datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
reorderdata/ReorderData | ---
task_categories:
- graph-ml
--- |
james-burton/OrientalMuseum_min5-3Dwhite-name | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: label
dtype:
class_label:
names:
'0': Aegis
'1': Ajaeng Holder
'2': Album Painting
'3': Amulet Mould
'4': Animal Figurine
'5': Animal Mummy
'6': Animal bone
'7': Arm Guard
'8': Axe Head
'9': Axle-caps
'10': Ball
'11': Ballista Bolt
'12': Band
'13': Basin
'14': Baton
'15': Belt Hook
'16': Betel Nut Cutter
'17': Blouse
'18': Blu-ray disc
'19': Bolt
'20': Book Cover
'21': Box
'22': Brush Pot
'23': Brush Rest
'24': Brush Tray
'25': Bulb Bowl
'26': Bullet Mould
'27': Burnisher
'28': Cabinet
'29': Cannon
'30': Cap
'31': Carved stone
'32': Case
'33': Cash Box
'34': Chest
'35': Cigar Holder
'36': Clapper
'37': Clay pipe (smoking)
'38': Comb
'39': Cosmetic and Medical Equipment and Implements
'40': Cricket pot
'41': Cross-bow Lock
'42': Cup And Saucer
'43': Cup, Saucer
'44': Cushion Cover
'45': DVDs
'46': Dagger
'47': Dice Box
'48': Dice Shaker
'49': Disc
'50': Domestic Equipment and Utensils
'51': Double Dagger
'52': Ear Protector
'53': Ear Stud
'54': Earring
'55': Elephant Goad
'56': Erotic Figurine
'57': Eye Protector
'58': Figurine Mould
'59': Finger Ring
'60': Funerary Cone
'61': Funerary goods
'62': Funerary money
'63': Furosode
'64': Greek crosses
'65': Hand Jade
'66': Hand Protector
'67': Handwarmer
'68': Hanging
'69': Headband
'70': Heart Scarab
'71': Human Figurine
'72': Incense Holder
'73': Inkstick
'74': Kite
'75': Knee Protector
'76': Kohl Pot
'77': Kundika
'78': Leaflet
'79': Letter
'80': Lock
'81': Mah Jong Rack
'82': Majiang set
'83': Manuscript Page
'84': Mat
'85': Mica Painting
'86': Miniature Painting
'87': Miniature Portrait
'88': Mortar
'89': Mould
'90': Mouth Jade
'91': Mouth Protector
'92': Mouth-piece
'93': Mummy Label
'94': Nail Protector
'95': Nose Protector
'96': Opium Pipe
'97': Opium Weight
'98': Oracle Bone
'99': Ostraka
'100': Palette
'101': Panel
'102': Part
'103': Pelmet
'104': Pencase
'105': Pendant
'106': Perfumer
'107': Phylactery
'108': Pigstick
'109': Pipe
'110': Pipe Case
'111': Pipe Holder
'112': Pith Painting
'113': Plaque
'114': Plate
'115': Poh Kam
'116': Pounder
'117': Prayer Wheel
'118': Rank Square
'119': Rubber
'120': Sake Cup
'121': Scabbard Chape
'122': Scabbard Slide
'123': Scarab Seal
'124': Scarf
'125': Score Board
'126': Screen
'127': Seal
'128': Seal Paste Pot
'129': Shaft Terminal
'130': Shield
'131': Shroud Weight
'132': Sleeve Band
'133': Sleeve Weight
'134': Slide
'135': Soles
'136': Spillikins
'137': Staff Head
'138': Stamp
'139': Stand
'140': Stand of Incense Burner
'141': Stem Bowl
'142': Stem Cup
'143': Story Cloth
'144': Strainer
'145': Sword Guard
'146': Table
'147': Table Runner
'148': Thangka
'149': Tomb Figure
'150': Tomb Model
'151': Washer
'152': Water Dropper
'153': Water Pot
'154': Wine Pot
'155': Woodblock Print
'156': Writing Desk
'157': accessories
'158': adzes
'159': alabastra
'160': albums
'161': altar components
'162': amphorae
'163': amulets
'164': anchors
'165': animation cels
'166': animation drawings
'167': anklets
'168': armbands
'169': armor
'170': armrests
'171': arrowheads
'172': arrows
'173': autograph albums
'174': axes
'175': 'axes: woodworking tools'
'176': back scratchers
'177': badges
'178': bags
'179': bandages
'180': bangles
'181': banners
'182': baskets
'183': beads
'184': beakers
'185': bedspreads
'186': bells
'187': belts
'188': bezels
'189': blades
'190': board games
'191': boats
'192': boilers
'193': booklets
'194': books
'195': bottles
'196': bowls
'197': boxes
'198': bracelets
'199': bread
'200': brick
'201': brooches
'202': brush washers
'203': brushes
'204': buckets
'205': buckles
'206': business cards
'207': buttons
'208': caddies
'209': calligraphy
'210': candelabras
'211': candleholders
'212': candlesticks
'213': canopic jars
'214': card cases
'215': card tables
'216': cards
'217': carvings
'218': cases
'219': celestial globes
'220': censers
'221': chains
'222': chairs
'223': charms
'224': charts
'225': chess sets
'226': chessmen
'227': chisels
'228': chopsticks
'229': cigarette cases
'230': cigarette holders
'231': cippi
'232': claypipe
'233': cloth
'234': clothing
'235': coats
'236': coffins
'237': coins
'238': collar
'239': compact discs
'240': containers
'241': coverings
'242': covers
'243': cuffs
'244': cups
'245': cushions
'246': cylinder seals
'247': deels
'248': deity figurine
'249': diagrams
'250': dice
'251': dishes
'252': document containers
'253': documents
'254': dolls
'255': doors
'256': drawings
'257': dresses
'258': drums
'259': dung-chen
'260': earrings
'261': embroidery
'262': ensembles
'263': envelopes
'264': 'equipment for personal use: grooming, hygiene and health care'
'265': ewers
'266': fans
'267': 'feet: furniture components'
'268': female figurine
'269': fiddles
'270': figures
'271': figurines
'272': finials
'273': flagons
'274': flags
'275': flasks
'276': fragments
'277': furniture components
'278': gameboards
'279': gaming counters
'280': ge
'281': glassware
'282': goblets
'283': gongs
'284': gowns
'285': greeting cards
'286': hair ornaments
'287': hairpins
'288': hammerstones
'289': handles
'290': handscrolls
'291': harnesses
'292': hats
'293': headdresses
'294': headrests
'295': heads
'296': headscarves
'297': helmets
'298': hobs
'299': hoods
'300': houses
'301': identity cards
'302': illuminated manuscripts
'303': incense burners
'304': incense sticks
'305': ink bottles
'306': inkstands
'307': inkstones
'308': inkwells
'309': inlays
'310': iron
'311': jackets
'312': jar seal
'313': jars
'314': jewelry
'315': juglets
'316': jugs
'317': keys
'318': kimonos
'319': knives
'320': ladles
'321': lamps
'322': lanterns
'323': lanyards
'324': leatherwork
'325': lids
'326': loom weights
'327': maces
'328': manuscripts
'329': maps
'330': masks
'331': medals
'332': miniatures
'333': mirrors
'334': models
'335': money
'336': mounts
'337': mugs
'338': mummies
'339': musical instruments
'340': nails
'341': necklaces
'342': needles
'343': netsukes
'344': nozzles
'345': obelisks
'346': obis
'347': oboes
'348': oil lamps
'349': ornaments
'350': pages
'351': paintings
'352': paper money
'353': paperweights
'354': papyrus
'355': passports
'356': pectorals
'357': pendants
'358': pestles
'359': petticoats
'360': photograph albums
'361': photographs
'362': pictures
'363': pins
'364': pipes
'365': pitchers
'366': playing card boxes
'367': playing cards
'368': plinths
'369': plumb bobs
'370': plume holders
'371': poker
'372': pommels
'373': postage stamps
'374': postcards
'375': posters
'376': pots
'377': pottery
'378': prayers
'379': printing blocks
'380': printing plates
'381': prints
'382': punch bowls
'383': puppets
'384': purses
'385': puzzles
'386': pyxides
'387': quilts
'388': razors
'389': reliefs
'390': rifles
'391': rings
'392': robes
'393': roofing tile
'394': rose bowls
'395': rubbings
'396': rugs
'397': rulers
'398': sandals
'399': saris
'400': sarongs
'401': sashes
'402': sauceboats
'403': saucers
'404': saws
'405': scabbards
'406': scaraboids
'407': scarabs
'408': scepters
'409': scissors
'410': scrolls
'411': sculpture
'412': seed
'413': seppa
'414': shadow puppets
'415': shawls
'416': shears
'417': shell
'418': shelves
'419': sherds
'420': shields
'421': shoes
'422': shrines
'423': sistra
'424': situlae
'425': sketches
'426': skewers
'427': skirts
'428': snuff bottles
'429': socks
'430': spatulas
'431': spearheads
'432': spears
'433': spittoons
'434': spoons
'435': statues
'436': statuettes
'437': steelyards
'438': stelae
'439': sticks
'440': stirrup jars
'441': stools
'442': stoppers
'443': straps
'444': studs
'445': styluses
'446': sugar bowls
'447': swagger sticks
'448': swords
'449': tablets
'450': tacks
'451': talismans
'452': tallies
'453': tangrams
'454': tankards
'455': tea bowls
'456': tea caddies
'457': tea kettles
'458': teacups
'459': teapots
'460': telephones
'461': ties
'462': tiles
'463': toggles
'464': toilet caskets
'465': tools
'466': toys
'467': trays
'468': trophies
'469': trousers
'470': tubes
'471': tureens
'472': tweezers
'473': typewriters
'474': underwear
'475': unidentified
'476': urinals
'477': ushabti
'478': utensils
'479': vases
'480': veils
'481': vessels
'482': waistcoats
'483': watches
'484': weight
'485': weights
'486': whetstones
'487': whistles
'488': whorls
'489': wood blocks
'490': writing boards
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: validation
num_bytes: 630038648.86
num_examples: 5436
- name: test
num_bytes: 613408499.456
num_examples: 5436
- name: train
num_bytes: 6479571973.5
num_examples: 115500
download_size: 6245167957
dataset_size: 7723019121.816
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
fenffef/afqmc | ---
license: mit
---
|
CyberHarem/birmingham_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of birmingham/バーミンガム/伯明翰 (Azur Lane)
This is the dataset of birmingham/バーミンガム/伯明翰 (Azur Lane), containing 26 images and their tags.
The core tags of this character are `bangs, red_hair, hair_ornament, breasts, red_eyes, short_hair, sidelocks, blunt_bangs, hairband, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 28.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/birmingham_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 18.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/birmingham_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 33.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/birmingham_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 25.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/birmingham_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 44.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/birmingham_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/birmingham_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, dress, solo, bare_shoulders, black_gloves, looking_at_viewer, half_gloves, cape, simple_background, blue_hairband, blush, panties, short_hair_with_long_locks, black_thighhighs, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, goggles_on_head, solo, arm_strap, holding, looking_at_viewer, outdoors, white_one-piece_swimsuit, bare_shoulders, closed_mouth, orange_hair, short_hair_with_long_locks, thigh_strap, water, blue_sky, covered_navel, day, legs, orange_eyes, skindentation, splashing, thighs, wet |
| 2 | 5 |  |  |  |  |  | 1girl, solo, fur_trim, hair_flower, looking_at_viewer, oil-paper_umbrella, sash, chinese_clothes, double_bun, floral_print, full_body, holding_umbrella, standing, wide_sleeves, yellow_eyes, long_sleeves, parted_lips, simple_background, snowing, tree, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | solo | bare_shoulders | black_gloves | looking_at_viewer | half_gloves | cape | simple_background | blue_hairband | blush | panties | short_hair_with_long_locks | black_thighhighs | white_background | goggles_on_head | arm_strap | holding | outdoors | white_one-piece_swimsuit | closed_mouth | orange_hair | thigh_strap | water | blue_sky | covered_navel | day | legs | orange_eyes | skindentation | splashing | thighs | wet | fur_trim | hair_flower | oil-paper_umbrella | sash | chinese_clothes | double_bun | floral_print | full_body | holding_umbrella | standing | wide_sleeves | yellow_eyes | long_sleeves | parted_lips | snowing | tree |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-----------------|:---------------|:--------------------|:--------------|:-------|:--------------------|:----------------|:--------|:----------|:-----------------------------|:-------------------|:-------------------|:------------------|:------------|:----------|:-----------|:---------------------------|:---------------|:--------------|:--------------|:--------|:-----------|:----------------|:------|:-------|:--------------|:----------------|:------------|:---------|:------|:-----------|:--------------|:---------------------|:-------|:------------------|:-------------|:---------------|:------------|:-------------------|:-----------|:---------------|:--------------|:---------------|:--------------|:----------|:-------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
khaledrabie1979/ROAA-SHORT2 | ---
license: apache-2.0
---
|
CVasNLPExperiments/OxfordFlowers_test_google_flan_t5_xxl_mode_A_T_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 58390
num_examples: 100
download_size: 14948
dataset_size: 58390
---
# Dataset Card for "OxfordFlowers_test_google_flan_t5_xxl_mode_A_T_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
threite/github-ds-tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8618263476
num_examples: 16702061
- name: valid
num_bytes: 48072624
num_examples: 93164
download_size: 3804663704
dataset_size: 8666336100
---
# Dataset Card for "github-ds-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Duskfallcrew/autotrain-data-phototest | ---
task_categories:
- image-classification
- text-to-image
license: creativeml-openrail-m
language:
- en
pretty_name: Phototest
size_categories:
- 1K<n<10K
---
# AutoTrain Dataset for project: phototest
## Dataset Description
This dataset has been automatically processed by AutoTrain for project phototest.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<768x768 RGB PIL image>",
"target": 0
},
{
"image": "<768x768 RGB PIL image>",
"target": 3
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['Row 1', 'Row 2', 'Row 3', 'Row 4', 'Row 5'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 72 |
| valid | 19 | |
jdsannchao/non_existent | ---
dataset_info:
- config_name: attr_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 43062088
num_examples: 704759
download_size: 12017273
dataset_size: 43062088
- config_name: exist_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 50290552
num_examples: 733586
download_size: 13928584
dataset_size: 50290552
- config_name: relation_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 48465571
num_examples: 712248
download_size: 14150304
dataset_size: 48465571
configs:
- config_name: attr_qa
data_files:
- split: train
path: attr_qa/train-*
- config_name: exist_qa
data_files:
- split: train
path: exist_qa/train-*
- config_name: relation_qa
data_files:
- split: train
path: relation_qa/train-*
---
|
huolongguo10/insecure | ---
license: openrail
task_categories:
- text-classification
language:
- en
tags:
- code
pretty_name: final
size_categories:
- 10K<n<100K
---
建议final,包含xss、sql注入等数据,安全数据采用sst-2的部分数据 |
HydraLM/GPTeacher_toolformer_list_dict | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 3339686
num_examples: 7672
download_size: 809207
dataset_size: 3339686
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_toolformer_list_dict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_13 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 196125315
num_examples: 19241
download_size: 58457645
dataset_size: 196125315
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EarthnDusk/Duskfallcrew_Art | ---
license: creativeml-openrail-m
language:
- en
tags:
- stable diffusion
pretty_name: Duskfallcrew Art Style Dataset
size_categories:
- n<1K
---
# Dataset Card for Duskfallcrew Art Style Dataset
Dataset for Duskfallcrew Art Style, aka Kieran Somerville. This artistic style is a self collected, self made dataset.
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Lisc Requirements
You have rights to distribute the LORA weights of which you train on this dataset, but you do not **own** the dataset. You're more than welcome to consistently add it to lora, checkpoint training on any Stable Diffusion Stable Cascade or Pixart models.
Please check the full out of SCOPE for details on prohibited use.
Largely the only thing Earth & Dusk asks is that you do not RESELL the dataset, and do not create print on demand with it.
We realize the art isn't that great, but it's our art, and we wanted to share it.
## Dataset Details
### Dataset Description
Comic style art by Duskfallcrew of Earth & Dusk
## Uses
Combining this in multiple style loras would be wonderful, just note that you don't own the dataset.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
Modified from: https://freedevproject.org/faipl-1.0-sd/
You may not use this dataset or any derived model for the following:
In any way that violates any applicable national, federal, state, local or international law or regulation;
For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;
To generate or disseminate verifiably false information and/or content with the purpose of harming others;
To generate or disseminate personal identifiable information that can be used to harm an individual;
To defame, disparage or otherwise harass others;
For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;
For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;
To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;
For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;
To provide medical advice and medical results interpretation;
To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use).
No Harm
You agree that no contributor’s conduct in the creation of this dataset has caused you any harm. As far as the law allows, you give up your right to pursue any kind of legal claim against any contributor for actions related the creation of this software, even if those actions broke a previous agreement.
Additionally, you agree not to use this dataset for harmful purposes, as listed in Prohibited Uses. These restrictions do not apply to non-model parts of this software.
No Liability
As far as the law allows, this software comes as is, without any warranty or condition, and no contributor will be liable to anyone for any damages related to this dataset or this license, under any kind of legal claim.
## Dataset Card Contact
For queries about copyright and liscencing of the dataset : https://www.end-media.org
|
AhBotNLP/ahbot_wakeword | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': ahbot
'1': ahbot_close
'2': background_noise
splits:
- name: train
num_bytes: 1190845036.86
num_examples: 1124
download_size: 0
dataset_size: 1190845036.86
task_categories:
- audio-classification
---
# Dataset Card for "ahbot_wakeword"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_21_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 980
num_examples: 32
download_size: 2073
dataset_size: 980
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_21_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SlapDrone/hf-stack-v1 | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 111464935
num_examples: 7045
download_size: 37891644
dataset_size: 111464935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
erwinqi/conslam_relabelled_semantic | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 401815947.0
num_examples: 88
- name: validation
num_bytes: 49293721.0
num_examples: 10
download_size: 451129141
dataset_size: 451109668.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
PhilSad/Control-Face-data-sameface | ---
dataset_info:
features:
- name: gender
dtype: string
- name: conditionning_image
dtype: image
- name: objective_image
dtype: image
- name: caption
dtype: string
- name: pers_id
dtype: int64
splits:
- name: train
num_bytes: 141728186.282
num_examples: 10177
download_size: 137859013
dataset_size: 141728186.282
---
# Dataset Card for "Control-Face-data-sameface"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-top_en-c0540d-2175569976 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-125m
metrics: []
dataset_name: futin/feed
dataset_config: top_en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-125m
* Dataset: futin/feed
* Config: top_en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
Helix21/med_faq_embeddings | ---
license: mit
---
|
LukeEuser/docvqa_20_unanswerable_questions | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 33132040.0
num_examples: 100
- name: test
num_bytes: 6102508.0
num_examples: 20
download_size: 13285946
dataset_size: 39234548.0
---
# Dataset Card for "docvqa_20_unanswerable_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/librispeech_asr_dummy_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 535736
num_examples: 63
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 535736
num_examples: 63
- name: academicodec_hifi_24k_320d
num_bytes: 802552
num_examples: 63
- name: audiodec_24k_320d
num_bytes: 1713544
num_examples: 63
- name: dac_16k
num_bytes: 2089080
num_examples: 63
- name: dac_24k
num_bytes: 8212840
num_examples: 63
- name: dac_44k
num_bytes: 2641068
num_examples: 63
- name: encodec_24k_12bps
num_bytes: 3212072
num_examples: 63
- name: encodec_24k_1_5bps
num_bytes: 402832
num_examples: 63
- name: encodec_24k_24bps
num_bytes: 6422632
num_examples: 63
- name: encodec_24k_3bps
num_bytes: 804152
num_examples: 63
- name: encodec_24k_6bps
num_bytes: 1606792
num_examples: 63
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 4291432
num_examples: 63
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 4291432
num_examples: 63
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 4285032
num_examples: 63
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 2152040
num_examples: 63
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 4285032
num_examples: 63
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 2152040
num_examples: 63
- name: speech_tokenizer_16k
num_bytes: 1072392
num_examples: 63
download_size: 7889841
dataset_size: 51508436
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
MicPie/unpredictable_cluster26 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster26
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster26" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
CyberHarem/bache_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bache/バッチ/贝奇 (Azur Lane)
This is the dataset of bache/バッチ/贝奇 (Azur Lane), containing 361 images and their tags.
The core tags of this character are `blonde_hair, long_hair, purple_eyes, bangs, two_side_up, breasts, fang, hat, small_breasts, black_headwear, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 361 | 536.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bache_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 361 | 271.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bache_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 979 | 651.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bache_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 361 | 460.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bache_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 979 | 978.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bache_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bache_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, fishnet_thighhighs, fur-trimmed_jacket, looking_at_viewer, micro_shorts, midriff, navel, simple_background, single_thighhigh, sleeveless, solo, white_background, yellow_jacket, bandaid_on_knee, bare_shoulders, belt, black_sailor_collar, off_shoulder, open_mouth, yellow_neckerchief, :3, black_shirt, loose_socks, pink_collar, crop_top, sailor_hat, denim_shorts, :d, blush, full_body, ok_sign |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_sailor_collar, black_shirt, fur-trimmed_jacket, long_sleeves, looking_at_viewer, micro_shorts, midriff, off_shoulder, open_fly, open_jacket, sleeveless_shirt, solo, yellow_jacket, :d, blush, crop_top, navel, open_mouth, sailor_hat, yellow_neckerchief, pink_collar, simple_background, single_thighhigh, white_background, :3, black_shorts, brown_belt, collarbone, cowboy_shot, denim_shorts, fishnet_thighhighs, short_shorts, armpits, chain, cutoffs, hand_on_hip, hand_up, ok_sign, open_shorts, sparkle, pouch |
| 2 | 5 |  |  |  |  |  | 1girl, belt, denim_shorts, fur-trimmed_jacket, looking_at_viewer, micro_shorts, midriff, off_shoulder, open_mouth, sailor_hat, solo, yellow_jacket, bare_shoulders, black_sailor_collar, black_shirt, blush, chain, cowboy_shot, crop_top, fishnet_thighhighs, navel, ok_sign, open_clothes, open_fly, pink_collar, sleeveless, :3, :d, single_thighhigh, sparkle, white_background, yellow_neckerchief |
| 3 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, solo, open_mouth, white_thighhighs, blush, denim_shorts, eyewear_on_head, micro_shorts, sunglasses, black_bikini, jacket, bikini_top_only, short_shorts, smile, simple_background, cutoffs, looking_back, tail, white_background, heart-shaped_pupils, jewelry, navel |
| 4 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, :3, loli, navel, open_mouth, solo, barefoot, micro_bikini, simple_background, white_bikini, ass, cameltoe, collarbone, eyepatch_bikini, feet, full_body, grey_background, heart-shaped_pupils, spread_legs, toes, covered_nipples, lying, soles, untied |
| 5 | 15 |  |  |  |  |  | 1girl, blush, single_thighhigh, solo, visor_cap, tennis_uniform, thigh_strap, white_thighhighs, bare_shoulders, looking_at_viewer, open_mouth, smile, detached_sleeves, clothing_cutout, very_long_hair, covered_navel, dress, twintails, panties, tennis_ball, tennis_racket |
| 6 | 11 |  |  |  |  |  | 1girl, rabbit_ears, solo, looking_at_viewer, open_mouth, playboy_bunny, fake_animal_ears, :3, bowtie, black_leotard, blush, detached_collar, pantyhose, smile, strapless_leotard, simple_background, rabbit_tail, white_background, wrist_cuffs, bare_shoulders, heart-shaped_pupils, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fishnet_thighhighs | fur-trimmed_jacket | looking_at_viewer | micro_shorts | midriff | navel | simple_background | single_thighhigh | sleeveless | solo | white_background | yellow_jacket | bandaid_on_knee | bare_shoulders | belt | black_sailor_collar | off_shoulder | open_mouth | yellow_neckerchief | :3 | black_shirt | loose_socks | pink_collar | crop_top | sailor_hat | denim_shorts | :d | blush | full_body | ok_sign | long_sleeves | open_fly | open_jacket | sleeveless_shirt | black_shorts | brown_belt | collarbone | cowboy_shot | short_shorts | armpits | chain | cutoffs | hand_on_hip | hand_up | open_shorts | sparkle | pouch | open_clothes | white_thighhighs | eyewear_on_head | sunglasses | black_bikini | jacket | bikini_top_only | smile | looking_back | tail | heart-shaped_pupils | jewelry | loli | barefoot | micro_bikini | white_bikini | ass | cameltoe | eyepatch_bikini | feet | grey_background | spread_legs | toes | covered_nipples | lying | soles | untied | visor_cap | tennis_uniform | thigh_strap | detached_sleeves | clothing_cutout | very_long_hair | covered_navel | dress | twintails | panties | tennis_ball | tennis_racket | rabbit_ears | playboy_bunny | fake_animal_ears | bowtie | black_leotard | detached_collar | pantyhose | strapless_leotard | rabbit_tail | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:---------------------|:--------------------|:---------------|:----------|:--------|:--------------------|:-------------------|:-------------|:-------|:-------------------|:----------------|:------------------|:-----------------|:-------|:----------------------|:---------------|:-------------|:---------------------|:-----|:--------------|:--------------|:--------------|:-----------|:-------------|:---------------|:-----|:--------|:------------|:----------|:---------------|:-----------|:--------------|:-------------------|:---------------|:-------------|:-------------|:--------------|:---------------|:----------|:--------|:----------|:--------------|:----------|:--------------|:----------|:--------|:---------------|:-------------------|:------------------|:-------------|:---------------|:---------|:------------------|:--------|:---------------|:-------|:----------------------|:----------|:-------|:-----------|:---------------|:---------------|:------|:-----------|:------------------|:-------|:------------------|:--------------|:-------|:------------------|:--------|:--------|:---------|:------------|:-----------------|:--------------|:-------------------|:------------------|:-----------------|:----------------|:--------|:------------|:----------|:--------------|:----------------|:--------------|:----------------|:-------------------|:---------|:----------------|:------------------|:------------|:--------------------|:--------------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | | X | | X | X | X | X | X | X | | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | X | | X | | | | | | X | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | | X | X | | X | X | | | X | X | | | | | | | X | | | | | | | | X | | X | | | | | | | | | | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | X | X | | | X | | | | | | | | X | | X | | | | | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | X | | | X | | | | | X | | X | | | | X | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | | X | | | | X | | | X | X | | | X | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_MaziyarPanahi__TheTop-5x7B-Instruct-S3-v0.1 | ---
pretty_name: Evaluation run of MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1](https://huggingface.co/MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__TheTop-5x7B-Instruct-S3-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T23:12:31.708653](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__TheTop-5x7B-Instruct-S3-v0.1/blob/main/results_2024-02-18T23-12-31.708653.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6571641282160704,\n\
\ \"acc_stderr\": 0.031918970852064334,\n \"acc_norm\": 0.6561506230894164,\n\
\ \"acc_norm_stderr\": 0.03258982989656136,\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6447306680251751,\n\
\ \"mc2_stderr\": 0.015519245883344577\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.689419795221843,\n \"acc_stderr\": 0.01352229209805306,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.004495891440519419,\n \"acc_norm\": 0.8800039832702649,\n\
\ \"acc_norm_stderr\": 0.0032429275808698544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568603,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568603\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4834761321909425,\n\
\ \"mc1_stderr\": 0.017493940190057723,\n \"mc2\": 0.6447306680251751,\n\
\ \"mc2_stderr\": 0.015519245883344577\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7202426080363912,\n \
\ \"acc_stderr\": 0.012364384016735319\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|arc:challenge|25_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|gsm8k|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hellaswag|10_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T23-12-31.708653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T23-12-31.708653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- '**/details_harness|winogrande|5_2024-02-18T23-12-31.708653.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T23-12-31.708653.parquet'
- config_name: results
data_files:
- split: 2024_02_18T23_12_31.708653
path:
- results_2024-02-18T23-12-31.708653.parquet
- split: latest
path:
- results_2024-02-18T23-12-31.708653.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1](https://huggingface.co/MaziyarPanahi/TheTop-5x7B-Instruct-S3-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__TheTop-5x7B-Instruct-S3-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T23:12:31.708653](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__TheTop-5x7B-Instruct-S3-v0.1/blob/main/results_2024-02-18T23-12-31.708653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6571641282160704,
"acc_stderr": 0.031918970852064334,
"acc_norm": 0.6561506230894164,
"acc_norm_stderr": 0.03258982989656136,
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6447306680251751,
"mc2_stderr": 0.015519245883344577
},
"harness|arc:challenge|25": {
"acc": 0.689419795221843,
"acc_stderr": 0.01352229209805306,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.004495891440519419,
"acc_norm": 0.8800039832702649,
"acc_norm_stderr": 0.0032429275808698544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568603,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568603
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518015,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4834761321909425,
"mc1_stderr": 0.017493940190057723,
"mc2": 0.6447306680251751,
"mc2_stderr": 0.015519245883344577
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.7202426080363912,
"acc_stderr": 0.012364384016735319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EarthnDusk/FFXIV_Data_and_Lora | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- ffxiv
- video game
- mmorpg
- stable diffusion
pretty_name: Final fantasy XIV Miqote and More Data + Lora
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
At0x/AIUniverse | ---
license: creativeml-openrail-m
---
|
Satish678/req2case | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 36287
num_examples: 158
download_size: 12320
dataset_size: 36287
---
# Dataset Card for "req2case"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alvarobartt/arc-c-okapi-eval-es | ---
language:
- es
license: cc-by-sa-4.0
size_categories:
- n<1K
- 1K<n<10K
task_categories:
- multiple-choice
- question-answering
task_ids:
- multiple-choice-qa
- open-domain-qa
tags:
- chatgpt-translated
dataset_info:
features:
- name: id
dtype: string
- name: en_question
dtype: string
- name: es_question
dtype: string
- name: en_choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: es_choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: en_answerKey
dtype: string
- name: es_answerKey
dtype: string
splits:
- name: train
num_bytes: 721053
num_examples: 1118
- name: validation
num_bytes: 199156
num_examples: 297
- name: test
num_bytes: 774487
num_examples: 1170
download_size: 919075
dataset_size: 1694696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# ARC-Challenge translated to Spanish
This dataset was generated by the Natural Language Processing Group of the University of Oregon, where they used the
original ARC-Challenge dataset in English and translated it into different languages using ChatGPT.
This dataset only contains the Spanish translation, but the following languages are also covered within the original
subsets posted by the University of Oregon at http://nlp.uoregon.edu/download/okapi-eval/datasets/.
## Disclaimer
All the credits for this dataset go to the original authors of ARC-Challenge (licensed as CC BY SA 4.0), and to the authors of
this translation via ChatGPT (licensed as CC BY NC 4.0, allowing only non-commercial use).
## References
* [Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge](https://arxiv.org/abs/1803.05457)
* [Okapi: Instruction-tuned Large Language Models in Multiple Languages with Reinforcement Learning from Human Feedback](https://arxiv.org/abs/2307.16039) |
irds/mr-tydi_bn_test | ---
pretty_name: '`mr-tydi/bn/test`'
viewer: false
source_datasets: ['irds/mr-tydi_bn']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/bn/test`
The `mr-tydi/bn/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/bn/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=111
- `qrels`: (relevance assessments); count=130
- For `docs`, use [`irds/mr-tydi_bn`](https://huggingface.co/datasets/irds/mr-tydi_bn)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_bn_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_bn_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
bigcode/the-stack-march-sample-special-tokens-stripped | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 3034084423
num_examples: 746856
download_size: 1107347598
dataset_size: 3034084423
---
# Dataset Card for "the-stack-march-sample-special-tokens-stripped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceTB/cosmopedia | ---
dataset_info:
- config_name: auto_math_text
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 8777587297.907892
num_examples: 1949895
download_size: 4461401898
dataset_size: 8777587297.907892
- config_name: khanacademy
features:
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 108591354.09210858
num_examples: 24123
download_size: 49139761
dataset_size: 108591354.09210858
- config_name: openstax
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 667837450
num_examples: 126332
download_size: 346992522
dataset_size: 667837450
- config_name: stanford
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 6341291506
num_examples: 1020024
download_size: 3302284560
dataset_size: 6341291506
- config_name: stories
features:
- name: text
dtype: string
- name: prompt
dtype: string
- name: text_token_length
dtype: int64
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 21314739648
num_examples: 4992964
download_size: 11902294709
dataset_size: 21314739648
- config_name: web_samples_v1
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 69075726295
num_examples: 12426348
download_size: 38978124936
dataset_size: 69075726295
- config_name: web_samples_v2
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 58711802939
num_examples: 10345867
download_size: 32658254617
dataset_size: 58711802939
- config_name: wikihow
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 892720528
num_examples: 179191
download_size: 502284600
dataset_size: 892720528
configs:
- config_name: auto_math_text
data_files:
- split: train
path: data/auto_math_text/train-*
- config_name: khanacademy
data_files:
- split: train
path: data/khanacademy/train-*
- config_name: openstax
data_files:
- split: train
path: data/openstax/train-*
- config_name: stanford
data_files:
- split: train
path: data/stanford/train-*
- config_name: stories
data_files:
- split: train
path: data/stories/train-*
- config_name: web_samples_v1
data_files:
- split: train
path: data/web_samples_v1/train-*
- config_name: web_samples_v2
data_files:
- split: train
path: data/web_samples_v2/train-*
- config_name: wikihow
data_files:
- split: train
path: data/wikihow/train-*
license: apache-2.0
language:
- en
tags:
- synthetic
---
# Cosmopedia v0.1
<center>
<img src="https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/8a9ZTW8sC4utjEPIrZegN.png" alt="Cosmopedia v0.1" width="600" height="300">
<p><em>Image generated by DALL-E, the <a href="https://huggingface.co/datasets/HuggingFaceTB/miscellaneous/blob/main/cosmopedia_dalle_prompt_by_mixtral.txt">prompt</a> was generated by Mixtral-8x7B-Instruct-v0.1</em></p>
</center>
```
User: What do you think "Cosmopedia" could mean? Hint: in our case it's not related to cosmology.
Mixtral-8x7B-Instruct-v0.1: A possible meaning for "Cosmopedia" could be an encyclopedia or collection of information about
different cultures, societies, and topics from around the world, emphasizing diversity and global connectedness.
```
**Cosmopedia** is a dataset of synthetic textbooks, blogposts, stories, posts and WikiHow articles generated by [Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1).The dataset contains over **30 million files** and **25 billion tokens**, making it the largest open synthetic dataset to date.
It covers a variety of topics; we tried to map world knowledge present in Web datasets like [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) and [RedPajama](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T), and generate synthetic content that covers them. This is the v0.1 of Cosmopedia, with ample room for improvement and topics to be more comprehensively covered. We hope this dataset will help the community's research efforts in the increasingly intriguing domain of synthetic data. You can find a clickable map by Nomic at [https://atlas.nomic.ai/map/cosmopedia](https://atlas.nomic.ai/map/cosmopedia).
This work is inspired by the great work of [Phi1.5](https://huggingface.co/papers/2309.05463).
# TL;DR
This is a synthetic dataset of 30M samples generated by [Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1). It contains 8 splits depending on the source of the seed samples we use in the prompts, the model is asked to generate content related to them. The splits range from web samples to educational resources like Stanford, OpenStax and KhanAcademy, we also use some instruction-tuning datasets as seed samples for stories.
Here's how you can load a dataset split:
```python
from datasets import load_dataset
ds = load_dataset("HuggingFaceTB/cosmopedia", "stories", split="train", num_proc=12)
ds[0]
```
If you want a smaller subset of the dataset check [Cosmopedia-100k](https://huggingface.co/datasets/HuggingFaceTB/cosmopedia-100k). We also trained a 1.8B model on Cosmopedia [Cosmo-1B](https://huggingface.co/HuggingFaceTB/cosmopedian-1b).
# Dataset splits
The prompts are all based on the concept of using a seed sample (for example an extract from a web page) and asking the model to generate new content (textbook, story, blogpost..) related to that seed sample.
The dataset consist of 8 splits depending on the source of the seed data used in the split. Some seed samples may appear more than once when we ask for a different style (e.g academic textbook vs blogpost) or audience (e.g young children vs college students). For example, each sample in `stanford` was used with 4 different prompt styles and audiences, check the `format` and `audience` columns for more details.
We observed that tailoring the audience and prompt style accordingly significantly enhances diversity; the proportion of duplicates eliminated via MinHash was under 1%.
The graph below shows the distribution of seed datasets, generations formats and audiences in Cosmopedia:
<center>
<img src="https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/V7MGV2OrCfLO5TxKPUXs4.png" alt="distributions" width="1000" height="500">
</center>
Below are the 8 splits:
- `web_samples_v1`: this and `web_samples_v2` are the largest splits (they make up~75% of the dataset), where we use samples from an internal web dataset similar to [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb). These samples were selected based on their topic, using a clustering method explained in the section below.
- `web_samples_v2`: similar to `web_samples_v2` using different samples. We call it v2 because we refined the prompts for this split (e.g asking for more depth over breadth in the concepts explanations and requesting the model to not generate a title and introductory sentences, which might be redundant across samples).
- `stanford`: we scraped course outlines from [stanford.edu](https://explorecourses.stanford.edu/search?q=all%20courses), and each time we prompt the model with one of the course units.
- `stories`: we generated stories to add some commonsense and day-to-day knowledge aspect to the dataset. For this split we use samples from [UltraChat](https://huggingface.co/datasets/stingning/ultrachat) -only questions about the world [subset](https://huggingface.co/datasets/loubnabnl/ultrachat_questions_about_world)- and [OpenHermes2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5). These are synthetic instruction-tuning datasets that are already curated
and cover a wide range of topics.
- `wikihow`: in this split, we asked the model to generate WikiHow articles from WikiHow titles that we scraped, the list is avilable [here](https://github.com/huggingface/cosmopedia/blob/main/prompts/wikihow/wikihowcom-20231012-titles.txt). Note that you can find more WikiHow articles in the other splits by looking for it in the `format` column.
- `openstax`: we scraped course outlines with unit introductions from [OpenStax](https://openstax.org/), a resource suggested by [AFAIK](https://afaik.io/) team.
- `khanacademy`: we scraped the outlines for the courses on [KhanAcademy](https://www.khanacademy.org), and asked the model to genrate a textbook for each.
- `automathtext`: to improve the science knowledge of the model, we use samples from [AutoMathText](https://huggingface.co/datasets/math-ai/AutoMathText/) dataset as seed samples. The dataset covers more than just math. See this clustering [plot](https://huggingface.co/datasets/HuggingFaceTB/miscellaneous/blob/main/AMT_plots/topics_distpng.png) we made.
### Dataset features
The dataset has the following features:
- prompt: the prompt we used to generate the content with Mixtral-8x7B-Instruct-v0.1.
- text: the synthetic generated content.
- seed_data: the prompts include some text fromanother dataset/an external source, `seed_data` is the name of that dataset (e.g web, Stanford courses...)
- token_length: the number of tokens in `text`, computed using [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1)'s tokenizer
- format: the style of `text`, this can for example be a textbook, a blogpost, a story.. It can also be inferred from the prompt.
- audience: the target audience defined in the prompt
# Dataset creation
The "Dataset splits" section already provides an overview of the data creation pipeline. In this section, we will explain the topic clustering method for web samples and our iterative process for refining the prompts, in addition to decontamination.
### Topic clustering
Our goal was to generate a vast quantity of synthetic data covering a wide range of topics (essentially, anything useful found on the web) in a cleaner format like textbooks. A natural strategy was to begin with web samples, using them as seeds for the generation.
This approach, employed by Li et al. in [Phi-1.5](https://huggingface.co/papers/2309.05463), appears to be the most scalable method for synthetic data generation, given the availability of web datasets with trillions of tokens.
The prompted model will use an extract from these seed samples as a reference for generation, so the topic might matter more than the actual content of the file. To filter out less relevant topics and to provide the model with context for generating content, we first clustered millions of files from a web dataset.
Then we prompted Mixtral 8x7B with extracts from 10 random samples in each cluster and asked it to find the topic they have in common and to provide an educational score for that topic. The dataset with clusters and topics is available in this [demo](https://huggingface.co/spaces/HuggingFaceTB/inspect_web_clusters), the code is available in [text-clustering]( https://github.com/huggingface/text-clustering ) and a [demo](https://huggingface.co/spaces/HuggingFaceTB/inspect_web_clusters) for inspection.
The educational score seems to work for "very uneducational" topics like adult content and "highly educational" topics like College Mathematics, but isn't very relevant in-between. So we manually inspect the 145 clusters we find, and discard 35 of them. The final list of topics is available [here](https://github.com/huggingface/cosmopedia/blob/dd5cd1f7fcfae255c9cfbe704ba2187965523457/prompts/web_samples/filter_and_classify_clusters.py#L8).
We don't do any further filtering inside the clusters but we include the topic of the sample in the prompt 100% of the time for `web_samples_v1`, but only 50% of the time in `web_samples_v2`, where we tried to refine the prompts, in case the topic isn't accurate or the topic list isn't comprehensive.
Below are the clusters found in Cosmopedia:
<center>
<img src="https://cdn-uploads.huggingface.co/production/uploads/61c141342aac764ce1654e43/jMKGaE_UnEfH3j8iZYXVN.png" alt="Cosmopedia clusters" width="1200" height="750">
<p><em>Cosmopedia clusters.</em></p>
</center>
### Diversity
We find that when using the same seed sample multiple times, changing the generation style and/or the audience and their target format results in different generations, covering the same topic from different angles. For example when asking the model for a children's textbook, we needed to remind it that it can't use complex concepts and that the tone should be adapted to children. The same goes when asking for textbooks for college students vs for researchers, we had to emphasize the level of depth we wanted for each, and how acadmeic the textbooks should be.
By carefully iterating on the prompts using [HuggingChat](https://huggingface.co/chat/) and then generating few hundreds samples, we managed to reduce the redundancy. For example, we noticed that the model always started the stories with "Once upon a time" and the forums posts with "A few years back", asking it to explicitly avoid these sentences when starting the generation results in more diverse beginnings (don't worry "Once upon a time" still appears in stories!). Same goes for blogposts and textbooks where the introductory sentences were initially repetitive.
Running MinHash deduplication on the splits detects less than 1% of the files as duplicates.
### Decontamination
Given how we generate synthetic content, there is a possibility that the seed samples or the model's training data could have benchmarks contamination. Therefore, we run a decontamination piepline to make sure we don't have any samples from the test benchmarks in our dataset.
We use a 10-gram overlap to retrieve potentially contaminated samples, similarly to [Phi-1](https://huggingface.co/papers/2306.11644).
After retrieving the candidates, we run a diff between the dataset sample and the benchmark sample using `difflib.SequenceMatcher` and discard the sample if `len(matched_substrings)/len(benchmark_sample) > 0.5`.
We run decontamination against all the benchmarks we evaluated the Cosmo-1B model on: MMLU, HellaSwag, PIQA, SIQA, Winogrande, OpenBookQA, ARC-easy, ARC-challenge.
We report the number of contaminated samples removed from each dataset split, as well as the number of unique benchmark samples that they correspond to (in brackets):
| Dataset group | ARC Easy | ARC Challenge | BoolQ | HellaSwag | MMLU | OpenBookQA | PIQA | WinoGrande |
|-----------------------------------------------|----------|---------------|----------------|-----------|------|------------|------|------------|
| web_samples_v1 + web_samples_v2 + stanford + openstax | 30 (13) | 19 (3) | 386 (41) | 6 (5) | 1 (1) | 0 (0) | 5 (3) | 0 (0) |
| auto_math_text + khanacademy | 4 (4) | 13 (2) | 34 (7) | 1 (1) | 0 (0) | 0 (0) | 0 (0) | 0 (0) |
| stories | 33 (20) | 20 (12) | 27 (21) | 3 (3) | 1 (1) | 2 (2) | 6 (4) | 3 (2) |
## Code
The code for topic clustering of the web samples, building the prompts, content generation and data deduplication & decontamination can be found in the [Cosmopedia GitHub repository](https://github.com/huggingface/cosmopedia).
## Citation
```
@software{benallal2024cosmopedia,
author = {Ben Allal, Loubna and Lozhkov, Anton and Penedo, Guilherme and Wolf, Thomas and von Werra, Leandro},
title = {Cosmopedia},
month = February,
year = 2024,
url = {https://huggingface.co/datasets/HuggingFaceTB/cosmopedia}
}
``` |
DavidVivancos/MindBigData2022_MNIST_IN | ---
license: odbl
---
|
Sntng/drone_view_augment_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 843832728.694
num_examples: 1503
- name: validation
num_bytes: 57308255.0
num_examples: 100
download_size: 166577777
dataset_size: 901140983.694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
DominosXpizza/mistral_pokemon | ---
license: apache-2.0
---
|
kaleemWaheed/twitter_dataset_1713054033 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28509
num_examples: 71
download_size: 15253
dataset_size: 28509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b | ---
pretty_name: Evaluation run of ehartford/samantha-1.1-llama-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-1.1-llama-33b](https://huggingface.co/ehartford/samantha-1.1-llama-33b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T11:42:44.859774](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b/blob/main/results_2023-09-17T11-42-44.859774.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20994127516778524,\n\
\ \"em_stderr\": 0.004170789326061049,\n \"f1\": 0.2829341442953027,\n\
\ \"f1_stderr\": 0.004181823285876536,\n \"acc\": 0.4024903466008606,\n\
\ \"acc_stderr\": 0.008664723950310687\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.20994127516778524,\n \"em_stderr\": 0.004170789326061049,\n\
\ \"f1\": 0.2829341442953027,\n \"f1_stderr\": 0.004181823285876536\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \
\ \"acc_stderr\": 0.00540943973697051\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650865\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-1.1-llama-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|drop|3_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T11-42-44.859774.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-44.859774.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-44.859774.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- results_2023-08-18T14:31:51.159426.parquet
- split: 2023_09_17T11_42_44.859774
path:
- results_2023-09-17T11-42-44.859774.parquet
- split: latest
path:
- results_2023-09-17T11-42-44.859774.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-1.1-llama-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-1.1-llama-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-1.1-llama-33b](https://huggingface.co/ehartford/samantha-1.1-llama-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T11:42:44.859774](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b/blob/main/results_2023-09-17T11-42-44.859774.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20994127516778524,
"em_stderr": 0.004170789326061049,
"f1": 0.2829341442953027,
"f1_stderr": 0.004181823285876536,
"acc": 0.4024903466008606,
"acc_stderr": 0.008664723950310687
},
"harness|drop|3": {
"em": 0.20994127516778524,
"em_stderr": 0.004170789326061049,
"f1": 0.2829341442953027,
"f1_stderr": 0.004181823285876536
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.00540943973697051
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650865
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/akanishi_erika_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akanishi_erika/赤西瑛梨華 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of akanishi_erika/赤西瑛梨華 (THE iDOLM@STER: Cinderella Girls), containing 44 images and their tags.
The core tags of this character are `green_eyes, long_hair, braid, brown_hair, breasts, twin_braids, hair_ornament, large_breasts, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 30.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akanishi_erika_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 23.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akanishi_erika_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 86 | 43.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akanishi_erika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 28.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akanishi_erika_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 86 | 52.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akanishi_erika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akanishi_erika_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, open_mouth, smile, solo, looking_at_viewer, cleavage, black_hair, blush, hair_flower, sweat, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, card_(medium), character_name, flower_(symbol), pink_background, smile, solo, open_mouth, skirt, bracelet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | smile | solo | looking_at_viewer | cleavage | black_hair | blush | hair_flower | sweat | white_background | card_(medium) | character_name | flower_(symbol) | pink_background | skirt | bracelet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:-------|:--------------------|:-----------|:-------------|:--------|:--------------|:--------|:-------------------|:----------------|:-----------------|:------------------|:------------------|:--------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B | ---
pretty_name: Evaluation run of Severian/Nexus-IKM-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/Nexus-IKM-Mistral-7B](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T19:40:09.114133](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B/blob/main/results_2024-03-04T19-40-09.114133.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2817231321365924,\n\
\ \"acc_stderr\": 0.032077202372413315,\n \"acc_norm\": 0.2844166087228834,\n\
\ \"acc_norm_stderr\": 0.03294909681534637,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041852,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.21843003412969283,\n \"acc_stderr\": 0.012074291605700962,\n \"\
acc_norm\": 0.29266211604095566,\n \"acc_norm_stderr\": 0.013295916103619411\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26707827126070505,\n\
\ \"acc_stderr\": 0.004415293656599497,\n \"acc_norm\": 0.29107747460665206,\n\
\ \"acc_norm_stderr\": 0.004533307758521325\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207764,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.03078373675774563,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.03078373675774563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.24242424242424243,\n \"acc_stderr\": 0.03053289223393203,\n \"\
acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03053289223393203\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.03410780251836183,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.03410780251836183\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377274,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377274\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097845,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097845\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634342,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634342\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29541284403669726,\n \"acc_stderr\": 0.019560619182975997,\n \"\
acc_norm\": 0.29541284403669726,\n \"acc_norm_stderr\": 0.019560619182975997\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605607,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605607\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.040933292298342784,\n\
\ \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.040933292298342784\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3425925925925926,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.3425925925925926,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.04498676320572921,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.04498676320572921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.016050792148036536,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.016050792148036536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.19106145251396647,\n\
\ \"acc_stderr\": 0.013148479802450801,\n \"acc_norm\": 0.19106145251396647,\n\
\ \"acc_norm_stderr\": 0.013148479802450801\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816653,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816653\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.02500646975579922,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.02500646975579922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114023,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114023\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\
\ \"acc_stderr\": 0.010976425013113893,\n \"acc_norm\": 0.24445893089960888,\n\
\ \"acc_norm_stderr\": 0.010976425013113893\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294268,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294268\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\
\ \"acc_stderr\": 0.03294118479054096,\n \"acc_norm\": 0.31840796019900497,\n\
\ \"acc_norm_stderr\": 0.03294118479054096\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683229,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683229\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041852,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n\
\ \"acc_stderr\": 0.014052271211616438\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Severian/Nexus-IKM-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|arc:challenge|25_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|arc:challenge|25_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|gsm8k|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|gsm8k|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hellaswag|10_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hellaswag|10_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-31.628664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-40-09.114133.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T19-40-09.114133.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- '**/details_harness|winogrande|5_2024-03-04T19-39-31.628664.parquet'
- split: 2024_03_04T19_40_09.114133
path:
- '**/details_harness|winogrande|5_2024-03-04T19-40-09.114133.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T19-40-09.114133.parquet'
- config_name: results
data_files:
- split: 2024_03_04T19_39_31.628664
path:
- results_2024-03-04T19-39-31.628664.parquet
- split: 2024_03_04T19_40_09.114133
path:
- results_2024-03-04T19-40-09.114133.parquet
- split: latest
path:
- results_2024-03-04T19-40-09.114133.parquet
---
# Dataset Card for Evaluation run of Severian/Nexus-IKM-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Severian/Nexus-IKM-Mistral-7B](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T19:40:09.114133](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B/blob/main/results_2024-03-04T19-40-09.114133.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2817231321365924,
"acc_stderr": 0.032077202372413315,
"acc_norm": 0.2844166087228834,
"acc_norm_stderr": 0.03294909681534637,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041852,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.21843003412969283,
"acc_stderr": 0.012074291605700962,
"acc_norm": 0.29266211604095566,
"acc_norm_stderr": 0.013295916103619411
},
"harness|hellaswag|10": {
"acc": 0.26707827126070505,
"acc_stderr": 0.004415293656599497,
"acc_norm": 0.29107747460665206,
"acc_norm_stderr": 0.004533307758521325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207764,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.03078373675774563,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.03078373675774563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03053289223393203,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03053289223393203
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.03410780251836183,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.03410780251836183
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377274,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377274
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097845,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634342,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634342
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182975997,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182975997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605607,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605607
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.04498676320572921,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.04498676320572921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036536,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.19106145251396647,
"acc_stderr": 0.013148479802450801,
"acc_norm": 0.19106145251396647,
"acc_norm_stderr": 0.013148479802450801
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816653,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816653
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114023,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114023
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113893,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113893
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294268,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294268
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.03294118479054096,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.03294118479054096
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683229,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683229
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041852,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616438
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/aac4766c | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1336
dataset_size: 188
---
# Dataset Card for "aac4766c"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thouph/experimental-dataset | ---
license: cc-by-nc-4.0
---
|
vumichien/preprocessed_jsut_jsss_css10 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 7003135912
num_examples: 18160
download_size: 7021090523
dataset_size: 7003135912
---
# Dataset Card for "preprocessed_jsut_jsss_css10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clinicalnlplab/LitCovid_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: train
num_bytes: 73641714
num_examples: 24960
- name: valid
num_bytes: 18488585
num_examples: 6239
- name: test
num_bytes: 7628379
num_examples: 2500
download_size: 33079636
dataset_size: 99758678
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_210 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1123132256.0
num_examples: 220568
download_size: 1148109270
dataset_size: 1123132256.0
---
# Dataset Card for "chunk_210"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dru-ac/ArBNTopic | ---
task_categories:
- text-classification
- zero-shot-classification
- text-generation
language:
- ar
size_categories:
- 10K<n<100K
---
The presented dataset was used to finetune the text classification model `ArGTClass`, available [https://huggingface.co/dru-ac/ArGTClass](here).
The dataset was compiled using samples from the following sources:
- `SANAD` newspapers dataset, available [https://huggingface.co/datasets/arbml/SANAD](here)
- `ARTopicDS-Books`, available [example.com](here)
|
nhantruongcse/summary-vietnamese-news-token-TFeval_vit5_large_vietnews | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 61526294
num_examples: 8229
download_size: 27275716
dataset_size: 61526294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pccl-org/formal-logic-simple-order-multi-token-dynamic-objects-paired-relationship-0-2000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 247387816
num_examples: 873250
download_size: 85633818
dataset_size: 247387816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SJ-Donald/orca-dpo-pairs-ko | ---
license: apache-2.0
tags:
- orca-pairs
- mncai/orca_dpo_pairs_ko
- Ja-ck/Orca-DPO-Pairs-KO
- We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs
---
# SJ-Donald/orca-dpo-pairs-ko
SJ-Donald/orca-dpo-pairs-ko is merged dataset from fllow datasets
## Datasets
* [mncai/orca_dpo_pairs_ko](https://huggingface.co/datasets/mncai/orca_dpo_pairs_ko)
* [Ja-ck/Orca-DPO-Pairs-KO](https://huggingface.co/datasets/Ja-ck/Orca-DPO-Pairs-KO)
* [We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs](https://huggingface.co/datasets/We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs)
Merge datasets from above and drop duplicates.
## How to use
```Python
from datasets import load_dataset
ds = load_dataset("SJ-Donald/orca-dpo-pairs-ko")
print(ds)
DatasetDict({
train: Dataset({
features: ['system', 'question', 'chosen', 'rejected'],
num_rows: 36009
})
})
``` |
MU-NLPC/Calc-ape210k_selftrain_experiment_balanced | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: question_chinese
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: equation
dtype: string
- name: model_checkpoint
dtype: string
- name: correct
dtype: string
- name: incorrect_1
dtype: string
splits:
- name: train
num_bytes: 55832831
num_examples: 48194
download_size: 23380890
dataset_size: 55832831
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Calc-ape210k_selftrain_experiment_melted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
towhid/aesir-test420 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 7240
num_examples: 17
download_size: 6311
dataset_size: 7240
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "aesir-test420"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/OCNLI | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
dataset_info:
features:
- name: sent1
sequence: string
- name: sent2
sequence: string
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 222873
num_examples: 1
download_size: 153558
dataset_size: 222873
---
# Dataset Card for "OCNLI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_115 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1459738916
num_examples: 286673
download_size: 1477815325
dataset_size: 1459738916
---
# Dataset Card for "chunk_115"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/burnet_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of burnet (Pokémon)
This is the dataset of burnet (Pokémon), containing 69 images and their tags.
The core tags of this character are `white_hair, dark_skin, dark-skinned_female, breasts, yellow_eyes, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 51.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/burnet_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 69 | 34.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/burnet_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 120 | 62.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/burnet_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 69 | 48.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/burnet_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 120 | 85.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/burnet_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/burnet_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, large_breasts, nipples, blush, hetero, navel, penis, pussy, 1boy, bar_censor, collarbone, looking_at_viewer, open_mouth, smile, solo_focus, bare_shoulders, female_pubic_hair, heart, shirt_lift, simple_background, tank_top, tongue_out, torn_clothes |
| 1 | 7 |  |  |  |  |  | 1girl, simple_background, grin, necklace, solo, closed_eyes, white_background, teeth, blush, sidelocks, upper_body |
| 2 | 10 |  |  |  |  |  | 1girl, smile, closed_mouth, looking_at_viewer, necklace, solo, cleavage, green_eyes, tank_top, collarbone, eyelashes, shirt, white_background, simple_background, bare_arms, bare_shoulders, sidelocks, sleeveless, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | large_breasts | nipples | blush | hetero | navel | penis | pussy | 1boy | bar_censor | collarbone | looking_at_viewer | open_mouth | smile | solo_focus | bare_shoulders | female_pubic_hair | heart | shirt_lift | simple_background | tank_top | tongue_out | torn_clothes | grin | necklace | solo | closed_eyes | white_background | teeth | sidelocks | upper_body | closed_mouth | cleavage | green_eyes | eyelashes | shirt | bare_arms | sleeveless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:----------|:--------|:---------|:--------|:--------|:--------|:-------|:-------------|:-------------|:--------------------|:-------------|:--------|:-------------|:-----------------|:--------------------|:--------|:-------------|:--------------------|:-----------|:-------------|:---------------|:-------|:-----------|:-------|:--------------|:-------------------|:--------|:------------|:-------------|:---------------|:-----------|:-------------|:------------|:--------|:------------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | | | | | | | | | X | X | | X | | X | | | | X | X | | | | X | X | | X | | X | X | X | X | X | X | X | X | X |
|
joey234/mmlu-electrical_engineering-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
- name: neg_prompt
dtype: string
splits:
- name: dev
num_bytes: 6493
num_examples: 5
- name: test
num_bytes: 857717
num_examples: 145
download_size: 121746
dataset_size: 864210
---
# Dataset Card for "mmlu-electrical_engineering-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BumblingOrange/Hanks_Embeddings | ---
license: bigscience-bloom-rail-1.0
---
This is a collection of embeddings that I decided to make public. Additionally, it will be where I host any future embeddings I decide to train. |
open-llm-leaderboard/details_PotatoOff__HamSter-0.2 | ---
pretty_name: Evaluation run of PotatoOff/HamSter-0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PotatoOff/HamSter-0.2](https://huggingface.co/PotatoOff/HamSter-0.2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PotatoOff__HamSter-0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T20:12:25.047225](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.2/blob/main/results_2024-01-16T20-12-25.047225.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4993855534029302,\n\
\ \"acc_stderr\": 0.034244491357846386,\n \"acc_norm\": 0.5077537035345174,\n\
\ \"acc_norm_stderr\": 0.03517731824473503,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.49629739509694737,\n\
\ \"mc2_stderr\": 0.015731600227202613\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5008532423208191,\n \"acc_norm_stderr\": 0.014611369529813272\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5668193586934873,\n\
\ \"acc_stderr\": 0.0049450236570322765,\n \"acc_norm\": 0.7365066719776937,\n\
\ \"acc_norm_stderr\": 0.004396273173717463\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.045378153549393924,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.045378153549393924\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.027869320571664625,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.027869320571664625\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.653211009174312,\n \"acc_stderr\": 0.020406097104093024,\n \"\
acc_norm\": 0.653211009174312,\n \"acc_norm_stderr\": 0.020406097104093024\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809118,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809118\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906274,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906274\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.04802694698258973,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.04802694698258973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009157,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009157\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.016706381415057904,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.016706381415057904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n\
\ \"acc_stderr\": 0.015078358970751765,\n \"acc_norm\": 0.2837988826815642,\n\
\ \"acc_norm_stderr\": 0.015078358970751765\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.028320325830105908,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.028320325830105908\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.01233739168453031,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.01233739168453031\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714874,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714874\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0201965949335412,\n \
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0201965949335412\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.49629739509694737,\n\
\ \"mc2_stderr\": 0.015731600227202613\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634463\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/PotatoOff/HamSter-0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|arc:challenge|25_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|gsm8k|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hellaswag|10_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T20-12-25.047225.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- '**/details_harness|winogrande|5_2024-01-16T20-12-25.047225.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T20-12-25.047225.parquet'
- config_name: results
data_files:
- split: 2024_01_16T20_12_25.047225
path:
- results_2024-01-16T20-12-25.047225.parquet
- split: latest
path:
- results_2024-01-16T20-12-25.047225.parquet
---
# Dataset Card for Evaluation run of PotatoOff/HamSter-0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PotatoOff/HamSter-0.2](https://huggingface.co/PotatoOff/HamSter-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PotatoOff__HamSter-0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T20:12:25.047225](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.2/blob/main/results_2024-01-16T20-12-25.047225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4993855534029302,
"acc_stderr": 0.034244491357846386,
"acc_norm": 0.5077537035345174,
"acc_norm_stderr": 0.03517731824473503,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.49629739509694737,
"mc2_stderr": 0.015731600227202613
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5008532423208191,
"acc_norm_stderr": 0.014611369529813272
},
"harness|hellaswag|10": {
"acc": 0.5668193586934873,
"acc_stderr": 0.0049450236570322765,
"acc_norm": 0.7365066719776937,
"acc_norm_stderr": 0.004396273173717463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.045378153549393924,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.045378153549393924
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664625,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664625
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.653211009174312,
"acc_stderr": 0.020406097104093024,
"acc_norm": 0.653211009174312,
"acc_norm_stderr": 0.020406097104093024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809118,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809118
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906274,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009157,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009157
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.016706381415057904,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.016706381415057904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751765,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.028320325830105908,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.028320325830105908
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.01233739168453031,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.01233739168453031
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0201965949335412,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0201965949335412
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.49629739509694737,
"mc2_stderr": 0.015731600227202613
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634463
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Hemg/brain-tumour-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Glioma
'1': Meningioma
'2': Pituitary tumor
splits:
- name: train
num_bytes: 1579718564.462
num_examples: 18398
- name: validation
num_bytes: 83608820.0
num_examples: 828
download_size: 1622392078
dataset_size: 1663327384.462
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
mserras/alpaca-es-hackaton-test | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
struct:
- name: en_index
dtype: int64
- name: sf-unprocessable-score
dtype: float64
- name: tr-flag-1-instruction
dtype: bool
- name: tr-flag-2-input
dtype: bool
- name: tr-flag-3-output
dtype: bool
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 984283413
num_examples: 51942
download_size: 652179041
dataset_size: 984283413
---
# Dataset Card for "alpaca-es-hackaton-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kukedlc__FrankeMerge-12.5B | ---
pretty_name: Evaluation run of Kukedlc/FrankeMerge-12.5B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/FrankeMerge-12.5B](https://huggingface.co/Kukedlc/FrankeMerge-12.5B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__FrankeMerge-12.5B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T03:49:51.405000](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__FrankeMerge-12.5B/blob/main/results_2024-03-22T03-49-51.405000.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6424553947276627,\n\
\ \"acc_stderr\": 0.032415537863378585,\n \"acc_norm\": 0.6448206393016681,\n\
\ \"acc_norm_stderr\": 0.03306942684822286,\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6687960206616382,\n\
\ \"mc2_stderr\": 0.01554482752476538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063228,\n\
\ \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7005576578370842,\n\
\ \"acc_stderr\": 0.004570777326263901,\n \"acc_norm\": 0.877414857598088,\n\
\ \"acc_norm_stderr\": 0.0032729014349397612\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645354,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381384,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n\
\ \"acc_stderr\": 0.016040454426164464,\n \"acc_norm\": 0.358659217877095,\n\
\ \"acc_norm_stderr\": 0.016040454426164464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101004,\n\
\ \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101004\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553707,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553707\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n\
\ \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.6687960206616382,\n\
\ \"mc2_stderr\": 0.01554482752476538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156888\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5367702805155421,\n \
\ \"acc_stderr\": 0.01373519195646865\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/FrankeMerge-12.5B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-49-51.405000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-49-51.405000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- '**/details_harness|winogrande|5_2024-03-22T03-49-51.405000.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T03-49-51.405000.parquet'
- config_name: results
data_files:
- split: 2024_03_22T03_49_51.405000
path:
- results_2024-03-22T03-49-51.405000.parquet
- split: latest
path:
- results_2024-03-22T03-49-51.405000.parquet
---
# Dataset Card for Evaluation run of Kukedlc/FrankeMerge-12.5B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/FrankeMerge-12.5B](https://huggingface.co/Kukedlc/FrankeMerge-12.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__FrankeMerge-12.5B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T03:49:51.405000](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__FrankeMerge-12.5B/blob/main/results_2024-03-22T03-49-51.405000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6424553947276627,
"acc_stderr": 0.032415537863378585,
"acc_norm": 0.6448206393016681,
"acc_norm_stderr": 0.03306942684822286,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6687960206616382,
"mc2_stderr": 0.01554482752476538
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063228,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068077
},
"harness|hellaswag|10": {
"acc": 0.7005576578370842,
"acc_stderr": 0.004570777326263901,
"acc_norm": 0.877414857598088,
"acc_norm_stderr": 0.0032729014349397612
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645354,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381384,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164464,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101004,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101004
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553707,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553707
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.6687960206616382,
"mc2_stderr": 0.01554482752476538
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156888
},
"harness|gsm8k|5": {
"acc": 0.5367702805155421,
"acc_stderr": 0.01373519195646865
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3 | ---
pretty_name: Evaluation run of AtAndDev/ShortKing-3b-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AtAndDev/ShortKing-3b-v0.3](https://huggingface.co/AtAndDev/ShortKing-3b-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T09:17:13.395928](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3/blob/main/results_2023-10-29T09-17-13.395928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460391266,\n \"f1\": 0.05457843959731554,\n\
\ \"f1_stderr\": 0.001344563821795035,\n \"acc\": 0.34071397754750704,\n\
\ \"acc_stderr\": 0.008118865064946825\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460391266,\n\
\ \"f1\": 0.05457843959731554,\n \"f1_stderr\": 0.001344563821795035\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.0030152942428909517\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6692975532754538,\n \"acc_stderr\": 0.013222435887002698\n\
\ }\n}\n```"
repo_url: https://huggingface.co/AtAndDev/ShortKing-3b-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|arc:challenge|25_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T09_17_13.395928
path:
- '**/details_harness|drop|3_2023-10-29T09-17-13.395928.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T09-17-13.395928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T09_17_13.395928
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-17-13.395928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-17-13.395928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hellaswag|10_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T03-04-04.830920.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T03-04-04.830920.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T09_17_13.395928
path:
- '**/details_harness|winogrande|5_2023-10-29T09-17-13.395928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T09-17-13.395928.parquet'
- config_name: results
data_files:
- split: 2023_10_04T03_04_04.830920
path:
- results_2023-10-04T03-04-04.830920.parquet
- split: 2023_10_29T09_17_13.395928
path:
- results_2023-10-29T09-17-13.395928.parquet
- split: latest
path:
- results_2023-10-29T09-17-13.395928.parquet
---
# Dataset Card for Evaluation run of AtAndDev/ShortKing-3b-v0.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AtAndDev/ShortKing-3b-v0.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AtAndDev/ShortKing-3b-v0.3](https://huggingface.co/AtAndDev/ShortKing-3b-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T09:17:13.395928](https://huggingface.co/datasets/open-llm-leaderboard/details_AtAndDev__ShortKing-3b-v0.3/blob/main/results_2023-10-29T09-17-13.395928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460391266,
"f1": 0.05457843959731554,
"f1_stderr": 0.001344563821795035,
"acc": 0.34071397754750704,
"acc_stderr": 0.008118865064946825
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460391266,
"f1": 0.05457843959731554,
"f1_stderr": 0.001344563821795035
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909517
},
"harness|winogrande|5": {
"acc": 0.6692975532754538,
"acc_stderr": 0.013222435887002698
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_177 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1161648144
num_examples: 228132
download_size: 1184808516
dataset_size: 1161648144
---
# Dataset Card for "chunk_177"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ahmed-ibn-Harun/wake-w | ---
license: mit
---
|
nayohan/032_broadcast_translation | ---
dataset_info:
features:
- name: domain
dtype: string
- name: subdomain
dtype: string
- name: style
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_text
dtype: string
- name: target_mt
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 158190564
num_examples: 587084
download_size: 82685546
dataset_size: 158190564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kishorereddy123/accurate_QA | ---
dataset_info:
features:
- name: Question_Answer
dtype: string
splits:
- name: train
num_bytes: 53333.17741935484
num_examples: 86
- name: test
num_bytes: 23565.822580645163
num_examples: 38
download_size: 45319
dataset_size: 76899.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-133000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1088429
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VatsaDev/codegolf | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- code
- challege
- codegolf
pretty_name: Codegolf
size_categories:
- 10K<n<100K
---
The entire codegolf stackexchange where questions have a score above 0, 14K code questions with all the answers
- good for learning complex code questions, more unique challenges, code optimizations, and code not really mainstream, could help diversity |
Taskin123/Classification | ---
license: apache-2.0
---
|
snipaid/snippet-mlsum-500 | ---
license: mit
language: de
tags:
- news
- headline
- teaser
- keywords
- tweet
- serp title-tag
- serp meta-description
- news snippets
task_categories:
- summarization
- text2text-generation
size_categories:
- n<1K
---
# Dataset Card for Snippet-MLSUM-500
### Dataset Summary
This dataset is a sample of ~500 news articles from the [MLSUM](https://huggingface.co/datasets/mlsum) dataset, augmented with machine generated news snippets.
### Supported Tasks
This dataset was created to support the task of generating news snippets such as title, teaser, keywords, serp and tweet for news articles in German language.
### Languages
de - German
## Dataset Structure
text: a string feature.
title: a string feature.
teaser: a string feature.
keywords: a string feature.
serp_title: a string feature.
serp_description: a string feature.
tweet: a string feature.
url: a string feature.
date: a string feature.
topic: a string feature.
## Dataset Creation
The news articles in this dataset are a random sample of ~500 news articles from MLSUM balanced by topic.
Features text, title, teaser (originally summary in MLSUM), url, date and topic are copied from MLSUM.
Features keywords, serp_title, serp_description and tweet are machine generated with GPT-3.5.
Generated features comply with length limits in place for SERPs and Tweets at the time of publishing.
## Considerations for Using the Data
### Known Limitations
Part of the snippet data is machine generated. Be aware that these features (specifically: keywords, serp_title, serp_description and tweet) may exhibit signs of model hallucination.
## Additional Information
See [Snippet-MLSUM-500-V2](https://huggingface.co/datasets/snipaid/snippet-mlsum-500-v2) if you are interested in a dataset with combined serp and additional summary data.
### Licensing Information
This dataset is licensed under MIT license. |
rai-sandeep/test_ds_1 | ---
dataset_info:
features:
- name: category
dtype: string
- name: topic
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 4689
num_examples: 4
download_size: 11810
dataset_size: 4689
---
# Dataset Card for "test_ds_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Serum/for_sd | ---
license: openrail
---
|
PikwikCStudios/Carson | ---
license: mit
---
|
CodecSR/esc50_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: id
dtype: string
splits:
- name: original
num_bytes: 882135256.0
num_examples: 2000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 882057006.0
num_examples: 2000
- name: academicodec_hifi_24k_320d
num_bytes: 882057006.0
num_examples: 2000
- name: audiodec_24k_300d
num_bytes: 882137006.0
num_examples: 2000
- name: audiodec_48k_300d_uni
num_bytes: 882137006.0
num_examples: 2000
- name: dac_16k
num_bytes: 882137006.0
num_examples: 2000
- name: dac_24k
num_bytes: 882137006.0
num_examples: 2000
- name: dac_44k
num_bytes: 882137006.0
num_examples: 2000
- name: encodec_24k_12bps
num_bytes: 882137006.0
num_examples: 2000
- name: encodec_24k_1_5bps
num_bytes: 882137006.0
num_examples: 2000
- name: encodec_24k_24bps
num_bytes: 882137006.0
num_examples: 2000
- name: encodec_24k_3bps
num_bytes: 882137006.0
num_examples: 2000
- name: encodec_24k_6bps
num_bytes: 882137006.0
num_examples: 2000
- name: facodec_16k
num_bytes: 881737006.0
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 882137006.0
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 882137006.0
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 882137006.0
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 882137006.0
num_examples: 2000
- name: language_codec_chinese_24k_nq8_12kbps
num_bytes: 883337006.0
num_examples: 2000
- name: language_codec_paper_24k_nq8_12kbps
num_bytes: 883337006.0
num_examples: 2000
- name: speech_tokenizer_16k
num_bytes: 883337006.0
num_examples: 2000
download_size: 16345948622
dataset_size: 18527915376.0
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_300d
path: data/audiodec_24k_300d-*
- split: audiodec_48k_300d_uni
path: data/audiodec_48k_300d_uni-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: facodec_16k
path: data/facodec_16k-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: language_codec_chinese_24k_nq8_12kbps
path: data/language_codec_chinese_24k_nq8_12kbps-*
- split: language_codec_paper_24k_nq8_12kbps
path: data/language_codec_paper_24k_nq8_12kbps-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
luukschmitz/validation_500 | ---
license: apache-2.0
---
|
peterholdsworth/vangogh | ---
license: unknown
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-107000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1018501
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aymen31/PlantVillage | ---
license: other
---
|
DayaneGuimaraes/verbosInfinitivo | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3144.0
num_examples: 24
- name: test
num_bytes: 917.0
num_examples: 7
download_size: 4374
dataset_size: 4061.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
trl-internal-testing/hh-rlhf-trl-style | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 327157884
num_examples: 160800
- name: test
num_bytes: 17602645
num_examples: 8552
download_size: 191942872
dataset_size: 344760529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# TRL's Anthropic HH Dataset
We preprocess the dataset using our standard `prompt, chosen, rejected` format.
## Reproduce this dataset
1. Download the `anthropic_hh.py` from the https://huggingface.co/datasets/trl-internal-testing/hh-rlhf-trl-style/tree/0.1.0.
2. Run `python examples/datasets/anthropic_hh.py --push_to_hub --hf_entity trl-internal-testing`
|
pa-shk/scifact | ---
dataset_info:
- config_name: docs
features:
- name: doc
dtype: string
splits:
- name: train
num_bytes: 7288933
num_examples: 5183
download_size: 4177186
dataset_size: 7288933
- config_name: qrels
features:
- name: query
dtype: string
- name: doc
dtype: string
splits:
- name: train
num_bytes: 1502296
num_examples: 919
- name: test
num_bytes: 549599
num_examples: 339
download_size: 799417
dataset_size: 2051895
configs:
- config_name: docs
data_files:
- split: train
path: docs/train-*
- config_name: qrels
data_files:
- split: train
path: qrels/train-*
- split: test
path: qrels/test-*
---
|
Hack90/experiment_one_viral_genomes_test_set | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
- name: sequence_quality
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 567591779
num_examples: 78918
download_size: 95799817
dataset_size: 567591779
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FINNUMBER/NQA_Instruction | ---
dataset_info:
- config_name: Numerical Reasoning Arithmetic
features:
- name: Q
dtype: string
- name: A
dtype: string
- name: C
dtype: string
- name: Rationale
dtype: string
- name: type
dtype: string
- name: correct
dtype: string
splits:
- name: train
num_bytes: 34274762
num_examples: 23064
- name: test
num_bytes: 12977495
num_examples: 9553
download_size: 26818152
dataset_size: 47252257
- config_name: Numerical Reasoning Comparison
features:
- name: Q
dtype: string
- name: A
dtype: string
- name: C
dtype: string
- name: Rationale
dtype: string
- name: type
dtype: string
- name: correct
dtype: string
splits:
- name: train
num_bytes: 35502510
num_examples: 23016
- name: test
num_bytes: 5536935
num_examples: 3783
download_size: 23032974
dataset_size: 41039445
- config_name: Numerical Reasoning Extraction
features:
- name: Q
dtype: string
- name: A
dtype: string
- name: C
dtype: string
- name: Rationale
dtype: string
- name: type
dtype: string
- name: correct
dtype: string
splits:
- name: train
num_bytes: 43262111
num_examples: 21000
- name: test
num_bytes: 8579210
num_examples: 5213
download_size: 30067726
dataset_size: 51841321
configs:
- config_name: Numerical Reasoning Arithmetic
data_files:
- split: train
path: Numerical Reasoning Arithmetic/train-*
- split: test
path: Numerical Reasoning Arithmetic/test-*
- config_name: Numerical Reasoning Comparison
data_files:
- split: train
path: Numerical Reasoning Comparison/train-*
- split: test
path: Numerical Reasoning Comparison/test-*
- config_name: Numerical Reasoning Extraction
data_files:
- split: train
path: Numerical Reasoning Extraction/train-*
- split: test
path: Numerical Reasoning Extraction/test-*
---
|
caldervf/cicero_raw_dataset | ---
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7984313
num_examples: 1143
download_size: 0
dataset_size: 7984313
---
# Dataset Card for "cicero_raw_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrfakename/ipa-phonemes-word-pairs | ---
license: cc-by-sa-4.0
language:
- en
pretty_name: Words + IPA Phoneme
---
* license: cc-by-sa 4.0
* size: \~275k pairs, \~7mb (\~4mb parquet)
* generated using: phonemizer/espeak
check out [openphonemizer](https://github.com/NeuralVox/OpenPhonemizer) for more details! |
une/uneune_image1 | ---
license: cc-by-4.0
---
# Dataset Card for uneune_image1
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
今まで私が描いたイラスト100枚のデータセットです。
512×512にトリミングしてあります。
さっくりとstableDiffusionでの学習用に使えるデータセットが欲しかったので作りました。
This is a data set of 100 illustrations I have drawn so far.
Cropped to 512x512.
I wanted a dataset that can be used for learning with stableDiffusion, so I made it. |
Jayseon/kfood_demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1297086.0
num_examples: 20
download_size: 1298218
dataset_size: 1297086.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BXYMartin/OpenHearthstone | ---
license: gpl-3.0
language:
- en
tags:
- hearthstone
pretty_name: v
size_categories:
- 1K<n<10K
---
This dataset is collected as an initial proof-of-concept for OpenHearthstone data collection pipeline.
The dataset is collected with PvE mode guided by actions from Silverfish.
The dataset contains 57 games, mean action counts per game is ~30 and the win rate is around 60%. |
Falah/Alzheimer_MRI | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Mild_Demented
'1': Moderate_Demented
'2': Non_Demented
'3': Very_Mild_Demented
splits:
- name: train
num_bytes: 22560791.2
num_examples: 5120
- name: test
num_bytes: 5637447.08
num_examples: 1280
download_size: 28289848
dataset_size: 28198238.28
license: apache-2.0
task_categories:
- image-classification
language:
- en
tags:
- medical
pretty_name: Alzheimer_MRI Disease Classification Dataset
size_categories:
- 1K<n<10K
---
# Alzheimer_MRI Disease Classification Dataset
The Falah/Alzheimer_MRI Disease Classification dataset is a valuable resource for researchers and health medicine applications. This dataset focuses on the classification of Alzheimer's disease based on MRI scans. The dataset consists of brain MRI images labeled into four categories:
- '0': Mild_Demented
- '1': Moderate_Demented
- '2': Non_Demented
- '3': Very_Mild_Demented
## Dataset Information
- Train split:
- Name: train
- Number of bytes: 22,560,791.2
- Number of examples: 5,120
- Test split:
- Name: test
- Number of bytes: 5,637,447.08
- Number of examples: 1,280
- Download size: 28,289,848 bytes
- Dataset size: 28,198,238.28 bytes
## Citation
If you use this dataset in your research or health medicine applications, we kindly request that you cite the following publication:
```
@dataset{alzheimer_mri_dataset,
author = {Falah.G.Salieh},
title = {Alzheimer MRI Dataset},
year = {2023},
publisher = {Hugging Face},
version = {1.0},
url = {https://huggingface.co/datasets/Falah/Alzheimer_MRI}
}
```
## Usage Example
Here's an example of how to load the dataset using the Hugging Face library:
```python
from datasets import load_dataset
# Load the Falah/Alzheimer_MRI dataset
dataset = load_dataset('Falah/Alzheimer_MRI', split='train')
# Print the number of examples and the first few samples
print("Number of examples:", len(dataset))
print("Sample data:")
for example in dataset[:5]:
print(example)
``` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-17000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 991917
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shiertier/12T_danbooru | ---
license: mit
---
|
amandlek/mimicgen_datasets | ---
license: cc-by-nc-sa-4.0
---
# Dataset Card for MimicGen Datasets
## Dataset Summary
This repository contains the official release of datasets for the [CoRL 2023](https://www.corl2023.org/) paper "MimicGen: A Data Generation System for Scalable Robot Learning using Human Demonstrations".
The datasets contain over 48,000 task demonstrations across 12 tasks, grouped into the following categories:
- **source**: 120 human demonstrations across 12 tasks used to automatically generate the other datasets
- **core**: 26,000 task demonstrations across 12 tasks (26 task variants)
- **object**: 2000 task demonstrations on the Mug Cleanup task with different mugs
- **robot**: 16,000 task demonstrations across 4 different robot arms on 2 tasks (4 task variants)
- **large_interpolation**: 6000 task demonstrations across 6 tasks that pose significant challenges for modern imitation learning methods
For more information please see the [website](https://mimicgen.github.io), the [paper](https://arxiv.org/abs/2310.17596), and the [code](https://github.com/NVlabs/mimicgen_environments).
## Dataset Structure
Each dataset is an hdf5 file that is readily compatible with [robomimic](https://robomimic.github.io/) --- the structure is explained [here](https://robomimic.github.io/docs/datasets/overview.html#dataset-structure).
As described in the paper, each task has a default reset distribution (D_0). Source human demonstrations (usually 10 demos) were collected on this distribution and MimicGen was subsequently used to generate large datasets (usually 1000 demos) across different task reset distributions (e.g. D_0, D_1, D_2), objects, and robots.
The datasets are split into different types:
- **source**: source human datasets used to generate all data -- this generally consists of 10 human demonstrations collected on the D_0 variant for each task.
- **core**: datasets generated with MimicGen for different task reset distributions. These correspond to the core set of results in Figure 4 of the paper.
- **object**: datasets generated with MimicGen for different objects. These correspond to the results in Appendix G of the paper.
- **robot**: datasets generated with MimicGen for different robots. These correspond to the results in Appendix F of the paper.
- **large_interpolation**: datasets generated with MimicGen using much larger interpolation segments. These correspond to the results in Appendix H in the paper.
**Note**: We found that the large_interpolation datasets pose a significant challenge for imitation learning, and have substantial room for improvement.
## Citation
Please cite the [MimicGen paper](https://arxiv.org/abs/2310.17596) if you use these datasets in your work:
```bibtex
@inproceedings{mandlekar2023mimicgen,
title={MimicGen: A Data Generation System for Scalable Robot Learning using Human Demonstrations},
author={Mandlekar, Ajay and Nasiriany, Soroush and Wen, Bowen and Akinola, Iretiayo and Narang, Yashraj and Fan, Linxi and Zhu, Yuke and Fox, Dieter},
booktitle={7th Annual Conference on Robot Learning},
year={2023}
}
``` |
davidfant/rapidapi-example-responses | ---
dataset_info:
features:
- name: id
dtype: string
- name: api_name
dtype: string
- name: api_description
dtype: string
- name: api_score
dtype: float64
- name: endpoint_name
dtype: string
- name: endpoint_description
dtype: string
- name: response_status_code
dtype: int64
- name: response_summary
dtype: string
- name: response_json
dtype: string
- name: response_json_schema
dtype: string
splits:
- name: train
num_bytes: 115936364
num_examples: 28059
download_size: 27933521
dataset_size: 115936364
---
# Dataset Card for "rapidapi-example-responses"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kgr123/quality_counter_1000_4_buckets | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 5846463
num_examples: 1929
- name: train
num_bytes: 5805342
num_examples: 1935
- name: validation
num_bytes: 5881218
num_examples: 1941
download_size: 4199180
dataset_size: 17533023
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
sfurkan/Kanun-Yonetmelik-Tuzuk | ---
license: apache-2.0
---
|
wiki_bio | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- table-to-text
task_ids: []
paperswithcode_id: wikibio
pretty_name: WikiBio
dataset_info:
features:
- name: input_text
struct:
- name: table
sequence:
- name: column_header
dtype: string
- name: row_number
dtype: int16
- name: content
dtype: string
- name: context
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 619269257
num_examples: 582659
- name: test
num_bytes: 77264695
num_examples: 72831
- name: val
num_bytes: 77335069
num_examples: 72831
download_size: 333998704
dataset_size: 773869021
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/DavidGrangier/wikipedia-biography-dataset
- **Paper:** https://arxiv.org/pdf/1603.07771.pdf
- **GitHub:** https://github.com/DavidGrangier/wikipedia-biography-dataset
### Dataset Summary
This Dataset contains 728321 biographies extracted from Wikipedia containing the first paragraph of the biography and the tabular infobox.
### Supported Tasks and Leaderboards
The main purpose of this dataset is developing text generation models.
### Languages
English.
## Dataset Structure
### Data Instances
More Information Needed
### Data Fields
The structure of a single sample is the following:
```json
{
"input_text":{
"context":"pope michael iii of alexandria\n",
"table":{
"column_header":[
"type",
"ended",
"death_date",
"title",
"enthroned",
"name",
"buried",
"religion",
"predecessor",
"nationality",
"article_title",
"feast_day",
"birth_place",
"residence",
"successor"
],
"content":[
"pope",
"16 march 907",
"16 march 907",
"56th of st. mark pope of alexandria & patriarch of the see",
"25 april 880",
"michael iii of alexandria",
"monastery of saint macarius the great",
"coptic orthodox christian",
"shenouda i",
"egyptian",
"pope michael iii of alexandria\n",
"16 -rrb- march -lrb- 20 baramhat in the coptic calendar",
"egypt",
"saint mark 's church",
"gabriel i"
],
"row_number":[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
}
},
"target_text":"pope michael iii of alexandria -lrb- also known as khail iii -rrb- was the coptic pope of alexandria and patriarch of the see of st. mark -lrb- 880 -- 907 -rrb- .\nin 882 , the governor of egypt , ahmad ibn tulun , forced khail to pay heavy contributions , forcing him to sell a church and some attached properties to the local jewish community .\nthis building was at one time believed to have later become the site of the cairo geniza .\n"
}
```
where, in the `"table"` field, all the information of the Wikpedia infobox is stored (the header of the infobox is stored in `"column_header"` and the information in the `"content"` field).
### Data Splits
- Train: 582659 samples.
- Test: 72831 samples.
- Validation: 72831 samples.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
This dataset was announced in the paper <em>Neural Text Generation from Structured Data with Application to the Biography Domain</em> [(arxiv link)](https://arxiv.org/pdf/1603.07771.pdf) and is stored in [this](https://github.com/DavidGrangier/wikipedia-biography-dataset) repo (owned by DavidGrangier).
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This dataset is ditributed under Creative Comons CC BY-SA 3.0 License.
### Citation Information
For refering the original paper in BibTex format:
```
@article{DBLP:journals/corr/LebretGA16,
author = {R{\'{e}}mi Lebret and
David Grangier and
Michael Auli},
title = {Generating Text from Structured Data with Application to the Biography
Domain},
journal = {CoRR},
volume = {abs/1603.07771},
year = {2016},
url = {http://arxiv.org/abs/1603.07771},
archivePrefix = {arXiv},
eprint = {1603.07771},
timestamp = {Mon, 13 Aug 2018 16:48:30 +0200},
biburl = {https://dblp.org/rec/journals/corr/LebretGA16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@alejandrocros](https://github.com/alejandrocros) for adding this dataset. |
textvqa | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: TextVQA
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- visual-question-answering
task_ids:
- visual-question-answering
dataset_info:
- config_name: train
features:
- name: image_id
dtype: string
- name: question_id
dtype: int32
- name: question
dtype: string
- name: question_tokens
sequence: string
- name: image
dtype: image
- name: image_width
dtype: int32
- name: image_height
dtype: int32
- name: flickr_original_url
dtype: string
- name: flickr_300k_url
dtype: string
- name: answers
sequence: string
- name: image_classes
sequence: string
- name: set_name
dtype: string
splits:
- name: train
num_bytes: 21381310
num_examples: 34602
- name: validation
num_bytes: 3077854
num_examples: 5000
- name: test
num_bytes: 3025046
num_examples: 5734
download_size: 8070116310
dataset_size: 27484210
- config_name: val
features:
- name: image_id
dtype: string
- name: question_id
dtype: int32
- name: question
dtype: string
- name: question_tokens
sequence: string
- name: image
dtype: image
- name: image_width
dtype: int32
- name: image_height
dtype: int32
- name: flickr_original_url
dtype: string
- name: flickr_300k_url
dtype: string
- name: answers
sequence: string
- name: image_classes
sequence: string
- name: set_name
dtype: string
splits:
- name: train
num_bytes: 21381310
num_examples: 34602
- name: validation
num_bytes: 3077854
num_examples: 5000
- name: test
num_bytes: 3025046
num_examples: 5734
download_size: 8070116310
dataset_size: 27484210
- config_name: test
features:
- name: image_id
dtype: string
- name: question_id
dtype: int32
- name: question
dtype: string
- name: question_tokens
sequence: string
- name: image
dtype: image
- name: image_width
dtype: int32
- name: image_height
dtype: int32
- name: flickr_original_url
dtype: string
- name: flickr_300k_url
dtype: string
- name: answers
sequence: string
- name: image_classes
sequence: string
- name: set_name
dtype: string
splits:
- name: train
num_bytes: 21381310
num_examples: 34602
- name: validation
num_bytes: 3077854
num_examples: 5000
- name: test
num_bytes: 3025046
num_examples: 5734
download_size: 8070116310
dataset_size: 27484210
- config_name: textvqa
features:
- name: image_id
dtype: string
- name: question_id
dtype: int32
- name: question
dtype: string
- name: question_tokens
sequence: string
- name: image
dtype: image
- name: image_width
dtype: int32
- name: image_height
dtype: int32
- name: flickr_original_url
dtype: string
- name: flickr_300k_url
dtype: string
- name: answers
sequence: string
- name: image_classes
sequence: string
- name: set_name
dtype: string
splits:
- name: train
num_bytes: 22073350
num_examples: 34602
- name: validation
num_bytes: 3177854
num_examples: 5000
- name: test
num_bytes: 3139726
num_examples: 5734
download_size: 8070116310
dataset_size: 28390930
---
# Dataset Card for TextVQA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://textvqa.org
- **Repository:** https://github.com/facebookresearch/mmf
- **Paper:** https://arxiv.org/abs/1904.08920
- **Leaderboard:** https://eval.ai/web/challenges/challenge-page/874/overview
- **Point of Contact:** mailto:amanpreet@nyu.edu
### Dataset Summary
TextVQA requires models to read and reason about text in images to answer questions about them.
Specifically, models need to incorporate a new modality of text present in the images and reason
over it to answer TextVQA questions. TextVQA dataset contains 45,336 questions over 28,408 images
from the OpenImages dataset. The dataset uses [VQA accuracy](https://visualqa.org/evaluation.html) metric for evaluation.
### Supported Tasks and Leaderboards
- `visual-question-answering`: The dataset can be used for Visual Question Answering tasks where given an image, you have to answer a question based on the image. For the TextVQA dataset specifically, the questions require reading and reasoning about the scene text in the given image.
### Languages
The questions in the dataset are in English.
## Dataset Structure
### Data Instances
A typical sample mainly contains the question in `question` field, an image object in `image` field, OpenImage image id in `image_id` and lot of other useful metadata. 10 answers per questions are contained in the `answers` attribute. For test set, 10 empty strings are contained in the `answers` field as the answers are not available for it.
An example look like below:
```
{'question': 'who is this copyrighted by?',
'image_id': '00685bc495504d61',
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=384x512 at 0x276021C5EB8>,
'image_classes': ['Vehicle', 'Tower', 'Airplane', 'Aircraft'],
'flickr_original_url': 'https://farm2.staticflickr.com/5067/5620759429_4ea686e643_o.jpg',
'flickr_300k_url': 'https://c5.staticflickr.com/6/5067/5620759429_f43a649fb5_z.jpg',
'image_width': 786,
'image_height': 1024,
'answers': ['simon clancy',
'simon ciancy',
'simon clancy',
'simon clancy',
'the brand is bayard',
'simon clancy',
'simon clancy',
'simon clancy',
'simon clancy',
'simon clancy'],
'question_tokens': ['who', 'is', 'this', 'copyrighted', 'by'],
'question_id': 3,
'set_name': 'train'
},
```
### Data Fields
- `question`: string, the question that is being asked about the image
- `image_id`: string, id of the image which is same as the OpenImages id
- `image`: A `PIL.Image.Image` object containing the image about which the question is being asked. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `image_classes`: List[str], The OpenImages classes to which the image belongs to.
- `flickr_original_url`: string, URL to original image on Flickr
- `flickr_300k_url`: string, URL to resized and low-resolution image on Flickr.
- `image_width`: int, Width of the original image.
- `image_height`: int, Height of the original image.
- `question_tokens`: List[str], A pre-tokenized list of question.
- `answers`: List[str], List of 10 human-annotated answers for the question. These 10 answers are collected from 10 different users. The list will contain empty strings for test set for which we don't have the answers.
- `question_id`: int, Unique id of the question.
- `set_name`: string, the set to which this question belongs.
### Data Splits
There are three splits. `train`, `validation` and `test`. The `train` and `validation` sets share images with OpenImages `train` set and have their answers available. For test set answers, we return a list of ten empty strings. To get inference results and numbers on `test` set, you need to go to the [EvalAI leaderboard](https://eval.ai/web/challenges/challenge-page/874/overview) and upload your predictions there. Please see instructions at [https://textvqa.org/challenge/](https://textvqa.org/challenge/).
## Dataset Creation
### Curation Rationale
From the paper:
> Studies have shown that a dominant class of questions asked by visually impaired users on images of their surroundings involves reading text in the image. But today’s VQA models can not read! Our paper takes a first step towards addressing this problem. First, we introduce a new “TextVQA” dataset to facilitate progress on this important problem. Existing datasets either have a small proportion of questions about text (e.g., the VQA dataset) or are too small (e.g., the VizWiz dataset). TextVQA contains 45,336 questions on 28,408 images that require reasoning about text to answer.
### Source Data
#### Initial Data Collection and Normalization
The initial images were sourced from [OpenImages](https://storage.googleapis.com/openimages/web/factsfigures_v4.html) v4 dataset. These were first filtered based on automatic heuristics using an OCR system where we only took images which had at least some text detected in them. See [annotation process](#annotation-process) section to understand the next stages.
#### Who are the source language producers?
English Crowdsource Annotators
### Annotations
#### Annotation process
After the automatic process of filter the images that contain text, the images were manually verified using human annotators making sure that they had text. In next stage, the annotators were asked to write questions involving scene text for the image. For some images, in this stage, two questions were collected whenever possible. Finally, in the last stage, ten different human annotators answer the questions asked in last stage.
#### Who are the annotators?
Annotators are from one of the major data collection platforms such as AMT. Exact details are not mentioned in the paper.
### Personal and Sensitive Information
The dataset does have similar PII issues as OpenImages and can at some times contain human faces, license plates, and documents. Using provided `image_classes` data field is one option to try to filter out some of this information.
## Considerations for Using the Data
### Social Impact of Dataset
The paper helped realize the importance of scene text recognition and reasoning in general purpose machine learning applications and has led to many follow-up works including [TextCaps](https://textvqa.org/textcaps) and [TextOCR](https://textvqa.org/textocr). Similar datasets were introduced over the time which specifically focus on sight-disabled users such as [VizWiz](https://vizwiz.org) or focusing specifically on the same problem as TextVQA like [STVQA](https://paperswithcode.com/dataset/st-vqa), [DocVQA](https://arxiv.org/abs/2007.00398v3) and [OCRVQA](https://ocr-vqa.github.io/). Currently, most methods train on combined dataset from TextVQA and STVQA to achieve state-of-the-art performance on both datasets.
### Discussion of Biases
Question-only bias where a model is able to answer the question without even looking at the image is discussed in the [paper](https://arxiv.org/abs/1904.08920) which was a major issue with original VQA dataset. The outlier bias in answers is prevented by collecting 10 different answers which are also taken in consideration by the evaluation metric.
### Other Known Limitations
- The dataset is english only but does involve images with non-English latin characters so can involve some multi-lingual understanding.
- The performance on the dataset is also dependent on the quality of OCR used as the OCR errors can directly lead to wrong answers.
- The metric used for calculating accuracy is same as [VQA accuracy](https://visualqa.org/evaluation.html). This involves one-to-one matching with the given answers and thus doesn't allow analyzing one-off errors through OCR.
## Additional Information
### Dataset Curators
- [Amanpreet Singh](https://github.com/apsdehal)
- Vivek Natarjan
- Meet Shah
- Yu Jiang
- Xinlei Chen
- Dhruv Batra
- Devi Parikh
- Marcus Rohrbach
### Licensing Information
CC by 4.0
### Citation Information
```bibtex
@inproceedings{singh2019towards,
title={Towards VQA Models That Can Read},
author={Singh, Amanpreet and Natarjan, Vivek and Shah, Meet and Jiang, Yu and Chen, Xinlei and Batra, Dhruv and Parikh, Devi and Rohrbach, Marcus},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
pages={8317-8326},
year={2019}
}
```
### Contributions
Thanks to [@apsdehal](https://github.com/apsdehal) for adding this dataset. |
rwitz2/teapartyproblem | ---
license: mit
---
|
FaalSa/dataL | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 14993
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mohdumar/SPHERE_100M | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: sha
dtype: string
- name: raw
dtype: string
- name: vector
sequence: float64
splits:
- name: train
num_bytes: 700040913966
num_examples: 100000000
download_size: 299664412819
dataset_size: 700040913966
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iohadrubin/nq_bm25_top100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
sequence: string
- name: qid
dtype: string
- name: ctxs
sequence: string
splits:
- name: train
num_bytes: 5029465701
num_examples: 79168
- name: validation
num_bytes: 556151568
num_examples: 8757
- name: test
num_bytes: 230146934
num_examples: 3610
download_size: 3270179648
dataset_size: 5815764203
---
# Dataset Card for "nq_bm25_top100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
generated with
```python
"""
python3.10 -m pip install pyserini==0.25.0
sudo apt install openjdk-11-jdk
cd /dev/shm
mkdir pyserini_cache
cd pyserini_cache
wget https://git.uwaterloo.ca/jimmylin/anserini-indexes/raw/master/index-wikipedia-dpr-20210120-d1b9e6.tar.gz
mkdir indexes
tar xvfz index-wikipedia-dpr-20210120-d1b9e6.tar.gz -C indexes
# rm index-wikipedia-dpr-20210120-d1b9e6.tar.gz
"""
from pyserini.search import LuceneSearcher
import json
import datasets
import numpy as np
import jax.numpy as jnp
import jax
import tqdm
import copy
from EasyLM.sliding_window import sliding_window
from EasyLM.models.neox.neox_model import GPTNeoXConfig
def decode_doc(doc):
return json.loads(doc.raw())["contents"]
def search_question(batch, searcher):
qids = batch["qid"]
hits = searcher.batch_search(batch["question"],qids,threads=300,k=100)
ctxs_per_doc = [[hit.docid for hit in hits[qid]] for qid in qids]
ctxs = sum(ctxs_per_doc,[])
doc_res = searcher.batch_doc(ctxs,threads=300)
docs_raw = [[decode_doc(doc_res[x]) for x in doc_hits] for doc_hits in ctxs_per_doc]
batch["ctxs"] = docs_raw
return batch
WIKI_INDEX_PATH = "/dev/shm/pyserini_cache/indexes/index-wikipedia-dpr-20210120-d1b9e6/"
def gen(split):
nq = datasets.load_dataset("iohadrubin/nq_closedbook", cache_dir="/dev/shm/datasets")
dataset = nq[split]
qid = list(map(str,range(len(dataset))))
dataset = dataset.add_column("qid",qid)
return dataset
def example_generator(split):
dataset = gen(split)
searcher = LuceneSearcher(WIKI_INDEX_PATH)
itr_dataset = dataset.to_iterable_dataset()
mapped_itr_dataset = itr_dataset.map(search_question,
batch_size=50,
batched=True,
fn_kwargs={"searcher":searcher},
)
yield from iter(mapped_itr_dataset)
import fire
import os
# python3.10 -m EasyLM.nq_data generate_nq
def generate_nq():
searcher = LuceneSearcher(WIKI_INDEX_PATH)
dataset_dict = {}
for split in [
"train",
"validation","test",
]:
dataset = gen(split)
dataset_dict[split] = dataset.map(search_question,
batch_size=300,
batched=True,
fn_kwargs={"searcher":searcher},
cache_file_name=f"/dev/shm/datasets/nq_bm25_top100_{split}.arrow"
)
dataset_dict= datasets.DatasetDict(dataset_dict)
dataset_dict.push_to_hub("nq_bm25_top100",token=os.environ["HF_TOKEN"])
``` |
zolak/twitter_dataset_80_1713109057 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3080992
num_examples: 7805
download_size: 1544042
dataset_size: 3080992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.