datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0e33ea6d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 190
num_examples: 10
download_size: 1329
dataset_size: 190
---
# Dataset Card for "0e33ea6d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatsuoDochiai/Jack | ---
license: openrail
---
|
linhtran92/random | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: intent
dtype: string
- name: sentence_annotation
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: origin_transcription
dtype: string
- name: sentence_norm
dtype: string
splits:
- name: train
num_bytes: 1085064441.2989166
num_examples: 2094
download_size: 260034262
dataset_size: 1085064441.2989166
---
# Dataset Card for "random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arbml/Satirical_Fake_News | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 6131349
num_examples: 3221
download_size: 3223892
dataset_size: 6131349
---
# Dataset Card for "Satirical_Fake_News"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhuynh95/Fibo2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 139900
num_examples: 100
download_size: 10506
dataset_size: 139900
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BBang22/customllama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1491058
num_examples: 644
download_size: 544960
dataset_size: 1491058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mutugi/housing | ---
license: apache-2.0
---
|
MLRS/mapa_maltese | ---
license: cc-by-4.0
task_categories:
- token-classification
task_ids:
- named-entity-recognition
language:
- mt
pretty_name: MAPA Maltese
size_categories:
- 1K<n<10K
---
# MAPA Maltese
Named-Entity Recognition dataset from the [MAPA Project](https://mapa-project.eu/).
This dataset has some fixes as detailed in [Cross-Lingual Transfer from Related Languages: Treating Low-Resource Maltese as Multilingual Code-Switching](https://aclanthology.org/2024.eacl-long.61):
- Manually fixed some inconsistencies between Level 1 & Level 2 tags.
- Manually added the labels for some spans which were marked as entity spans but didn't have the tags.
- Manually fixed incorrectly marked spans with respect to tokenisation (either having a sub-word marked as an entity span, or having part of a previous word marked as an entity span; in both cases the whole word should've been marked as a span).
- Re-tokenised the dataset using the [MLRS Tokeniser](https://mlrs.research.um.edu.mt/), mainly done to not split off `-` & `'` characters as separate tokens as done by the [official convertor](https://gitlab.com/MAPA-EU-Project/mapa_project/-/blob/master/documentation/detection_training.md#converting-inception-tsv-files-to-jsonlines), since these are linguistically important characters in Maltese.
While doing so, any tokens not split off by the tokeniser but which had multiple entity sub-spans, were also split off into separate tokens.
Lastly, all tokens ending with `-`/`'` were checked to ensure that these weren't miscellaneous characters (e.g. for number ranges or quotation marks), in which case they were manually split into separate tokens.
For `EurLex` documents the same training/validation/testing splits from [joelniklaus/mapa](https://huggingface.co/datasets/joelniklaus/mapa) are kept.
Otherwise, for the other domains, we split documents in similar ratios.
## Citations
If you used this dataset, please cite these works:
- The original dataset:
```bibtex
@inproceedings{gianola-2020-mapa,
author = {Lucie Gianola and Ēriks Ajausks and Victoria Arranz and Chomicha Bendahman and Laurent Bié and Claudia Borg and Aleix Cerdà and Khalid Choukri and Montse Cuadros and Ona de Gibert and Hans Degroote and Elena Edelman and Thierry Etchegoyhen and Ángela Franco Torres and Mercedes García Hernandez and Aitor García Pablos and Albert Gatt and Cyril Grouin and Manuel Herranz and Alejandro Adolfo Kohan and Thomas Lavergne and Maite Melero and Patrick Paroubek and Mickaël Rigault and Mike Rosner and Roberts Rozis and Lonneke van der Plas and Rinalds Vīksna and Pierre Zweigenbaum},
title = {Automatic Removal of Identifying Information in Official EU Languages for Public Administrations: The {MAPA} Project},
booktitle = {Proceedings of the 33rd International Conference on Legal Knowledge and Information Systems ({JURIX'20})},
pages = {223--226},
year = {2020},
publisher = {IOS Press},
url = {https://ebooks.iospress.nl/volumearticle/56182},
doi = {10.3233/FAIA200869},
}
```
- The fixes & training/validation/testing splits:
```bibtex
@misc{micallef-etal-2024-maltese-etymology,
title = "Cross-Lingual Transfer from Related Languages: Treating Low-Resource {M}altese as Multilingual Code-Switching",
author = "Micallef, Kurt and
Habash, Nizar and
Borg, Claudia and
Eryani, Fadhl and
Bouamor, Houda",
editor = "Graham, Yvette and
Purver, Matthew",
booktitle = "Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = mar,
year = "2024",
address = "St. Julian{'}s, Malta",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.eacl-long.61",
pages = "1014--1025",
}
```
|
duraad/nep-spell-3k | ---
license: mit
---
|
akshaysaju9660/llama_tests | ---
license: llama2
---
|
huggingface/autotrain-data-yrsq-dnj7-jghjk3 | Invalid username or password. |
qbourbon/pb_trainset-2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': 000_airplane
'1': 001_alarm_clock
'2': 002_angel
'3': 003_ant
'4': 004_apple
'5': 005_arm
'6': 006_armchair
'7': 007_ashtray
'8': 008_axe
'9': 009_backpack
'10': 010_banana
'11': 011_barn
'12': 012_baseball_bat
'13': 013_basket
'14': 014_bathtub
'15': 015_bear_(animal)
'16': 016_bed
'17': 017_bee
'18': 018_beer-mug
'19': 019_bell
'20': 020_bench
'21': 021_bicycle
'22': 022_binoculars
'23': 023_blimp
'24': 024_book
'25': 025_bookshelf
'26': 026_boomerang
'27': 027_bottle_opener
'28': 028_bowl
'29': 029_brain
'30': 030_bread
'31': 031_bridge
'32': 032_bulldozer
'33': 033_bus
'34': 034_bush
'35': 035_butterfly
'36': 036_cabinet
'37': 037_cactus
'38': 038_cake
'39': 039_calculator
'40': 040_camel
'41': 041_camera
'42': 042_candle
'43': 043_cannon
'44': 044_canoe
'45': 045_car_(sedan)
'46': 046_carrot
'47': 047_castle
'48': 048_cat
'49': 049_cell_phone
'50': 050_chair
'51': 051_chandelier
'52': 052_church
'53': 053_cigarette
'54': 054_cloud
'55': 055_comb
'56': 056_computer_monitor
'57': 057_computer-mouse
'58': 058_couch
'59': 059_cow
'60': 060_crab
'61': 061_crane_(machine)
'62': 062_crocodile
'63': 063_crown
'64': 064_cup
'65': 065_diamond
'66': 066_dog
'67': 067_dolphin
'68': 068_donut
'69': 069_door
'70': 070_door_handle
'71': 071_dragon
'72': 072_duck
'73': 073_ear
'74': 074_elephant
'75': 075_envelope
'76': 076_eye
'77': 077_eyeglasses
'78': 078_face
'79': 079_fan
'80': 080_feather
'81': 081_fire_hydrant
'82': 082_fish
'83': 083_flashlight
'84': 084_floor_lamp
'85': 085_flower_with_stem
'86': 086_flying_bird
'87': 087_flying_saucer
'88': 088_foot
'89': 089_fork
'90': 090_frog
'91': 091_frying-pan
'92': 092_giraffe
'93': 093_grapes
'94': 094_grenade
'95': 095_guitar
'96': 096_hamburger
'97': 097_hammer
'98': 098_hand
'99': 099_harp
'100': 100_hat
'101': 101_head
'102': 102_head-phones
'103': 103_hedgehog
'104': 104_helicopter
'105': 105_helmet
'106': 106_horse
'107': 107_hot_air_balloon
'108': 108_hot-dog
'109': 109_hourglass
'110': 110_house
'111': 111_human-skeleton
'112': 112_ice-cream-cone
'113': 113_ipod
'114': 114_kangaroo
'115': 115_key
'116': 116_keyboard
'117': 117_knife
'118': 118_ladder
'119': 119_laptop
'120': 120_leaf
'121': 121_lightbulb
'122': 122_lighter
'123': 123_lion
'124': 124_lobster
'125': 125_loudspeaker
'126': 126_mailbox
'127': 127_megaphone
'128': 128_mermaid
'129': 129_microphone
'130': 130_microscope
'131': 131_monkey
'132': 132_moon
'133': 133_mosquito
'134': 134_motorbike
'135': 135_mouse_(animal)
'136': 136_mouth
'137': 137_mug
'138': 138_mushroom
'139': 139_nose
'140': 140_octopus
'141': 141_owl
'142': 142_palm_tree
'143': 143_panda
'144': 144_paper_clip
'145': 145_parachute
'146': 146_parking_meter
'147': 147_parrot
'148': 148_pear
'149': 149_pen
'150': 150_penguin
'151': 151_person_sitting
'152': 152_person_walking
'153': 153_piano
'154': 154_pickup_truck
'155': 155_pig
'156': 156_pigeon
'157': 157_pineapple
'158': 158_pipe_(for_smoking)
'159': 159_pizza
'160': 160_potted_plant
'161': 161_power_outlet
'162': 162_present
'163': 163_pretzel
'164': 164_pumpkin
'165': 165_purse
'166': 166_rabbit
'167': 167_race_car
'168': 168_radio
'169': 169_rainbow
'170': 170_revolver
'171': 171_rifle
'172': 172_rollerblades
'173': 173_rooster
'174': 174_sailboat
'175': 175_santa_claus
'176': 176_satellite
'177': 177_satellite_dish
'178': 178_saxophone
'179': 179_scissors
'180': 180_scorpion
'181': 181_screwdriver
'182': 182_sea_turtle
'183': 183_seagull
'184': 184_shark
'185': 185_sheep
'186': 186_ship
'187': 187_shoe
'188': 188_shovel
'189': 189_skateboard
'190': 190_skull
'191': 191_skyscraper
'192': 192_snail
'193': 193_snake
'194': 194_snowboard
'195': 195_snowman
'196': 196_socks
'197': 197_space_shuttle
'198': 198_speed-boat
'199': 199_spider
'200': 200_sponge_bob
'201': 201_spoon
'202': 202_squirrel
'203': 203_standing_bird
'204': 204_stapler
'205': 205_strawberry
'206': 206_streetlight
'207': 207_submarine
'208': 208_suitcase
'209': 209_sun
'210': 210_suv
'211': 211_swan
'212': 212_sword
'213': 213_syringe
'214': 214_t-shirt
'215': 215_table
'216': 216_tablelamp
'217': 217_teacup
'218': 218_teapot
'219': 219_teddy-bear
'220': 220_telephone
'221': 221_tennis-racket
'222': 222_tent
'223': 223_tiger
'224': 224_tire
'225': 225_toilet
'226': 226_tomato
'227': 227_tooth
'228': 228_toothbrush
'229': 229_tractor
'230': 230_traffic_light
'231': 231_train
'232': 232_tree
'233': 233_trombone
'234': 234_trousers
'235': 235_truck
'236': 236_trumpet
'237': 237_tv
'238': 238_umbrella
'239': 239_van
'240': 240_vase
'241': 241_violin
'242': 242_walkie_talkie
'243': 243_wheel
'244': 244_wheelbarrow
'245': 245_windmill
'246': 246_wine-bottle
'247': 247_wineglass
'248': 248_wrist-watch
'249': 249_zebra
'250': mistery_category
splits:
- name: train
num_bytes: 151506666.84822693
num_examples: 5136
download_size: 148171712
dataset_size: 151506666.84822693
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YufeiHFUT/BioRED_llama_modify | ---
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 13500564
num_examples: 3831
- name: validation
num_bytes: 4073661
num_examples: 1114
- name: test
num_bytes: 3568904
num_examples: 990
download_size: 2887435
dataset_size: 21143129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
MikeTrizna/bee_specimens | ---
license: cc0-1.0
dataset_info:
features:
- name: occurrenceID
dtype: string
- name: catalogNumber
dtype: string
- name: recordedBy
dtype: string
- name: year
dtype: int64
- name: month
dtype: int64
- name: day
dtype: int64
- name: country
dtype: string
- name: stateProvince
dtype: string
- name: county
dtype: string
- name: locality
dtype: string
- name: decimalLatitude
dtype: float64
- name: decimalLongitude
dtype: float64
- name: identifiedBy
dtype: string
- name: scientificName
dtype: string
- name: genus
dtype: string
- name: subgenus
dtype: string
- name: specificEpithet
dtype: string
- name: infraspecificEpithet
dtype: string
- name: scientificNameAuthorship
dtype: string
- name: PixelXDimension
dtype: float64
- name: PixelYDimension
dtype: float64
- name: accessURI
dtype: string
splits:
- name: train
num_bytes: 26732760
num_examples: 73387
download_size: 7117791
dataset_size: 26732760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Bee_Specimens
## Dataset Summary
The USNM Bumblebee Dataset is a natural history dataset containing, for each of 73,497 Bumblebee specimens in the family Apidae, a single image in lateral or dorsal view and a tab-separated value file with occurrence data. Occurrence data includes the species classification, the date and site/location of collection, and other metadata conforming to the Darwin Core data standard (https://dwc.tdwg.org). 11,421 specimens are not identified to species and these specimens are included as 'Bombus sp.' or 'Xylocopa sp.' The collecting sites/locations of the majority of specimens (55,301), have been georeferenced. The dataset is worldwide in scope, but is limited to the specimens available in the Smithsonian USNM collection.
## Languages
English
## Data Instances
A typical data point comprises of the specimen metadata and image information for a single bumblebee specimen.
An example from the dataset looks as follows:
```json
{
'occurrenceID': 'http://n2t.net/ark:/65665/30042e2d8-669d-4520-b456-e3c64203eff8',
'catalogNumber': 'USNMENT01732649',
'recordedBy': 'R. Craig',
'year': '1949',
'month': '4',
'day': '13',
'country': 'United States',
'stateProvince': 'California',
'county': 'Fresno',
'locality': 'Auberry',
'decimalLatitude': '37.0808',
'decimalLongitude': '-119.485',
'identifiedBy': "O'Brien, L. R.",
'scientificName': 'Xylocopa (Notoxylocopa) tabaniformis orpifex',
'genus': 'Xylocopa',
'subgenus': 'Notoxylocopa',
'specificEpithet': 'tabaniformis',
'infraspecificEpithet': 'orpifex',
'scientificNameAuthorship': 'Smith',
'accessURI': 'https://ids.si.edu/ids/deliveryService?id=NMNH-USNMENT01732649',
'PixelXDimension': 2000,
'PixelYDimension': 1212
}
```
## Data Fields
Specimen metadata fields conform to the Darwin Core data standard and are detailed here: https://dwc.tdwg.org. Image metadata fields conform to the Audiovisual Core data standard and are detailed here: https://ac.tdwg.org/.
## Curation Rationale
The dataset represents a portion of the U. S. National Entomological Collection. The U.S. National Entomological Collection (USNM) traces its origins in part to the acquisition of the U.S. Department of Agriculture Collection of 138,000 specimens donated in 1885. These specimens became the foundation of one of the world’s largest and most important accessible entomological collections, with over 33 million specimens taken care of by the combined staff of three government agencies: the Smithsonian Institution; the Systematic Entomology Laboratory (Agricultural Research Service, United States Department of Agriculture); and the Walter Reed Biosystematics Unit (Walter Reed Army Institute of Research). The specimens were imaged in a mass-digitization project in collaboration with the Digitization Program Office. The goal was to digitize every Bombus specimen in the collection.
## Initial Data Collection and Normalization
Bumblebee specimens were collected over a period of 150 years (earliest specimen dates from 1807, most recent specimen dates from 2020). The specimens were collected by and identified by many different individual researchers over this time. The initial images of about 49,000 specimens were taken in a rapid capture project by a dedicated team in 2014 with additional specimen images (about 25,000) taken in 2018. The labels containing the information on site/location, date of collection, collector, and identifier were removed from the insect pin. The occurrence data were transcribed from the labels by online volunteers and a professional transcription service into Darwin Core fields. Following quality control of the transcribed data by NMNH staff, they were imported into the institutional database (EMu).
NMNH specimen data get exported to the Global Biodiversity Information Facility (GBIF) on a weekly basis through an installation of an Integrated Publishing Toolkit (IPT, https://collections.nmnh.si.edu/ipt/). Some data transformation takes place within EMu and GBIF likewise normalizes the data to meet their standards.
## Who are the source language producers?
The occurrence data were produced by humans, observed and written onto paper labels over the museum’s history, and then transcribed from paper labels pinned with the specimens upon collection.
## Annotations
The specimen occurrence data in Darwin Core fields.
## Annotation process
The occurrence data were transcribed from the labels by online volunteers and a professional transcription service into Darwin Core fields.
## Who are the annotators?
Original collectors and identifiers were entomologists and researchers from the Smithsonian and other institutions. Collectors may not be bumblebee specialists. For data transcription, online volunteers and professional transcription service workers. Demographic data of transcribers is unknown.
## Personal and Sensitive Information
The dataset contains the names of the collectors and identifiers.
## Social Impact of Dataset
Digitized natural history collections have the potential to be used in diverse research applications in evolutionary biology, ecology, and climate change.
The dataset contains records for species listed on the U.S. Endangered Species List: Bombus affinis, Bombus franklini, and Bombus terricola.
Some site/location names could cause harm as they are insensitive or racist towards indigenous communities.
## Discussion of Biases
Estimates of species geographic ranges based on these data may not be complete. There are many reasons collectors may collect more frequently from some areas rather than others, including their own taxonomic interests, proximity to collections institutions, accessibility via roads, ability to acquire permits for a specific area, or for geopolitical reasons.
The majority of specimens in this dataset originate from North America.
Most specimens are expected to be female, because bumblebees are social insects and it is more common to find female bees.
## Other Known Limitations
As with all natural history collections data, there is the potential that some metadata are inaccurate or inconsistent given that they have been collected and recorded over the course of the past 150 years. Smithsonian staff seek to correct these errors as they are identified but the dataset as presented is a snapshot in time.
Species identifications may be inaccurate or not up-to-date based on the latest classification.
Collector names may not be consistent across records (e.g. the same person’s name may be written differently). For women’s names, which were often historically recorded as Mrs. <spouse’s name>, only the spouse’s name may appear.
Locality data may use historical place names that are no longer used.
Dates may sometimes have been recorded by original collectors inconsistently or may be incomplete (no month/day information).
For specimens collected from Brazil, specimen images are not included in the dataset.
For endangered species, locality data is not included in the dataset.
## Dataset Curators
Smithsonian National Museum of Natural History, Department of Entomology.
Jessica Bird (Data Manager in the Department of Entomology) is the main contact person for the dataset.
## Licensing Information
Public domain, Creative Commons CC0.
## Citation Information
Orrell T, Informatics Office (2023). NMNH Extant Specimen Records (USNM, US). Version 1.72. National Museum of Natural History, Smithsonian Institution. Occurrence dataset. https://collections.nmnh.si.edu/ipt/resource?r=nmnh_extant_dwc-a&v=1.72
## Contributions
Thanks to NMNH for adding this dataset. |
TagsTest2024/tiny_llava_20240227183328 | ---
dataset_info:
features:
- name: URL
dtype: string
- name: TEXT
dtype: string
- name: tiny_llava_20240227183328
dtype: string
splits:
- name: ase6.5_5000
num_bytes: 2671356
num_examples: 5000
download_size: 1410411
dataset_size: 2671356
configs:
- config_name: default
data_files:
- split: ase6.5_5000
path: data/ase6.5_5000-*
---
|
Papersnake/xi_talk | ---
license: cc0-1.0
---
# 习近平系列重要讲话数据库
数据截止至 2023.4.23 |
marvinmedeiros52/vozperdigao | ---
license: openrail
---
|
nlplabtdtu/ai_la_trieu_phu | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: ID
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: solution
dtype: 'null'
- name: options
list:
- name: answer
dtype: string
- name: key
dtype: string
splits:
- name: train
num_bytes: 2425477
num_examples: 13630
download_size: 1180909
dataset_size: 2425477
---
# Dataset Card for "ai_la_trieu_phu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arnepeine/medspeech3 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2290519.0
num_examples: 24
download_size: 0
dataset_size: 2290519.0
---
# Dataset Card for "medspeech3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmu-mlsp/prepared_encodec_first_layer_libri100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1283112810
num_examples: 57078
- name: validation
num_bytes: 73262048
num_examples: 5406
- name: test
num_bytes: 35793244
num_examples: 2620
download_size: 82659998
dataset_size: 1392168102
---
# Dataset Card for "prepared_encodec_first_layer_libri100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jylins/videoxum | ---
license: apache-2.0
task_categories:
- summarization
language:
- en
tags:
- cross-modal-video-summarization
- video-summarization
- video-captioning
pretty_name: VideoXum
size_categories:
- 10K<n<100K
---
# Dataset Card for VideoXum
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Splits](#data-splits)
- [Data Resources](#data-resources)
- [Data Fields](#data-fields)
- [Annotation Sample](#annotation-sample)
- [Citation](#citation)
## Dataset Description
- **Homepage:** https://videoxum.github.io/
- **Paper:** https://arxiv.org/abs/2303.12060
### Dataset Summary
The VideoXum dataset represents a novel task in the field of video summarization, extending the scope from single-modal to cross-modal video summarization. This new task focuses on creating video summaries that containing both visual and textual elements with semantic coherence. Built upon the foundation of ActivityNet Captions, VideoXum is a large-scale dataset, including over 14,000 long-duration and open-domain videos. Each video is paired with 10 corresponding video summaries, amounting to a total of 140,000 video-text summary pairs.
### Languages
The textual summarization in the dataset are in English.
## Dataset Structure
### Dataset Splits
| |train |validation| test | Overall |
|-------------|------:|---------:|------:|--------:|
| # of videos | 8,000 | 2,001 | 4,000 | 14,001 |
### Dataset Resources
- `train_videoxum.json`: annotations of training set
- `val_videoxum.json`: annotations of validation set
- `test_videoxum.json`: annotations of test set
### Dataset Fields
- `video_id`: `str` a unique identifier for the video.
- `duration`: `float` total duration of the video in seconds.
- `sampled_frames`: `int` the number of frames sampled from source video at 1 fps with a uniform sampling schema.
- `timestamps`: `List_float` a list of timestamp pairs, with each pair representing the start and end times of a segment within the video.
- `tsum`: `List_str` each textual video summary provides a summarization of the corresponding video segment as defined by the timestamps.
- `vsum`: `List_float` each visual video summary corresponds to key frames within each video segment as defined by the timestamps. The dimensions (3 x 10) suggest that each video segment was reannotated by 10 different workers.
- `vsum_onehot`: `List_bool` one-hot matrix transformed from 'vsum'. The dimensions (10 x 83) denotes the one-hot labels spanning the entire length of a video, as annotated by 10 workers.
### Annotation Sample
For each video, We hire workers to annotate ten shortened video summaries.
``` json
{
'video_id': 'v_QOlSCBRmfWY',
'duration': 82.73,
'sampled_frames': 83
'timestamps': [[0.83, 19.86], [17.37, 60.81], [56.26, 79.42]],
'tsum': ['A young woman is seen standing in a room and leads into her dancing.',
'The girl dances around the room while the camera captures her movements.',
'She continues dancing around the room and ends by laying on the floor.'],
'vsum': [[[ 7.01, 12.37], ...],
[[41.05, 45.04], ...],
[[65.74, 69.28], ...]] (3 x 10 dim)
'vsum_onehot': [[[0,0,0,...,1,1,...], ...],
[[0,0,0,...,1,1,...], ...],
[[0,0,0,...,1,1,...], ...],] (10 x 83 dim)
}
```
## Citation
```bibtex
@article{lin2023videoxum,
author = {Lin, Jingyang and Hua, Hang and Chen, Ming and Li, Yikang and Hsiao, Jenhao and Ho, Chiuman and Luo, Jiebo},
title = {VideoXum: Cross-modal Visual and Textural Summarization of Videos},
journal = {IEEE Transactions on Multimedia},
year = {2023},
}
```
|
metaeval/autotnli | ---
license: apache-2.0
language:
- en
task_ids:
- natural-language-inference
task_categories:
- text-classification
---
https://github.com/Dibyakanti/AutoTNLI-code
```
@inproceedings{kumar-etal-2022-autotnli,
title = "Realistic Data Augmentation Framework for Enhancing Tabular Reasoning",
author = "Kumar, Dibyakanti and
Gupta, Vivek and
Sharma, Soumya and
Zhang, Shuo",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Online and Abu Dhabi",
publisher = "Association for Computational Linguistics",
url = "https://vgupta123.github.io/docs/autotnli.pdf",
pages = "",
abstract = "Existing approaches to constructing training data for Natural Language Inference (NLI) tasks, such as for semi-structured table reasoning, are either via crowdsourcing or fully automatic methods. However, the former is expensive and time-consuming and thus limits scale, and the latter often produces naive examples that may lack complex reasoning. This paper develops a realistic semi-automated framework for data augmentation for tabular inference. Instead of manually generating a hypothesis for each table, our methodology generates hypothesis templates transferable to similar tables. In addition, our framework entails the creation of rational counterfactual tables based on human written logical constraints and premise paraphrasing. For our case study, we use the InfoTabS (Gupta et al., 2020), which is an entity-centric tabular inference dataset. We observed that our framework could generate human-like tabular inference examples, which could benefit training data augmentation, especially in the scenario with limited supervision.",
}
``` |
autoevaluate/autoeval-staging-eval-project-squad_v2-2eb94bfa-11695556 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/minilm-uncased-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/minilm-uncased-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ghpkishore](https://huggingface.co/ghpkishore) for evaluating this model. |
psandev/lotr-book | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2196528.0
num_examples: 268
- name: test
num_bytes: 245880.0
num_examples: 30
download_size: 1126733
dataset_size: 2442408.0
---
# Dataset Card for "lotr-book"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_tr_w2 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162855.76923076922
num_examples: 250
download_size: 88572
dataset_size: 162855.76923076922
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_w2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xiaojuan0920/CSKG | ---
license: openrail
task_categories:
- question-answering
language:
- en
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
harishvs/ecommerce-faq-llama2-chat | ---
language:
- en
license: apache-2.0
task_categories:
- question-answering
- text-generation
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38858
num_examples: 158
download_size: 9384
dataset_size: 38858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
morenofran/pilolinho | ---
license: openrail
---
|
MatrixStudio/Codeforces-Python-Submissions | ---
dataset_info:
features:
- name: contestId
dtype: int64
- name: index
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: rating
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: time-limit
dtype: string
- name: memory-limit
dtype: string
- name: problem-description
dtype: string
- name: input-specification
dtype: string
- name: output-specification
dtype: string
- name: demo-input
sequence: string
- name: demo-output
sequence: string
- name: note
dtype: string
- name: points
dtype: float64
- name: test_cases
list:
- name: input
dtype: string
- name: output
dtype: string
- name: creationTimeSeconds
dtype: int64
- name: relativeTimeSeconds
dtype: int64
- name: programmingLanguage
dtype: string
- name: verdict
dtype: string
- name: testset
dtype: string
- name: passedTestCount
dtype: int64
- name: timeConsumedMillis
dtype: int64
- name: memoryConsumedBytes
dtype: int64
- name: code
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 4233926740
num_examples: 621356
- name: test
num_bytes: 470125693
num_examples: 69040
download_size: 1663054241
dataset_size: 4704052433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "Codeforces-Python-Submissions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lachine/melson7 | ---
license: gpl-3.0
---
|
jojofan/minguostyle | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 444193006.0
num_examples: 944
download_size: 444181518
dataset_size: 444193006.0
---
# Dataset Card for "minguostyle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giux78/70000-90000-ultrafeedback-ita | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 147192537
num_examples: 20000
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 969780599
dataset_size: 1797561097
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus | ---
pretty_name: Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2_guanaco-dolly-platypus](https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T17:11:56.219131](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus/blob/main/results_2023-10-15T17-11-56.219131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965757,\n \"f1\": 0.04961304530201346,\n\
\ \"f1_stderr\": 0.001421455981669693,\n \"acc\": 0.2505919494869771,\n\
\ \"acc_stderr\": 0.007026223145264506\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965757,\n\
\ \"f1\": 0.04961304530201346,\n \"f1_stderr\": 0.001421455981669693\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n\
\ \"acc_stderr\": 0.014052446290529012\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|arc:challenge|25_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T17_11_56.219131
path:
- '**/details_harness|drop|3_2023-10-15T17-11-56.219131.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T17-11-56.219131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T17_11_56.219131
path:
- '**/details_harness|gsm8k|5_2023-10-15T17-11-56.219131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T17-11-56.219131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hellaswag|10_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T23:17:05.227048.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T23:17:05.227048.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T17_11_56.219131
path:
- '**/details_harness|winogrande|5_2023-10-15T17-11-56.219131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T17-11-56.219131.parquet'
- config_name: results
data_files:
- split: 2023_08_31T23_17_05.227048
path:
- results_2023-08-31T23:17:05.227048.parquet
- split: 2023_10_15T17_11_56.219131
path:
- results_2023-10-15T17-11-56.219131.parquet
- split: latest
path:
- results_2023-10-15T17-11-56.219131.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_guanaco-dolly-platypus](https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T17:11:56.219131](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus/blob/main/results_2023-10-15T17-11-56.219131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965757,
"f1": 0.04961304530201346,
"f1_stderr": 0.001421455981669693,
"acc": 0.2505919494869771,
"acc_stderr": 0.007026223145264506
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965757,
"f1": 0.04961304530201346,
"f1_stderr": 0.001421455981669693
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jjmachan/NSFW-questions-inter-cleaned_df | ---
dataset_info:
features:
- name: title
dtype: string
- name: subreddit
dtype: string
- name: post_id
dtype: string
- name: score
dtype: int64
- name: link_flair_text
dtype: string
- name: is_self
dtype: bool
- name: over_18
dtype: bool
- name: upvote_ratio
dtype: float64
- name: is_question
dtype: bool
- name: C1
dtype: string
- name: C2
dtype: string
- name: C3
dtype: string
- name: C4
dtype: string
- name: C5
dtype: string
splits:
- name: train
num_bytes: 1974116
num_examples: 12858
download_size: 885500
dataset_size: 1974116
---
# Dataset Card for "NSFW-questions-inter-cleaned_df"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Janiele/eduardo | ---
license: openrail
---
|
VoidZeroe/autonlp-data-second | ---
task_categories:
- conditional-text-generation
---
# AutoNLP Dataset for project: second
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project second.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "one hundred and forty-two minus fifty-three",
"target": "one hundred and ninety-five"
},
{
"text": "two hundred and twenty minus seventy-one",
"target": "two hundred and ninety-one"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "Value(dtype='string', id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 600000 |
| valid | 150000 |
|
datahrvoje/twitter_dataset_1713054300 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20806
num_examples: 48
download_size: 11156
dataset_size: 20806
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaitchup/opus-Indonesian-to-English | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 182024
num_examples: 2000
- name: train
num_bytes: 74451703
num_examples: 989529
download_size: 53126195
dataset_size: 74633727
---
# Dataset Card for "opus-id-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lshowway/wikipedia.reorder.vos.de | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2385745587
num_examples: 1137317
download_size: 1068076681
dataset_size: 2385745587
---
# Dataset Card for "wikipedia.reorder.vos.de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/noa_himesaka_watashinitenshigamaiorita | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Noa Himesaka
This is the dataset of Noa Himesaka, containing 422 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 422 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1039 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1159 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 422 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 422 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 422 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1039 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1039 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 838 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1159 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1159 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
OsamaBsher/AITA-Reddit-Dataset | ---
task_categories:
- text-generation
- text-classification
size_categories:
- 100K<n<1M
---
# Dataset Card for AITA Reddit Posts and Comments
Posts of the AITA subreddit, with the 2 top voted comments that share the post verdict. Extracted using REDDIT PushShift (from 2013 to April 2023)
## Dataset Details
The dataset contains 270,709 entiries each of which contain the post title, text, verdict, comment1, comment2 and score (number of upvotes)
For more details see paper: https://arxiv.org/abs/2310.18336
### Dataset Sources
The Reddit PushShift data dumps are part of a data collection effort which crawls Reddit at regular intervals, to extract and keep all its data.
## Dataset Card Authors
@OsamaBsher and Ameer Sabri
## Dataset Card Contact
@OsamaBsher |
jadechip/color-palette-controlnet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': animals
'1': art
'2': fashion
'3': food
'4': indoor
'5': landscape
'6': logo
'7': people
'8': plants
'9': vehicles
splits:
- name: train
num_bytes: 518662800.0
num_examples: 30000
download_size: 40972396
dataset_size: 518662800.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tourist800/ORKG_train_data_with_prefix | ---
license: mit
---
|
CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nagatoro Hayase/長瀞さん/長瀞早瀬 (Don't Toy With Me, Miss Nagatoro)
This is the dataset of Nagatoro Hayase/長瀞さん/長瀞早瀬 (Don't Toy With Me, Miss Nagatoro), containing 992 images and their tags.
The core tags of this character are `black_hair, long_hair, dark-skinned_female, dark_skin, brown_eyes, hairclip, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 992 | 778.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 992 | 778.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1989 | 1.32 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blush, closed_mouth, portrait, solo, frown, close-up |
| 1 | 11 |  |  |  |  |  | 1girl, blush, portrait, solo, looking_at_viewer, shirt |
| 2 | 6 |  |  |  |  |  | 1girl, closed_mouth, earclip, looking_at_viewer, portrait, shirt, solo, blush, asymmetrical_bangs |
| 3 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, skin_fang, smile, solo, fang_out, portrait, white_shirt, closed_mouth, earclip |
| 4 | 5 |  |  |  |  |  | 1girl, backpack, blue_skirt, looking_at_viewer, open_mouth, pleated_skirt, school_uniform, solo, white_shirt, :d, earclip, holding_strap, traditional_media, sleeves_rolled_up |
| 5 | 14 |  |  |  |  |  | 1girl, school_uniform, solo, blush, white_shirt, blue_skirt, pleated_skirt, window |
| 6 | 9 |  |  |  |  |  | 1girl, indoors, no_socks, pleated_skirt, school_uniform, uwabaki, white_shirt, blue_skirt, couch, solo, full_body, sitting, blush, crossed_legs |
| 7 | 6 |  |  |  |  |  | 1girl, earclip, indoors, skin_fang, solo, white_shirt, chalkboard, looking_at_viewer, :d, blush, nail_polish, open_mouth, classroom, collared_shirt, piercing, school_uniform, upper_body |
| 8 | 6 |  |  |  |  |  | 1girl, simple_background, solo, blush, greyscale, no_humans, white_background, comic |
| 9 | 6 |  |  |  |  |  | 1girl, anime_coloring, mole_under_eye, red_eyes, solo, looking_at_viewer, parody, portrait, smile, closed_mouth |
| 10 | 5 |  |  |  |  |  | 1girl, blush, nail_polish, smile, solo, holding_phone, portrait, smartphone, :3, fang, night, pink_nails, red_nails |
| 11 | 9 |  |  |  |  |  | 1girl, blush, hood, smile, solo, upper_body, animal_hat, beanie, coat, jacket, looking_at_viewer, outdoors, skin_fang |
| 12 | 11 |  |  |  |  |  | 1girl, cat_ears, blush, bare_shoulders, blue_skirt, animal_ear_fluff, crop_top, indoors, midriff, shirt, cat_paws, navel, one-piece_tan, paw_gloves, pleated_skirt, sleeveless, small_breasts, earclip, fake_animal_ears, fang, smile, solo_focus, tail |
| 13 | 6 |  |  |  |  |  | 1girl, beret, blue_skirt, outdoors, sitting, sketchbook, bag, holding_pencil, blush, closed_mouth, day, earclip, jacket, shirt, smile, white_headwear, park_bench, solo_focus |
| 14 | 8 |  |  |  |  |  | 1girl, blush, day, outdoors, smile, collarbone, earclip, navel, sports_bra, midriff, ponytail, skin_fang, small_breasts, solo, asymmetrical_bangs, bike_shorts, cleavage, fang_out, looking_at_viewer, one-piece_tan, armpits, blue_sky, closed_mouth, cloud, grass, groin, hand_on_own_hip, tree |
| 15 | 8 |  |  |  |  |  | 1girl, indoors, blush, nail_polish, holding_game_controller, red_nails, tan, blue_shirt, gamepad, green_shirt, playstation_controller, short_sleeves, sitting, smile, solo_focus, t-shirt, barefoot, bike_shorts, bookshelf, collarbone, frown, open_mouth, playing_games |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | closed_mouth | portrait | solo | frown | close-up | looking_at_viewer | shirt | earclip | asymmetrical_bangs | skin_fang | smile | fang_out | white_shirt | backpack | blue_skirt | open_mouth | pleated_skirt | school_uniform | :d | holding_strap | traditional_media | sleeves_rolled_up | window | indoors | no_socks | uwabaki | couch | full_body | sitting | crossed_legs | chalkboard | nail_polish | classroom | collared_shirt | piercing | upper_body | simple_background | greyscale | no_humans | white_background | comic | anime_coloring | mole_under_eye | red_eyes | parody | holding_phone | smartphone | :3 | fang | night | pink_nails | red_nails | hood | animal_hat | beanie | coat | jacket | outdoors | cat_ears | bare_shoulders | animal_ear_fluff | crop_top | midriff | cat_paws | navel | one-piece_tan | paw_gloves | sleeveless | small_breasts | fake_animal_ears | solo_focus | tail | beret | sketchbook | bag | holding_pencil | day | white_headwear | park_bench | collarbone | sports_bra | ponytail | bike_shorts | cleavage | armpits | blue_sky | cloud | grass | groin | hand_on_own_hip | tree | holding_game_controller | tan | blue_shirt | gamepad | green_shirt | playstation_controller | short_sleeves | t-shirt | barefoot | bookshelf | playing_games |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:---------------|:-----------|:-------|:--------|:-----------|:--------------------|:--------|:----------|:---------------------|:------------|:--------|:-----------|:--------------|:-----------|:-------------|:-------------|:----------------|:-----------------|:-----|:----------------|:--------------------|:--------------------|:---------|:----------|:-----------|:----------|:--------|:------------|:----------|:---------------|:-------------|:--------------|:------------|:-----------------|:-----------|:-------------|:--------------------|:------------|:------------|:-------------------|:--------|:-----------------|:-----------------|:-----------|:---------|:----------------|:-------------|:-----|:-------|:--------|:-------------|:------------|:-------|:-------------|:---------|:-------|:---------|:-----------|:-----------|:-----------------|:-------------------|:-----------|:----------|:-----------|:--------|:----------------|:-------------|:-------------|:----------------|:-------------------|:-------------|:-------|:--------|:-------------|:------|:-----------------|:------|:-----------------|:-------------|:-------------|:-------------|:-----------|:--------------|:-----------|:----------|:-----------|:--------|:--------|:--------|:------------------|:-------|:--------------------------|:------|:-------------|:----------|:--------------|:-------------------------|:----------------|:----------|:-----------|:------------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | X | | | X | | | | | | | | | | X | | X | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | | X | | | | | | | | | | X | | X | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | | X | | | X | | X | | X | | | X | | | X | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | X | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | X | X | | | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 11 |  |  |  |  |  | X | X | | | | | | | X | X | | | X | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 6 |  |  |  |  |  | X | X | X | | | | | | X | X | | | X | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 14 | 8 |  |  |  |  |  | X | X | X | | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | | | X | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 15 | 8 |  |  |  |  |  | X | X | | | | X | | | | | | | X | | | | | X | | | | | | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
eren23/Amazon-Reviews-2023-amazon_fashion-grouped-100-sub-tagged | ---
dataset_info:
features:
- name: asin
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: review_count
dtype: int64
- name: combined_reviews
dtype: string
- name: summary_reviews
dtype: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 911245
num_examples: 100
download_size: 498687
dataset_size: 911245
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_zorobin__mistral-class-shishya-all-hal-7b-ep3 | ---
pretty_name: Evaluation run of zorobin/mistral-class-shishya-all-hal-7b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zorobin/mistral-class-shishya-all-hal-7b-ep3](https://huggingface.co/zorobin/mistral-class-shishya-all-hal-7b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zorobin__mistral-class-shishya-all-hal-7b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T05:47:57.937695](https://huggingface.co/datasets/open-llm-leaderboard/details_zorobin__mistral-class-shishya-all-hal-7b-ep3/blob/main/results_2024-01-28T05-47-57.937695.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.35098970402920293,\n\
\ \"acc_stderr\": 0.033365473911417726,\n \"acc_norm\": 0.3540891126290075,\n\
\ \"acc_norm_stderr\": 0.03427175559062365,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752325,\n \"mc2\": 0.3598229176985082,\n\
\ \"mc2_stderr\": 0.0144824296098062\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.447098976109215,\n \"acc_stderr\": 0.01452938016052685,\n\
\ \"acc_norm\": 0.4658703071672355,\n \"acc_norm_stderr\": 0.014577311315231104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5972913762198765,\n\
\ \"acc_stderr\": 0.004894407257215806,\n \"acc_norm\": 0.7886875124477196,\n\
\ \"acc_norm_stderr\": 0.004074052113451379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.03000048544867599,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.03000048544867599\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0314108219759624,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0314108219759624\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.041042692118062316,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.041042692118062316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3193548387096774,\n\
\ \"acc_stderr\": 0.02652270967466777,\n \"acc_norm\": 0.3193548387096774,\n\
\ \"acc_norm_stderr\": 0.02652270967466777\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.41414141414141414,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.41414141414141414,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089117,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089117\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.023000628243687957,\n\
\ \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.023000628243687957\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42752293577981654,\n \"acc_stderr\": 0.021210910204300434,\n \"\
acc_norm\": 0.42752293577981654,\n \"acc_norm_stderr\": 0.021210910204300434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791325,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.46835443037974683,\n \"acc_stderr\": 0.03248197400511075,\n \
\ \"acc_norm\": 0.46835443037974683,\n \"acc_norm_stderr\": 0.03248197400511075\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4017094017094017,\n\
\ \"acc_stderr\": 0.03211693751051622,\n \"acc_norm\": 0.4017094017094017,\n\
\ \"acc_norm_stderr\": 0.03211693751051622\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5938697318007663,\n\
\ \"acc_stderr\": 0.017562037406478916,\n \"acc_norm\": 0.5938697318007663,\n\
\ \"acc_norm_stderr\": 0.017562037406478916\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2630057803468208,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.2630057803468208,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.026787453111906535,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.026787453111906535\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.0266644108869376,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.0266644108869376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.345679012345679,\n \"acc_stderr\": 0.026462487777001872,\n\
\ \"acc_norm\": 0.345679012345679,\n \"acc_norm_stderr\": 0.026462487777001872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n \
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2561929595827901,\n\
\ \"acc_stderr\": 0.01114917315311058,\n \"acc_norm\": 0.2561929595827901,\n\
\ \"acc_norm_stderr\": 0.01114917315311058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3137254901960784,\n \"acc_stderr\": 0.01877168389352819,\n \
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.01877168389352819\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752325,\n \"mc2\": 0.3598229176985082,\n\
\ \"mc2_stderr\": 0.0144824296098062\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626306\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/zorobin/mistral-class-shishya-all-hal-7b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|arc:challenge|25_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|gsm8k|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hellaswag|10_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T05-47-57.937695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T05-47-57.937695.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- '**/details_harness|winogrande|5_2024-01-28T05-47-57.937695.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T05-47-57.937695.parquet'
- config_name: results
data_files:
- split: 2024_01_28T05_47_57.937695
path:
- results_2024-01-28T05-47-57.937695.parquet
- split: latest
path:
- results_2024-01-28T05-47-57.937695.parquet
---
# Dataset Card for Evaluation run of zorobin/mistral-class-shishya-all-hal-7b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zorobin/mistral-class-shishya-all-hal-7b-ep3](https://huggingface.co/zorobin/mistral-class-shishya-all-hal-7b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zorobin__mistral-class-shishya-all-hal-7b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T05:47:57.937695](https://huggingface.co/datasets/open-llm-leaderboard/details_zorobin__mistral-class-shishya-all-hal-7b-ep3/blob/main/results_2024-01-28T05-47-57.937695.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.35098970402920293,
"acc_stderr": 0.033365473911417726,
"acc_norm": 0.3540891126290075,
"acc_norm_stderr": 0.03427175559062365,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752325,
"mc2": 0.3598229176985082,
"mc2_stderr": 0.0144824296098062
},
"harness|arc:challenge|25": {
"acc": 0.447098976109215,
"acc_stderr": 0.01452938016052685,
"acc_norm": 0.4658703071672355,
"acc_norm_stderr": 0.014577311315231104
},
"harness|hellaswag|10": {
"acc": 0.5972913762198765,
"acc_stderr": 0.004894407257215806,
"acc_norm": 0.7886875124477196,
"acc_norm_stderr": 0.004074052113451379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.03000048544867599,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.03000048544867599
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0314108219759624,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0314108219759624
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.041042692118062316,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.041042692118062316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3193548387096774,
"acc_stderr": 0.02652270967466777,
"acc_norm": 0.3193548387096774,
"acc_norm_stderr": 0.02652270967466777
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.41414141414141414,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.41414141414141414,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089117,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089117
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.023000628243687957,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.023000628243687957
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42752293577981654,
"acc_stderr": 0.021210910204300434,
"acc_norm": 0.42752293577981654,
"acc_norm_stderr": 0.021210910204300434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791325,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.46835443037974683,
"acc_stderr": 0.03248197400511075,
"acc_norm": 0.46835443037974683,
"acc_norm_stderr": 0.03248197400511075
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4017094017094017,
"acc_stderr": 0.03211693751051622,
"acc_norm": 0.4017094017094017,
"acc_norm_stderr": 0.03211693751051622
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5938697318007663,
"acc_stderr": 0.017562037406478916,
"acc_norm": 0.5938697318007663,
"acc_norm_stderr": 0.017562037406478916
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2630057803468208,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.2630057803468208,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.026787453111906535,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.026787453111906535
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.0266644108869376,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.0266644108869376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.345679012345679,
"acc_stderr": 0.026462487777001872,
"acc_norm": 0.345679012345679,
"acc_norm_stderr": 0.026462487777001872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.027281608344469414,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.027281608344469414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2561929595827901,
"acc_stderr": 0.01114917315311058,
"acc_norm": 0.2561929595827901,
"acc_norm_stderr": 0.01114917315311058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.01877168389352819,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.01877168389352819
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288086,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288086
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752325,
"mc2": 0.3598229176985082,
"mc2_stderr": 0.0144824296098062
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626306
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
data-store/facebook-sentiment-analysis | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 1052010
num_examples: 7126
- name: test
num_bytes: 131478
num_examples: 891
- name: dev
num_bytes: 132085
num_examples: 897
download_size: 779432
dataset_size: 1315573
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
JaehyungKim/p2c_spam | ---
license: other
license_name: following-original-dataset
license_link: LICENSE
---
|
tyzhu/wiki_find_passage_train100_eval10_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 133060
num_examples: 210
- name: validation
num_bytes: 6982
num_examples: 10
download_size: 58150
dataset_size: 140042
---
# Dataset Card for "wiki_find_passage_train100_eval10_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel1322/austin | ---
license: openrail
---
|
open-llm-leaderboard/details_ericzzz__falcon-rw-1b-instruct-openorca | ---
pretty_name: Evaluation run of ericzzz/falcon-rw-1b-instruct-openorca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ericzzz/falcon-rw-1b-instruct-openorca](https://huggingface.co/ericzzz/falcon-rw-1b-instruct-openorca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ericzzz__falcon-rw-1b-instruct-openorca\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T12:35:28.593271](https://huggingface.co/datasets/open-llm-leaderboard/details_ericzzz__falcon-rw-1b-instruct-openorca/blob/main/results_2023-12-02T12-35-28.593271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.03411675511751327,\n\
\ \"acc_stderr\": 0.005000212600773262\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.005000212600773262\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ericzzz/falcon-rw-1b-instruct-openorca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T12_35_28.593271
path:
- '**/details_harness|gsm8k|5_2023-12-02T12-35-28.593271.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T12-35-28.593271.parquet'
- config_name: results
data_files:
- split: 2023_12_02T12_35_28.593271
path:
- results_2023-12-02T12-35-28.593271.parquet
- split: latest
path:
- results_2023-12-02T12-35-28.593271.parquet
---
# Dataset Card for Evaluation run of ericzzz/falcon-rw-1b-instruct-openorca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ericzzz/falcon-rw-1b-instruct-openorca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ericzzz/falcon-rw-1b-instruct-openorca](https://huggingface.co/ericzzz/falcon-rw-1b-instruct-openorca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ericzzz__falcon-rw-1b-instruct-openorca",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T12:35:28.593271](https://huggingface.co/datasets/open-llm-leaderboard/details_ericzzz__falcon-rw-1b-instruct-openorca/blob/main/results_2023-12-02T12-35-28.593271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.03411675511751327,
"acc_stderr": 0.005000212600773262
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.005000212600773262
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
UserIscool/Prompts | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- code
pretty_name: AI classification
size_categories:
- n<1K
--- |
Nexdata/American_English_Speech_Synthesis_Corpus-Male | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/American_English_Speech_Synthesis_Corpus-Male
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1159?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Male audio data of American English. It is recorded by American English native speakers, with authentic accent. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1159?source=Huggingface
### Supported Tasks and Leaderboards
tts: The dataset can be used to train a model for Text to Speech (TTS).
### Languages
American English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
chailey/EthTransactions_V1 | ---
license: openrail
---
|
open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3 | ---
pretty_name: Evaluation run of bardsai/jaskier-7b-dpo-v3.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bardsai/jaskier-7b-dpo-v3.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T15:49:58.893408](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3/blob/main/results_2024-02-13T15-49-58.893408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6493302568511367,\n\
\ \"acc_stderr\": 0.032148382244220834,\n \"acc_norm\": 0.648903728648512,\n\
\ \"acc_norm_stderr\": 0.03281862542360137,\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.7900424254093005,\n\
\ \"mc2_stderr\": 0.013557770618845038\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n\
\ \"acc_stderr\": 0.004516215206715359,\n \"acc_norm\": 0.8888667596096396,\n\
\ \"acc_norm_stderr\": 0.003136547276689888\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077839,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077839\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135367,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135367\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268588,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.7900424254093005,\n\
\ \"mc2_stderr\": 0.013557770618845038\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873518\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836703\n }\n}\n```"
repo_url: https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|arc:challenge|25_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|gsm8k|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hellaswag|10_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T15-49-58.893408.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- '**/details_harness|winogrande|5_2024-02-13T15-49-58.893408.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T15-49-58.893408.parquet'
- config_name: results
data_files:
- split: 2024_02_13T15_49_58.893408
path:
- results_2024-02-13T15-49-58.893408.parquet
- split: latest
path:
- results_2024-02-13T15-49-58.893408.parquet
---
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v3.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v3.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T15:49:58.893408](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v3.3/blob/main/results_2024-02-13T15-49-58.893408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6493302568511367,
"acc_stderr": 0.032148382244220834,
"acc_norm": 0.648903728648512,
"acc_norm_stderr": 0.03281862542360137,
"mc1": 0.6389228886168911,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.7900424254093005,
"mc2_stderr": 0.013557770618845038
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715359,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.003136547276689888
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077839,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077839
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135367,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268588,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6389228886168911,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.7900424254093005,
"mc2_stderr": 0.013557770618845038
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873518
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BlinkJc/Llama-novelpersonal-7b | ---
license: openrail
---
# I AM still working on it |
nguyenthanhdo/ultrachat-aem-v1.0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: data
sequence: string
splits:
- name: train
num_bytes: 311481287.8581631
num_examples: 54411
download_size: 169997532
dataset_size: 311481287.8581631
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ultrachat-aem-v1.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
winvoker/lvis | ---
viewer: true
annotations_creators: []
language: []
language_creators: []
license:
- cc-by-4.0
pretty_name: lvis
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- segmentation
- coco
task_categories:
- image-segmentation
task_ids:
- instance-segmentation
---
# LVIS
### Dataset Summary
This dataset is the implementation of LVIS dataset into Hugging Face datasets. Please visit the original website for more information.
- https://www.lvisdataset.org/
### Loading
This code returns train, validation and test generators.
```python
from datasets import load_dataset
dataset = load_dataset("winvoker/lvis")
```
Objects is a dictionary which contains annotation information like bbox, class.
```
DatasetDict({
train: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 100170
})
validation: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 4809
})
test: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 19822
})
})
```
### Access Generators
```python
train = dataset["train"]
validation = dataset["validation"]
test = dataset["test"]
```
An example row is as follows.
```json
{ 'id': 0,
'image': '000000437561.jpg',
'height': 480,
'width': 640,
'objects': {
'bboxes': [[[392, 271, 14, 3]],
'classes': [117],
'segmentation': [[376, 272, 375, 270, 372, 269, 371, 269, 373, 269, 373]]
}
}
``` |
Peihao/test-dateset | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- odc-by
multilinguality:
- multilingual
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: c4
pretty_name: C4
---
# Dataset Card for C4
## Table of Contents
- [Dataset Card for C4](#dataset-card-for-c4)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/allenai/c4
- **Paper:** https://arxiv.org/abs/1910.10683
### Dataset Summary
A colossal, cleaned version of Common Crawl's web crawl corpus. Based on Common Crawl dataset: "https://commoncrawl.org".
This is the version prepared by AllenAI, hosted at this address: https://huggingface.co/datasets/allenai/c4
It comes in four variants:
- `en`: 305GB in JSON format
- `en.noblocklist`: 380GB in JSON format
- `en.noclean`: 2.3TB in JSON format
- `realnewslike`: 15GB in JSON format
The `en.noblocklist` variant is exactly the same as the `en` variant, except we turned off the so-called "badwords filter", which removes all documents that contain words from the lists at https://github.com/LDNOOBW/List-of-Dirty-Naughty-Obscene-and-Otherwise-Bad-Words.
### Supported Tasks and Leaderboards
C4 is mainly intended to pretrain language models and word representations.
### Languages
The dataset is in English.
## Dataset Structure
### Data Instances
An example form the `en` config is:
```
{
'url': 'https://klyq.com/beginners-bbq-class-taking-place-in-missoula/',
'text': 'Beginners BBQ Class Taking Place in Missoula!\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\nHe will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\nThe cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.',
'timestamp': '2019-04-25T12:57:54Z'
}
```
### Data Fields
The data have several fields:
- `url`: url of the source as a string
- `text`: text content as a string
- `timestamp`: timestamp as a string
### Data Splits
| name | train |validation|
|----------------|--------:|---------:|
| en |364868892| 364608|
| en.noblocklist |393391519| 393226|
| en.noclean | ?| ?|
| realnewslike | 13799838| 13863|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
C4 dataset is a collection of about 750GB of English-language text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in [c4.py](https://github.com/tensorflow/datasets/blob/5952d3d60d60e1727786fa7a9a23d24bb463d4d6/tensorflow_datasets/text/c4.py) by Tensorflow Datasets.
The dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by [langdetect](https://github.com/Mimino666/langdetect) was discarded.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
AllenAI are releasing this dataset under the terms of ODC-BY. By using this, you are also bound by the Common Crawl terms of use in respect of the content contained in the dataset.
### Citation Information
```
@article{2019t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {arXiv e-prints},
year = {2019},
archivePrefix = {arXiv},
eprint = {1910.10683},
}
```
### Contributions
Thanks to [@dirkgr](https://github.com/dirkgr) and [@lhoestq](https://github.com/lhoestq) for adding this dataset.
|
liuyanchen1015/MULTI_VALUE_stsb_bare_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 62672
num_examples: 319
- name: test
num_bytes: 38553
num_examples: 205
- name: train
num_bytes: 209177
num_examples: 1136
download_size: 209785
dataset_size: 310402
---
# Dataset Card for "MULTI_VALUE_stsb_bare_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ademax/metadata-legal-doc-ser | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 18870413203
num_examples: 237467
download_size: 1661208233
dataset_size: 18870413203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "metadata-legal-doc-ser"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
becerriljc/sentiment-banking | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 1205760
num_examples: 5001
download_size: 451611
dataset_size: 1205760
---
# Dataset Card for "sentiment-banking"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wheart/web3test1 | ---
license: openrail
---
|
tyzhu/find_last_sent_train_100_eval_10_sentbefore | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 434031
num_examples: 320
- name: validation
num_bytes: 10271
num_examples: 10
download_size: 179279
dataset_size: 444302
---
# Dataset Card for "find_last_sent_train_100_eval_10_sentbefore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lemon-mint/en_ko_translation_purified_v0.1 | ---
license: mit
---
|
neila8/cai | ---
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- finance
size_categories:
- n<1K
--- |
ldhldh/k1 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong1.1-2b | ---
pretty_name: Evaluation run of RESMPDEV/Gemma-Wukong1.1-2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RESMPDEV/Gemma-Wukong1.1-2b](https://huggingface.co/RESMPDEV/Gemma-Wukong1.1-2b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong1.1-2b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T03:11:07.874950](https://huggingface.co/datasets/open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong1.1-2b/blob/main/results_2024-03-02T03-11-07.874950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42112511469658714,\n\
\ \"acc_stderr\": 0.03440418776662734,\n \"acc_norm\": 0.42765526900902845,\n\
\ \"acc_norm_stderr\": 0.03534441824634386,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396756,\n \"mc2\": 0.47695573426393867,\n\
\ \"mc2_stderr\": 0.01699726842754026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31143344709897613,\n \"acc_stderr\": 0.013532472099850952,\n\
\ \"acc_norm\": 0.33447098976109213,\n \"acc_norm_stderr\": 0.013787460322441391\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3217486556462856,\n\
\ \"acc_stderr\": 0.004661924314756087,\n \"acc_norm\": 0.42421828321051586,\n\
\ \"acc_norm_stderr\": 0.004932137126625398\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.035975244117345775,\n\
\ \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.035975244117345775\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.024603626924097406,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.024603626924097406\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926763,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926763\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.03181110032413925,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5357798165137615,\n \"acc_stderr\": 0.021382364775701893,\n \"\
acc_norm\": 0.5357798165137615,\n \"acc_norm_stderr\": 0.021382364775701893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.49019607843137253,\n \"acc_stderr\": 0.035086373586305716,\n \"\
acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.035086373586305716\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5063291139240507,\n \"acc_stderr\": 0.03254462010767859,\n \
\ \"acc_norm\": 0.5063291139240507,\n \"acc_norm_stderr\": 0.03254462010767859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6452991452991453,\n\
\ \"acc_stderr\": 0.031342504862454025,\n \"acc_norm\": 0.6452991452991453,\n\
\ \"acc_norm_stderr\": 0.031342504862454025\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5683269476372924,\n\
\ \"acc_stderr\": 0.017712228939299794,\n \"acc_norm\": 0.5683269476372924,\n\
\ \"acc_norm_stderr\": 0.017712228939299794\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422625,\n\
\ \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n\
\ \"acc_stderr\": 0.014102223623152579,\n \"acc_norm\": 0.23128491620111732,\n\
\ \"acc_norm_stderr\": 0.014102223623152579\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.028358956313423552,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.028358956313423552\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34028683181225555,\n\
\ \"acc_stderr\": 0.012101217610223772,\n \"acc_norm\": 0.34028683181225555,\n\
\ \"acc_norm_stderr\": 0.012101217610223772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.02850145286039656,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.02850145286039656\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46938775510204084,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.46938775510204084,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.037999786443706066,\n\
\ \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.037999786443706066\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396756,\n \"mc2\": 0.47695573426393867,\n\
\ \"mc2_stderr\": 0.01699726842754026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5824782951854776,\n \"acc_stderr\": 0.013859978264440248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/RESMPDEV/Gemma-Wukong1.1-2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|arc:challenge|25_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|arc:challenge|25_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|gsm8k|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|gsm8k|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hellaswag|10_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hellaswag|10_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T02-51-42.614031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-11-07.874950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T03-11-07.874950.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- '**/details_harness|winogrande|5_2024-03-02T02-51-42.614031.parquet'
- split: 2024_03_02T03_11_07.874950
path:
- '**/details_harness|winogrande|5_2024-03-02T03-11-07.874950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T03-11-07.874950.parquet'
- config_name: results
data_files:
- split: 2024_03_02T02_51_42.614031
path:
- results_2024-03-02T02-51-42.614031.parquet
- split: 2024_03_02T03_11_07.874950
path:
- results_2024-03-02T03-11-07.874950.parquet
- split: latest
path:
- results_2024-03-02T03-11-07.874950.parquet
---
# Dataset Card for Evaluation run of RESMPDEV/Gemma-Wukong1.1-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RESMPDEV/Gemma-Wukong1.1-2b](https://huggingface.co/RESMPDEV/Gemma-Wukong1.1-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong1.1-2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T03:11:07.874950](https://huggingface.co/datasets/open-llm-leaderboard/details_RESMPDEV__Gemma-Wukong1.1-2b/blob/main/results_2024-03-02T03-11-07.874950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42112511469658714,
"acc_stderr": 0.03440418776662734,
"acc_norm": 0.42765526900902845,
"acc_norm_stderr": 0.03534441824634386,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396756,
"mc2": 0.47695573426393867,
"mc2_stderr": 0.01699726842754026
},
"harness|arc:challenge|25": {
"acc": 0.31143344709897613,
"acc_stderr": 0.013532472099850952,
"acc_norm": 0.33447098976109213,
"acc_norm_stderr": 0.013787460322441391
},
"harness|hellaswag|10": {
"acc": 0.3217486556462856,
"acc_stderr": 0.004661924314756087,
"acc_norm": 0.42421828321051586,
"acc_norm_stderr": 0.004932137126625398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.035975244117345775,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.035975244117345775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.024603626924097406,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.024603626924097406
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926763,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926763
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5357798165137615,
"acc_stderr": 0.021382364775701893,
"acc_norm": 0.5357798165137615,
"acc_norm_stderr": 0.021382364775701893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.035086373586305716,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.035086373586305716
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5063291139240507,
"acc_stderr": 0.03254462010767859,
"acc_norm": 0.5063291139240507,
"acc_norm_stderr": 0.03254462010767859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.44274809160305345,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.44274809160305345,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6452991452991453,
"acc_stderr": 0.031342504862454025,
"acc_norm": 0.6452991452991453,
"acc_norm_stderr": 0.031342504862454025
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5683269476372924,
"acc_stderr": 0.017712228939299794,
"acc_norm": 0.5683269476372924,
"acc_norm_stderr": 0.017712228939299794
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422625,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23128491620111732,
"acc_stderr": 0.014102223623152579,
"acc_norm": 0.23128491620111732,
"acc_norm_stderr": 0.014102223623152579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.028358956313423552,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.028358956313423552
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34028683181225555,
"acc_stderr": 0.012101217610223772,
"acc_norm": 0.34028683181225555,
"acc_norm_stderr": 0.012101217610223772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46938775510204084,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.46938775510204084,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.037999786443706066,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.037999786443706066
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396756,
"mc2": 0.47695573426393867,
"mc2_stderr": 0.01699726842754026
},
"harness|winogrande|5": {
"acc": 0.5824782951854776,
"acc_stderr": 0.013859978264440248
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdamGrzesik/Samantha-PL-AG-axolotl | ---
license: apache-2.0
---
|
pandora-s/neural-bridge-rag-dataset-12000-google-translated | ---
tags:
- rag
- synthetic data
license: apache-2.0
language:
- fr
datasets:
- neural-bridge/rag-dataset-12000
---
# Overview
This is a repository where I will slowly translate [neural-bridge/rag-dataset-12000](https://huggingface.co/datasets/neural-bridge/rag-dataset-12000) into different languages with Google Translate.
As RAG datasets are quite scarce, I felt that this could be useful for many who seek to add RAG capabilities to their models!
# How?
There are no secrets; these are raw translations that might not be 100% reliable. I literally run the entire dataset through Google Translate overnight.
I'm prioritizing "quantity" over "quality" here. As previously stated, there is a lack of diverse datasets. Better have some to play with than none... so here I am !
I do have the intention of doing proper and cleaner translations in the future... we will see.
# Languages:
| Language | Code | Status |
| ----------- | ----- | ----- |
| English (OG) | EN | ✔️ |
| French | FR | ✔️ |
| Spanish | ES | ✔️ |
| German | DE | ✔️ |
| Italian | IT | 〽️ |
| Portuguese | PT | 〽️ |
| Russian | RU | ❌ |
| Chinese | ZH | ❌ |
| Japanese | JA | ❌ |
| Arabic | AR | ❌ |
| Hindi | HI | ❌ |
| Korean | KO | ❌ |
| Dutch | NL | ❌ |
| ... | ... | ... |
PS: A few entries might be lost because of the simple way I'm doing this, but it's only a few.
# The Script:
In case some would want to know how I am doing this, here is a sample.
```py
import time
from googletrans import Translator
import pandas as pd
from tqdm import tqdm
## import logging
## timestamp = time.time()
## logging.basicConfig(filename=f'logs/{timestamp}.log', filemode='w', level = logging.DEBUG)
## logger = logging.getLogger("DatasetTranslator")
## logging.info("Logger Ready.")
def trans(path_og, path_save, src: str = 'en', dest: str = 'fr'):
translator = Translator()
## logging.info("Reading OG.")
## Edit this if you desire to read a different dataset format
df = pd.read_parquet(path_og)
#############################################################
new_df = []
## logging.info("Looping Translator...")
for i, row in tqdm(df.iterrows()):
for _ in range(5):
try:
## logging.debug(f"({i+1}) Translating from '{src}' to '{dest}': {row['context'][:20]} | {row['question'][:20]} | {row['answer'][:20]}")
ctx = translator.translate(row['context'], dest=dest, src=src).text
q = translator.translate(row['question'], dest=dest, src=src).text
a = translator.translate(row['answer'], dest=dest, src=src).text
new_row = {'language': dest, 'context': ctx, 'question': q, 'answer': a}
new_df.append(new_row)
## logging.debug(f"({i+1}) Translated from '{src}' to '{dest}': {ctx[:20]} | {q[:20]} | {a[:20]}")
break
except Exception as e:
## logging.error(e)
print(e)
time.sleep(1)
df = pd.DataFrame(data=new_df)
df.to_csv(path_save)
trans("test.parquet","test.csv")
``` |
heliosprime/twitter_dataset_1713000112 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11496
num_examples: 25
download_size: 9941
dataset_size: 11496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713000112"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/morikubo_nono_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of morikubo_nono/森久保乃々/모리쿠보노노 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of morikubo_nono/森久保乃々/모리쿠보노노 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_eyes, bangs, light_brown_hair, long_hair, drill_hair, earrings, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 583.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/morikubo_nono_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 323.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/morikubo_nono_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1042 | 678.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/morikubo_nono_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 504.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/morikubo_nono_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1042 | 997.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/morikubo_nono_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/morikubo_nono_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, blue_dress, simple_background, ringlets, upper_body, white_background, puffy_short_sleeves, medium_hair, open_mouth, blush, sweat, @_@, stud_earrings, tears, hair_ornament |
| 1 | 5 |  |  |  |  |  | 1girl, blue_dress, blue_footwear, chibi, puffy_short_sleeves, ringlets, solo, collared_dress, shoes, closed_mouth, holding, standing, :3, blush, full_body, outdoors, sitting, white_background, white_socks |
| 2 | 5 |  |  |  |  |  | 1girl, blonde_hair, blush, looking_at_viewer, solo, upper_body, holding, flower, jewelry, long_sleeves, ascot, medium_hair, open_mouth, simple_background |
| 3 | 7 |  |  |  |  |  | 1girl, blush, simple_background, solo, white_background, bare_shoulders, smile, collarbone, looking_at_viewer, medium_hair, ringlets, sleeveless_dress, upper_body, blonde_hair, choker, jewelry |
| 4 | 13 |  |  |  |  |  | 1girl, jewelry, solo, blush, sleeveless, black_gloves, looking_at_viewer, smile, corset, green_dress, holding_microphone, open_mouth, simple_background, white_background, blonde_hair, frills, lace, @_@, hair_bow, hair_ornament, thighhighs |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, nipples, nude, open_mouth, simple_background, sweat, @_@, bar_censor, cum, medium_hair, penis, small_breasts, tears, white_background, blonde_hair, handjob, navel, pussy, ringlets, sex |
| 6 | 5 |  |  |  |  |  | 1girl, hair_bow, ringlets, solo, blush, floral_print, looking_at_viewer, upper_body, wide_sleeves, frills, holding, jewelry, long_sleeves, open_mouth, print_kimono, ribbon, smile, blonde_hair |
| 7 | 15 |  |  |  |  |  | 1girl, solo, blush, frilled_bikini, floral_print, hair_flower, navel, ringlets, white_bikini, collarbone, wavy_mouth, looking_at_viewer, outdoors, print_bikini, small_breasts, water, bare_shoulders, blonde_hair, open_mouth |
| 8 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, open_mouth, penis, pussy, tears, sweat, vaginal, long_sleeves, mosaic_censoring, on_back, short_hair, wavy_mouth, blonde_hair, clothed_sex, dress, hair_flower, jaggy_lines, jewelry, oekaki, panties, spread_legs |
| 9 | 5 |  |  |  |  |  | 1girl, blush, one_side_up, pleated_skirt, school_uniform, solo, hair_scrunchie, long_sleeves, stud_earrings, white_background, cardigan, kogal, looking_at_viewer, nail_polish, necklace, simple_background, blue_skirt, cellphone, charm_(object), flying_sweatdrops, from_below, green_bow, green_scrunchie, hairclip, holding_phone, loose_bowtie, loose_socks, medium_breasts, open_mouth, pantyshot, school_bag, sitting, striped_panties, white_shirt, white_socks |
| 10 | 5 |  |  |  |  |  | 1girl, cosplay, ringlets, simple_background, solo, white_background, belt_buckle, black_footwear, shoes, sweat, full_body, long_sleeves, nose_blush, open_jacket, white_shirt, >_<, @_@, black_shirt, blue_pants, boots, brown_footwear, chain, closed_eyes, collared_shirt, fingerless_gloves, holding_weapon, open_mouth, parted_lips, standing, tears, wavy_mouth, white_gloves, white_jacket |
| 11 | 13 |  |  |  |  |  | 1girl, solo, bow, jewelry, blush, cape, long_sleeves, star_(symbol), fur-trimmed_cloak, looking_at_viewer, mini_crown, shorts, smile, side_ponytail, white_background |
| 12 | 9 |  |  |  |  |  | maid_headdress, blush, enmaided, frills, 1girl, black_dress, jewelry, solo, long_sleeves, open_mouth, puffy_sleeves, ringlets, blonde_hair, bow, looking_at_viewer, maid_apron, wavy_mouth, simple_background, white_apron, white_background |
| 13 | 5 |  |  |  |  |  | 1girl, @_@, fake_animal_ears, rabbit_ears, ringlets, solo, white_background, detached_collar, nose_blush, simple_background, sweat, bare_shoulders, black_bowtie, black_jacket, hand_up, navel, white_collar, wing_collar, wrist_cuffs, arm_behind_back, black_hairband, black_leotard, breasts, closed_mouth, hair_between_eyes, hands_up, long_sleeves, looking_away, no_pants, pantyhose, playboy_bunny, rabbit_tail, strapless_leotard, wavy_mouth, white_gloves, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blue_dress | simple_background | ringlets | upper_body | white_background | puffy_short_sleeves | medium_hair | open_mouth | blush | sweat | @_@ | stud_earrings | tears | hair_ornament | blue_footwear | chibi | collared_dress | shoes | closed_mouth | holding | standing | :3 | full_body | outdoors | sitting | white_socks | blonde_hair | looking_at_viewer | flower | jewelry | long_sleeves | ascot | bare_shoulders | smile | collarbone | sleeveless_dress | choker | sleeveless | black_gloves | corset | green_dress | holding_microphone | frills | lace | hair_bow | thighhighs | 1boy | hetero | solo_focus | nipples | nude | bar_censor | cum | penis | small_breasts | handjob | navel | pussy | sex | floral_print | wide_sleeves | print_kimono | ribbon | frilled_bikini | hair_flower | white_bikini | wavy_mouth | print_bikini | water | vaginal | mosaic_censoring | on_back | short_hair | clothed_sex | dress | jaggy_lines | oekaki | panties | spread_legs | one_side_up | pleated_skirt | school_uniform | hair_scrunchie | cardigan | kogal | nail_polish | necklace | blue_skirt | cellphone | charm_(object) | flying_sweatdrops | from_below | green_bow | green_scrunchie | hairclip | holding_phone | loose_bowtie | loose_socks | medium_breasts | pantyshot | school_bag | striped_panties | white_shirt | cosplay | belt_buckle | black_footwear | nose_blush | open_jacket | >_< | black_shirt | blue_pants | boots | brown_footwear | chain | closed_eyes | collared_shirt | fingerless_gloves | holding_weapon | parted_lips | white_gloves | white_jacket | bow | cape | star_(symbol) | fur-trimmed_cloak | mini_crown | shorts | side_ponytail | maid_headdress | enmaided | black_dress | puffy_sleeves | maid_apron | white_apron | fake_animal_ears | rabbit_ears | detached_collar | black_bowtie | black_jacket | hand_up | white_collar | wing_collar | wrist_cuffs | arm_behind_back | black_hairband | black_leotard | breasts | hair_between_eyes | hands_up | looking_away | no_pants | pantyhose | playboy_bunny | rabbit_tail | strapless_leotard | white_panties |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:-------------|:--------------------|:-----------|:-------------|:-------------------|:----------------------|:--------------|:-------------|:--------|:--------|:------|:----------------|:--------|:----------------|:----------------|:--------|:-----------------|:--------|:---------------|:----------|:-----------|:-----|:------------|:-----------|:----------|:--------------|:--------------|:--------------------|:---------|:----------|:---------------|:--------|:-----------------|:--------|:-------------|:-------------------|:---------|:-------------|:---------------|:---------|:--------------|:---------------------|:---------|:-------|:-----------|:-------------|:-------|:---------|:-------------|:----------|:-------|:-------------|:------|:--------|:----------------|:----------|:--------|:--------|:------|:---------------|:---------------|:---------------|:---------|:-----------------|:--------------|:---------------|:-------------|:---------------|:--------|:----------|:-------------------|:----------|:-------------|:--------------|:--------|:--------------|:---------|:----------|:--------------|:--------------|:----------------|:-----------------|:-----------------|:-----------|:--------|:--------------|:-----------|:-------------|:------------|:-----------------|:--------------------|:-------------|:------------|:------------------|:-----------|:----------------|:---------------|:--------------|:-----------------|:------------|:-------------|:------------------|:--------------|:----------|:--------------|:-----------------|:-------------|:--------------|:------|:--------------|:-------------|:--------|:-----------------|:--------|:--------------|:-----------------|:--------------------|:-----------------|:--------------|:---------------|:---------------|:------|:-------|:----------------|:--------------------|:-------------|:---------|:----------------|:-----------------|:-----------|:--------------|:----------------|:-------------|:--------------|:-------------------|:--------------|:------------------|:---------------|:---------------|:----------|:---------------|:--------------|:--------------|:------------------|:-----------------|:----------------|:----------|:--------------------|:-----------|:---------------|:-----------|:------------|:----------------|:--------------|:--------------------|:----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | X | | | X | X | X | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | X | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | | X | | | X | | | X | X | | X | | | X | | | | | | | | | | | | | X | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | | X | | X | X | X | X | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | X | X | | | | X | X | | | | | | | | | | | X | | | | | | | X | X | | X | X | | | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 15 |  |  |  |  |  | X | X | | | X | | | | | X | X | | | | | | | | | | | | | | | X | | | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | | | | | | | X | X | X | | | X | | | | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | X | X | X | | | | | X | | | | X | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | X | | | X | | | X | X | | | X | | | | | | | | | | | | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | X | | X | | | | | X | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 13 |  |  |  |  |  | X | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | | | X | X | | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
phyloforfun/HLT_Kew_WCVP_SLTPvA_v1-0_small__T20-OCR-C25-L25-E50-R10 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1045913
num_examples: 1000
download_size: 150662
dataset_size: 1045913
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft | ---
pretty_name: Evaluation run of CorticalStack/mistral-7b-dolphin-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6226391506011404,\n\
\ \"acc_stderr\": 0.032752871244970075,\n \"acc_norm\": 0.6284357429187831,\n\
\ \"acc_norm_stderr\": 0.033420600664784014,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n\
\ \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.014542104569955269,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n\
\ \"acc_stderr\": 0.0048329345291207955,\n \"acc_norm\": 0.8301135232025493,\n\
\ \"acc_norm_stderr\": 0.0037476555337545158\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493609,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493609\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n\
\ \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n\
\ \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594202,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594202\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n\
\ \"acc_stderr\": 0.01552192393352364,\n \"acc_norm\": 0.3139664804469274,\n\
\ \"acc_norm_stderr\": 0.01552192393352364\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\
\ \"acc_stderr\": 0.012669813464935722,\n \"acc_norm\": 0.43741851368970014,\n\
\ \"acc_norm_stderr\": 0.012669813464935722\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.4891471279958395,\n\
\ \"mc2_stderr\": 0.014787543186222349\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35784685367702807,\n \
\ \"acc_stderr\": 0.013204142536119939\n }\n}\n```"
repo_url: https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T14-55-12.739347.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- '**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T14-55-12.739347.parquet'
- config_name: results
data_files:
- split: 2024_02_16T14_55_12.739347
path:
- results_2024-02-16T14-55-12.739347.parquet
- split: latest
path:
- results_2024-02-16T14-55-12.739347.parquet
---
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-dolphin-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-dolphin-sft](https://huggingface.co/CorticalStack/mistral-7b-dolphin-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T14:55:12.739347](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-dolphin-sft/blob/main/results_2024-02-16T14-55-12.739347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6226391506011404,
"acc_stderr": 0.032752871244970075,
"acc_norm": 0.6284357429187831,
"acc_norm_stderr": 0.033420600664784014,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.014542104569955269,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.0048329345291207955,
"acc_norm": 0.8301135232025493,
"acc_norm_stderr": 0.0037476555337545158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493609,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493609
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594202,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594202
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.01552192393352364,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.01552192393352364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935722,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935722
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.4891471279958395,
"mc2_stderr": 0.014787543186222349
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.35784685367702807,
"acc_stderr": 0.013204142536119939
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/med_alpaca_standardized_cluster_47_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12127916
num_examples: 24264
download_size: 6213302
dataset_size: 12127916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_47_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
worldboss/nia_faq_chat | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36318
num_examples: 66
download_size: 20689
dataset_size: 36318
---
# Dataset Card for "nia_faq_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_189 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1297981352.0
num_examples: 254906
download_size: 1323128010
dataset_size: 1297981352.0
---
# Dataset Card for "chunk_189"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-squad_v2-squad_v2-552ce2-1507654811 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: navteca/roberta-large-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: navteca/roberta-large-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@tvdermeer](https://huggingface.co/tvdermeer) for evaluating this model. |
yentinglin/TaiwanChat | ---
license: cc-by-nc-4.0
task_categories:
- conversational
- text-generation
- text2text-generation
language:
- zh
pretty_name: Traditional Chinese Instruction-tuning Set
size_categories:
- 100K<n<1M
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/CmusIT5OlSXvFrbTJ7l-C.png" alt="Taiwan LLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
## Performance

## Citation
If you find Taiwan LLM is useful in your work, please cite it with:
```
@misc{lin2023taiwan,
title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model},
author={Yen-Ting Lin and Yun-Nung Chen},
year={2023},
eprint={2311.17487},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
GraphWiz/GraphInstruct-Test | ---
license: apache-2.0
task_categories:
- text-generation
- graph-ml
language:
- en
size_categories:
- 1K<n<10K
configs:
- config_name: cycle
data_files:
- split: test
path: cycle_test.json
- config_name: connectivity
data_files:
- split: test
path: connectivity_test.json
- config_name: flow
data_files:
- split: test
path: flow_test.json
- config_name: bipartite
data_files:
- split: test
path: bipartite_test.json
- config_name: hamilton
data_files:
- split: test
path: hamilton_test.json
- config_name: shortest
data_files:
- split: test
path: shortest_test.json
- config_name: topology
data_files:
- split: test
path: topology_test.json
- config_name: substructure
data_files:
- split: test
path: substructure_test.json
- config_name: triangle
data_files:
- split: test
path: triangle_test.json
---
|
projetosoclts/bruna | ---
license: openrail
---
|
deepachalapathi/msrc_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1434434
num_examples: 5801
- name: validation
num_bytes: 287084.61885881744
num_examples: 1161
download_size: 923273
dataset_size: 1721518.6188588175
---
# Dataset Card for "msrc_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hack90/ncbi_genbank_part_65 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 19753145244
num_examples: 1476991
download_size: 8576777094
dataset_size: 19753145244
---
# Dataset Card for "ncbi_genbank_part_65"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jingyeom__freeze_KoSoLAR-10.7B-v0.2_1.4_dedup | ---
pretty_name: Evaluation run of jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup](https://huggingface.co/jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jingyeom__freeze_KoSoLAR-10.7B-v0.2_1.4_dedup\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T19:47:27.132798](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__freeze_KoSoLAR-10.7B-v0.2_1.4_dedup/blob/main/results_2024-02-09T19-47-27.132798.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427828139999391,\n\
\ \"acc_stderr\": 0.03180043003386348,\n \"acc_norm\": 0.6500272402365154,\n\
\ \"acc_norm_stderr\": 0.03244696674206044,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4449971855988083,\n\
\ \"mc2_stderr\": 0.01491170317496814\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.01455810654392406,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.01440136664121639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5994821748655647,\n\
\ \"acc_stderr\": 0.0048900193560210865,\n \"acc_norm\": 0.8125871340370444,\n\
\ \"acc_norm_stderr\": 0.0038944505016930368\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n\
\ \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n\
\ \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\"\
: {\n \"acc\": 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058336,\n\
\ \"acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058336\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083025,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083025\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461756,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.029442495585857483,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.029442495585857483\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n\
\ \"acc_stderr\": 0.014796502622562548,\n \"acc_norm\": 0.2670391061452514,\n\
\ \"acc_norm_stderr\": 0.014796502622562548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.02405102973991225,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.02405102973991225\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4449971855988083,\n\
\ \"mc2_stderr\": 0.01491170317496814\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881573\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32221379833206976,\n \
\ \"acc_stderr\": 0.012872435481188778\n }\n}\n```"
repo_url: https://huggingface.co/jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-47-27.132798.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-47-27.132798.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- '**/details_harness|winogrande|5_2024-02-09T19-47-27.132798.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T19-47-27.132798.parquet'
- config_name: results
data_files:
- split: 2024_02_09T19_47_27.132798
path:
- results_2024-02-09T19-47-27.132798.parquet
- split: latest
path:
- results_2024-02-09T19-47-27.132798.parquet
---
# Dataset Card for Evaluation run of jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup](https://huggingface.co/jingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jingyeom__freeze_KoSoLAR-10.7B-v0.2_1.4_dedup",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T19:47:27.132798](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__freeze_KoSoLAR-10.7B-v0.2_1.4_dedup/blob/main/results_2024-02-09T19-47-27.132798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6427828139999391,
"acc_stderr": 0.03180043003386348,
"acc_norm": 0.6500272402365154,
"acc_norm_stderr": 0.03244696674206044,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4449971855988083,
"mc2_stderr": 0.01491170317496814
},
"harness|arc:challenge|25": {
"acc": 0.5426621160409556,
"acc_stderr": 0.01455810654392406,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.01440136664121639
},
"harness|hellaswag|10": {
"acc": 0.5994821748655647,
"acc_stderr": 0.0048900193560210865,
"acc_norm": 0.8125871340370444,
"acc_norm_stderr": 0.0038944505016930368
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4576719576719577,
"acc_stderr": 0.025658868862058336,
"acc_norm": 0.4576719576719577,
"acc_norm_stderr": 0.025658868862058336
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083025,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083025
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461756,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857483,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857483
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2670391061452514,
"acc_stderr": 0.014796502622562548,
"acc_norm": 0.2670391061452514,
"acc_norm_stderr": 0.014796502622562548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4449971855988083,
"mc2_stderr": 0.01491170317496814
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881573
},
"harness|gsm8k|5": {
"acc": 0.32221379833206976,
"acc_stderr": 0.012872435481188778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_127 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1182591540
num_examples: 232245
download_size: 1207344898
dataset_size: 1182591540
---
# Dataset Card for "chunk_127"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fishytorts/new_dataset_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: audio_names
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 12388426.0
num_examples: 6
download_size: 12391206
dataset_size: 12388426.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "new_dataset_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Neel-Gupta/minipile-processed_768 | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 16663477824
num_examples: 1764
- name: test
num_bytes: 160589072
num_examples: 17
download_size: 1664616997
dataset_size: 16824066896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_cola_standing_stood | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 46
num_examples: 1
- name: train
num_bytes: 92
num_examples: 1
download_size: 3816
dataset_size: 138
---
# Dataset Card for "MULTI_VALUE_cola_standing_stood"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guriko/autotrain-data-cv-sentiment | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: cv-sentiment
## Dataset Description
This dataset has been automatically processed by AutoTrain for project cv-sentiment.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "I have an educational background in the Information Technology, I graduated from Informatics Engineering at Parahyangan Catholic University in Bandung. I made a final project about Development of BPMS in Mobile Cordova Platform (Coordova Tasklist). I really excited learning new things such as my final project of learning about cordova and test the effectiveness and reusability in the business process management system.",
"target": 1
},
{
"text": "A college student who love technology and create projects about web and multi-platform apps.",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['0', '1', '2', '3'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 77 |
| valid | 22 |
|
HydraLM/partitioned_v3_standardized_025 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 10684436.312722515
num_examples: 19870
download_size: 6109603
dataset_size: 10684436.312722515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_025"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thomasavare/waste-classification-v3 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Phrase
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 1289389.2
num_examples: 16146
- name: validation
num_bytes: 429796.4
num_examples: 5382
- name: test
num_bytes: 429796.4
num_examples: 5382
download_size: 668259
dataset_size: 2148982.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
jester20/data0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tropy0/Tropy96485 | ---
license: bigcode-openrail-m
---
|
distilabel-internal-testing/airoboros-3.2-writing-oai-style-mini | ---
dataset_info:
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 565952.1665728756
num_examples: 100
download_size: 323614
dataset_size: 565952.1665728756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base | ---
pretty_name: Evaluation run of fierysurf/Kan-LLaMA-7B-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fierysurf/Kan-LLaMA-7B-base](https://huggingface.co/fierysurf/Kan-LLaMA-7B-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T13:48:16.932348](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base/blob/main/results_2024-01-18T13-48-16.932348.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37263074581051026,\n\
\ \"acc_stderr\": 0.0338849247942205,\n \"acc_norm\": 0.3774408949562487,\n\
\ \"acc_norm_stderr\": 0.03480722110246682,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3957474692508163,\n\
\ \"mc2_stderr\": 0.014345144003847196\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4069965870307167,\n \"acc_stderr\": 0.014356399418009128,\n\
\ \"acc_norm\": 0.439419795221843,\n \"acc_norm_stderr\": 0.014503747823580127\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5163314080860386,\n\
\ \"acc_stderr\": 0.004987119003151497,\n \"acc_norm\": 0.7075283808006373,\n\
\ \"acc_norm_stderr\": 0.004539680764142161\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.36129032258064514,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380025,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4595959595959596,\n\
\ \"acc_stderr\": 0.03550702465131343,\n \"acc_norm\": 0.4595959595959596,\n\
\ \"acc_norm_stderr\": 0.03550702465131343\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n\
\ \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602354,\n \
\ \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.44220183486238535,\n \"acc_stderr\": 0.021293613207520216,\n \"\
acc_norm\": 0.44220183486238535,\n \"acc_norm_stderr\": 0.021293613207520216\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02988691054762695,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02988691054762695\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.0345423658538061,\n \"acc_norm\"\
: 0.4117647058823529,\n \"acc_norm_stderr\": 0.0345423658538061\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.4008438818565401,\n \"acc_stderr\": 0.03190080389473236,\n \"\
acc_norm\": 0.4008438818565401,\n \"acc_norm_stderr\": 0.03190080389473236\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.045604560863872344,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.045604560863872344\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.37423312883435583,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.37423312883435583,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5427350427350427,\n\
\ \"acc_stderr\": 0.03263622596380688,\n \"acc_norm\": 0.5427350427350427,\n\
\ \"acc_norm_stderr\": 0.03263622596380688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5019157088122606,\n\
\ \"acc_stderr\": 0.017879832259026677,\n \"acc_norm\": 0.5019157088122606,\n\
\ \"acc_norm_stderr\": 0.017879832259026677\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02791405551046803,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02791405551046803\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4694533762057878,\n\
\ \"acc_stderr\": 0.02834504586484069,\n \"acc_norm\": 0.4694533762057878,\n\
\ \"acc_norm_stderr\": 0.02834504586484069\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.02700252103451649,\n\
\ \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.02700252103451649\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30964797913950454,\n\
\ \"acc_stderr\": 0.011808598262503321,\n \"acc_norm\": 0.30964797913950454,\n\
\ \"acc_norm_stderr\": 0.011808598262503321\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406797,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406797\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32189542483660133,\n \"acc_stderr\": 0.018901015322093085,\n \
\ \"acc_norm\": 0.32189542483660133,\n \"acc_norm_stderr\": 0.018901015322093085\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n\
\ \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n\
\ \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.03820042586602966,\n\
\ \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.03820042586602966\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3957474692508163,\n\
\ \"mc2_stderr\": 0.014345144003847196\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6850828729281768,\n \"acc_stderr\": 0.013054277568469231\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/fierysurf/Kan-LLaMA-7B-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-48-16.932348.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- '**/details_harness|winogrande|5_2024-01-18T13-48-16.932348.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T13-48-16.932348.parquet'
- config_name: results
data_files:
- split: 2024_01_18T13_48_16.932348
path:
- results_2024-01-18T13-48-16.932348.parquet
- split: latest
path:
- results_2024-01-18T13-48-16.932348.parquet
---
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-base](https://huggingface.co/fierysurf/Kan-LLaMA-7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:48:16.932348](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base/blob/main/results_2024-01-18T13-48-16.932348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37263074581051026,
"acc_stderr": 0.0338849247942205,
"acc_norm": 0.3774408949562487,
"acc_norm_stderr": 0.03480722110246682,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3957474692508163,
"mc2_stderr": 0.014345144003847196
},
"harness|arc:challenge|25": {
"acc": 0.4069965870307167,
"acc_stderr": 0.014356399418009128,
"acc_norm": 0.439419795221843,
"acc_norm_stderr": 0.014503747823580127
},
"harness|hellaswag|10": {
"acc": 0.5163314080860386,
"acc_stderr": 0.004987119003151497,
"acc_norm": 0.7075283808006373,
"acc_norm_stderr": 0.004539680764142161
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36129032258064514,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.36129032258064514,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.03550702465131343,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.03550702465131343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44220183486238535,
"acc_stderr": 0.021293613207520216,
"acc_norm": 0.44220183486238535,
"acc_norm_stderr": 0.021293613207520216
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762695,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762695
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.0345423658538061,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.0345423658538061
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4008438818565401,
"acc_stderr": 0.03190080389473236,
"acc_norm": 0.4008438818565401,
"acc_norm_stderr": 0.03190080389473236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.44274809160305345,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.44274809160305345,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.045604560863872344,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.045604560863872344
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.37423312883435583,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.37423312883435583,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5427350427350427,
"acc_stderr": 0.03263622596380688,
"acc_norm": 0.5427350427350427,
"acc_norm_stderr": 0.03263622596380688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5019157088122606,
"acc_stderr": 0.017879832259026677,
"acc_norm": 0.5019157088122606,
"acc_norm_stderr": 0.017879832259026677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02791405551046803,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02791405551046803
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4694533762057878,
"acc_stderr": 0.02834504586484069,
"acc_norm": 0.4694533762057878,
"acc_norm_stderr": 0.02834504586484069
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.02700252103451649,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.02700252103451649
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30964797913950454,
"acc_stderr": 0.011808598262503321,
"acc_norm": 0.30964797913950454,
"acc_norm_stderr": 0.011808598262503321
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406797,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406797
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32189542483660133,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.32189542483660133,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.03820042586602966,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.03820042586602966
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3957474692508163,
"mc2_stderr": 0.014345144003847196
},
"harness|winogrande|5": {
"acc": 0.6850828729281768,
"acc_stderr": 0.013054277568469231
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
calaisc/288-demo | ---
license: pddl
---
|
pankajemplay/llama-intent-1615 | ---
dataset_info:
features:
- name: User Query
dtype: string
- name: Intent
dtype: string
- name: id type
dtype: string
- name: id value
dtype: string
- name: id slot filled
dtype: bool
- name: Task
dtype: string
- name: task slot filled
dtype: bool
- name: Bot Response
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 759033
num_examples: 1615
download_size: 221927
dataset_size: 759033
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-intent-1615"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McGill-NLP/statcan-dialogue-dataset | ---
task_categories:
- conversational
- table-question-answering
language:
- en
- fr
extra_gated_prompt: "You agree to not attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Full Name: text
Affiliation: text
Country: text
Academic/Work Email Address: text
I agree to follow the terms of use: checkbox
I have read and will respect the restrictions: checkbox
pretty_name: Statcan Dialogue Dataset
size_categories:
- 1K<n<10K
---
# Statcan Dialogue Dataset
<div align="center">
[**💻Code**](https://github.com/mcGill-NLP/statcan-dialogue-dataset) | [**📄Paper**](https://arxiv.org/abs/2304.01412) | [**🌐Homepage**](https://mcgill-nlp.github.io/statcan-dialogue-dataset) | [**🤗Huggingface**](https://huggingface.co/datasets/McGill-NLP/statcan-dialogue-dataset) | [**🐦Tweets**](https://twitter.com/xhluca/status/1648728708142727180) | [**📺Video**](https://aclanthology.org/2023.eacl-main.206.mp4) |
| :--: | :--: | :--: | :--: | :--: | :--: |
[**The StatCan Dialogue Dataset: Retrieving Data Tables through Conversations with Genuine Intents**](https://arxiv.org/abs/2304.01412)\
[*Xing Han Lu*](https://xinghanlu.com), [*Siva Reddy*](https://sivareddy.in), [*Harm de Vries*](https://www.harmdevries.com/)\
EACL 2023

</div>
## Access
To access this dataset, you must read and accept the following terms of use and restrictions, then request access with your academic or professional email.
We will manually review each request. To ensure your request is not rejected, make sure that:
- Your huggingface account is linked to your professional/research website, which we may review to ensure the dataset will be used for the intended purpose
- Your request is made with an academic (e.g. `.edu`) or professional email (e.g. `@servicenow.com`). To do this, your have to set your primary email to your academic/professional email, or create a new Huggingface account.
If your academic institution does not end with `.edu`, or you are part of a professional group that does not have an email address, please contact us (see email in paper).
### Terms of use
Researchers must agree to the following terms:
1. These data represent anonymized (de-identified) data from individuals. Best efforts have been implemented to ensure that all directly and indirectly identifiable information has been removed. Researchers who download this dataset must agree to notify Graeme Gilmour (`graeme.gilmour <at> statcan.gc.ca`) and Harm de Vries (`harm.devries <at> servicenow.com`) if any inadvertently remaining identifiable information is discovered during the process of re-using this dataset. Researchers must agree to destroy any version of this dataset containing identifiable information.
2. The terms of this dataset require that reusers give credit to the creators. It allows reusers to distribute, remix, adapt, and build upon the material in any medium or format, even for commercial purposes.
3. Have read and acknowledged the Appendix B (Dataset Card) of the latest version of the paper prior to using the dataset.
### Restrictions
Downloaders cannot:
1. obtain information from the dataset that results in the researcher or any third party(ies) directly or indirectly identifying any participant with the aid of other information acquired elsewhere;
2. produce connections or links among or between the information included in the dataset and other third-party information that could be used to identify any individuals; and
3. extract information from the dataset that could aid researchers (downloaders) in gaining knowledge about or obtaining any means of contacting any individuals already known to the downloader/researcher
## Quickstart
Quickstart code is available in the Readme and on the user guide (see [documentation](https://mcgill-nlp.github.io/statcan-dialogue-dataset/docs)).
## Dataset Card
Please refer to Appendix B of the manuscript.
## Usage on Huggingface `datasets`
It is recommended to use the `statcan-dialogue-dataset` library to access the dataset, which you can install with `pip install statcan-dialogue-dataset` and learn about in the [documentation](https://mcgill-nlp.github.io/statcan-dialogue-dataset/docs).
However, it is possible to load certain files directly on Huggingface `datasets` (however, for other files, you will need to use the `statcan-dialogue-dataset` library):
```python
from datasets import load_dataset
# Load retrieval task data (without bm25 hard negatives)
ds_ret = load_dataset("McGill-NLP/statcan-dialogue-dataset", data_dir="retrieval")
# Load generation task data (without retrieval augmentations)
ds_gen = load_dataset("McGill-NLP/statcan-dialogue-dataset", data_dir="generation")
# Load french version of datasets
ds_ret_fr = load_dataset("McGill-NLP/statcan-dialogue-dataset", data_dir="retrieval_fr")
ds_gen_fr = load_dataset("McGill-NLP/statcan-dialogue-dataset", data_dir="generation_fr")
```
> **IMPORTANT NOTE**: Do not download the content of this repository into `~/.statcan_dialogue_dataset/` as this will cause conflicts with the `statcan-dialogue-dataset` library. As you have noticed, the file names and path are different - the files and directories here have been modified from the original files located in `task_data.zip`. If you need to cache the files, please use the default Huggingface cache directory.
## Citation
If you use our dataset, please cite as follows:
```bibtex
@inproceedings{lu-etal-2023-statcan,
title = "The {S}tat{C}an Dialogue Dataset: Retrieving Data Tables through Conversations with Genuine Intents",
author = "Lu, Xing Han and
Reddy, Siva and
de Vries, Harm",
booktitle = "Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics",
month = may,
year = "2023",
address = "Dubrovnik, Croatia",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2304.01412",
pages = "2799--2829",
}
```
|
AnhTong/vi_dataset | ---
dataset_info:
features:
- name: title
dtype: string
- name: link
dtype: string
- name: content
dtype: string
splits:
- name: astronomy
num_bytes: 5509853
num_examples: 1163
- name: cacnuoc
num_bytes: 1849582
num_examples: 373
- name: hocvan12
num_bytes: 3700549
num_examples: 584
- name: marketing
num_bytes: 1395360
num_examples: 304
- name: molympiad
num_bytes: 11949913
num_examples: 4488
- name: sinhhocvn
num_bytes: 1201768
num_examples: 142
- name: vansudia
num_bytes: 85849474
num_examples: 9045
- name: kimca
num_bytes: 2126678
num_examples: 902
- name: toidicodedao
num_bytes: 3045055
num_examples: 498
download_size: 57946392
dataset_size: 116628232
---
# Dataset Card for "vi_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lkh9908/CleanedCombinedHub | ---
dataset_info:
features:
- name: id
dtype: string
- name: abstract
dtype: string
- name: highlights
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 43140090
num_examples: 27175
download_size: 23291566
dataset_size: 43140090
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.