datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
xezpeleta/ccmatrix
--- annotations_creators: - found language_creators: - found language: - af - am - ar - ast - az - be - bg - bn - br - ca - ceb - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - ha - he - hi - hr - hu - hy - id - ig - ilo - is - it - ja - jv - ka - kk - km - ko - la - lb - lg - lt - lv - mg - mk - ml - mr - ms - my - ne - nl - 'no' - oc - om - or - pl - pt - ro - ru - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - tl - tr - tt - uk - ur - uz - vi - wo - xh - yi - yo - zh - zu - se license: - unknown multilinguality: - multilingual size_categories: - 100M<n<1B source_datasets: - original task_categories: - text2text-generation - translation task_ids: [] paperswithcode_id: ccmatrix pretty_name: CCMatrixV1 tags: - conditional-text-generation --- # Dataset Card for CCMatrix v1 ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://opus.nlpl.eu/CCMatrix.php - **Repository:** None - **Paper:** https://arxiv.org/abs/1911.04944 ### Dataset Summary This corpus has been extracted from web crawls using the margin-based bitext mining techniques described at https://github.com/facebookresearch/LASER/tree/master/tasks/CCMatrix. * 90 languages, 1,197 bitexts * total number of files: 90 * total number of tokens: 112.14G * total number of sentence fragments: 7.37G ### Supported Tasks and Leaderboards [More Information Needed] ### Languages Configs are generated for all language pairs in both directions. You can find the valid pairs in Homepage section of Dataset Description: https://opus.nlpl.eu/CCMatrix.php E.g. ``` from datasets import load_dataset dataset = load_dataset("yhavinga/ccmatrix", "en-nl", streaming=True) ``` This will open the `en-nl` dataset in streaming mode. Without streaming, download and prepare will take tens of minutes. You can inspect elements with: ``` print(next(iter(dataset['train']))) {'id': 0, 'score': 1.2499677, 'translation': {'en': 'They come from all parts of Egypt, just like they will at the day of His coming.', 'nl': 'Zij kwamen uit alle delen van Egypte, evenals zij op de dag van Zijn komst zullen doen.'}} ``` ## Dataset Structure ### Data Instances For example: ```json { "id": 1, "score": 1.2498379, "translation": { "nl": "En we moeten elke waarheid vals noemen die niet minstens door een lach vergezeld ging.”", "en": "And we should call every truth false which was not accompanied by at least one laugh.”" } } ``` ### Data Fields Each example contains an integer id starting with 0, a score, and a translation dictionary with the language 1 and language 2 texts. ### Data Splits Only a `train` split is provided. ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data [More Information Needed] #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations [More Information Needed] #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information IMPORTANT: Please cite reference [2][3] if you use this data. 1. **[CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data](https://arxiv.org/abs/1911.00359)** by *Guillaume Wenzek, Marie-Anne Lachaux, Alexis Conneau, Vishrav Chaudhary, Francisco Guzmán, Armand Jouli and Edouard Grave*. 2. **[CCMatrix: Mining Billions of High-Quality Parallel Sentences on the WEB](https://arxiv.org/abs/1911.04944)** by *Holger Schwenk, Guillaume Wenzek, Sergey Edunov, Edouard Grave and Armand Joulin*. 3. **[Beyond English-Centric Multilingual Machine Translation](https://arxiv.org/abs/2010.11125)** by *Angela Fan, Shruti Bhosale, Holger Schwenk, Zhiyi Ma, Ahmed El-Kishky, Siddharth Goyal, Mandeep Baines, Onur Celebi, Guillaume Wenzek, Vishrav Chaudhary, Naman Goyal, Tom Birch, Vitaliy Liptchinsky, Sergey Edunov, Edouard Grave, Michael Auli, and Armand Joulin.* This HuggingFace CCMatrix dataset is a wrapper around the service and files prepared and hosted by OPUS: * **[Parallel Data, Tools and Interfaces in OPUS](https://www.aclweb.org/anthology/L12-1246/)** by *Jörg Tiedemann*. ### Contributions
dim/leetcodesolutions_en_2k
--- license: mit dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 4847444 num_examples: 2048 download_size: 937266 dataset_size: 4847444 ---
CyberHarem/sockdolager_neuralcloud
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of sockdolager/サックダラジャー/莎克拉戈 (Neural Cloud) This is the dataset of sockdolager/サックダラジャー/莎克拉戈 (Neural Cloud), containing 13 images and their tags. The core tags of this character are `brown_hair, long_hair, brown_eyes, red_eyes, ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 13 | 16.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sockdolager_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 13 | 8.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sockdolager_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 29 | 18.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sockdolager_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 13 | 14.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sockdolager_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 29 | 29.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sockdolager_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sockdolager_neuralcloud', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, open_mouth, looking_at_viewer, blush, brown_gloves, holding, puffy_sleeves | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | looking_at_viewer | blush | brown_gloves | holding | puffy_sleeves | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------------------|:--------|:---------------|:----------|:----------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
qazisaad/llama_2-product-titles-esci-train-all-temp
--- dataset_info: features: - name: index dtype: int64 - name: query dtype: string - name: average_score dtype: float64 - name: total_score dtype: float64 - name: text dtype: string - name: label dtype: string - name: preds dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 6964939 num_examples: 3060 download_size: 1087545 dataset_size: 6964939 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "llama_2-product-titles-esci-train-all-temp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_wnli_definite_for_indefinite_articles
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 5997 num_examples: 29 - name: test num_bytes: 27482 num_examples: 94 - name: train num_bytes: 40847 num_examples: 195 download_size: 32454 dataset_size: 74326 --- # Dataset Card for "MULTI_VALUE_wnli_definite_for_indefinite_articles" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Yamei/NER_VISBank
--- dataset_info: features: - name: tokens sequence: string - name: ner_tags sequence: class_label: names: '0': B_DATA '1': I_DATA '2': B_APPLICATION '3': I_APPLICATION '4': B_METHOD '5': I_METHOD '6': B_VISUALIZATION '7': I_VISUALIZATION '8': B_EVALUATION '9': I_EVALUATION '10': O - name: id dtype: string splits: - name: train num_bytes: 1627524.584712372 num_examples: 4568 - name: test num_bytes: 90497.20764381402 num_examples: 254 - name: valid num_bytes: 90497.20764381402 num_examples: 254 download_size: 397990 dataset_size: 1808518.9999999998 --- # Dataset Card for "NER_VIS_5076" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Aleavka/Kaifei_29
--- license: odbl ---
CVasNLPExperiments/TinyImagenet_200_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_200
--- dataset_info: features: - name: id dtype: int64 - name: prompt dtype: string - name: true_label dtype: string - name: prediction dtype: string splits: - name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices num_bytes: 88117 num_examples: 200 download_size: 37414 dataset_size: 88117 --- # Dataset Card for "TinyImagenet_200_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_200" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ServiceNow/hotpot_test_pos__1_2
--- dataset_info: features: - name: context dtype: string - name: contexts_list sequence: string - name: titles_list sequence: string - name: useful_contexts sequence: int64 - name: question dtype: string - name: answer dtype: string - name: sample_idx dtype: int64 - name: dataset dtype: string splits: - name: test num_bytes: 255384789 num_examples: 22215 download_size: 150776605 dataset_size: 255384789 configs: - config_name: default data_files: - split: test path: data/test-* ---
JB/mimic-cxr-rrg
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: id dtype: int64 - name: image dtype: image - name: impression dtype: string splits: - name: test num_bytes: 14124813.0 num_examples: 100 download_size: 14118845 dataset_size: 14124813.0 --- # Dataset Card for "mimic-cxr-rrg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
breno30/MeiaNoite
--- license: openrail ---
AITECHPRODUCTS/WellRates.csv
--- license: mit ---
Svenni551/toxic_conversations-toxic_only
--- dataset_info: features: - name: text dtype: string - name: label dtype: int64 - name: label_text dtype: string - name: all_labels list: - name: label dtype: string - name: score dtype: float64 - name: max_label dtype: string splits: - name: train num_bytes: 116994221 num_examples: 140379 download_size: 57452504 dataset_size: 116994221 configs: - config_name: default data_files: - split: train path: data/train-* ---
UmaDiffusion/ULTIMA-prompts
--- license: other --- This document is a part of [ULTIMA Dataset](https://huggingface.co/datasets/UmaDiffusion/ULTIMA). You don't need to write *the feature tags* to generate image of character. ## Included Costumes ### Common Clothes | costume | prompts | |:--------|:-| | Tracen Stage Uniform | official alternate costume, choker, long sleeves, wristband, cropped jacket, white shirt, white jacket, red vest, midriff, waist cape, overskirt, red shorts, short shorts, garter straps, purple thighhighs, white footwear | | Tracen Summer Uniform | tracen school uniform, summer uniform, serafuku, puffy short sleeves, purple bowtie, horseshoe ornament, sailor collar, sailor shirt, purple shirt, white skirt, pleated skirt, frilled skirt, miniskirt, zettai ryouiki, white thighhighs, brown footwear, loafers | | Tracen Winter Uniform | tracen school uniform, winter uniform, puffy long sleeves, sailor collar, purple shirt, white bowtie, horseshoe ornament, purple skirt, pleated skirt, purple thighhighs, brown footwear, loafers | | Tracen Training Uniform | tracen trainig uniform, track jacket, track pants | | Gaze on Me | official alternate costume, white shirt, wrist scrunchie, short sleeves, white shirt, tied shirt, navel, midriff, blue shorts, floral print, swim trunks, sandals, flip-flops, blue footwear | | Glorious Azure | official alternate costume, open clothes, open jacket, wing collar, blue jacket, cropped jacket, puffy short sleeves, blue short necktie, white shirt, underbust, buttons, double-breasted, white gloves, gold trim, back yellow bow, belt, horsehoe ornament, red bow, high-waist blue skirt, white shorts, zettai ryouiki, white thighhighs under knee boots, high heel boots, white footwear | | New Star Rose | official alternate costume, puffy short sleeves, white jacket, brooch, neck ribbon, white shirt, white gloves, buttons, medal, gold trim, white over-kneehighs, purple skirt, fringe trim, white shorts, short shorts, high heel boots, white footwear | | Snowy Integrity | official alternate costume, single epaulette, light blue coat, gold trim, white shirt, pink bowtie, brooch, long sleeves, waist cape, white shorts, white thighhighs, white footwear, knee boots, high heels, high heel boots | | <br> ### Original Clothes | abbr. | Full Name | costume | prompts | |:-:|:-|:-|:--------------| | agt | Agens Tachyon | the feature tags | 1girl, jewelry, purple eyes, purple hair, messy hair, animal ears, orange eyes, solo, horse ears, red eyes, bangs, medium hair, short hair, brown eyes, ahoge, horse tail, brown hair, green hair, earrings, horse girl, hair between eyes | | | | race | single earring, long sleeves, sleeves past wrists, sleeves past fingers, collared shirt, black shirt, yellow sweater, sweater vest, black necktie, short necktie, open coat, white coat, labcoat, test tube, vial, zipper, black pantyhose, white footwear, high heel boots | | | | casual | single earring, casual, off-shoulder shirt, bare shoulders, purple shirt, purple sweater, bra strap, long sleeves, sleeves past wrists, necklace, pendant, black pants, white footwear, high heels | | agv | Air Groove | the feature tags | 1girl, parted bangs, horse tail, horse girl, black hair, makeup, solo, parted lips, animal ears, long hair, medium hair, hair between eyes, short hair, parted bangs, bangs, grey eyes, horse ears, blue eyes, purple eyes, eyeshadow, brown hair | | | | race | yellow bow, ear bow, choker, shoulder cutout, clothing cutout, sailor collar, cape, two-tone shirt, yellow shirt, black shirt, wide sleeves, o-ring, belt, blue skirt, pleated skirt, thigh strap, single thighhigh, black thighhighs | | | | wedding | official alternate costume, hair ornament, purple flower, hair flower, purple rose, bridal gauntlets, see-through, bare shoulders, wedding dress, white dress, white pants, high heels | | | | casual | casual, blue jacket, black shirt, neckless, brown belt, brown pants | | air | Air Shakur | the feature tags | 1girl, tail, animal ears, brown hair, yellow eyes, long hair, horse tail, fingernails, sharp teeth, orange eyes, horse girl, parted bangs, eyebrow piercing, horse ears, bangs, asymmetrical bangs, brown eyes, piercing, short hair, green eyes, ear piercing, red eyes, solo, black hair | | | | original art | hair ornament, arm tattoo, arm belt, ring, blue sailor collar, blue neckerchief, yellow shirt, sleeveless shirt, bare shoulders, spiked bracelet, studded bracelet, midriff, belt, nail polish, white shorts, short shorts, cutoffs, thigh strap, white footwear | | | | race | yellow shirt, yellow tank top, side cutout, black choker, necklace, short sleeves, shoulder cutout, crop top, blue collar, cropped jacket, black jacket, open jacket, zipper, wristband, bracelet, belt, black shorts, thigh strap, single thighhigh, blue thighhighs, black footwear | | | | SSR Speed | official alternate costume, beanie, purple headwear, animal ear headphones, ears through headwear, choker, chain necklace, long sleeves, purple shirt, purple jacket, multiple rings, eyewear, orange-tinted eyewear | | | | casual | choker, hood down, purple hoodie, hooded jacket, long sleeves, sleeves past wrists, off shoulder, purple shirt, purple jacket, multicolored jacket, open jacket, black pants, sneakers | | aky | Akikawa Yayoi | the feature tags | 1girl, multicolored hair, brown hair, two-tone hair, fangs, white hair, very long hair, fang, purple eyes, streaked hair, blunt bangs, orange hair, long hair, blue eyes, skin fang, blonde hair, solo, bangs | | | | original art | white headwear, hat flower, blue rose, vertical stripes, striped ascot, vertical-striped shirt, open clothes, open jacket, blue jacket, cropped jacket, puffy long sleeves, buttons, high-waist dress, white dress, socks, shoes, blue footwear | | amv | Admire Vega | the feature tags | 1girl, ear covers, purple eyes, brown hair, medium hair, horse ears, pink eyes, multicolored hair, orange eyes, solo, tail, very long hair, horse girl, low ponytail, long hair, horse tail, ponytail, hair between eyes, bangs, brown eyes, red eyes, animal ears, black hair, two-tone hair, 1other, | | | | original art | hair ribbon, white ribbon, single ear cover, black shirt, collared shirt, white necktie, juliet sleeves, layered sleeves, puffy long sleeves, blue dress, buttons, corset, pleated skirt, white skirt, black pantyhose, white footwear, black footwear, mismatched footwear, asymmetrical footwear | | | | race | hair ribbon, white ribbon, single ear cover, capelet, puffy long sleeves, black glove, fingerless glove, white necktie, blue dress, buttons, corset, pleated skirt, black pantyhose, white footwear, black footwear, mismatched footwear, asymmetrical footwear | | | | casual | casual, hair ribbon, white ribbon, cable knit, necklace, long sleeves, aran sweater, blue sweater, turtleneck sweater, ribbed sweater, denim, jeans, blue pants | | and | Agnes Digital | the feature tags | 1girl, pink eyes, blue eyes, horse ears, brown hair, horse tail, purple eyes, bangs, long hair, pink hair, horse girl, animal ears, very long hair, solo, two side up, braid | | | | original art | hair bow, red bow, sailor dress, sleeveless dress, yellow dress, frills, blue necktie, single glove, white gloves, garter straps, frilled thighhighs, fishnet thighhighs | | | | race | hair bow, yellow bow, bead necklace, jewelry, sailor collar, sleeveless, frills, white shirt, multicolored clothes, crop top, wrist cuffs, bracelet, red belt, multicolored skirt, frilled skirt, layered skirt, red bow, single thighhigh, socks, asymmetrical legwear, mismatched legwear | | | | halloween | official alternate costume, qing guanmao, talisman, ofuda, twin braids, hair bun, double bun, chinese dress, pink dress, frilled dress, detached sleeves, double sleeves past fingers, white thighhighs, jiangshi | | | | casual | casual, hair bow, red bow, choker, pink jacket, open jacket, yellow shirt, wrist scrunchie, belt, blue skirt, pleated skirt, white thighhighs | | atm | Aston Machan | the feature tags | 1girl, brown hair, medium hair, horse ears, simple background, solo, horse girl, green eyes, long hair, horse tail, short hair, hair between eyes, one side up, bangs, yellow eyes, brown eyes, animal ears, parted bangs, side ponytail, ponytail, twintails | | | | race | mini crown, red scrunchie, black bowtie, long sleeves, collared shirt, red jacket, white shirt, white dress, white skirt, black thighhighs, white loafer, white footwear | | | | casual | casual, red scrunchie, neck ribbon, frilled shirt collar, white shirt, long sleeves, button, green skirt, suspender skirt, black thighhighs | | azs | Anshinzawa Sasami | the feature tags | 1girl, ahoge, pink hair, long hair, blue eyes, blonde hair, solo, swept bangs, very long hair, bangs, hair between eyes, lipstick | | | | original art | holding needle, sunglasses, white gloves, pearl necklace, long sleeves, wide sleeves, open clothes, open coat, white coat, red dress, pencil dress, short dress, needle, thigh strap, shoes, red footwear, high heels | | bbm | Bamboo Memory | the feature tags | 1girl, blue eyes, green eyes, two-tone hair, animal ears, streaked hair, headband, brown hair, solo, horse ears, multicolored hair, hair between eyes, horse tail, short hair, white hair, bangs, horse girl, green eyes | | | | race | ear bow, hairband, neck ribbon, white vest, grey shirt, collared shirt, open jacket, black jacket, white gloves, armband, long sleeves, green skirt, black shorts, bike shorts, black socks, knee boots, high heel boots, white footwear | | | | SSR Guts | official alternate costume, ear bow, hachimaki, black necktie, white shirt, collared shirt, cardigan, gray shirt, black jacket, open jacket, armband, white gloves, long sleeves, pleated skirt, brown skirt, bike shorts under skirt, black shorts, miniskirt, white socks, boots, black footwear | | bkp | Biko Pegasus | the feature tags | 1girl, blue eyes, multicolored hair, animal ears, horse tail, short hair, brown eyes, white hair, grey eyes, horse ears, grey hair, black hair, pink hair, bangs, brown hair, purple eyes, two-tone hair, solo, aqua eyes, long hair, horse girl | | | | original art | star ear ornament, school uniform, long sleeves, orange bowtie, brooch, backpack, bag, wings, double-breasted, buttons, white jacket, purple skirt, pleated skirt, orange thighhighs, boots, white footwear | | | | race | star ear ornament, fingerless gloves, white gloves, puffy short sleeves, purple jacket, hooded jacket, cropped jacket, belt, short shorts, white shorts, sneakers, white footwear | | blt | Byerley Turk | the feature tags | 1girl, horse tail, two-tone hair, bangs, scar on cheek, brown hair, scar on face, black hair, long hair, solo, short hair, pink eyes, horse girl, red eyes, horse ears, animal ears, scar across eye, multicolored hair, purple eyes, hat | | | | original art | ear ornament, garrison cap, black gloves, white neckerchief, crop top, bandeau, long sleeves, yellow jacket, cropped jacket, open jacket, open clothes, suspenders, black pants, black footwear, high heel boots | | btg | Bitter Glace | the feature tags | 1girl, solo, horse ears, abs, animal ears, brown hair, horse girl, horse tail, short hair, red shorts, breasts, dark skin, tail, bangs, muscular female, muscular, dark-skinned female, parted lips, brown eyes, hair between eyes | | | | race | ear ornament, arm strap, black choker, fingerless gloves, red gloves, white shirt, crop top, hooded jacket, hood down, sleeveless jacket, red jacket, cropped jacket, open clothes, brown belt, red shorts, short shorts, bandaid on leg, kneehighs, black socks, black footwear, boots | | bwh | Biwa Hayahide | the feature tags | 1girl, long hair, solo, horse ears, animal ears, horse tail, tail, horse girl, white hair, very long hair, yellow eyes, ahoge | | | | race | semi-rimless eyewear, under-rim eyewear, red-framed eyewear, long sleeves, neck ribbon, red ribbon, puffy sleeves, juliet sleeves, buttons, purple shirt, dress, jacket, purple skirt, bell, miniskirt, zettai ryouiki, garter straps, black thighhighs, purple footwear, high heels | | | | christmas | official alternate costume, semi-rimless eyewear, under-rim eyewear, red-framed eyewear, santa costume, necklace, tiara, fur-trimmed dress, red dress, bare shoulders, detached sleeves, jewelry, christmas ornaments, black thighhighs, red footwear, high heels | | | | casual | casual, black shirt, short sleeves, neckerchief, skirt | | cpr | Copano Rickey | the feature tags | 1girl, solo, horse girl, horse tail, horse ears, animal ears, double bun, hair bun, ahoge, purple eyes, brown hair, long hair, ear covers, multicolored hair | | | | race | bare shoulder, off-shoulder, blue gloves, bead bracelet, fingerless gloves, detached sleeves, wide sleeves, red ribbon, neck ribbon, cleavage, yellow dress, frilled dress, yellow footwear, white footwear, boots, mismatched footwear, asymmetrical footwear | | | | casual | casual, yellow dress, yellow bowtie, belt, puffy sleeves, short sleeves, plaid, white socks | | crc | Curren Chan | the feature tags | 1girl, white hair, brown hair, animal ears, pink hair, horse ears, solo, brown eyes, red eyes, purple eyes, hair between eyes, bangs, grey hair, pink eyes, short hair, horse tail, horse girl | | | | race | hairband, red bow, ear bow, collared dress, vertical-striped dress, bare shoulders, off-shoulder jacket, black jacket, long sleeves, white gloves, frills, red thighhighs, thighhighs under boots, knee boots, black footwear | | | | wedding | official alternate costume, ear covers, ear ornament, wedding dress, black dress, bare shoulders, sleeveless dress, elbow gloves, black gloves, bracelet, frills, dress bow, black footwear, high heels | | | | SSR Wisdom | official alternate costume, braid, hair ornament, hair flower, red bowtie, japanese clothes, sleeveless kimono, bare shoulders, detached sleeves, frilled sleeves, wide sleeves, red nails, obi, sash, frills | | | | RTTT Pajamas | official alternate costume, hairband, red bow, ear bow, frills, pink shirt, pink shorts, pajamas | | | | casual | casual, hairband, red bow, ear bow, necklace, jewelry, white shirt, long sleeves, see-through sleeves, sleeves past wrists, belt, red skirt, buttons, black pantyhose | | cvg | Cheval Grand | the feature tags | 1girl, solo, white hair, black hair, hair between eyes, multicolored hair, blonde hair, brown hair, short hair with long locks, green eyes, long sleeves, simple background, aqua eyes, animal ears, two-tone hair, streaked hair, medium hair, purple eyes, bangs, ponytail, horse ears, horse girl, blue eyes, horse tail, long hair, short hair | | | | race | white headwear, ears through headwear, peaked cap, cape, sailor collar, collared shirt, white shirt, yellow ascot, white jacket, long sleeves, white gloves, belt, black shorts, black socks, shoes, white footwear | | | | BoC'z | official alternate costume, baseball cap, ears through headwear, black headwear, black choker, grey shirt, collared shirt, long sleeves, high-waist shorts, black shorts, kneehighs, black footwear | | dbz | Twin Turbo | the feature tags | 1girl, @_@, long hair, twintails, crossed bangs, red eyes, hair ribbon, blue eyes, very long hair, hair between eyes, aqua eyes, pink eyes, heterochromia, purple hair, two-tone hair, aqua hair, blue hair, sharp teeth, gradient hair, ahoge, bangs, horse girl, green hair, solo, multicolored hair, horse tail, purple eyes, horse ears, animal ears, striped ribbon, white ribbon, ribbon, black ribbon, ear ribbon, hair bow | | | | race | hoodie, hood down, hooded coat, puffy long sleeves, multicolored clothes, multicolored jacket, drawstring, nail polish, black bodysuit, black pantyhose, ankle boots, yellow footwear | | | | SSR Guts | official alternate costume, hair flower, hair ornament, black choker, off-shoulder dress, sleeveless dress, bare shoulders, scrunchie, wrist cuffs, black dress, smile, frilled dress, frills, white pantyhose | | | | casual | casual, blue hoodie, hood down, blue hoodie, black hoodie, drawstring, multicolored clothes, blue skirt, pleated skirt, plaid skirt, mismatched legwear, asymmetrical legwear, striped thighhighs | | dir | Daiichi Ruby | the feature tags | 1girl, solo, animal ears, horse ears, black hair, long hair, braid, purple eyes, hair bow, horse girl, drill hair, bangs | | | | race | red ascot, puffy long sleeves, collared dress, frilled dress, purple dress, white pantyhose, mary janes, black footwear | | | | casual | casual, long sleeves, blue shirt, center frilled, collared shirt, blue skirt, long skirt, white pantyhose, blue footwear, loafer | | dla | Darley Arabian | the feature tags | 1girl, horse girl, horse tail, bangs, multicolored hair, solo, animal ears, dark skin, medium hair, brown hair, horse ears, white hair, streaked hair, hair between eyes, long hair, dark-skinned female, ahoge, aqua eyes, two-tone hair, red hair, pink hair, green eyes | | | | original art | ear ornament, jewelry, necklace, bra strap, crop top, horseshoe ornament, frilled shirt, red shirt, bare shoulders, off-shoulder shirt, detached sleeves, puffy long sleeves, frilled sleeves, bracelet, highleg panties, underwear, jeans, blue pants, sandals, white footwear | | drt | Daring Tact | the feature tags | 1girl, green eyes, tail, long hair, aqua eyes, hair ornament, horse tail, multicolored hair, animal ears, medium hair, horse girl, black hair, hair between eyes, streaked hair, ahoge, solo, blue eyes, horse ears, brown hair, star hair ornament | | | | race | hair ornament, blue ascot, halterneck shirt, collared shirt, white jacket, white shirt, open jacket, black gloves, wrist cuffs, long sleeves, belt, white skirt, layered skirt, frills, black pantyhose | | dth | Daitaku Helios | the feature tags | 1girl, brown hair, horse girl, bangs, horse tail, black hair, green eyes, blue hair, swept bangs, solo, side ponytail, ponytail, long hair, multicolored hair, brown eyes, yellow eyes, streaked hair, horse ears, colored inner hair, animal ears, hair between eyes, medium hair, two-tone hair | | | | race | ear ornament, ear piercing, ear covers, star hair ornament, hairclip, necklace, short sleeves, blue shirt, tied shirt, off-shoulder shirt, bare shoulders, necklace, shoulder cutout, clothing cutout, bead bracelet, wristband, belt buckle, two-tone belt, denim shorts, white shorts, short shorts, cutoffs, blue footwear, boots | | | | casual | casual, ear ornament, ear piercing, ear covers, star hair ornament, hairclip, bead necklace, layered sleeves, white shirt, clothes writing, crop top, orange belt, denim shorts, blue shorts, torn clothes | | dws | Daiwa Scarlet | the feature tags | 1girl, horse tail, twintails, horse girl, red hair, animal ears, brown eyes, very long hair, tail, bangs, fang, skin fang, hair between eyes, tail through clothes, horse ears, hair intakes, long hair, solo, pink eyes, brown hair, red eyes, antenna hair | | | | race | hair bow, red bow, tiara, epaulettes, puffy long sleeves, juliet sleeves, black bowtie, white shirt, framed breasts, blue jacket, blue dress, center frills, underbust, layered skirt, garter straps, white thighhighs, white footwear | | | | SSR Power | official alternate costume, official alternate hairstyle, low twintails, hair ornament, hair bobbles, navel, red bikini, frilled bikini, bikini skirt, swimsuit, frills | | | | christamas | official alternate costume, official alternate hairstyle, single ear cover, christmas, tiara, bare shoulders, detached sleeves, jewelry, fur trim, brooch, red gloves, fur-trimmed gloves, red dress, fur-trimmed dress, plaid dress, black skirt, black thighhighs, thigh boots, high heel boots, black footwear | | | | casual | hair bow, red bow, tiara, puffy short sleeves, bare shoulders, off-shoulder shirt, collarbone, white shirt, frills, blue skirt, plaid skirt, pleated skirt | | ecp | El Condor Pasa | the feature tags | 1girl, horse tail, very long hair, black hair, green eyes, white hair, horse ears, horse girl, solo, bangs, hair intakes, animal ears, ponytail, hair between eyes, brown hair, long hair, blue eyes, aqua eyes | | | | race | hair scrunchie, domino mask, sailor collar, long sleeves, yellow shirt, buttons, red coat, open coat, belt, blue skirt, pleated skirt, miniskirt, black thighhighs, red footwear, high heels | | | | fantasy | official alternate costume, domino mask, ear ornament, bare shoulders, detached sleeves, fur-trimmed sleeves, black shirt, sleeveless shirt, clothing cutout, cleavage cutout, gauntlets, fur-trimmed corset, black pantyhose, orange footwear, knee boots | | | | casual | casual, hair scrunchie, domino mask, polo shirt, yellow shirt, wristband, short sleeves, collared shirt, brown skirt, plaid skirt, brown footwear | | esf | Eishin Flash | the feature tags | 1girl, short hair, brown hair, blue eyes, horse tail, horse girl, aqua eyes, purple hair, horse ears, purple eyes, hair between eyes, bangs, black hair, solo, animal ears, medium hair, bob cut, blue hair, swept bangs | | | | race | hair ornament, ear ribbon, white scrunchie, german clothes, frills, red choker, bare shoulders, black sleeves, wide sleeves, frilled sleeves, long sleeves, detached sleeves, shirt, dirndl, bodice, brown belt, white apron, waist apron, vertical-striped skirt, red skirt, frilled skirt, zettai ryouiki, black thighhighs, mary janes, blue footwear | | | | valentine | official alternate costume, valentine, low twintails, hair ornament, white headwear, tilted headwear, mini hat, chef hat, hairclip, long sleeves, brooch, striped bow, pink bow, brown bow, neckerchief, red ascot, frilled cuffs, wrist cuffs, frills, white dress, apron, buttons, double-breasted, pocket watch, zettai ryouiki, white thighhighs, brown footwear, high heels | | | | casual | casual, hair ornament, ear ribbon, white scrunchie, wide sleeves, long sleeves, brown jacket, open clothes, open jacket, sash, buttons, black dress, sleeveless dress | | fjk | Fuji Kiseki | the feature tags | 1girl, solo, short hair, two-tone hair, white hair, horse tail, blue eyes, brown hair, hair between eyes, horse ears, multicolored hair, ahoge, horse girl, tail, bangs, animal ears, streaked hair, green eyes, blue hair, black hair | | | | race | ear piercing, ear ornament, black necktie, long sleeves, cleavage, black gloves, no bra, white shirt, center opening, open clothes, collared shirt, center frills, frilled shirt, half gloves, open jacket, black jacket, underbust, brown belt, chain, black pants, black footwear, high heels | | | | ballroom | official alternate costume, hair ornament, hair flower, white flower, nail polish, jewelry, necklace, bracelet, bare shoulders, off-shoulder dress, purple dress | | | | casual | casual, ear piercing, ear ornament, long sleeves, turtleneck sweater, black sweater, ribbed sweater, wristwatch, belt, bag, plaid skirt, brown skirt, black footwear, boots | | fnm | Fine Motion | the feature tags | 1girl, white hair, solo, folded ponytail, horse girl, bangs, green eyes, yellow eyes, hair bun, streaked hair, brown hair, short hair, black hair, horse ears, multicolored hair, hair between eyes, animal ears, brown eyes, tail, single hair bun, medium hair, horse tail, two-tone hair | | | | race | clover hair ornament, hair ribbon, white ascot, black bowtie, long sleeves, white gloves, white jacket, collared shirt, green shirt, buttons, open jacket, open clothes, black skirt, pleated skirt, white socks, white footwear, high heel boots | | | | wedding | official alternate costume, tiara, ear covers, bridal veil, necklace, jewelry, wedding dress, white dress | | | | casual | casual, clover hair ornament, hair ribbon, collared shit, white shirt, shirt tucked in, belt, plaid skirt, green skirt | | gdc | Gold City | the feature tags | 1girl, solo, horse girl, bangs, green eyes, yellow eyes, very long hair, blonde hair, streaked hair, brown hair, purple eyes, horse ears, multicolored hair, parted bangs, long hair, animal ears, tail, swept bangs, blue eyes, horse tail | | | | race | ear bow, blue bow, black choker, chain, strapless, bandeau, tube top, open jacket, black jacket, long sleeves, single glove, black belt, denim shorts, short shorts, cutoffs, single thigh strap, high heel boots, mismatched footwear, asymmetrical footwear | | | | original art | ear bow, blue bow, blue necktie, white shirt, collared shirt, grey jacket, sleeves rolled up, single glove, open jacket, open clothes, red dress, pleated skirt, black pantyhose | | | | festival | official alternate costume, hair ornament, hair flower, bare shoulders, cleavage cutout, clothing cutout, japanese clothes, frilled kimono, purple kimono, wide sleeves, detached sleeves, black gloves, sash, obi, purple footwear | | | | casual | casual, necklace, pendant, plaid jacket, grey jacket, open clothes, red sweater, turtleneck sweater, ribbed sweater, black belt, blue skirt | | gds | Gold Ship | the feature tags | 1girl, solo, long hair, animal ears, horse ears, bangs, purple eyes, breasts, horse girl, grey hair, blunt bangs, horse tail | | | | race | pillbox hat, brown headwear, red bowtie, bare shoulders, red dress, sleeveless dress, armband, white gloves, buttons, pouch, white pantyhose, thigh strap, white footwear, knee boots, high heel boots | | | | SSR Speed | purple crown, cape, red bowtie, red dress, sleeveless dress, white gloves, buttons, pouch, white pantyhose, thigh strap, white footwear, knee boots, high heel boots | | | | summer | official alternate costume, sunglasses, necklace, jewelry, black one-piece swimsuit, casual one-piece swimsuit, cleavage, bare shoulders, arm tattoo, bracelet, thigh strap, black footwear, sandals | | | | casual | casual, pillbox hat, brown headwear, red jacket, open jacket, open clothes, ribbed shirt, white shirt, long sleeves, jeans, blue pants | | gpb | Godolphin Barb | the feature tags | 1girl, horse tail, gradient hair, horse ears, blue eyes, long hair, hair intakes, blue hair, bangs, aqua eyes, animal ears, horse girl, multicolored hair, solo, purple hair, parted bangs, green eyes, very long hair, tail, ear ornament | | | | original art | hair ornament, ear covers, collar, track jacket, blue jacket, long sleeves, black shorts, short shorts, black socks, sneakers, blue footwear | | grw | Grass Wonder | the feature tags | 1girl, horse ears, bangs, very long hair, horse tail, multicolored hair, brown hair, long hair, horse girl, blonde hair, orange hair, blue eyes, solo, white hair, parted bangs, animal ears, tail, two-tone hair, sidelocks | | | | race | ear ornament, striped bow, white sailor collar, armband, blue jacket, long sleeves, white bow, white skirt, brown pantyhose, brown footwear, high heel boots, mismatched footwear, asymmetrical footwear | | | | fantasy | official alternate costume, ear ornament, ear ribbon, beret, green headwear, white gloves, long sleeves, white dress, layered dress, white thighhighs, green footwear, high heels | | | | SSR Power | official alternate costume, hair ornament, single hair bun, hair flower, fur trim, long sleeves, wide sleeves, yukata, print kimono, blue kimono, japanese clothes, floral print, sash, obi | | | | Rapper | official alternate costume, black headwear, baseball cap, white shirt, black jacket, long sleeves, open clothes, blue pants, denim | | | | casual | casual, ear ornament, striped bow, white shirt, frilled shirt, puffy short sleeves, blue skirt | | hak | Hishi Akebono | the feature tags | 1girl, horse girl, brown eyes, bangs, brown hair, long hair, hair between eyes, red eyes, pink eyes, solo, horse ears, animal ears, horse tail, very long hair, medium hair, twintails, purple eyes, side ponytail, sidelocks | | | | race | hair ribbon, hair bobbles, mini hat, white chocker, striped dress, blue dress, white apron, buttons, badge, wrist cuffs, puffy short sleeves, waist apron, frilled apron, frilled dress, blue ribbon, skirt, white socks, white footwear, brown footwear, mismatched footwear, asymmetrical footwear, sneakers | | | | original art | hair ribbon, hair bobbles, blue bowtie, white jacket, double-breasted, buttons, long sleeves, blue skirt, pleated skirt, miniskirt, bandaid on knee, white socks, brown footwear, loafers | | | | SSR Guts | official alternate costume, hair bun, double bun, hair flower, pink flower, necklace, jewelry, collarbone, cleavage, bare shoulders, strapless dress, white dress, layered dress, wedding dress, frilled dress, elbow gloves, white gloves, dress flower, white bow, white thighhighs | | | | casual | casual, hair ribbon, hair bobbles, necklace, heart, jewelry, white shirt, off-shoulder shirt, bare shoulders, center frills, white bow, long sleeves, suspenders, suspender skirt, brown skirt, plaid skirt, white socks, shoes | | haz | Hishi Amazon | the feature tags | 1girl, dark blue hair, purple hair, dark skin, horse ears, orange eyes, dark-skinned female, solo, very long hair, horse girl, long hair, horse tail, fang, hair between eyes, bangs, blue hair, yellow eyes, brown eyes, red eyes, animal ears, black hair, tan, fang out, skin fang, fangs, toned | | | | race | hair scrunchie, red scrunchie, hairclip, white choker, jewelry, sailor collar, single bare shoulder, single sleeve, long sleeves, arm ribbon, crop top, two-tone bandeau, frills, white skirt, blue skirt, two-tone skirt, pleated skirt, miniskirt, bikini skirt, layered skirt, single thigh strap, leg ribbon, sandals, blue footwear | | | | original art uncensored | dress, off shoulder, off-shoulder dress, swimsuit, thighhighs, school uniform, hair scrunchie, red scrunchie, hairclip, blue sailor collar, blue neckerchief, wide sleeves, white shirt, loose shirt, bare shoulders, off-shoulder shirt, single thigh strap, blue footwear, boots, tanlines, bikini tan | | | | original art censored | dress, off shoulder, off-shoulder dress, swimsuit, thighhighs, school uniform, hair scrunchie, red scrunchie, hairclip, blue sailor collar, blue neckerchief, wide sleeves, white shirt, loose shirt, bare shoulders, off-shoulder shirt, single thigh strap, blue footwear, boots, bra strap, bikini under clothes, pleated skirt, blue skirt | | | | casual | casual, hair scrunchie, red scrunchie, hairclip, choker, off-shoulder shirt, red shirt, long sleeves, crop top, frilled shirt, bare shoulders, bracelet, jewelry, denim, torn pants, sneakers | | hkt | Hayakawa Tazuna | the feature tags | 1girl, multicolored hair, horse girl, black hair, purple hair, brown hair, green hair, two-tone hair, ponytail, red hair, split ponytail, green eyes, aqua eyes, very long hair, low ponytail, long hair, solo, short hair, bangs, low twintails, twintails | | | | original art | green headwear, bowler hat, hair ribbon, yellow necktie, wristwatch, formal, long sleeves, collared shirt, green jacket, buttons, double-breasted, green skirt, skirt suit, pencil skirt, black pantyhose, shoes, green footwear | | hmr | Hishi Miracle | the feature tags | 1girl, white hair, ahoge, solo, short hair, horse girl, green eyes, blue eyes, brown eyes, grey hair, horse tail, blue hair, long hair, animal ears, medium hair, purple eyes, horse ears, grey eyes, red eyes, yellow eyes, bangs, light purple hair | | | | race | ear covers, ear ornament, collared shirt, white shirt, sleeveless shirt, necklace, bare shoulders, detached sleeves, puffy long sleeves, white sleeves, blue dress, white skirt, black pantyhose, boots, white footwear, black footwear | | | | casual | casual, ear covers, ear ornament, white dress, long sleeves, jeans, blue pants, brown footwear, shoes | | hpm | Happy Meek | the feature tags | 1girl, red eyes, pink hair, solo, tail, braid, white hair, long hair, brown eyes, horse ears, purple eyes, horse girl, brown hair, blonde hair, bangs, horse tail, blunt bangs, animal ears, pink eyes, yellow eyes, short hair, grey hair, light brown hair, ponytail, medium hair, bob cut | | | | race | hair flower, hairclip, striped bowtie, blue bowtie, brooch, jewelry, collared shirt, dress shirt, white shirt, blue vest, white jacket, open clothes, open jacket, puffy short sleeves, frilled sleeves, white gloves, pleated skirt, blue skirt, boots, white footwear | | htm | Hokko Tarumae | the feature tags | 1girl, solo, green eyes, black hair, blunt bangs, animal ears, horse tail, multicolored hair, streaked hair, brown eyes, braided ponytail, blue eyes, white hair, very long hair, two-tone hair, long hair, horse ears, purple eyes, bangs, twin braids, horse girl, brown hair | | | | race | ear covers, beret, hair bow, striped bow, sailor collar, white jacket, bowtie, long sleeves, layered sleeves, half gloves, asymmetrical gloves, mismatched gloves, black gloves, white gloves, white skirt, white socks, shoes, sneakers | | | | BoC'z | hair bow, official alternate costume, baseball cap, ears through headwear, black headwear, black jacket, zipper, long sleeves, crop top, grey pants | | | | casual | casual, beret, ear covers, hair bow, striped bow, collared shirt, white shirt, blue sweater, neck ribbon, long sleeves, pleated skirt, blue skirt, white socks, shoes, blue footwear, sneakers | | hur | Haru Urara | the feature tags | 1girl, horse tail, pink eyes, horse girl, pink hair, bangs, solo, purple eyes, hair intakes, animal ears, red eyes, horse ears | | | | race | ponytail, ear covers, hair bow, long hair, red headband, fingerless gloves, red gloves, gym uniform, white shirt, track jacket, open jacket, red buruma, bandaid on leg, striped socks, kneehighs, white footwear | | | | new year | official alternate costume, hair bun, hair ornament, ear covers, wide sleeves, long sleeves, floral print, japanese clothes, red kimono, sash, obi, pink hakama, hakama skirt, frills | | | | casual | casual, ponytail, ear covers, hair bow, long hair, red headband, pink vest, puffy short sleeves, white sleeves, neck ribbon, red ribbon, collared shirt, overall shorts, blue overalls, sneakers, pink footwear | | ikd | Ikuno Dictus | the feature tags | 1girl, streaked hair, red eyes, horse ears, hair between eyes, white hair, very long hair, horse girl, brown hair, long hair, green eyes, two-tone hair, bangs, purple eyes, multicolored hair, horse tail, short hair, brown eyes, grey eyes, yellow eyes, animal ears, orange hair, braided ponytail, single braid, solo, blue eyes | | | | race | round eyewear, ear ornament, white shirt, collared shirt, red necktie, long sleeves, dress shirt, two-tone vest, green vest, white vest, armband, open clothes, green coat, belt, tailcoat, coattails, white pants, thighhighs, thigh boots, high heel boots, black footwear | | | | SSR Guts | round eyewear, long sleeves, turtleneck shirt, white shirt, ribbed shirt, suspenders, shirt stuck in pants, blue pants, jeans, denim | | | | SSR Stamina | round eyewear, hat, white headwear, hair ornament, white dress, sleeveless dress, bare shoulders | | | | casual | round eyewear, white shirt, long sleeves, green jacket, open clothes | | inf | Ines Fujin | the feature tags | 1girl, blunt bangs, medium hair, horse tail, bangs, horse girl, animal ears, long hair, brown hair, diagonal bangs, solo, black hair, horse ears, short hair, side ponytail, green eyes, sidelocks, yellow eyes, one side up, parted bangs | | | | race | single ear cover, visor cap, freckles, choker, collarbone, pink jacket, crop top, cropped jacket, short sleeves, wristband, midriff, navel, white belt, pink skirt, miniskirt, bike shorts, shorts under skirt, black shorts, black socks, shoes, pink footwear | | | | original art | single ear cover, visor cap, freckles, track jacket, cleavage, pleated skirt, bike shorts, shorts under skirt, black socks, sneakers | | | | valentine | official alternate costume, maid headdress, hair ornament, necklace, jewelry, white shirt, clothing cutout, shoulder cutout, short sleeves, heart, wrist cuffs, red apron, plaid skirt, white socks, red footwear, mary janes | | | | casual | casual, single ear cover, brown headwear, baseball cap, freckles, hair through headwear, ears through headwear, hood down, orange hoodie, drawstring, cropped hoodie, crop top overhang, bodystocking, long sleeves, wirstwatch, black pants, track pants, brown footwear | | ino | Inari One | the feature tags | 1girl, hair between eyes, fang, animal ears, grey eyes, brown hair, bangs, horse tail, horse ears, medium hair, aqua eyes, solo, blue eyes, horse girl, brown eyes, black hair, green eyes, twintails, skin fang, long hair, thick eyebrows | | | | race | fox mask, mask on head, hair bow, yellow bow, choker, jewelry, japanese clothes, purple kimono, short kimono, tasuki, wrist scrunchie, short sleeves, wide sleeves, chest sarashi, ribbon trim, obi, sash, o-ring, shimenawa, kouhaku nawa, rope, kimono skirt, blue skirt, shorts under skirt, bike shorts, white shorts, black socks, zouri, ribbon-trimmed legwear, sandals, brown footwear, mismatched footwear, mismatched legwear | | | | original | fox mask, mask on head, hair bow, yellow bow, school uniform, collared shirt, white shirt, purple necktie, short sleeves, necklace, bead bracelet, clothes around waist, purple skirt, pleated skirt, socks, red footwear | | | | festival | official alternate costume, two side up, ear covers, hair ornament, choker, japanese clothes, bare shoulders, detached sleeves, wide sleeves, ribbon-trimmed sleeves, long sleeves, white kimono, sleeveless kimono, taut clothes, sash, obi, skirt, black pantyhose, toeless legwear, sandals | | | | casual | casual, mask on head, fox mask, hair bow, yellow bow, drawstring, purple hoodie, long sleeves, belt, purple bow, black shorts, sneakers, white footwear | | jgp | Jungle Pocket | the feature tags | 1girl, brown eyes, white hair, short hair, orange hair, medium hair, multicolored hair, blonde hair, animal ears, streaked hair, horse ears, hair between eyes, green eyes, antenna hair, horse girl, yellow eyes, long hair, ahoge, horse tail, bangs, brown hair, solo, braid | | | | race | ear ornament, hairclip, yellow jacket, crop top, cropped jacket, open clothes, open jacket, strapless, tube top, bandeau, long sleeves, midriff, navel, suspenders, animal print, belt, suspender skirt, black skirt, pleated skirt, green socks, black footwear, boots | | | | casual | casual, ear ornament, hairclip, necklace, jewelry, orange hoodie, hooded jacket, hood down, open clothes, open jacket, animal print, sleeveless shirt, white shirt, clothes writing, long sleeves, black shorts, black footwear, sneakers | | kho | King Halo | the feature tags | 1girl, red eyes, brown eyes, purple eyes, bangs, pink eyes, long hair, blue eyes, blonde hair, horse tail, solo, orange eyes, brown hair, black hair, horse ears, parted bangs, yellow eyes, medium hair, horse girl, animal ears | | | | race | ear covers, green bow, hair bow, purple gloves, off shoulder, bare shoulders, short sleeves, sleeveless, off-shoulder dress, collared dress, green dress, double-breasted, buttons, frills, garter straps, frilled thighhighs, purple thighhighs, white footwear, high heels | | | | cheerleader | ear covers, official alternate costume, side ponytail, hairband, long sleeves, armband, white gloves, white jacket, buttons, double-breasted, pleated skirt, white dress, layered dress, blue skirt, long skirt, black socks, brown footwear | | | | SSR Power | official alternate costume, ear covers, red headwear, hat feather, bare shoulders, off shoulder, necklace, jewelry, wide sleeves, long sleeves, black gloves, lace gloves, red dress, off-shoulder dress, frilled dress, layered dress, lace trim, frills, fishnets, lace-trimmed legwear, black thighhighs | | | | casual | casual, ear covers, green bow, hair bow, one side up, short sleeves, bracelet, vertical stripes, striped dress, black dress, see-through, high heels, black footwear | | kmr | Kashimoto Riko | the feature tags | 1girl, hair between eyes, bangs, yellow eyes, purple hair, ahoge, green hair, short hair, green eyes, brown eyes, red eyes, brown hair, hair behind ear, black eyes, solo, dark blue hair, blue hair, blue eyes, black hair, grey eyes, wavy hair, medium hair, purple eyes, long hair | | | | original art | formal, long sleeves, collared shirt, wing collar, white shirt, dress shirt, vertical-striped shirt, buttons, striped, pinstripe pattern, pinstripe suit, black suit, black jacket, open clothes, open jacket, frilled sleeves, black belt, belt buckle, pant suit, black pants, black footwear, high heels | | kra | Kiryuin Aoi | the feature tags | 1girl, purple eyes, hair ornament, brown eyes, blunt bangs, blue hair, blue eyes, dark blue hair, black hair, hairclip, ponytail, grey hair, purple hair, asymmetrical bangs, solo, straight hair, short hair, medium hair, bangs | | | | original art | dress shirt, white shirt, collared shirt, frilled shirt collar, long sleeves, black vest, buttons, blue pants, black footwear, shoes | | ksb | Kitasan Black | the feature tags | 1girl, red eyes, white hair, solo, brown eyes, horse tail, orange eyes, short hair, ahoge, two-tone hair, streaked hair, hair between eyes, yellow eyes, horse ears, medium hair, brown hair, tail, animal ears, black hair, multicolored hair, horse girl | | | | race | hair ribbon, hair ornament, fingerless gloves, yellow gloves, japanese clothes, long sleeves, wide sleeves, detached sleeves, bare shoulders, clothing cutout, cleavage cutout, kouhaku nawa, brown skirt, pleated skirt, zettai ryouiki, black thighhighs, sandals, red footwear | | | | new year | official alternate costume, braid, fur collar, ear ornament, hair ornament, detached collar, bare shoulders, detached sleeves, strapless dress, wide sleeves, frilled sleeves, ribbon trim, feathers, white dress, thigh strap, thighhighs, tabi, white footwear, platform footwear, sandals | | | | young | aged down, ponytail, hair ribbon, hair ornament, long sleeves, white shirt, track jacket, black jacket, open clothes, open jacket, food print, yellow shorts, black footwear, sneakers | | | | casual | casual, hair ribbon, hair ornament, long sleeves, white sleeves, hoodie, hood down, hooded jacket, blue jacket, black shirt, yellow shorts, short shorts, black thighhighs, sneakers, shoes, red footwear | | ksm | K.S.Miracle | the feature tags | 1girl, horse ears, solo, short hair, blue eyes, horse girl, purple hair, bangs, purple eyes, long bangs, blue hair, medium hair, light blue hair, black hair, horse tail, animal ears, low ponytail, sidelocks | | | | race | ear covers, blue shirt, collared shirt, blue coat, open coat, white ascot, brooch, jewelry, white gloves, half gloves, long sleeves, belt, black pants, thigh strap, tailcoat, boots | | | | SSR Guts | official alternate costume, ear covers, collared shirt, white shirt, dress shirt, sweater vest, yellow vest, black pants | | | | casual | casual, ear covers, ear ornament, white dress, long sleeves, jeans, blue pants, brown footwear, shoes | | kta | Katsuragi Ace | the feature tags | 1girl, solo, horse girl, ponytail, animal ears, horse ears, multicolored hair, black hair, streaked hair, white hair, long hair, hair between eyes, blue eyes, bangs | | | | race | ear covers, hair ornament, black gloves, red gloves, two-tone gloves, half gloves, high collar, blue shirt, frills, clothing cutout, cleavage cutout, long sleeves, black coat, belt, black shorts, short shorts, kneehighs, black socks, white footwear, sneakers | | kwp | Kawakami Princess | the feature tags | 1girl, very long hair, braid, horse ears, aqua eyes, brown hair, asymmetrical bangs, hair intakes, bangs, green eyes, orange hair, french braid, pink hair, parted bangs, long hair, red hair, ahoge, grey eyes, crown braid, sidelocks, blue eyes, horse tail, animal ears, horse girl, solo, forehead, ponytail, twintails | | | | race | ear ornament, pink dress, bare shoulders, cleavage, sleeveless dress, detached sleeves, pink sleeves, white thighhighs, pink footwear, boots | | | | original art | ear ornament, puffy short sleeves, black dress, white jacket, cropped jacket, white gloves, frills, white thighhighs, white footwear, boots | | | | SSR Speed | official alternate costume, hair ornament, hair flower, tiara, bridal veil, frilled choker, blue choker, pearl necklace, jewelry, collarbone, wedding dress, strapless dress, aqua dress, bare shoulders, cleavage, elbow gloves, wrist cuffs, white gloves, bridal garter, aqua footwear, high heels | | | | casual | casual, ear ornament, red dress, plaid dress, white jacket, open clothes, puffy short sleeves, cropped jacket, black bow, belt, black socks, red footwear | | ltc | Little Cocon | the feature tags | 1girl, horse girl, horse ears, swept bangs, blonde hair, green hair, streaked hair, horse tail, short hair, animal ears, solo, orange hair, bangs, multicolored hair, brown hair, hair between eyes, blue hair, green eyes, two-tone hair | | | | race | hair bow, striped bow, hood down, black hoodie, black choker, black collar, frilled sailor collar, drawstring, puffy long sleeves, sleeves past wrists, blue dress, chest harness, o-ring, short dress, frills, kneehighs, knee boots, striped socks, blue footwear, fishnets, asymmetrical legwear, mismatched legwear | | lth | Light Hello | the feature tags | 1girl, brown hair, horse girl, short hair, red eyes, brown eyes, horse tail, solo, long hair, animal ears, purple eyes, bangs, horse ears, braid, medium hair, pink eyes, hair intakes, black hair | | | | original art | casual, hair ornament, hairclip, necklace, jewelry, blue shirt, long sleeves, shirt tucked in, belt, white skirt, blue footwear, high heels | | mcb | Mr. C.B. | the feature tags | 1girl, very long hair, ahoge, tail, aqua eyes, swept bangs, long hair, blue eyes, animal ears, horse tail, solo, yellow eyes, bangs, horse ears, brown hair, black hair, horse girl, hair between eyes, green eyes | | | | race | mini top hat, hairclip, yellow choker, single bare shoulder, strapless shirt, green shirt, crop top, single sleeve, white jacket, open jacket, open clothes, cropped jacket, long sleeves, wrist cuffs, arm strap, white pants, high heels, black footwear | | | | BoC'z | hairclip, official alternate costume, baseball cap, ears through headwear, black headwear, black shirt, long sleeves, crop top, collared shirt, belt, black shorts, black socks, high heels, high heel boots | | | | casual | casual, jewelry, necklace, white shirt, bra, long sleeves, brown jacket, open jacket, partially unbuttoned, shirt tucked in, brown belt, denim, blue shorts, torn clothes, white socks, white footwear | | mdt | Meisho Doto | the feature tags | 1girl, multicolored hair, tail, purple hairband, short hair, horse tail, hair between eyes, bangs, streaked hair, horse girl, ahoge, horse ears, solo, medium hair, brown eyes, brown hair, animal ears, white hair, purple eyes, two-tone hair, red eyes, @_@ | | | | race | pink hairband, ear ribbon, blue ribbon, white shirt, collared shirt, white gloves, strap between breasts, long sleeves, center frills, blue skirt, lace trim, white thighhighs, high heels, white footwear | | | | halloween | official alternate costume, halloween costume, single ear cover, pumpkin hat, orange headwear, frilled hairband, high collar, black dress, white ascot, orange ribbon, neck ribbon, juliet sleeves, puffy long sleeves, frilled sleeves, lace trim, black pantyhose | | | | SSR Stamina | official alternate costume, hair ornament, hair flower, pink hairband, japanese clothes, floral print, purple kimono, long sleeves, wide sleeves, obi, sash, hakama skirt | | | | casual | casual, ear ribbon, blue ribbon, pink hairband, white sweater, turtleneck sweater, ribbed sweater, long sleeves, floral print, yellow skirt, long pleated skirt | | mhb | Mihono Bourbon | the feature tags | 1girl, animal ears, hair between eyes, horse ears, long hair, very long hair, brown hair, bangs, solo, tail, horse tail, purple eyes, blue eyes, horse girl, ahoge | | | | race | grey hairband, glowing hair ornament, detached sleeves, bare shoulders, clothing cutout, pink necktie, white leotard, highleg leotard, pleated skirt, grey skirt, lowleg skirt, miniskirt, white thighhighs, grey footwear | | | | valentine | official alternate costume, valentine, white gloves, low-tied long hair, brown hairband, frilled hairband, hair ornament, jewelry, brooch, shoulder cutout, bare shoulders, red ascot, off-shoulder dress, sleeveless dress, wrist cuffs, brown dress, frills, waist apron, red footwear | | | | SSR Wisdom | official alternate costume, ghost costume, halloween, mini top hat, black headwear, jack-o'-lantern on head, purple bow, hood up, white shirt, long sleeves, white jacket, chain | | | | casual | casual, grey hairband, white hoodie, clothes writing, long sleeves, white jacket, open clothes, open jacket, yoga pants, blue pants, leggings, black socks, shoes, white footwear | | mht | Manhattan Cafe | the feature tags | 1girl, solo, brown eyes, horse girl, yellow eyes, blue eyes, white hair, streaked hair, bangs, very long hair, horse ears, horse tail, animal ears, hair between eyes, black hair, multicolored hair, long hair, ahoge, long bangs | | | | race | black choker, black gloves, long sleeves, collared shirt, yellow necktie, black vest, black coat, belt, black skirt, pleated skirt, black pantyhose, shoes, white footwear | | | | SSR Stamina | official alternate costume, alternate hairstyle, ponytail, single earring, long sleeves, layered sleeves, green apron, collared shirt, white shirt, sweater vest, neck ribbon, white ribbon | | | | casual | casual, neck ribbon, black ribbon, puffy long sleeves, bracelet, yellow shirt, collared shirt, frilled shirt collar, shirt tucked in, black belt, belt buckle, green skirt, plaid skirt, long skirt, black pantyhose, black footwear | | mja | Mejiro Ardan | the feature tags | 1girl, horse tail, horse ears, animal ears, blue eyes, horse girl, very long hair, purple eyes, tail, long hair, solo, bangs, blue hair, grey hair, crown braid, light blue hair, braid | | | | race | ear ornament, black gloves, white gloves, asymmetrical gloves, mismatched gloves, detached sleeves, puffy sleeves, short sleeves, long sleeves, asymmetrical sleeves, uneven sleeves, ribbon, bare shoulders, off-shoulder dress, black dress, frilled dress, center frills, knee boots, asymmetrical footwear, mismatched footwear | | | | SSR Speed | official alternate costume, ear ornament, choker, detached sleeves, wrist cuffs, off-shoulder dress, white dress, strapless dress, bare shoulders, collarbone, blue thighhighs | | | | casual | jewelry, necklace, long sleeves, white shirt, belt, necklace, blue skirt, long skirt, floral print, socks | | mjb | Mejiro Bright | the feature tags | 1girl, brown hair, brown eyes, wavy hair, ahoge, side braids, twin braids, horse girl, grey eyes, horse ears, solo, side braid, orange hair, very long hair, hair between eyes, yellow eyes, long hair, animal ears, horse tail, bangs, multicolored hair, green eyes, purple eyes, red eyes, tail | | | | race | hair bow, collared shirt, white shirt, puffy short sleeves, center frills, frilled dress, green dress, striped, belt, green skirt, frilled skirt, black thighhighs, mismatched footwear, black footwear, white footwear, asymmetrical footwear, knee boots | | | | casual | hair bow, casual, frills, green dress, puffy short sleeves | | mjd | Mejiro Dober | the feature tags | 1girl, green hair, parted bangs, hair between eyes, brown eyes, black hair, animal ears, horse ears, horse tail, long hair, bangs, very long hair, purple eyes, blue eyes, purple hair, solo, horse girl, brown hair | | | | race | ear ornament, hairclip, collared shirt, white shirt, sleeveless shirt, green bowtie, white gloves, belt, green skirt, center frills, frilled skirt, single thigh strap, kneehighs, white socks, high heels | | | | original art | ear ornament, hairclip, collared shirt, white shirt, sleeveless shirt, green bowtie, white gloves, belt, suspender, green skirt, pleated skirt,kneehighs, white socks, high heels | | | | camping | official alternate costume, straw hat, ears through headwear, criss-cross halter, sleeveless blue dress, bead bracelet, vertical stripped dress, white bow, frilled dress, sandals | | | | SSR Wisdom | official alternate costume, hair ornament, hair flower, veil, wedding dress, jewelry, necklace, off-shoulder dress, strapless dress, white dress, short sleeves, detached sleeves, white gloves | | | | casual | casual, ear ornament, hairclip, necklace, jewelry, white dress, shoulder cutout, bare shoulders, belt, black pantyhose | | mjm | Mejiro McQueen | the feature tags | 1girl, solo, horse ears, long hair, animal ears, horse girl, purple eyes, tail, horse tail, purple hair, swept bangs | | | | race | green bowtie, armband, long sleeves, frilled sleeves, frills, striped shirt, black coat, pleated skirt, black skirt, kneehighs, socks, black footwear, lace-up boots, cross-laced footwear | | | | Anime Collab | official alternate costume, mini top hat, mini hat, white headwear, braid, ascot, clothing cutout, shoulder cutout, short sleeves, white gloves, cropped jacket, white jacket, belt, navel, midriff, short shorts, white shorts, white footwear, knee boots, lace-up boots, cross-laced footwear | | | | race | official alternate costume, low twintails, hair ornament, hair flower, necklace, bracelet, green ribbon, swimsuit, see-through dress, white dress, bare shoulders, off-shoulder dress, frills, frilled dress, green bikini, sandals | | | | casual | casual, neck ribbon, pinafore dress, sleeveless dress, blue dress, white shirt, long sleeves | | mjr | Mejiro Ryan | the feature tags | 1girl, streaked hair, bangs, white hair, brown hair, animal ears, horse tail, short hair, black hair, solo, blue eyes, two-tone hair, brown eyes, horse ears, purple eyes, grey eyes, multicolored hair, very short hair, horse girl | | | | race | ear piercing, white gloves, studded bracelet, long sleeves, sleeveless shirt, collared shirt, clothes writing, vertical-striped shirt, tied shirt, striped jacket, white jacket, open jacket, open clothes, cropped jacket, green necktie, crop top, navel, midriff, vertical stripes, brown belt, short shorts, black shorts, thigh strap, high heel boots, white footwear | | | | valentine | hair flower, official alternate costume, long sleeves, collared shirt, white shirt, brown vest, striped bowtie, brown belt, frills, chain, gold trim, brown pants, brown footwear | | | | SSR Guts | ear piercing, official alternate costume, bare shoulders, short sleeves, crop top, blue shirt, camisole, open clothes, open jacket, blue jacket, see-through, belt, blue shorts, short shorts, blue footwear, sneakers | | | | casual | casual, ear piercing, necklace, wristband, long sleeves, white shirt, blue dress, open jacket, open clothes, black jacket, capri pants, black pants, blue footwear, sneakers | | mkk | Matikanefukukitaru | the feature tags | 1girl, horse tail, horse ears, tail, solo, horse girl, animal ears, +_+, hair ornament, short hair, orange hair, flipped hair, yellow eyes, blue skirt | | | | race | school uniform, serafuku, blue sailor collar, clothing cutout, shoulder cutout, short sleeves, white shirt, red neckerchief, backpack, bead bracelet, maneki-neko, blue skirt, pleated skirt, ema, ribbon trim, ribbon-trimmed legwear, zettai ryouiki, white thighhighs, shoes, brown footwear, loafers | | | | full armor | official alternate costume, yellow headwear, beret, jewelry, necklace, bag, bag charm, backpack, puffy long sleeves, bead bracelet, collared dress, yellow dress, frills, charm \(object\), white socks, green shoes | | | | SSR Speed | official alternate costume, hair flower, choker, frilled sleeves, wide sleeves, long sleeves, frills, japanese clothes, green kimono, floral print, hakama, green kimono, sash, obi, shide, yellow skirt, plaid skirt, green socks, ema | | | | casual | collared shirt, pink shirt, long sleeves, open clothes, green jacket, brown pants, brown shorts, brown skirt, plaid skirt | | mkt | Matikanetannhauser | the feature tags | 1girl, horse ears, animal ears, solo, horse tail, horse girl, brown hair, hair ornament, hairclip, medium hair, yellow eyes | | | | race | cabbie hat, blue headwear, blue bowtie, cape, white shirt, collared shirt, shoulder cutout, puffy long sleeves, center frills, corset, belt pouch, belt, blue skirt, frilled skirt, brown footwear, lace-up boots, knee boots, high heel boots | | | | BoC'z | official alternate costume, baseball cap, ears through headwear, black headwear, hairclip, off shoulder, crop top, sports bra, tank top, black jacket, open clothes, open jacket, belt, black pants, boots | | | | casual | casual, beret, red headwear, neck ribbon, long sleeves, collared shirt, pink shirt, open clothes, blue cardigan, shirt tucked in, plaid skirt, brown skirt | | mpm | Mejiro Palmer | the feature tags | 1girl, aqua eyes, bangs, long hair, green eyes, horse tail, wavy hair, grey eyes, streaked hair, white hair, blue eyes, hair between eyes, multicolored hair, brown hair, medium hair, two-tone hair, horse girl, horse ears, ponytail, solo, animal ears, very long hair, sidelocks, ahoge, parted bangs | | | | race | ear ornament, fur trim, fur-trimmed jacket, white jacket, yellow shirt, necklace, jewelry, open jacket, open clothes, long sleeves, white gloves, single glove, fingerless gloves, crop top, midriff, cropped jacket, navel, belt, green skirt, miniskirt, shoes, sneakers, asymmetrical footwear, mismatched footwear, black footwear, white footwear | | | | SSR Stamina | halloween costume, official alternate costume, ear ornament, demon horns, jewelry, crop top, bare shoulders, sleeveless, cleavage, navel, midriff, single glove, elbow gloves, black gloves, asymmetrical gloves, fur trim, bracelet, nail polish, multicolored nails, fingerless gloves, demon wings, brown belt, black shorts, short shorts, thigh strap | | | | casual | ear ornament, casual, green shirt, green bowtie, long sleeves, belt, shirt tucked in, grey pants, plaid, black footwear, high heels | | mrm | Mejiro Ramonu | the feature tags | 1girl, multicolored hair, black hair, solo, yellow eyes, grey hair, mole under eye, parted bangs, single hair bun, forehead, red eyes, braid, black eyes, animal ears, horse tail, braided bun, grey eyes, purple hair, bangs, blue eyes, horse girl, purple eyes, two-tone hair, hair bun, medium hair, pink eyes, brown eyes, streaked hair, horse ears, brown hair, white hair | | | | race | ear piercing, white glove, single glove, wristband, white shirt, blue ascot, open clothes, white jacket, cropped jacket, long sleeves, high-waist skirt, black skirt, frills, single thighhigh, black thighhighs, asymmetrical legwear, white footwear, black footwear, high heels, asymmetrical footwear, mismatched footwear | | mrz | Maruzensky | the feature tags | 1girl, blonde hair, drill hair, horse tail, brown hair, horse girl, brown eyes, multicolored hair, green eyes, orange hair, horse ears, animal ears, bangs, very long hair, hair between eyes, grey eyes, aqua eyes, solo, blue eyes, long hair | | | | race | ear bow, hair bow, ribbon, pendant choker, red shirt, brown sailor collar, white bowtie, red jacket, cropped jacket, long sleeves, red skirt, pleated skirt, thigh strap, red thighhighs, high heel boots, brown footwear | | | | summer | official alternate costume, ponytail, ear ornament, tinted eyewear, heart-shaped eyewear, eyewear on head, sunglasses, bracelet, necklace, jewelry, bare shoulders, black bikini, frilled bikini, chain, red flower, purple flower, single thigh strap, sandals, brown footwear | | | | casual | casual, ear ornament, blue bow, hair bow, red neckerchief, floral print, white dress, sleeveless dress | | mvs | Marvelous Sunday | the feature tags | 1girl, horse tail, animal ears, horse ears, horse girl, tail, solo, fang, black hair, twintails, yellow eyes, +_+, medium hair | | | | race | hair flower, hair ornament, red hairband, necklace, jewelry, puffy short sleeves, white shirt, center frills, underbust, wrist cuffs, multicolored dress, pink dress, layered skirt, frilled dress, frills, white pantyhose, purple footwear, high heels | | | | original art | hair flower, hair ornament, red hairband, black gloves, puffy short sleeves, neck ribbon, brooch, collared shirt, white shirt, center frills, underbust, black bow, high-waist skirt, black skirt, layered skirt, frilled skirt, black dress, frilled dress, white pantyhose, mary janes, black footwear | | | | SSR Power | official alternate costume, hair flower, hair ornament, red hairband, red scarf, fringe trim, white gloves, long sleeves, open clothes, brown coat, white dress, ribbon, brown shorts, boots, brown footwear | | nkf | Nakayama Festa | the feature tags | 1girl, red eyes, hat, hair between eyes, brown hair, multicolored hair, horse girl, pink eyes, long hair, orange hair, medium hair, purple eyes, horse ears, horse tail, bangs, brown eyes, streaked hair, solo, blue eyes, animal ears | | | | race | beanie, grey headwear, jewelry, dice necklace, chain necklace, striped shirt, red shirt, torn shirt, long sleeves, open coat, blue coat, black gloves, fingerless gloves, black belt, black pants, torn jeans, shoes, black footwear, sneakers | | | | original art | beanie, grey headwear, headphones around neck, long sleeves, collared shirt, white shirt, blue jacket, letterman jacket, fingerless gloves, black gloves, single glove, bracelet, red sweater vest, white skirt, pleated skirt, miniskirt, black thighhighs | | | | SSR Wisdom | official alternate costume, ahoge, bare shoulders, jewelry, necklace, halterneck, bead bracelet, black bikini, black shorts, bikini shorts, sandals | | | | casual | casual, beanie, grey headwear, long sleeves, white hoodie, hood down, drawstring, denim jacket, blue jacket, open clothes, open jacket, black shorts, short shorts | | nnt | Nice Nature | the feature tags | 1girl, twintails, bangs, blue eyes, streaked hair, horse tail, grey eyes, green eyes, red eyes, horse girl, solo, brown hair, hair between eyes, multicolored hair, two-tone hair, tail, medium hair, animal ears, yellow eyes, brown eyes, horse ears | | | | race | ear covers, green bowtie, diagonal-striped bowtie, striped puffy sleeves, puffy long sleeves, juliet sleeves, grey shirt, double-breasted, buttons, black dress, frilled dress, pinafore dress, thigh strap, o-ring, red socks, brown footwear, knee boots, cross-laced footwear, lace-up boots, kneehighs | | | | cheerleader | official alternate costume, cheerleader, ponytail, long sleeves, crop top, white shirt, sailor collar, midriff, navel, blue jacket, open jacket, belt, layered skirt, pleated skirt, white skirt, miniskirt, thigh strap, shorts under skirt, short shorts, orange shorts, white socks, red footwear, sneakers, shoes | | | | casual | casual, ear covers, long sleeves, wide sleeves, red shirt, white shirt, denim skirt, handbag, shoulder bag | | nrb | Narita Brian | the feature tags | 1girl, solo, animal ears, horse ears, yellow eyes, long hair, horse girl, ponytail, black hair, horse tail, bangs, tail, hair between eyes | | | | race | ear ornament, bandaid on nose, rope, shimenawa, fingerless gloves, pink gloves, crop top, chest sarashi, torn clothes, white coat, open coat, long sleeves, brown belt, miniskirt, plaid skirt, pleated skirt, black skirt, bandaged leg, single black thighhigh, boots, asymmetrical footwear, black footwear | | | | blaze | official alternate costume, single ear cover, ear ornament, bandaid on nose, asymmetrical clothes, crop top, short sleeves, black gloves, elbow gloves, black skirt, purple pants, keen boots, black footwear | | | | SSR Stamina | official alternate costume, ear ornament, bandaid on nose, single sleeve, single bare shoulder, detached sleeves, asymmetrical sleeves, chest sarashi, japanese clothes, asymmetrical clothes, obi, sash, long sleeves, purple kimono, strapless, floral print, rope, shimenawa, shorts | | | | casual | casual, ear ornament, bandaid on nose, white shirt, long sleeves, shirt tucked in, high-waist pants, purple pants | | nrt | Narita Taishin | the feature tags | 1girl, blue eyes, long bangs, horse girl, brown hair, horse tail, animal ears, pink hair, short hair, hair between eyes, asymmetrical bangs, parted bangs, tail, green eyes, horse ears, swept bangs, black hair, multicolored hair, grey eyes, purple eyes, bangs, solo | | | | race | ear ornament, horseshoe ornament, yellow shirt, tied shirt, hood, fur-trimmed jacket, pink jacket, long sleeves, open jacket, brown belt, plaid jacket around waist, asymmetrical clothes, single pantsleg, blue pants, torn jeans, asymmetrical footwear, mismatched footwear | | | | steampunk | official alternate costume, steampunk, ear ornament, ear covers, monocle, under-rim eyewear, high collar, frilled shirt, white shirt, striped bowtie, long sleeves, brooch, fingerless gloves, black gloves, corset, brown belt, pouch, short shorts, purple shorts, garter straps, black thighhighs, brown footwear | | | | SSR Wisdom | official alternate costume, ear ornament, jewelry, necklace, white shirt, clothes writing, open clothes, green coat, fur-trimmed coat, denim shorts, short shorts, black pantyhose | | | | casual | casual, ear ornament, beret, black headwear, white shirt, clothes writing, hood, red jacket, long sleeves, open clothes, blue pants, torn jeans | | nsf | Nishino Flower | the feature tags | 1girl, solo, horse ears, animal ears, purple eyes, x hair ornament, short hair, horse girl, bangs, pink hairband | | | | race | purple bowtie, striped bowtie, bare shoulders, wrist cuffs, white shirt, button, white dress, sleeveless dress, white skirt, yellow skirt, frilled skirt, thigh strap, yellow footwear, knee boots | | | | original art | red ribbon, collared shirt, striped shirt, white shirt, long sleeves, pleated dress, yellow dress, pleated skirt, belt, black footwear, shoes | | | | casual | casual, long sleeves, neck ribbon, collared shirt, blue shirt, blue dress, pink jacket | | ntr | Narita Top Road | the feature tags | 1girl, multicolored eyes, bangs, horse tail, brown eyes, horse girl, yellow eyes, tail, orange eyes, parted bangs, purple eyes, blonde hair, pink hair, medium hair, solo, long hair, multicolored hair, streaked hair, short hair, animal ears, white hair, horse ears, pink eyes, light brown hair, red eyes, brown hair | | | | race | ear cover, star hair ornament, brooch, sleeveless dress, purple dress, white bow, mismatched gloves, white gloves, black gloves, shoulder cutout, puffy long sleeves, detached sleeves, see-through sleeves, frills, shorts under dress, short shorts, purple shorts, thigh strap, knee boots, purple footwear | | ogr | Oguri Cap | the feature tags | 1girl, bangs, grey hair, horse tail, horse girl, yellow eyes, hair between eyes, horse ears, blue eyes, long hair, purple eyes, ahoge, white hair, grey eyes, animal ears, multicolored hair, solo, very long hair, two-tone hair | | | | race | ear ornament, hairband, blue sailor collar, white jacket, white shirt, red neckerchief, long sleeves, midriff peek, belt buckle, grey belt, blue skirt, pleated skirt, black pantyhose, fur-trimmed boots, grey footwear | | | | christmas | official alternate costume, hair bell, christmas hair ornaments, stripped bowtie, brooch, white shirt, center frills, puffy short sleeves, cropped jacket, white jacket, white gloves, belt, frilled skirt, layered skirt, black pantyhose, knee boots, black footwear | | | | SSR Power | official alternate costume, ear ornament, hairband, brown shirt, puffy short sleeves, belt, shirt tucked in, denim skirt | | | | casual | casual, white shirt, open clothes, brown jacket, brown belt, blue pants, denim, jeans | | rcs | Rice Shower | the feature tags | 1girl, pink hair, brown hair, black hair, horse ears, bangs, horse tail, very long hair, purple eyes, pink eyes, animal ears, red eyes, hair over one eye, blue eyes, long hair, horse girl, solo | | | | race | blue headwear, tilted headwear, hat flower, blue rose, long sleeves, purple sleeves, sleeves past wrists, fur collar, off-shoulder dress, bare shoulders, blue dress, dress bow, sheathed dagger, lace trim, lace-trimmed legwear, brown thighhighs, shoes, black footwear | | | | halloween | official alternate costume, halloween costume, hair flower, blue rose, hair ornament, frilled hairband, purple gloves, striped bowtie, black bowtie, brooch, puffy short sleeves, bat wings, collared shirt, white shirt, center frills, jack-o'-lantern, orange bow, high-waist skirt, medium skirt, purple skirt, spider web print, black skirt, purple footwear | | | | SSR Power | casual, beret, brown headwear, hair ribbon, black ribbon, neck ribbon, long sleeves, neck ribbon, brown jacket, pink dress, open clothes, open jacket, cropped jacket, striped dress, vertical-striped skirt, black pantyhose | | | | SSR Wisdom | official alternate costume, bride, braided ponytail, blue bow, hair flower, blue rose, hair ornament, bridal veil, jewelry, necklace, white gloves, wedding dress, white dress, sleeveless dress, garter straps, white thighhighs, high heels, white footwear | | | | BoC'z | official alternate costume, baseball cap, ears through headwear, black headwear, hair over one eye, single glove, covered collarbone, black shirt, sleeveless shirt, turtleneck, single long sleeves, grey skirt, pleated skirt, black socks, boots, black footwear | | | | casual | casual, hairband, hair ribbon, brown ribbon, long sleeves, collared dress, brown dress, frilled dress, frills, white socks, brown footwear, mary janes | | sbo | Sakura Bakushin O | the feature tags | 1girl, horse ears, solo, animal ears, long hair, brown hair, horse girl, horse tail, ponytail, asymmetrical bangs, purple eyes, pink eyes | | | | race | epaulettes, sleeveless shirt, pink shirt, sailor collar, capelet, yellow neckerchief, asymmetrical gloves, mismatched gloves, black gloves, white gloves, black shorts, white thighhighs, asymmetrical footwear, mismatched footwear, high heel boots | | | | casual | casual, denim jacket, open clothes, blue jacket, open jacket, black shirt, long sleeves, shirt tucked in, white shorts, floral print, white shorts, short shorts | | sbr | Symboli Rudolf | the feature tags | 1girl, solo, animal ears, horse ears, horse tail, tail, horse girl, brown hair, long hair, multicolored hair, streaked hair, purple eyes, white hair, bangs | | | | race | red cape, epaulettes, aiguillette, medal, green jacket, long sleeves, white gloves, white ascot, buttons, double-breasted, belt, green skirt, frilled skirt, dress, zettai ryouiki, gold trim, black thighhighs, black footwear | | | | festival | official alternate costume, hadanugi dousa, japanese clothes, chest sarashi, open kimono, detached sleeves, single sleeve, ponytail, single bare shoulder, long sleeves, obi, sash, tabi, sandals | | | | casual | casual, handbag, jewelry, green shirt, long sleeves, sweater, glasses, shoulder bag, belt, adjusting eyewear, collarbone, white pants | | sco | Sakura Chiyono O | the feature tags | 1girl, solo, eyes, animal ears, horse ears, horse girl, horse tail, hair ornament, pink hair, hair flower, hair between eyes, medium hair, hair flaps | | | | race | fingerless gloves, black gloves, collarbone, detached sleeves, wide sleeves, long sleeves, sleeveless, sailor collar, bare shoulders, japanese clothes, pink kimono, sash, obi, ribbon trim, pleated skirt, white skirt, white thighhighs, sandals | | | | SSR Guts | official alternate costume, swimsuit, navel, bare shoulders, off shoulder, collarbone, pink bikini, frills, cleavage, off-shoulder bikini, holding water gun, frilled bikini, bridal garter, wrist scrunchie | | | | casual | casual, pink sweater, long sleeves, white shirt, floral print, pantyhose, shoulder bag, shorts, checkered skirt | | sis | Silence Suzuka | the feature tags | 1girl, orange hair, green eyes, blunt bangs, long hair, brown hair, solo, horse tail, aqua eyes, animal ears, blue eyes, very long hair, horse girl, horse ears, hime cut | | | | race | ear covers, hair ornament, white hairband, black bowtie, black gloves, layered sleeves, short over long sleeves, puffy short sleeves, green sailor collar, white jacket, white skirt, pleated skirt, black pantyhose, loafers, asymmetrical footwear, mismatched footwear | | | | casual | ear covers, hair ornament, white hairband, casual, short sleeves, white shirt, green ribbon, neck ribbon, blue skirt, pleated skirt, brown pantyhose | | siu | Seiun Sky | the feature tags | 1girl, short hair, hair between eyes, purple eyes, blue eyes, grey eyes, white hair, horse ears, green eyes, bangs, horse girl, grey hair, solo, light green hair, green hair, horse tail, animal ears | | | | race | single ear cover, hair flower, hairclip, brown choker, green sailor collar, white shirt, center frills, jewelry, bracelet, layered sleeves, short over long sleeves, wide sleeves, white dress, green shorts, short shorts, single thigh strap, fur-trimmed boots, grey footwear, high heels | | | | ballroom | official alternate costume, single ear cover, neck ribbon, white jacket, flower, cropped jacket, open jacket, long sleeves, white gloves, collared shirt, white shirt, blue vest, buttons, underbust, blue pants, thigh strap, legwear garter, kneehighs, blue footwear | | | | SSR Wisdom | official alternate costume, single ear cover, hood up, ears through headwear, red flower, head wreath, jewelry, bracelet, fur-trimmed hood, hooded cloak, robe, choker, long wide sleeves, frilled sleeves, black shorts, thigh strap | | | | casual | casual, single ear cover, hair flower, hairclip, black shirt, t-shirt, clothes writing, one-side off shoulder, short sleeves, overalls, suspenders pants, white footwear, sneakers | | skl | Sakura Laurel | the feature tags | 1girl, animal ears, multicolored hair, red hair, solo, red eyes, black hair, purple eyes, pink eyes, horse girl, parted bangs, brown hair, two-tone hair, orange hair, white hair, horse ears, bangs, short shorts, horse tail, short hair, pink hair, medium hair | | | | race | ear bow, symbol in eye, symbol-shaped pupils, sailor collar, pink ascot, white shirt, pink vest, puffy short sleeves, wrist cuffs, white shorts, single thighhigh, black thighhighs, knee boots, asymmetrical footwear, white footwear, mismatched footwear, high heels | | | | SSR Stamina | official alternate costume, christmas, hair ornament, ear covers, symbol-shaped pupils, symbol in eye, halterneck, halter dress, fur trim, fur-trimmed dress, red dress, santa dress, bare shoulders, detached sleeves, elbow gloves, fur-trimmed gloves, red gloves, back bow, green bow, black thighhighs | | | | casual | casual, ear bow, symbol in eye, symbol-shaped pupils, collarbone, pink cardigan, necklace, jewelry, white shirt, long sleeves, grey skirt, plaid skirt, black pantyhose, pink footwear, sneakers, shoes, shoulder bag | | skp | Seeking the Pearl | the feature tags | 1girl, animal ears, solo, purple eyes, horse ears, grey eyes, mole under eye, black hair, horse tail, very long hair, green eyes, two-tone hair, lips, hair between eyes, blue eyes, brown hair, bangs, horse girl, parted lips, long hair, red lips, multicolored hair | | | | race | ear ornament, eyewear on head, sunglasses, red-framed eyewear, bracelet, jewelry, necklace, black gloves, off shoulder, bare shoulders, detached sleeves, long sleeves, red jacket, star \\(symbol\\), belt, pleated skirt, white skirt, sandals, high heels | | | | SSR Speed | official alternate costume, ear ornament, eyewear on head, sunglasses, red-framed eyewear, military uniform, white gloves, long sleeves, white jacket, white skirt, pleated skirt | | | | casual | casual, ear ornament, eyewear on head, sunglasses, red-framed eyewear, jewelry, long sleeves, vertical stripes, striped shirt, bracelet, belt, clothes around waist, black skirt, pencil skirt, high heels, red footwear | | sks | Symboli Kris S | the feature tags | 1girl, dark skin, tail, aqua eyes, purple hair, long hair, multicolored hair, purple eyes, horse girl, brown hair, horse tail, hair between eyes, bangs, horse ears, ponytail, green eyes, earrings, dark-skinned female, blue eyes, very long hair, animal ears, black hair, ear ornament, solo, braid, ahoge, mole under eye | | | | race | single earring, hair ornament, high collar, single shoulder armor, black gloves, white ascot, clothing cutout, cleavage cutout, crop top, white shirt, gold trim, green coat, open clothes, long sleeves, navel, midriff, black belt, green shorts, short shorts, single thigh strap, black footwear, knee boots | | | | casual | casual, single earring, hair ornament, necklace, black shirt, taut shirt, turtleneck, green jacket, long sleeves, open clothes, open jacket, crop top, taut clothes, navel, midriff, white pants | | skw | Shinko Windy | the feature tags | 1girl, brown eyes, ahoge, brown hair, horse girl, medium hair, horse tail, long hair, ponytail, red eyes, white hair, sidelocks, two-tone hair, animal ears, grey eyes, horse ears, hair between eyes, orange hair, purple eyes, short hair, multicolored hair, bangs, streaked hair, solo, pink eyes, sharp teeth | | | | race | ear ornament, black choker, spiked choker, fur-trimmed jacket, sleeveless jacket, cropped jacket, open jacket, black shirt, sleeveless shirt, torn clothes, arm strap, fur-trimmed gloves, fingerless gloves, black gloves, midriff, navel, belt, fur-trimmed shorts, black shorts, short shorts, garter straps, thigh strap, black thighhighs, torn thighhighs, fur-trimmed boots, black footwear | | | | original art | ear ornament, school uniform, collared shirt, white shirt, black jacket, hoodie, hooded jacket, hood down, open jacket, blue necktie, drawstring, partially unzipped, zipper, backpack, long sleeves, black skirt, pleated skirt, miniskirt, black socks, black footwear, sneakers | | | | SSR Guts | official alternate costume, crown, demon horns, brooch, jewelry, black bowtie, chain, fur-trimmed cape, two-sided cape, black cape, red cape, purple shirt, long sleeves, frilled sleeves, black shorts, short shorts, frills, black thighhighs | | | | casual | casual, ear ornament, white shirt, clothes writing, long sleeves, red shorts, white socks, black footwear, sneakers | | smf | Smart Falcon | the feature tags | 1girl, bangs, horse ears, red eyes, solo, twintails, medium hair, orange eyes, horse girl, yellow eyes, animal ears, green eyes, hair between eyes, brown hair, horse tail, brown eyes | | | | race | hair bow, hair ribbon, puffy short sleeves, wrist cuffs, wrist scrunchie, collared shirt, white shirt, suspenders, black bowtie, center frills, back bow, suspender skirt, pink skirt, high-waist skirt, frilled skirt, dress, bridal garter, thigh strap, frilled socks, red socks, red footwear, sneakers | | | | original art | hair bow, hair ribbon, puffy short sleeves, wristband, wrist cuffs, argyle, plaid bowtie, sailor collar, yellow shirt, collared shirt, pleated skirt, frilled skirt, zettai ryouiki, white thighhighs, brown footwear, high heels | | | | grand live | official alternate costume, hair bow, star hair ornament, choker, fingerless gloves, yellow gloves, crop top, sleeveless jacket, midriff, navel, belt, yellow bow, white skirt, pleated skirt, shorts under skirt, black shorts, bike shorts, boots | | | | SR Guts | hair bow, hair ribbon, sleeveless, strap slip, pink shirt, pink camisole, tank top, bare shoulders, sweatpants, black pants, leggings, yoga pants | | | | casual | casual, hair bow, sleeves past wrists, puffy long sleeves, frilled collar, pink sweater, clothes writing, brown skirt, pleated skirt, plaid skirt, zettai ryouiki, miniskirt, white thighhighs, loafers, brown footwear | | spc | Super Creek | the feature tags | 1girl, solo, animal ears, horse ears, horse girl, horse tail, long hair, tail, brown hair, blue eyes, very long hair, hair between eyes, bangs, braided ponytail | | | | race | blue scarf, detached sleeves, bare shoulders, o-ring, clothing cutout, wrist cuffs, scrunchie, strap between breasts, handbag, shoulder bag, blue dress, white skirt, zettai ryouiki, thigh boots | | | | halloween | official alternate costume, halloween costume, mummy costume, ear ornament, choker, cleavage, wrist cuffs, bare shoulders, bandages, bandaged arm, fingernails, nail polish, torn clothes, navel, skirt, frills, bandaged leg | | | | original art | detached sleeves, long sleeves, bare shoulders, sleeveless sweater, ribbed sweater, white sweater, turtleneck sweater, wrist cuffs, strap between breasts, handbag, shoulder bag, blue dress, thighhighs, zettai ryouiki, thigh boots, mismatched footwear, asysmetrical footwear | | | | casual | puffy long sleeves, sailor collar, yellow shirt, brown shirt, white shirt, pink bowtie, necklace, belt, pleated skirt, blue skirt, brown footwear, high heel boots | | spw | Special Week | the feature tags | 1girl, streaked hair, braid, two-tone hair, red eyes, white hair, bangs, hair between eyes, solo, animal ears, horse ears, multicolored hair, pink eyes, tail, brown hair, black hair, short hair, purple eyes, horse tail, horse girl | | | | race | ear bow, purple bow, puffy short sleeves, neck ribbon, blue ribbon, cropped jacket, white jacket, two-tone jacket, collared shirt, white shirt, purple vest, wristband, wrist cuffs, white skirt, pleated skirt, two-tone skirt, frilled skirt, frills, zettai ryouiki, white thighhighs, white footwear, purple footwear, asymmetrical footwear, mismatched footwear, high heels | | | | summer | official alternate costume, hair ornament, hair flower, hibiscus, bead necklace, detached sleeves, arm garter, puffy short sleeves, bead bracelet, swimsuit, front-tie top, orange bikini, plaid bikini, frilled bikini, bikini skirt, bridal garter, thigh strap, brown footwear, sandals | | | | commander | official alternate costume, hair ornament, white gloves, long sleeves, white shirt, collared shirt, red jacket, red coat, sash, red skirt, red dress, frills, pleated skirt, waist cape, black footwear, white footwear, mismatched footwear, asymmetrical footwear, knee boots, high heel boots | | | | casual | ear bow, purple bow, casual, blue jacket, white dress, long sleeves, open clothes, open jacket, floral print | | ssb | Sirius Symboli | the feature tags | 1girl, streaked hair, very long hair, long hair, tail, pink eyes, white hair, horse girl, brown hair, solo, horse tail, multicolored hair, bangs, hair between eyes, yellow eyes, horse ears, purple eyes, brown eyes, animal ears, red eyes, two-tone hair | | | | race | black choker, jewelry, cleavage, long sleeves, black gloves, fingerless gloves, crop top, grey shirt, green jacket, open jacket, open clothes, navel, midriff, belt, shirt, green pants, brown footwear, high heel boots | | | | SSR Wisdom | official alternate costume, formal, tuxedo, blue bowtie, long sleeves, white gloves, butler, black jacket, collared shirt, white shirt, black pants, pant suit, tailcoat, black footwear, high heels | | stc | Satono Crown | the feature tags | 1girl, solo, black hair, short hair, horse girl, crown, hair between eyes, horse ears, grey eyes, very long hair, horse tail, grey hair, multicolored hair, animal ears, white hair, twintails, long hair, mini crown, bangs, asymmetrical bangs, green eyes, purple hair, blue eyes, side ponytail | | | | race | ear ornament, black gloves, elbow gloves, collared shirt, suspenders, bare shoulders, yellow ascot, green shirt, sleeveless shirt, black shorts, highhighs, thigh boots, black footwear | | std | Satono Diamond | the feature tags | 1girl, ear ornament, solo, long hair, horse ears, brown hair, animal ears, half updo, very long hair, horse girl, hair between eyes, braid, brown eyes, bangs | | | | race | sleeves past wrists, sleeves past fingers, frilled sleeves, frills, long sleeves, corset, green dress, braid, ascot, green jacket, green skirt, black thighhighs, black pantyhose, boots, white footwear | | | | new year | official alternate costume, hair flower, frills, red flower, japanese clothes, white kimono, blue skirt, open mouth, wide sleeves, long sleeves, thighhighs | | | | casual | green shirt, long sleeves, green skirt, closed mouth, frills, sleeves ribbon, long skirt | | swt | Sweep Tosho | the feature tags | 1girl, solo, brown hair, animal ears, horse tail, purple eyes, horse girl, twintails, horse ears, long hair, tail, hair rings | | | | race | witch hat, black headwear, black necktie, collared shirt, red shirt, white gloves, wide sleeves, black jacket, black robe, buckle, belt, white skirt, pleated skirt, black thighhighs, thigh boots, black footwear, high heel boots | | | | SSR Speed | official alternate costume, one-piece swimsuit, hair ornament, hair bow, collarbone, hair flower, casual one-piece swimsuit, frilled swimsuit | | | | casual | casual, hair bow, headband, collarbone, puffy long sleeves, juliet sleeves, purple necktie, short necktie, collared dress, red dress, white pantyhose, black footwear, mary janes | | tdc | Tap Dance City | the feature tags | 1girl, black hair, short hair, horse girl, streaked hair, wavy hair, multicolored hair, hair between eyes, brown hair, red eyes, two-tone hair, brown eyes, solo, pink eyes, medium hair, bangs, horse ears, horse tail, animal ears, purple eyes, blue eyes, grey hair, braid | | | | race | ear ornament, jewelry, detached sleeves, sleeveless dress, bare shoulders, red gloves, off shoulder, clothing cutout, halterneck, striped, crop top, red shirt, bodice, white skirt, very long skirt, black pantyhose, white footwear, high heels | | tks | Taiki Shuttle | the feature tags | 1girl, solo, animal ears, horse ears, horse tail, horse girl, brown hair, blue eyes, ponytail, bangs, long hair, star hair ornament | | | | race | green shirt, front-tie top, bare shoulders, detached sleeves, bandeau, bikini under clothes, tube top, red scarf, navel, midriff, stomach, armband, brown gloves, cowboy hat, bandana, green skirt, miniskirt, brown belt, cowboy boots, holstered weapon, star print, zettai ryouiki, purple thighhighs, brown footwear | | | | camping | official alternate costume, suspender shorts, midriff, suspenders, jewelry, white shorts, hand on hip, bracelet, off-shoulder shirt, crop top, short shorts, bare shoulders, off shoulder, pink shirt, necklace, stomach, belt, hairband, frills, nail polish, thighs, puffy short sleeves, bow, sidelocks, single braid, collarbone, detached sleeves | | | | casual | yellow shirt, plaid shirt, collared shirt, revolver, brown belt, jeans, jewelry, denim, bracelet, blue pants, collared shirt, sleeves rolled up | | tkt | Tokai Teio | the feature tags | 1girl, solo, horse ears, brown hair, animal ears, long hair, horse tail, horse girl, blue eyes, white hair, streaked hair, multicolored hair, two-tone hair, high ponytail | | | | race | hair ribbon, pink ribbon, single epaulette, pink ascot, red capelet, long sleeves, asymmetrical gloves, mismatched gloves, white glove, blue glove, multicolored clothes, two-tone jacket, white jacket, blue jacket, shirt, buttons, double-breasted, white skirt, pleated skirt, two-tone skirt, miniskirt, white footwear, knee boots | | | | Anime Collab | official alternate costume, feather hair ornament, ear ornament, feathers, choker, necklace, red cape, navel, midriff, fingerless gloves, black gloves, puffy short sleeves, hood down, red jacket, hooded jacket, cropped jacket, open jacket, open clothes, crop top, jewelry, shirt, belt, red skirt, brown skirt, miniskirt, frilled skirt, frills, brown thighhighs, thigh boots, cross-laced footwear, lace-up boots, brown footwear | | | | SSR Wisdom | official alternate costume, steampunk, hair ribbon, blue ribbon, eyewear on head, goggles on head, sunglasses, long sleeves, fingerless gloves, black gloves, asymmetrical gloves, blue scarf, black jacket, corset, belt, black pants | | | | casual | casual, hair ribbon, pink ribbon, bra strap, off shoulder, bare shoulders, off-shoulder shirt, short sleeves, striped shirt, multicolored shirt, blue shorts, short shorts | | tmc | Tamamo Cross | the feature tags | 1girl, purple eyes, white hair, horse ears, grey hair, multicolored hair, solo, very long hair, horse girl, long hair, horse tail, hair between eyes, bangs, animal ears, blue eyes, skin fang, fangs, fang | | | | race | ear covers, blue hairband, red headband, blue jacket, open jacket, open clothes, long sleeves, lightning bolt symbol, sports bra, crop top, fingerless gloves, white gloves, red belt, belt buckle, white pants, capri pants, blue footwear, sneakers | | | | festival | official alternate costume, high ponytail, hair tie, hair ornament, ear covers, jewelry, necklace, bare shoulders, sleeveless shirt, black shirt, detached sleeves, fingerless gloves, elbow gloves, black gloves, bracelet, shimenawa, rope, hip vent, side cutout, high-waist pants, blue pants, shoes | | | | original art | ear covers, blue hairband, red headband, school uniform, serafuku, sailor collar, two-tone neckerchief, wide sleeves, long sleeves, belt buckle, black skirt, pleated skirt, grey thighhighs, brown footwear, loafers | | | | SSR Power | official alternate costume, halloween costume, mummy costume, hair ornament, hair flower, bandaged head, bandages, bandaged arm, torn clothes, center frills, dress | | | | casual | casual, ear covers, blue hairband, red headband, long sleeves, off shoulder, white shirt, clothes writing, denim shorts | | | | young | aged down, ears through headwear, kindergarten uniform, school hat, yellow headwear, blue shirt, long sleeves, bag, black shorts, bandaid on knee, white socks, white footwear | | tmo | T.M. Opera O | the feature tags | 1girl, hair over one eye, horse girl, horse tail, solo, hair between eyes, animal ears, bangs, purple eyes, brown hair, short hair, horse ears, orange hair, multicolored hair, blonde hair | | | | race | mini crown, ear piercing, white shirt, shoulder armor, pink cape, brooch, puffy long sleeves, jewelry, fingerless gloves, single glove, white gloves, multiple rings, corset, white skirt, pink skirt, two-tone skirt, pleated skirt, white thighhighs, zettai ryouiki, high heel boots, yellow footwear | | | | new year | official alternate costume, hair flower, blue flower, japanese clothes, kimono, blue jacket, cropped jacket, epaulettes, aiguillette, long sleeves, wide sleeves, white gloves, sash, hakama skirt, obi, floral print, high heel boots | | | | casual | casual, ear piercing, white shirt, dress shirt, collared shirt, black jacket, open clothes, open jacket, long sleeves, shirt tucked in, belt, plaid pants | | tng | Tanino Gimlet | the feature tags | 1girl, pink hair, medium hair, horse ears, grey eyes, animal ears, blue hair, purple hair, horse girl, streaked hair, short hair, bangs, black hair, yellow eyes, red eyes, multicolored hair, two-tone hair, green eyes, white hair, solo, brown hair, horse tail, black eyes, blue eyes, brown eyes | | | | race | eyepatch, black choker, half gloves, asymmetrical gloves, mismatched gloves, black glove, white glove, shoulder cutout, bare shoulders, detached sleeves, long sleeves, sleeveless shirt, blue necktie, open clothes, black jacket, collared shirt, yellow shirt, striped vest, belt, chain, white shorts, black skirt, thigh strap, black pantyhose, fishnet pantyhose, knee boots, high heel boots, black footwear | | | | casual | eyepatch, black choker, off shoulder, tank top, purple shirt, black jacket, open jacket, short shorts, denim shorts, cutoffs, fishnet pantyhose | | tpg | Mayano Topgun | the feateure tgas | 1girl, horse girl, orange hair, orange eyes, long hair, brown hair, solo, horse ears, horse tail, brown eyes, animal ears, hair between eyes, bangs, very long hair, two side up, yellow eyes, sidelocks | | | | race | ear ribbon, black ribbon, dog tags, bomber jacket, green jacket, fur-trimmed jacket, open jacket, open clothes, long sleeves, crop top, yellow shirt, green belt, white shorts, short shorts, black thighhighs, brown footwear | | | | wedding | official alternate costume, hair ornament, hair flower, choker, necklace, white gloves, bare shoulders, white dress, strapless dress, wedding dress, frilled dress, layered dress, white pantyhose, yellow footwear, high heels | | | | SSR Speed | official alternate costume, santa costume, santa hat, red headwear, hair ornament, fingerless gloves, fur-trimmed gloves, fur-trimmed capelet, red capelet, red dress, black belt, fur-trimmed dress, white pantyhose | | | | casual | casual, ear ribbon, black ribbon, turtleneck, necklace, jewelry, white shirt, crop top, yellow vest, long sleeves, white belt, belt buckle, plaid skirt, yellow skirt | | tsj | Tosen Jordan | the feature tags | 1girl, solo, animal ears, horse ears, long hair, horse girl, horse tail, twintails, brown hair, blue eyes, ear ornament, bangs, ribbon, hair ribbon, sidelocks | | | | race | black choker, collar, crop top, fishnets, chain necklace, cropped jacket, open jacket, long sleeves, navel, midriff, nail polish, watch, bracelet, jewelry, purple belt, black skirt, gold chain, thigh strap, purple thighhighs, fishnet thighhighs, mismatched thighhihs, high heels | | | | casual | black choker, ring, collarbone, blue neckerchief, sailor collar, long sleeves, school uniform, green jacket, white skirt, belt, thigh strap, black footwear | | | | casual | bra strap, white shirt, tied shirt, off-shoulder shirt, long sleeves, nail polish, ring, bracelet, jewelry, denim shorts, short shorts, bare legs, cutoffs | | tst | Tsurumaru Tsuyoshi | the feature tags | 1girl, parted bangs, medium hair, two-tone hair, blonde hair, horse tail, horse girl, horse ears, white hair, grey eyes, long hair, brown hair, purple eyes, blank eyes, green hair, brown eyes, multicolored hair, streaked hair, red eyes, solo, short hair, orange hair, animal ears, pink eyes, bangs | | | | race | ear ornament, hair ornament, hairclip, wide sleeves, long sleeves, detached sleeves, bare shoulders, cleavage cutout, clothing cutout, strapless, sleeveless coat, purple coat, white shorts, single thigh strap, brown footwear, toeless footwear, sandals | | unv | Neo Universe | the feature tags | 1girl, yellow eyes, gradient hair, two-tone hair, blonde hair, streaked hair, long hair, very long hair, aqua hair, horse girl, animal ears, horse ears, hair between eyes, ahoge, multicolored hair, purple eyes, green hair, blue hair, aqua eyes, parted bangs, crossed bangs, brown hair, grey hair, colored inner hair, blue eyes, horse tail, bangs, grey eyes, solo | | | | race | single ear cover, puffy long sleeves, sleeves past fingers, sleeves past wrists, oversized clothes, cleavage cutout, clothing cutout, crop top, open jacket, open clothes, white jacket, yellow jacket, two-tone jacket, white leotard, boots, white footwear | | | | casual | casual, collared dress, white dress, puffy long sleeves, white shirt, white skirt, black footwear, sneakers | | vdk | Vodka | the feature tags | 1girl, brown hair, horse ears, multicolored hair, solo, grey eyes, horse girl, long hair, horse tail, ponytail, short hair, hair over one eye, bangs, yellow eyes, brown eyes, animal ears, parted bangs, two-tone hair, very long hair, white hair, streaked hair, twintails, low twintails, low ponytail | | | | race | ear ornament, jewelry, necklace, black jacket, open clothes, long sleeves, yellow shirt, crop top, belt, black shorts, short shorts, thigh boots, black footwear | | | | christmas | official alternate costume, santa costume, ear covers, ear ornament, green necktie, red jacket, fur trim, fur-trimmed jacket, collared shirt, long sleeves, white gloves, belt, two-tone pants, white pants, red pants, boots, red footwear | | | | casual | casual, black jacket, long sleeves, yellow shirt, black pants, black footwear, high heel boots | | wat | Wonder Acute | the feature tags | 1girl, horse tail, tail, grey hair, horse girl, brown hair, bangs, multicolored hair, green eyes, blue eyes, solo, streaked hair, very long hair, two-tone hair, animal ears, horse ears, long hair, white hair, hair between eyes | | | | race | choker, black gloves, fingerless gloves, detached sleeves, puffy long sleeves, frilled sleeves, pink dress, frills, pink jacket, striped skirt, pantyhose, brown footwear, knee boots, high heel boots | | | | casual | casual, white shirt, necklace, long sleeves, white shirt, white sweater, belt, green skirt, plaid skirt, long skirt, black pantyhose, shoes, sneakers | | wnt | Winning Ticket | the feature tags | 1girl, red eyes, horse ears, horse girl, tail, short hair, horse tail, solo, swept bangs, animal ears, brown hair, parted bangs, black hair, bangs | | | | race | ear ornament, hairclip, bandaid on face, necklace, bandeau, bracelet, open clothes, sleeveless jacket, hooded jacket, red jacket, wristband, partially unzipped, blue skirt, pleated skirt, shorts under skirt, bike shorts, sneakers | | | | steampunk | official alternate costume, steampunk, single ear cover, feather ear ornament, goggles on headwear, low twintails, short twintails, bandaid on face, collared shirt, white shirt, crop top, red bandana, shoulder cutout, frilled sleeves, puffy short sleeves, fingerless gloves, brown gloves, navel, midriff, belt, pouch, brown skirt, frilled skirt, miniskirt, black thighhighs, thighhighs under boots, brown footwear, knee boots | | | | casual | casual, beanie, red headwear, ears through headwear, ear ornament, hairclip, bandaid on face, white shirt, long sleeves, sleeves past wrists, open jacket, red jacket, plaid jacket, fanny pack, denim shorts, short shorts, white footwear, sneakers | | ykb | Yukino Bijin | the feature tags | 1girl, blunt bangs, medium hair, horse tail, bangs, horse girl, animal ears, long hair, brown hair, diagonal bangs, solo, black hair, horse ears, short hair, side ponytail, green eyes, sidelocks, yellow eyes, one side up, parted bangs, hair ornament, hairband, white hairband, multicolored hair, bob cut | | | | race | ear covers, headband, white choker, white dress, long sleeves, fingerless glove, white glove, single glove, blue collar, necklace, buttons, brown belt, belt buckle, layered dress, white footwear, boots | | | | original art | ear covers, headband, white scarf, winter clothes, winter coat, black coat, long sleeves, fur trim, mittens, pleated skirt, blue skirt, miniskirt, zettai ryouiki, white thighhighs, white footwear, boots, ice skates | | | | casual | casual, ear covers, headband, collared shirt, pink sweater, cardigan, cable knit, long sleeves, blue skirt, socks, white footwear | | ymt | Yaeno Muteki | the feature tags | 1girl, hair between eyes, short hair, tail, brown hair, solo, horse ears, streaked hair, animal ears, bangs, horse girl, light brown hair, white hair, horse tail, multicolored hair, brown eyes, red eyes, two-tone hair | | | | race | japanese clothes, sleeveless, detached sleeves, fingerless gloves, red gloves, cleavage cutout, clothing cutout, sash, obi, hakama, ribbon trim, black skirt, kneehighs, socks, sandals | | | | casual | casual, hair ornament, long sleeves, black jacket, yellow jacket, two-tone jacket, track jacket, track suit, black pants, track pants, sneakers | | ymz | Yamanin Zephyr | the feature tags | 1girl, grey hair, ear ornament, multicolored hair, solo, streaked hair, white hair, brown hair, black hair, purple hair, orange eyes, animal ears, two-tone hair, horse girl, red eyes, hair scrunchie, horse tail, very long hair, horse ears, bangs, yellow eyes, brown eyes, long hair, hair between eyes, low twintails, hair bobbles | | | | race | hair ornament, red scrunchie, choker, necklace, blue gloves, detached sleeves, long sleeves, bare shoulders, blue sleeves, white dress, strapless dress, frilled dress, flower, frills, blue thighhighs, black footwear | | | | casual | casual, hair ornament, red scrunchie, white shirt, brown bow, collared shirt, puffy long sleeves, open clothes, open jacket, brown jacket, layered sleeves, center frills, short over long sleeves, dress shirt, belt buckle, brown belt, short shorts, blue shorts | | zrr | Zenno Rob Roy | the feature tags | 1girl, animal ears, horse ears, horse girl, long hair, single braid, crown braid, purple hair, bangs, short hair, green eyes, braided ponytail, blue eyes, solo, horse tail, braid, medium hair, grey hair, black hair, blue hair, brown hair, low ponytail, blunt bangs | | | | race | ear ornament, hairclip, beret, hat, green headwear, glasses, black-framed eyewear, green jacket, scarf, long sleeves, brown belt, green skirt, frilled skirt, white socks, brown footwear, white footwear, shoes | | | | SSR Speed | official alternate costume, halloween costume, hairclip, hair bow, blue bow, witch hat, star print, ears through headwear, black headwear, glasses, black-framed eyewear, black robe, white shirt, orange bowtie, center frills, long sleeves, black skirt, striped thighhighs, black footwear, boots | | | | casual | casual, ear ornament, hairclip, glasses, black-framed eyewear, collared shirt, white shirt, sweater vest, blue sweater, long sleeves, shoulder bag, handbag, grey skirt, blue socks, brown footwear | |
notrichardren/azaria-mitchell
--- configs: - config_name: default data_files: - split: combined path: data/combined-* - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: claim dtype: string - name: label dtype: int64 - name: dataset dtype: string - name: qa_type dtype: int64 - name: ind dtype: int64 splits: - name: combined num_bytes: 1553103 num_examples: 17092 - name: train num_bytes: 1244045 num_examples: 13673 - name: test num_bytes: 309058 num_examples: 3419 download_size: 1228770 dataset_size: 3106206 --- # Dataset Card for "azaria-mitchell" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/jessie_neuralcloud
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of jessie/ジェシー/洁西 (Neural Cloud) This is the dataset of jessie/ジェシー/洁西 (Neural Cloud), containing 20 images and their tags. The core tags of this character are `blue_eyes, short_hair, blonde_hair, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 20 | 21.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessie_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 20 | 15.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessie_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 41 | 26.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessie_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 20 | 20.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessie_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 41 | 33.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessie_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/jessie_neuralcloud', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, boots, gloves, open_mouth, looking_at_viewer, full_body, frills, blush, shirt, shotgun, thighhighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | boots | gloves | open_mouth | looking_at_viewer | full_body | frills | blush | shirt | shotgun | thighhighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:-------------|:--------------------|:------------|:---------|:--------|:--------|:----------|:-------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
izumi-lab/wikipedia-ja-20230720
--- dataset_info: features: - name: curid dtype: string - name: title dtype: string - name: text dtype: string splits: - name: train num_bytes: 3653518687 num_examples: 1362415 download_size: 2130533065 dataset_size: 3653518687 license: cc-by-sa-3.0 language: - ja --- # Dataset Card for "wikipedia-ja-20230720" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chentong00/factoid-wiki
--- license: apache-2.0 ---
ola13/small-roots_en
--- dataset_info: features: - name: text dtype: string - name: meta struct: - name: perplexity_score dtype: float64 splits: - name: train num_bytes: 1165939191 num_examples: 100000 download_size: 640582687 dataset_size: 1165939191 --- # Dataset Card for "small-roots_en" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DopeorNope/20000sample_COT
--- dataset_info: features: - name: source dtype: string - name: target dtype: string - name: rationale dtype: string - name: task dtype: string - name: type dtype: string splits: - name: train num_bytes: 23066106 num_examples: 21297 download_size: 9606299 dataset_size: 23066106 --- # Dataset Card for "20000sample_COT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
maritaca-ai/enem
--- license: apache-2.0 configs: - config_name: '2022' data_files: 2022.jsonl - config_name: '2023' data_files: 2023.jsonl default: true dataset_info: features: - name: id dtype: string - name: exam dtype: string - name: IU dtype: bool - name: ledor dtype: bool - name: question dtype: string - name: alternatives sequence: string - name: figures sequence: string - name: description sequence: string - name: label dtype: string task_categories: - visual-question-answering - multiple-choice language: - pt pretty_name: ENEM size_categories: - n<1K --- The enem 2022 and enem 2023 datasets encompass all multiple-choice questions from the last two editions of the [Exame Nacional do Ensino Médio (ENEM)](https://www.gov.br/inep/pt-br/areas-de-atuacao/avaliacao-e-exames-educacionais/enem), the main standardized entrance examination adopted by Brazilian universities. The datasets have been created to allow the evaluation of both textual-only and textual-visual language models. To evaluate textual-only models, we incorporated into the datasets the textual descriptions of the images that appear in the questions' statements from the orange ENEM exam booklet, a particular booklet that offers accessibility to people with visual impairments. A repository containing the essential code for utilizing this dataset is accessible [here](https://github.com/piresramon/gpt-4-enem). If you use this dataset in your research, please acknowledge the papers below by citing them: ```bibtex @misc{pires2023evaluating, title={Evaluating GPT-4's Vision Capabilities on Brazilian University Admission Exams}, author={Ramon Pires and Thales Sales Almeida and Hugo Abonizio and Rodrigo Nogueira}, year={2023}, eprint={2311.14169}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtex @misc{nunes2023evaluating, title={Evaluating GPT-3.5 and GPT-4 Models on Brazilian University Admission Exams}, author={Desnes Nunes and Ricardo Primi and Ramon Pires and Roberto Lotufo and Rodrigo Nogueira}, year={2023}, eprint={2303.17003}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
CyberHarem/savage_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of savage/サベージ/暴行 (Arknights) This is the dataset of savage/サベージ/暴行 (Arknights), containing 140 images and their tags. The core tags of this character are `animal_ears, rabbit_ears, long_hair, grey_hair, breasts, grey_eyes, two_side_up, large_breasts, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 140 | 212.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/savage_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 1200 | 140 | 181.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/savage_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 364 | 352.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/savage_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/savage_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, black_gloves, looking_at_viewer, shirt, sleeveless, solo, upper_body, elbow_gloves, hair_between_eyes, blue_eyes, grey_background, smile, white_background, dress, medium_breasts, simple_background | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, pantyhose, solo, thighhighs, thigh_boots, black_footwear, looking_at_viewer, smile, elbow_gloves, simple_background, dress, cape, white_background, cloak, holding_weapon, closed_mouth, hair_between_eyes, open_mouth, standing | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | looking_at_viewer | shirt | sleeveless | solo | upper_body | elbow_gloves | hair_between_eyes | blue_eyes | grey_background | smile | white_background | dress | medium_breasts | simple_background | pantyhose | thighhighs | thigh_boots | black_footwear | cape | cloak | holding_weapon | closed_mouth | open_mouth | standing | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:--------|:-------------|:-------|:-------------|:---------------|:--------------------|:------------|:------------------|:--------|:-------------------|:--------|:-----------------|:--------------------|:------------|:-------------|:--------------|:-----------------|:-------|:--------|:-----------------|:---------------|:-------------|:-----------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | X | | X | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X |
vgoldberg/6k-longform-summ
--- dataset_info: features: - name: text dtype: string - name: summary dtype: string splits: - name: train num_bytes: 331108572.999906 num_examples: 6711 download_size: 108573322 dataset_size: 331108572.999906 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "6k-longform-summ" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Minata/mistral_method2test_v0
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: test num_bytes: 590655920 num_examples: 352420 download_size: 77022203 dataset_size: 590655920 configs: - config_name: default data_files: - split: test path: data/test-* ---
shi3z/ja_conv_wikipedia_orion14B_100K
--- task_categories: - conversational language: - ja size_categories: - 100K<n<1M --- # Abstruct This is a multi-turn conversation dataset generated from the Japanese Wikipedia dataset using Orion14B-Chat. Commercial use is possible, but the license is complicated, so please read it carefully before using it. I generated V100x4 on 200 machines in about half a week. # License 【Orion-14B Series】 Models Community License Agreement https://huggingface.co/OrionStarAI/Orion-14B-Chat/blob/main/ModelsCommunityLicenseAgreement # Computing ABCI https://abci.ai/ja/
ibranze/araproje_hellaswag_tr_conf_bestcore
--- dataset_info: features: - name: ind dtype: int32 - name: activity_label dtype: string - name: ctx_a dtype: string - name: ctx_b dtype: string - name: ctx dtype: string - name: endings sequence: string - name: source_id dtype: string - name: split dtype: string - name: split_type dtype: string - name: label dtype: string splits: - name: validation num_bytes: 162703.0 num_examples: 250 download_size: 87097 dataset_size: 162703.0 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "araproje_hellaswag_tr_conf_bestcore" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Msun/dota
--- license: apache-2.0 ---
polinaeterna/default_config
--- dataset_info: - config_name: default features: - name: x dtype: int64 - name: 'y' dtype: string splits: - name: train num_bytes: 93 num_examples: 6 - name: test num_bytes: 28 num_examples: 2 download_size: 1703 dataset_size: 121 - config_name: v2 features: - name: x dtype: int64 - name: 'y' dtype: string splits: - name: train num_bytes: 56 num_examples: 4 - name: test num_bytes: 14 num_examples: 1 download_size: 0 dataset_size: 70 pretty_name: traktor_dodik --- # Dataset Card for "default_config" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AlekseyKorshuk/gpt4all-jazzy-chatml
--- dataset_info: features: - name: conversation list: - name: content dtype: string - name: do_train dtype: bool - name: role dtype: string splits: - name: train num_bytes: 1484028130 num_examples: 711126 download_size: 768135582 dataset_size: 1484028130 --- # Dataset Card for "gpt4all-jazzy-chatml" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lmms-lab/ICON-QA
--- dataset_info: features: - name: question_id dtype: string - name: question dtype: string - name: choices dtype: string - name: answer dtype: string - name: query_image dtype: image - name: choice_image_0 dtype: image - name: choice_image_1 dtype: image - name: ques_type dtype: string - name: label dtype: string - name: grade dtype: string - name: skills dtype: string splits: - name: val num_bytes: 329185883.464 num_examples: 21488 - name: test num_bytes: 333201645.625 num_examples: 21489 download_size: 667286379 dataset_size: 662387529.089 configs: - config_name: default data_files: - split: val path: data/val-* - split: test path: data/test-* --- <p align="center" width="100%"> <img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%"> </p> # Large-scale Multi-modality Models Evaluation Suite > Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval` 🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab) # This Dataset This is a formatted version of [ICONQA](https://iconqa.github.io/). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models. ``` @inproceedings{lu2021iconqa, title = {IconQA: A New Benchmark for Abstract Diagram Understanding and Visual Language Reasoning}, author = {Lu, Pan and Qiu, Liang and Chen, Jiaqi and Xia, Tony and Zhao, Yizhou and Zhang, Wei and Yu, Zhou and Liang, Xiaodan and Zhu, Song-Chun}, booktitle = {The 35th Conference on Neural Information Processing Systems (NeurIPS) Track on Datasets and Benchmarks}, year = {2021} } ```
CasperLD/Pizza_Dataset_Extra_Detailed
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 3791807.0 num_examples: 80 download_size: 3782195 dataset_size: 3791807.0 --- # Dataset Card for "Pizza_Dataset_Extra_Detailed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tabtoyou/KoLLaVA-v1.5-Instruct-581k
--- license: cc-by-nc-4.0 task_categories: - visual-question-answering language: - ko size_categories: - 100K<n<1M --- # KoLLaVA-v1.5 Visual Instruct 581K Dataset Card [LLaVA-v1.5](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K/blob/main/llava_v1_5_mix665k.json)의 Instruction-following Data에서 필요한 데이터를 필터링하고, 한국어로 번역한 데이터셋입니다. (feat. DeepL) 이미지 다운로드 및 사용 방법은 [KoLLaVA](https://github.com/tabtoyou/KoLLaVA) repo를 참고해주세요.
raeidsaqur/Hansard
--- license: mit language: - en - fr task_categories: - translation pretty_name: hansard size_categories: - 100K<n<1M --- <h1> <img alt="RH" src="./icon.png" style="display:inline-block; vertical-align:middle" /> Pedagogical Machine Translation (Dialect) dataset: the filtered Canadian Hansard Dataset. </h1> The Canadian [Hansard](https://www.ourcommons.ca/documentviewer/en/35-2/house/hansard-index) is an archive of parliamentary sessions in the two official languages in Canada - English and Franch. ## 📋 Table of Contents - [🧩 Hansard Dataset](#-hansard-dataset) - [📋 Table of Contents](#-table-of-contents) - [📖 Usage](#-usage) - [Downloading the dataset](#downloading-the-dataset) - [Dataset structure](#dataset-structure) - [Loading the dataset](#loading-the-dataset) <!--- [Evaluating](#evaluating) - [Running the baselines](#running-the-baselines) - [Word Embeddings and Pre-trained Language Models](#word-embeddings-and-pre-trained-language-models) - [Large Language Models](#large-language-models) --> - [✍️ Contributing](#️-contributing) - [📝 Citing](#-citing) - [🙏 Acknowledgements](#-acknowledgements) ## 📖 Usage ### Downloading the dataset The hansard dataset can be downloaded from [here](https://www.cs.toronto.edu/~raeidsaqur/hansard/hansard.tar.gz) or with a bash script: ```bash bash download_hansard.sh ``` ### Dataset structure The dataset is provided as csv (and parquet) files, one for each partition: `train.[csv|parquet]` and `test.csv`. We also provide a `hansard.[csv|parquet]` file that contains all examples across all splits. The splits are sized as follows: <!-- | Split | # Walls | |:-------|:---------:| | `train` | 311K | | `test` | 49K | Here is an example of the dataset's structure: ```csv ``` --> ### Loading the dataset The three partitions can be loaded the same way as any other csv file. For example, using Python: ```python dataset = { "train": csv.load(open("./Hansard/train.csv", "r"))["dataset"], "test": csv.load(open("./Hansard/test.csv", "r"))["dataset"], } ``` However, it is likely easiest to work with the dataset using the [HuggingFace Datasets](https://huggingface.co/datasets) library: ```python # pip install datasets from datasets import load_dataset # The dataset can be used like any other HuggingFace dataset dataset = load_dataset("raeidsaqur/hansard") ``` <!-- > __Note__ --> <!-- ### Evaluating We provide a script for evaluating the performance of a model on the dataset. Before running, make sure you have installed the requirements and package: ```bash pip install -r requirements.txt pip install -e . ``` To run the evaluation script: ### Running the baselines --> ## ✍️ Contributing We welcome contributions to this repository (noticed a typo? a bug?). To propose a change: ``` git clone https://github.com/raeidsaqur/hansard cd hansard git checkout -b my-branch pip install -r requirements.txt pip install -e . ``` Once your changes are made, make sure to lint and format the code (addressing any warnings or errors): ``` isort . black . flake8 . ``` Then, submit your change as a pull request. ## 📝 Citing If you use the Canadian Hansarddataset in your work, please consider citing our paper: ``` @article{raeidsaqur2024Hansard, title = {The Canadian Hansard Dataset for Analyzing Dialect Efficiencies in Language Models}, author = {Raeid Saqur}, year = 2024, journal = {ArXiv}, url = } ``` ## 🙏 Acknowledgements The entire CSC401/2511 teaching team at the Dept. of Computer Science at the University of Toronto.
aryaman/irumozhi
--- license: mit task_categories: - text-classification language: - ta tags: - diglossia pretty_name: IruMozhi size_categories: - n<1K --- **IruMozhi** is a human-translated dataset of parallel text in Literary and Spoken Tamil, using sentences taken from Wikipedia. For more details, see the [paper](https://arxiv.org/abs/2311.07804).
norwegian_ner
--- annotations_creators: - expert-generated language_creators: - crowdsourced language: - 'no' license: - unknown multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - token-classification task_ids: - named-entity-recognition pretty_name: Norwegian NER dataset_info: - config_name: bokmaal features: - name: idx dtype: string - name: text dtype: string - name: tokens sequence: string - name: lemmas sequence: string - name: pos_tags sequence: class_label: names: '0': NOUN '1': PUNCT '2': ADP '3': NUM '4': SYM '5': SCONJ '6': ADJ '7': PART '8': DET '9': CCONJ '10': PROPN '11': PRON '12': X '13': ADV '14': INTJ '15': VERB '16': AUX - name: ner_tags sequence: class_label: names: '0': O '1': B-OTH '2': I-OTH '3': E-OTH '4': S-OTH '5': B-ORG '6': I-ORG '7': E-ORG '8': S-ORG '9': B-PRS '10': I-PRS '11': E-PRS '12': S-PRS '13': B-GEO '14': I-GEO '15': E-GEO '16': S-GEO splits: - name: train num_bytes: 9859760 num_examples: 15696 - name: validation num_bytes: 1475216 num_examples: 2410 - name: test num_bytes: 1212939 num_examples: 1939 download_size: 8747760 dataset_size: 12547915 - config_name: nynorsk features: - name: idx dtype: string - name: text dtype: string - name: tokens sequence: string - name: lemmas sequence: string - name: pos_tags sequence: class_label: names: '0': NOUN '1': PUNCT '2': ADP '3': NUM '4': SYM '5': SCONJ '6': ADJ '7': PART '8': DET '9': CCONJ '10': PROPN '11': PRON '12': X '13': ADV '14': INTJ '15': VERB '16': AUX - name: ner_tags sequence: class_label: names: '0': O '1': B-OTH '2': I-OTH '3': E-OTH '4': S-OTH '5': B-ORG '6': I-ORG '7': E-ORG '8': S-ORG '9': B-PRS '10': I-PRS '11': E-PRS '12': S-PRS '13': B-GEO '14': I-GEO '15': E-GEO '16': S-GEO splits: - name: train num_bytes: 9916338 num_examples: 14174 - name: validation num_bytes: 1257235 num_examples: 1890 - name: test num_bytes: 1006733 num_examples: 1511 download_size: 8484545 dataset_size: 12180306 - config_name: samnorsk features: - name: idx dtype: string - name: text dtype: string - name: tokens sequence: string - name: lemmas sequence: string - name: pos_tags sequence: class_label: names: '0': NOUN '1': PUNCT '2': ADP '3': NUM '4': SYM '5': SCONJ '6': ADJ '7': PART '8': DET '9': CCONJ '10': PROPN '11': PRON '12': X '13': ADV '14': INTJ '15': VERB '16': AUX - name: ner_tags sequence: class_label: names: '0': O '1': B-OTH '2': I-OTH '3': E-OTH '4': S-OTH '5': B-ORG '6': I-ORG '7': E-ORG '8': S-ORG '9': B-PRS '10': I-PRS '11': E-PRS '12': S-PRS '13': B-GEO '14': I-GEO '15': E-GEO '16': S-GEO splits: - name: train num_bytes: 22508485 num_examples: 34170 - name: validation num_bytes: 2732419 num_examples: 4300 - name: test num_bytes: 2219640 num_examples: 3450 download_size: 19133049 dataset_size: 27460544 --- # Dataset Card for Norwegian NER ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [Github](https://github.com/ljos/navnkjenner) - **Repository:** [Github](https://github.com/ljos/navnkjenner) - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary [More Information Needed] ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions Thanks to [@jplu](https://github.com/jplu) for adding this dataset.
open-llm-leaderboard/details_Tensoic__Gemma-2B-Samvaad
--- pretty_name: Evaluation run of Tensoic/Gemma-2B-Samvaad dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Tensoic/Gemma-2B-Samvaad](https://huggingface.co/Tensoic/Gemma-2B-Samvaad) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tensoic__Gemma-2B-Samvaad\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-29T16:22:57.666384](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Gemma-2B-Samvaad/blob/main/results_2024-02-29T16-22-57.666384.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3360458197941767,\n\ \ \"acc_stderr\": 0.03323760993199669,\n \"acc_norm\": 0.3391455847920635,\n\ \ \"acc_norm_stderr\": 0.034017762568725546,\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.39950374063892946,\n\ \ \"mc2_stderr\": 0.014392561951779027\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.45733788395904434,\n \"acc_stderr\": 0.014558106543924068,\n\ \ \"acc_norm\": 0.4658703071672355,\n \"acc_norm_stderr\": 0.014577311315231097\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5046803425612428,\n\ \ \"acc_stderr\": 0.004989562798280525,\n \"acc_norm\": 0.681736705835491,\n\ \ \"acc_norm_stderr\": 0.004648503177353945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\ \ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\ \ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\ \ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.029373646253234686,\n\ \ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.029373646253234686\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\ \ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\ \ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\ \ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\ \ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\ \ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.040925639582376536,\n\ \ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.040925639582376536\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.031709956060406545,\n\ \ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.031709956060406545\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\ \ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\ \ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.040287315329475576,\n\ \ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.040287315329475576\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\ acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\ \ \"acc_stderr\": 0.042407993275749234,\n \"acc_norm\": 0.3412698412698413,\n\ \ \"acc_norm_stderr\": 0.042407993275749234\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.29354838709677417,\n\ \ \"acc_stderr\": 0.02590608702131929,\n \"acc_norm\": 0.29354838709677417,\n\ \ \"acc_norm_stderr\": 0.02590608702131929\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\ \ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3434343434343434,\n \"acc_stderr\": 0.03383201223244442,\n \"\ acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.03383201223244442\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276586,\n\ \ \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276586\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.023060438380857733,\n\ \ \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.023060438380857733\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \ \ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.02851025151234193,\n \ \ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.02851025151234193\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"\ acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3963302752293578,\n \"acc_stderr\": 0.02097146994790052,\n \"\ acc_norm\": 0.3963302752293578,\n \"acc_norm_stderr\": 0.02097146994790052\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.24074074074074073,\n \"acc_stderr\": 0.02915752218460561,\n \"\ acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02915752218460561\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.3235294117647059,\n \"acc_stderr\": 0.03283472056108567,\n \"\ acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.03283472056108567\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \ \ \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\ \ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.4260089686098655,\n\ \ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n\ \ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\ acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\ \ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\ \ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\ \ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\ \ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\ \ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5299145299145299,\n\ \ \"acc_stderr\": 0.03269741106812442,\n \"acc_norm\": 0.5299145299145299,\n\ \ \"acc_norm_stderr\": 0.03269741106812442\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4240102171136654,\n\ \ \"acc_stderr\": 0.017672263329084226,\n \"acc_norm\": 0.4240102171136654,\n\ \ \"acc_norm_stderr\": 0.017672263329084226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.0246853168672578,\n\ \ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.0246853168672578\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\ \ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\ \ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.369281045751634,\n \"acc_stderr\": 0.02763417668960266,\n\ \ \"acc_norm\": 0.369281045751634,\n \"acc_norm_stderr\": 0.02763417668960266\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n\ \ \"acc_stderr\": 0.02721042037593402,\n \"acc_norm\": 0.35691318327974275,\n\ \ \"acc_norm_stderr\": 0.02721042037593402\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.025910063528240868,\n\ \ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.025910063528240868\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461,\n \ \ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27444589308996087,\n\ \ \"acc_stderr\": 0.011397043163078154,\n \"acc_norm\": 0.27444589308996087,\n\ \ \"acc_norm_stderr\": 0.011397043163078154\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.026679252270103117,\n\ \ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.026679252270103117\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.3055555555555556,\n \"acc_stderr\": 0.018635594034423972,\n \ \ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.018635594034423972\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\ \ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\ \ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.02853556033712844,\n\ \ \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.02853556033712844\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.34328358208955223,\n\ \ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.34328358208955223,\n\ \ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\ \ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\ \ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.03834234744164993,\n\ \ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.03834234744164993\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\ \ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.39950374063892946,\n\ \ \"mc2_stderr\": 0.014392561951779027\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.6164167324388319,\n \"acc_stderr\": 0.013666275889539019\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.058377558756633814,\n \ \ \"acc_stderr\": 0.00645808355783246\n }\n}\n```" repo_url: https://huggingface.co/Tensoic/Gemma-2B-Samvaad leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|arc:challenge|25_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-29T16-22-57.666384.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|gsm8k|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hellaswag|10_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-22-57.666384.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-management|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-22-57.666384.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|truthfulqa:mc|0_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-29T16-22-57.666384.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_29T16_22_57.666384 path: - '**/details_harness|winogrande|5_2024-02-29T16-22-57.666384.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-29T16-22-57.666384.parquet' - config_name: results data_files: - split: 2024_02_29T16_22_57.666384 path: - results_2024-02-29T16-22-57.666384.parquet - split: latest path: - results_2024-02-29T16-22-57.666384.parquet --- # Dataset Card for Evaluation run of Tensoic/Gemma-2B-Samvaad <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Tensoic/Gemma-2B-Samvaad](https://huggingface.co/Tensoic/Gemma-2B-Samvaad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Tensoic__Gemma-2B-Samvaad", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-29T16:22:57.666384](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Gemma-2B-Samvaad/blob/main/results_2024-02-29T16-22-57.666384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.3360458197941767, "acc_stderr": 0.03323760993199669, "acc_norm": 0.3391455847920635, "acc_norm_stderr": 0.034017762568725546, "mc1": 0.25703794369645044, "mc1_stderr": 0.01529807750948508, "mc2": 0.39950374063892946, "mc2_stderr": 0.014392561951779027 }, "harness|arc:challenge|25": { "acc": 0.45733788395904434, "acc_stderr": 0.014558106543924068, "acc_norm": 0.4658703071672355, "acc_norm_stderr": 0.014577311315231097 }, "harness|hellaswag|10": { "acc": 0.5046803425612428, "acc_stderr": 0.004989562798280525, "acc_norm": 0.681736705835491, "acc_norm_stderr": 0.004648503177353945 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.37037037037037035, "acc_stderr": 0.04171654161354543, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.04171654161354543 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2894736842105263, "acc_stderr": 0.03690677986137283, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.35094339622641507, "acc_stderr": 0.029373646253234686, "acc_norm": 0.35094339622641507, "acc_norm_stderr": 0.029373646253234686 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3680555555555556, "acc_stderr": 0.04032999053960719, "acc_norm": 0.3680555555555556, "acc_norm_stderr": 0.04032999053960719 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.22, "acc_stderr": 0.041633319989322695, "acc_norm": 0.22, "acc_norm_stderr": 0.041633319989322695 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.28901734104046245, "acc_stderr": 0.034564257450869995, "acc_norm": 0.28901734104046245, "acc_norm_stderr": 0.034564257450869995 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.040925639582376536, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.040925639582376536 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.37872340425531914, "acc_stderr": 0.031709956060406545, "acc_norm": 0.37872340425531914, "acc_norm_stderr": 0.031709956060406545 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.04142439719489362, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.04142439719489362 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3724137931034483, "acc_stderr": 0.040287315329475576, "acc_norm": 0.3724137931034483, "acc_norm_stderr": 0.040287315329475576 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24867724867724866, "acc_stderr": 0.022261817692400168, "acc_norm": 0.24867724867724866, "acc_norm_stderr": 0.022261817692400168 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.042407993275749234, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.042407993275749234 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.29354838709677417, "acc_stderr": 0.02590608702131929, "acc_norm": 0.29354838709677417, "acc_norm_stderr": 0.02590608702131929 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2413793103448276, "acc_stderr": 0.030108330718011625, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.030108330718011625 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2909090909090909, "acc_stderr": 0.03546563019624336, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3434343434343434, "acc_stderr": 0.03383201223244442, "acc_norm": 0.3434343434343434, "acc_norm_stderr": 0.03383201223244442 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.31088082901554404, "acc_stderr": 0.03340361906276586, "acc_norm": 0.31088082901554404, "acc_norm_stderr": 0.03340361906276586 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2923076923076923, "acc_stderr": 0.023060438380857733, "acc_norm": 0.2923076923076923, "acc_norm_stderr": 0.023060438380857733 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.026335739404055803, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.026335739404055803 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.02851025151234193, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.02851025151234193 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23841059602649006, "acc_stderr": 0.03479185572599661, "acc_norm": 0.23841059602649006, "acc_norm_stderr": 0.03479185572599661 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3963302752293578, "acc_stderr": 0.02097146994790052, "acc_norm": 0.3963302752293578, "acc_norm_stderr": 0.02097146994790052 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.02915752218460561, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.02915752218460561 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3235294117647059, "acc_stderr": 0.03283472056108567, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.03283472056108567 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.38396624472573837, "acc_stderr": 0.031658678064106674, "acc_norm": 0.38396624472573837, "acc_norm_stderr": 0.031658678064106674 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4260089686098655, "acc_stderr": 0.033188332862172806, "acc_norm": 0.4260089686098655, "acc_norm_stderr": 0.033188332862172806 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.33587786259541985, "acc_stderr": 0.041423137719966634, "acc_norm": 0.33587786259541985, "acc_norm_stderr": 0.041423137719966634 }, "harness|hendrycksTest-international_law|5": { "acc": 0.38016528925619836, "acc_stderr": 0.04431324501968432, "acc_norm": 0.38016528925619836, "acc_norm_stderr": 0.04431324501968432 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3148148148148148, "acc_stderr": 0.04489931073591312, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.36893203883495146, "acc_stderr": 0.047776151811567386, "acc_norm": 0.36893203883495146, "acc_norm_stderr": 0.047776151811567386 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5299145299145299, "acc_stderr": 0.03269741106812442, "acc_norm": 0.5299145299145299, "acc_norm_stderr": 0.03269741106812442 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4240102171136654, "acc_stderr": 0.017672263329084226, "acc_norm": 0.4240102171136654, "acc_norm_stderr": 0.017672263329084226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.30057803468208094, "acc_stderr": 0.0246853168672578, "acc_norm": 0.30057803468208094, "acc_norm_stderr": 0.0246853168672578 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2536312849162011, "acc_stderr": 0.014551553659369922, "acc_norm": 0.2536312849162011, "acc_norm_stderr": 0.014551553659369922 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.369281045751634, "acc_stderr": 0.02763417668960266, "acc_norm": 0.369281045751634, "acc_norm_stderr": 0.02763417668960266 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.35691318327974275, "acc_stderr": 0.02721042037593402, "acc_norm": 0.35691318327974275, "acc_norm_stderr": 0.02721042037593402 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.31790123456790126, "acc_stderr": 0.025910063528240868, "acc_norm": 0.31790123456790126, "acc_norm_stderr": 0.025910063528240868 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2765957446808511, "acc_stderr": 0.026684564340461, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.026684564340461 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.27444589308996087, "acc_stderr": 0.011397043163078154, "acc_norm": 0.27444589308996087, "acc_norm_stderr": 0.011397043163078154 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2610294117647059, "acc_stderr": 0.026679252270103117, "acc_norm": 0.2610294117647059, "acc_norm_stderr": 0.026679252270103117 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3055555555555556, "acc_stderr": 0.018635594034423972, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.018635594034423972 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.42727272727272725, "acc_stderr": 0.04738198703545483, "acc_norm": 0.42727272727272725, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.27346938775510204, "acc_stderr": 0.02853556033712844, "acc_norm": 0.27346938775510204, "acc_norm_stderr": 0.02853556033712844 }, "harness|hendrycksTest-sociology|5": { "acc": 0.34328358208955223, "acc_stderr": 0.03357379665433431, "acc_norm": 0.34328358208955223, "acc_norm_stderr": 0.03357379665433431 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479636, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479636 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.49122807017543857, "acc_stderr": 0.03834234744164993, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.03834234744164993 }, "harness|truthfulqa:mc|0": { "mc1": 0.25703794369645044, "mc1_stderr": 0.01529807750948508, "mc2": 0.39950374063892946, "mc2_stderr": 0.014392561951779027 }, "harness|winogrande|5": { "acc": 0.6164167324388319, "acc_stderr": 0.013666275889539019 }, "harness|gsm8k|5": { "acc": 0.058377558756633814, "acc_stderr": 0.00645808355783246 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
cce112/1233
--- license: mit ---
niv-al/sq-babi_nli_single-supporting-fact
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: labels dtype: class_label: names: '0': not-entailed '1': entailed splits: - name: train num_bytes: 214663 num_examples: 1000 - name: validation num_bytes: 31319 num_examples: 144 - name: test num_bytes: 30966 num_examples: 144 download_size: 50131 dataset_size: 276948 language: - sq --- # Dataset Card for "sq-babi_nli_single-supporting-fact" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0160ba77
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 180 num_examples: 10 download_size: 1330 dataset_size: 180 --- # Dataset Card for "0160ba77" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bartoszmaj/vader_sentiment_full
--- license: openrail dataset_info: features: - name: sub dtype: string - name: created_utc dtype: int64 - name: score dtype: int64 - name: vader_sentiment dtype: float64 - name: year dtype: int64 - name: sentiment_cat dtype: string splits: - name: train num_bytes: 268134439 num_examples: 4600698 download_size: 86031901 dataset_size: 268134439 ---
datahrvoje/twitter_dataset_1712977296
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 22210 num_examples: 49 download_size: 12332 dataset_size: 22210 configs: - config_name: default data_files: - split: train path: data/train-* ---
vwxyzjn/openhermes-dev__mistralai_Mixtral-8x7B-Instruct-v0.1__1706887930
--- dataset_info: features: - name: topic dtype: string - name: views dtype: 'null' - name: system_prompt dtype: string - name: conversations list: - name: from dtype: string - name: value dtype: string - name: weight dtype: 'null' - name: title dtype: string - name: model_name dtype: string - name: id dtype: string - name: avatarUrl dtype: 'null' - name: hash dtype: 'null' - name: custom_instruction dtype: bool - name: model dtype: 'null' - name: idx dtype: 'null' - name: source dtype: string - name: skip_prompt_formatting dtype: bool - name: category dtype: string - name: language dtype: string - name: prompt dtype: string - name: chosen_policy dtype: string - name: chosen list: - name: content dtype: string - name: role dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: rejected_policy dtype: string splits: - name: train_prefs num_bytes: 135599 num_examples: 29 - name: test_prefs num_bytes: 1820 num_examples: 1 download_size: 112553 dataset_size: 137419 configs: - config_name: default data_files: - split: train_prefs path: data/train_prefs-* - split: test_prefs path: data/test_prefs-* ---
autoevaluate/autoeval-staging-eval-project-e1907042-7494835
--- type: predictions tags: - autotrain - evaluation datasets: - clinc_oos eval_info: task: multi_class_classification model: jackmleitch/distilbert-base-uncased-distilled-clinc metrics: [] dataset_name: clinc_oos dataset_config: small dataset_split: test col_mapping: text: text target: intent --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Multi-class Text Classification * Model: jackmleitch/distilbert-base-uncased-distilled-clinc * Dataset: clinc_oos To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B
--- pretty_name: Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0050335570469798654,\n\ \ \"em_stderr\": 0.0007247385547751907,\n \"f1\": 0.060973154362416224,\n\ \ \"f1_stderr\": 0.0014562854103949273,\n \"acc\": 0.40513399869433026,\n\ \ \"acc_stderr\": 0.009524554979348756\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751907,\n\ \ \"f1\": 0.060973154362416224,\n \"f1_stderr\": 0.0014562854103949273\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06520090978013647,\n \ \ \"acc_stderr\": 0.006800302989321091\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\ \ }\n}\n```" repo_url: https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-09-03T22:06:17.603163.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_22T18_47_12.642745 path: - '**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-22T18-47-12.642745.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_22T18_47_12.642745 path: - '**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-22T18-47-12.642745.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T22:06:17.603163.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_09_03T22_06_17.603163 path: - '**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-09-03T22:06:17.603163.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_22T18_47_12.642745 path: - '**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-22T18-47-12.642745.parquet' - config_name: results data_files: - split: 2023_09_03T22_06_17.603163 path: - results_2023-09-03T22:06:17.603163.parquet - split: 2023_09_22T18_47_12.642745 path: - results_2023-09-22T18-47-12.642745.parquet - split: latest path: - results_2023-09-22T18-47-12.642745.parquet --- # Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v18_B-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v18_B-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v18_B-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T18:47:12.642745](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v18_B-7B/blob/main/results_2023-09-22T18-47-12.642745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751907, "f1": 0.060973154362416224, "f1_stderr": 0.0014562854103949273, "acc": 0.40513399869433026, "acc_stderr": 0.009524554979348756 }, "harness|drop|3": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751907, "f1": 0.060973154362416224, "f1_stderr": 0.0014562854103949273 }, "harness|gsm8k|5": { "acc": 0.06520090978013647, "acc_stderr": 0.006800302989321091 }, "harness|winogrande|5": { "acc": 0.745067087608524, "acc_stderr": 0.012248806969376422 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
tinyBenchmarks/tinyAlpacaEval
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: instruction dtype: string - name: output dtype: string - name: generator dtype: string - name: dataset dtype: string splits: - name: test num_bytes: 177527 num_examples: 100 download_size: 120300 dataset_size: 177527 --- # tinyAlpacaEval Welcome to tinyAlpacaEval! This dataset serves as a concise version of AlpacaEval 2.0, offering a subset of curated 100 data points selected from the original compilation of 805 examples. ## Features - **Compact Dataset:** With only 100 data points, tinyAlpacaEval provides a swift and efficient way to evaluate your LLM's performance against a benchmark set, maintaining the essence of AlpacaEval 2.0. - **Benchmarking with gpt4_turbo:** We include outputs from `gpt4_turbo` (also known as `gpt4_1106_preview`), allowing users to compare their LLM's performance directly. The evaluation should be conducted using the `weighted_alpaca_eval_gpt4_turbo` method for a consistent assessment. ## Model Evaluation Users looking to evaluate a new model with tinyAlpacaEval should refer to the instructions provided at [AlpacaEval GitHub](https://github.com/tatsu-lab/alpaca_eval). To download the data, please run ```python from datasets import load_dataset tiny_data = load_dataset('tinyBenchmarks/tinyAlpacaEval', 'default')['test'] ``` For each one of the 100 examples in `tiny_data`, the [AlpacaEval 2.0 code](https://github.com/tatsu-lab/alpaca_eval) will generate a score between 1 and 2. You need to organize those scores in unidimensional numpy vector of scores \(y\). To estimate the final performance of your model ("win rate") you first need to translate the individual scores to make sure every number is between 0 and 1 ```python # Score translation example y = # your original score vector y = y - 1 ``` Then, to estimate your LLM's performance using tinyAlpacaEval, you can use the following Python code. First, ensure you have the tinyBenchmarks package installed: ```shell pip install git+https://github.com/felipemaiapolo/tinyBenchmarks ``` Then, use the code snippet below for the evaluation: import numpy as np import tinyBenchmarks as tb ```python ### Parameters benchmark = 'alpaca' ### Evaluation tb.evaluate(y, benchmark) ``` This process will help you estimate the performance of your LLM against the tinyAlpacaEval dataset, providing a streamlined approach to benchmarking. For more detailed instructions on evaluating new models and computing scores, please refer to the comprehensive guides available at [AlpacaEval GitHub](https://github.com/tatsu-lab/alpaca_eval) and [tinyBenchmarks GitHub](https://github.com/felipemaiapolo/tinyBenchmarks). Happy benchmarking! ## More tinyBenchmarks **Open LLM leaderboard**: [tiny MMLU](https://huggingface.co/datasets/tinyBenchmarks/tinyMMLU), [tiny Arc-Challenge](https://huggingface.co/datasets/tinyBenchmarks/tinyAI2_arc), [tiny Winogrande](https://huggingface.co/datasets/tinyBenchmarks/tinyWinogrande), [tiny Hellaswag](https://huggingface.co/datasets/tinyBenchmarks/tinyHellaswag), [tiny TruthfulQA](https://huggingface.co/datasets/tinyBenchmarks/tinyTruthfulQA), [tiny GSM8k](https://huggingface.co/datasets/tinyBenchmarks/tinyGSM8k) **HELM-lite**: _work-in-progress_ ## Citation @article{polo2024tinybenchmarks, title={tinyBenchmarks: evaluating LLMs with fewer examples}, author={Felipe Maia Polo and Lucas Weber and Leshem Choshen and Yuekai Sun and Gongjun Xu and Mikhail Yurochkin}, year={2024}, eprint={2402.14992}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{alpaca_eval, author = {Xuechen Li and Tianyi Zhang and Yann Dubois and Rohan Taori and Ishaan Gulrajani and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto }, title = {AlpacaEval: An Automatic Evaluator of Instruction-following Models}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/tatsu-lab/alpaca_eval}} }
alexandrainst/m_arc
--- configs: - config_name: ar data_files: - split: train path: data/ar/train.jsonl - split: val path: data/ar/val.jsonl - split: test path: data/ar/test.jsonl - config_name: bn data_files: - split: train path: data/bn/train.jsonl - split: val path: data/bn/val.jsonl - split: test path: data/bn/test.jsonl - config_name: ca data_files: - split: train path: data/ca/train.jsonl - split: val path: data/ca/val.jsonl - split: test path: data/ca/test.jsonl - config_name: da data_files: - split: train path: data/da/train.jsonl - split: val path: data/da/val.jsonl - split: test path: data/da/test.jsonl - config_name: de data_files: - split: train path: data/de/train.jsonl - split: val path: data/de/val.jsonl - split: test path: data/de/test.jsonl - config_name: en data_files: - split: train path: data/en/train.jsonl - split: val path: data/en/val.jsonl - split: test path: data/en/test.jsonl - config_name: es data_files: - split: train path: data/es/train.jsonl - split: val path: data/es/val.jsonl - split: test path: data/es/test.jsonl - config_name: eu data_files: - split: train path: data/eu/train.jsonl - split: val path: data/eu/val.jsonl - split: test path: data/eu/test.jsonl - config_name: fr data_files: - split: train path: data/fr/train.jsonl - split: val path: data/fr/val.jsonl - split: test path: data/fr/test.jsonl - config_name: gu data_files: - split: train path: data/gu/train.jsonl - split: val path: data/gu/val.jsonl - split: test path: data/gu/test.jsonl - config_name: hi data_files: - split: train path: data/hi/train.jsonl - split: val path: data/hi/val.jsonl - split: test path: data/hi/test.jsonl - config_name: hr data_files: - split: train path: data/hr/train.jsonl - split: val path: data/hr/val.jsonl - split: test path: data/hr/test.jsonl - config_name: hu data_files: - split: train path: data/hu/train.jsonl - split: val path: data/hu/val.jsonl - split: test path: data/hu/test.jsonl - config_name: hy data_files: - split: train path: data/hy/train.jsonl - split: val path: data/hy/val.jsonl - split: test path: data/hy/test.jsonl - config_name: id data_files: - split: train path: data/id/train.jsonl - split: val path: data/id/val.jsonl - split: test path: data/id/test.jsonl - config_name: is data_files: - split: train path: data/is/train.jsonl - split: val path: data/is/val.jsonl - split: test path: data/is/test.jsonl - config_name: it data_files: - split: train path: data/it/train.jsonl - split: val path: data/it/val.jsonl - split: test path: data/it/test.jsonl - config_name: kn data_files: - split: train path: data/kn/train.jsonl - split: val path: data/kn/val.jsonl - split: test path: data/kn/test.jsonl - config_name: ml data_files: - split: train path: data/ml/train.jsonl - split: val path: data/ml/val.jsonl - split: test path: data/ml/test.jsonl - config_name: mr data_files: - split: train path: data/mr/train.jsonl - split: val path: data/mr/val.jsonl - split: test path: data/mr/test.jsonl - config_name: nb data_files: - split: train path: data/nb/train.jsonl - split: val path: data/nb/val.jsonl - split: test path: data/nb/test.jsonl - config_name: ne data_files: - split: train path: data/ne/train.jsonl - split: val path: data/ne/val.jsonl - split: test path: data/ne/test.jsonl - config_name: nl data_files: - split: train path: data/nl/train.jsonl - split: val path: data/nl/val.jsonl - split: test path: data/nl/test.jsonl - config_name: pt data_files: - split: train path: data/pt/train.jsonl - split: val path: data/pt/val.jsonl - split: test path: data/pt/test.jsonl - config_name: ro data_files: - split: train path: data/ro/train.jsonl - split: val path: data/ro/val.jsonl - split: test path: data/ro/test.jsonl - config_name: ru data_files: - split: train path: data/ru/train.jsonl - split: val path: data/ru/val.jsonl - split: test path: data/ru/test.jsonl - config_name: sk data_files: - split: train path: data/sk/train.jsonl - split: val path: data/sk/val.jsonl - split: test path: data/sk/test.jsonl - config_name: sr data_files: - split: train path: data/sr/train.jsonl - split: val path: data/sr/val.jsonl - split: test path: data/sr/test.jsonl - config_name: sv data_files: - split: train path: data/sv/train.jsonl - split: val path: data/sv/val.jsonl - split: test path: data/sv/test.jsonl - config_name: ta data_files: - split: train path: data/ta/train.jsonl - split: val path: data/ta/val.jsonl - split: test path: data/ta/test.jsonl - config_name: te data_files: - split: train path: data/te/train.jsonl - split: val path: data/te/val.jsonl - split: test path: data/te/test.jsonl - config_name: uk data_files: - split: train path: data/uk/train.jsonl - split: val path: data/uk/val.jsonl - split: test path: data/uk/test.jsonl - config_name: vi data_files: - split: train path: data/vi/train.jsonl - split: val path: data/vi/val.jsonl - split: test path: data/vi/test.jsonl - config_name: zh data_files: - split: train path: data/zh/train.jsonl - split: val path: data/zh/val.jsonl - split: test path: data/zh/test.jsonl license: cc-by-nc-4.0 task_categories: - question-answering task_ids: - multiple-choice-qa size_categories: - 10K<n<100K language: - ar - bn - ca - da - de - en - es - eu - fr - gu - hi - hr - hu - hy - id - is - it - kn - ml - mr - nb - 'no' - ne - nl - pt - ro - ru - sk - sr - sv - ta - te - uk - vi - zh --- # Multilingual ARC ## Dataset Summary This dataset is a machine translated version of the [ARC dataset](https://huggingface.co/datasets/ai2_arc). The Icelandic (is) part was translated with [Miðeind](https://mideind.is/english.html)'s Greynir model and Norwegian (nb) was translated with [DeepL](https://deepl.com/). The rest of the languages was translated using GPT-3.5-turbo by the University of Oregon, and this part of the dataset was originally uploaded to [this Github repository](https://github.com/nlp-uoregon/mlmm-evaluation).
mteb/msmarco-v2
--- language: - en multilinguality: - monolingual task_categories: - text-retrieval source_datasets: - msmarco-v2 task_ids: - document-retrieval config_names: - corpus tags: - text-retrieval dataset_info: - config_name: default features: - name: query-id dtype: string - name: corpus-id dtype: string - name: score dtype: float64 splits: - name: train num_bytes: 9631462 num_examples: 284212 - name: dev num_bytes: 136961 num_examples: 4009 - name: dev2 num_bytes: 150735 num_examples: 4411 - config_name: corpus features: - name: _id dtype: string - name: title dtype: string - name: text dtype: string splits: - name: corpus num_bytes: 50691069190 num_examples: 138364198 - config_name: queries features: - name: _id dtype: string - name: text dtype: string splits: - name: queries num_bytes: 13379527 num_examples: 285328 configs: - config_name: default data_files: - split: train path: qrels/train.jsonl - split: dev path: qrels/dev.jsonl - split: dev2 path: qrels/dev2.jsonl - config_name: corpus data_files: - split: corpus path: corpus.jsonl.gz - config_name: queries data_files: - split: queries path: queries.jsonl ---
spr1916/building_detection
--- dataset_info: features: - name: image_id dtype: int64 - name: image dtype: string - name: width dtype: int64 - name: height dtype: int64 - name: objects struct: - name: area sequence: int64 - name: bbox sequence: sequence: float64 - name: category sequence: int64 - name: id sequence: int64 splits: - name: train num_bytes: 1427880 num_examples: 5000 download_size: 547367 dataset_size: 1427880 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "building_detection" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_mrpc_a_participle
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 257296 num_examples: 920 - name: train num_bytes: 567260 num_examples: 2021 - name: validation num_bytes: 61730 num_examples: 220 download_size: 583834 dataset_size: 886286 --- # Dataset Card for "MULTI_VALUE_mrpc_a_participle" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-staging-eval-multi_nli-default-68c6a6-14415975
--- type: predictions tags: - autotrain - evaluation datasets: - multi_nli eval_info: task: natural_language_inference model: MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli metrics: [] dataset_name: multi_nli dataset_config: default dataset_split: validation_matched col_mapping: text1: premise text2: hypothesis target: label --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Natural Language Inference * Model: MoritzLaurer/DeBERTa-v3-base-mnli-fever-anli * Dataset: multi_nli * Config: default * Split: validation_matched To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@MoritzLaurer](https://huggingface.co/MoritzLaurer) for evaluating this model.
AFZALS/data
--- dataset_info: features: - name: audio dtype: string splits: - name: train num_bytes: 23 num_examples: 1 download_size: 865 dataset_size: 23 configs: - config_name: default data_files: - split: train path: data/train-* ---
netcat420/quiklogik
--- license: mit --- a quick and light dataset designed to PEFT fine-tune mistral 7B and improve upon its reasoning skills a fine-tuned and quantized model using this dataset can be found at netcat420/MHENN (successor coming soon)
yagmurx/ataturk_voice_teknosa_genAI
--- license: unknown ---
Tippawan/TCI-5k-v1
--- dataset_info: features: - name: en dtype: string - name: th dtype: string - name: translation struct: - name: en dtype: string - name: th dtype: string splits: - name: train num_bytes: 3531364 num_examples: 4603 - name: validation num_bytes: 390418 num_examples: 578 - name: test num_bytes: 443036 num_examples: 578 download_size: 1759878 dataset_size: 4364818 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
CyberHarem/dark_jeanne_granbluefantasy
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Jeanne d'Arc (Dark) (Granblue Fantasy) This is the dataset of Jeanne d'Arc (Dark) (Granblue Fantasy), containing 74 images and their tags. The core tags of this character are `long_hair, hair_ornament, breasts, white_hair, red_eyes, large_breasts, bangs, hair_flower, medium_breasts, very_long_hair, wings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 74 | 96.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 74 | 62.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 171 | 123.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 74 | 88.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 171 | 163.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dark_jeanne_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/dark_jeanne_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, looking_at_viewer, smile, solo, bare_shoulders, collarbone, navel, black_bikini, blush, feather_hair_ornament, flower, hair_between_eyes, official_alternate_costume, see-through, simple_background, white_background | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, cleavage, smile, solo, looking_at_viewer, simple_background, armor, black_gloves, collarbone, dress, feather_hair_ornament, white_background | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, looking_at_viewer, smile, solo, armor, feathers, holding_sword, cleavage, bare_shoulders, collarbone, black_gloves, boots, ahoge, open_mouth, single_glove, skirt | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | smile | solo | bare_shoulders | collarbone | navel | black_bikini | blush | feather_hair_ornament | flower | hair_between_eyes | official_alternate_costume | see-through | simple_background | white_background | armor | black_gloves | dress | feathers | holding_sword | boots | ahoge | open_mouth | single_glove | skirt | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------|:-------|:-----------------|:-------------|:--------|:---------------|:--------|:------------------------|:---------|:--------------------|:-----------------------------|:--------------|:--------------------|:-------------------|:--------|:---------------|:--------|:-----------|:----------------|:--------|:--------|:-------------|:---------------|:--------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | | | X | | | | | X | X | X | X | X | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | | | | | | | | | | X | X | | X | X | X | X | X | X | X |
huggingartists/rage-against-the-machine
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/rage-against-the-machine" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 0.216212 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/2158957823960c84c7890b8fa5e6d479.1000x1000x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/rage-against-the-machine"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">Rage Against the Machine</div> <a href="https://genius.com/artists/rage-against-the-machine"> <div style="text-align: center; font-size: 14px;">@rage-against-the-machine</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/rage-against-the-machine). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/rage-against-the-machine") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |100| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/rage-against-the-machine") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B
--- pretty_name: Evaluation run of flemmingmiguel/MarcMistral-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [flemmingmiguel/MarcMistral-7B](https://huggingface.co/flemmingmiguel/MarcMistral-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-16T22:54:28.870994](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B/blob/main/results_2024-01-16T22-54-28.870994.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6590480473953646,\n\ \ \"acc_stderr\": 0.031791981740818515,\n \"acc_norm\": 0.658591302585081,\n\ \ \"acc_norm_stderr\": 0.032449325118685084,\n \"mc1\": 0.48592411260709917,\n\ \ \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6492388181500066,\n\ \ \"mc2_stderr\": 0.015458622413425438\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850942,\n\ \ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.709520015933081,\n\ \ \"acc_stderr\": 0.004530560646902539,\n \"acc_norm\": 0.8778131846245768,\n\ \ \"acc_norm_stderr\": 0.0032683212609136273\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\ \ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\ \ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\ \ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \ \ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\ \ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\ \ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\ \ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\ \ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\ \ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\ acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\ \ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\ \ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"\ acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\ acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\ acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\ \ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \ \ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\ acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\ acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568603,\n \"\ acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568603\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768427,\n \ \ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768427\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\ \ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\ \ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\ acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\ \ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\ \ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\ \ \"acc_stderr\": 0.019875655027867437,\n \"acc_norm\": 0.8974358974358975,\n\ \ \"acc_norm_stderr\": 0.019875655027867437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\ \ \"acc_stderr\": 0.013140225515611729,\n \"acc_norm\": 0.8390804597701149,\n\ \ \"acc_norm_stderr\": 0.013140225515611729\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\ \ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\ \ \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n\ \ \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\ \ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\ \ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n\ \ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \ \ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\ \ \"acc_stderr\": 0.012753716929101006,\n \"acc_norm\": 0.4745762711864407,\n\ \ \"acc_norm_stderr\": 0.012753716929101006\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \ \ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \ \ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48592411260709917,\n\ \ \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6492388181500066,\n\ \ \"mc2_stderr\": 0.015458622413425438\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168374\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \ \ \"acc_stderr\": 0.01237460849092955\n }\n}\n```" repo_url: https://huggingface.co/flemmingmiguel/MarcMistral-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|arc:challenge|25_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-16T22-54-28.870994.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|gsm8k|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hellaswag|10_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-54-28.870994.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-54-28.870994.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T22-54-28.870994.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_16T22_54_28.870994 path: - '**/details_harness|winogrande|5_2024-01-16T22-54-28.870994.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-16T22-54-28.870994.parquet' - config_name: results data_files: - split: 2024_01_16T22_54_28.870994 path: - results_2024-01-16T22-54-28.870994.parquet - split: latest path: - results_2024-01-16T22-54-28.870994.parquet --- # Dataset Card for Evaluation run of flemmingmiguel/MarcMistral-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [flemmingmiguel/MarcMistral-7B](https://huggingface.co/flemmingmiguel/MarcMistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-16T22:54:28.870994](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B/blob/main/results_2024-01-16T22-54-28.870994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6590480473953646, "acc_stderr": 0.031791981740818515, "acc_norm": 0.658591302585081, "acc_norm_stderr": 0.032449325118685084, "mc1": 0.48592411260709917, "mc1_stderr": 0.017496563717042793, "mc2": 0.6492388181500066, "mc2_stderr": 0.015458622413425438 }, "harness|arc:challenge|25": { "acc": 0.6885665529010239, "acc_stderr": 0.013532472099850942, "acc_norm": 0.71160409556314, "acc_norm_stderr": 0.013238394422428175 }, "harness|hellaswag|10": { "acc": 0.709520015933081, "acc_stderr": 0.004530560646902539, "acc_norm": 0.8778131846245768, "acc_norm_stderr": 0.0032683212609136273 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967926, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268542, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268542 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.020986854593289733, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.020986854593289733 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.024509803921568603, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.024509803921568603 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8270042194092827, "acc_stderr": 0.024621562866768427, "acc_norm": 0.8270042194092827, "acc_norm_stderr": 0.024621562866768427 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.0306365913486998, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.0306365913486998 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.03640118271990947, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.03640118271990947 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.019875655027867437, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.019875655027867437 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8390804597701149, "acc_stderr": 0.013140225515611729, "acc_norm": 0.8390804597701149, "acc_norm_stderr": 0.013140225515611729 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069363, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069363 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41899441340782123, "acc_stderr": 0.016501579306861677, "acc_norm": 0.41899441340782123, "acc_norm_stderr": 0.016501579306861677 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959603, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959603 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4745762711864407, "acc_stderr": 0.012753716929101006, "acc_norm": 0.4745762711864407, "acc_norm_stderr": 0.012753716929101006 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6875, "acc_stderr": 0.02815637344037142, "acc_norm": 0.6875, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.48592411260709917, "mc1_stderr": 0.017496563717042793, "mc2": 0.6492388181500066, "mc2_stderr": 0.015458622413425438 }, "harness|winogrande|5": { "acc": 0.8168902920284136, "acc_stderr": 0.010869778633168374 }, "harness|gsm8k|5": { "acc": 0.7194844579226687, "acc_stderr": 0.01237460849092955 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
1rsh/tts-rajasthani-ulca
--- dataset_info: features: - name: audio dtype: audio - name: text dtype: string splits: - name: train num_bytes: 5198083099.513551 num_examples: 21242 - name: test num_bytes: 460428062.6684487 num_examples: 1848 download_size: 5668661981 dataset_size: 5658511162.181999 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
bigscience-data/roots_indic-bn_wikipedia
--- language: bn license: cc-by-sa-3.0 extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience Ethical Charter. The charter can be found at: https://hf.co/spaces/bigscience/ethical-charter' extra_gated_fields: I have read and agree to abide by the BigScience Ethical Charter: checkbox --- ROOTS Subset: roots_indic-bn_wikipedia # wikipedia - Dataset uid: `wikipedia` ### Description ### Homepage ### Licensing ### Speaker Locations ### Sizes - 3.2299 % of total - 4.2071 % of en - 5.6773 % of ar - 3.3416 % of fr - 5.2815 % of es - 12.4852 % of ca - 0.4288 % of zh - 0.4286 % of zh - 5.4743 % of indic-bn - 8.9062 % of indic-ta - 21.3313 % of indic-te - 4.4845 % of pt - 4.0493 % of indic-hi - 11.3163 % of indic-ml - 22.5300 % of indic-ur - 4.4902 % of vi - 16.9916 % of indic-kn - 24.7820 % of eu - 11.6241 % of indic-mr - 9.8749 % of id - 9.3489 % of indic-pa - 9.4767 % of indic-gu - 24.1132 % of indic-as - 5.3309 % of indic-or ### BigScience processing steps #### Filters applied to: en - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: ar - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: fr - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: es - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: ca - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_1024 #### Filters applied to: zh #### Filters applied to: zh #### Filters applied to: indic-bn - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-ta - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-te - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: pt - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-hi - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-ml - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-ur - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: vi - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-kn - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: eu - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs #### Filters applied to: indic-mr - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: id - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-pa - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-gu - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs - filter_small_docs_bytes_300 #### Filters applied to: indic-as - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs #### Filters applied to: indic-or - filter_wiki_user_titles - dedup_document - filter_remove_empty_docs
open-llm-leaderboard/details_postbot__pythia-160m-hq-emails
--- pretty_name: Evaluation run of postbot/pythia-160m-hq-emails dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [postbot/pythia-160m-hq-emails](https://huggingface.co/postbot/pythia-160m-hq-emails)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-11-19T15:36:28.681873](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public/blob/main/results_2023-11-19T15-36-28.681873.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.264448312099342,\n\ \ \"acc_stderr\": 0.031139481426485385,\n \"acc_norm\": 0.26576656092101425,\n\ \ \"acc_norm_stderr\": 0.03196953249728658,\n \"mc1\": 0.2558139534883721,\n\ \ \"mc1_stderr\": 0.015274176219283349,\n \"mc2\": 0.4550530004497737,\n\ \ \"mc2_stderr\": 0.016187324561962944,\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 0.003122902684563759,\n \"f1_stderr\"\ : 0.00024291228896195137\n },\n \"harness|arc:challenge|25\": {\n \"\ acc\": 0.19880546075085323,\n \"acc_stderr\": 0.011662850198175543,\n \ \ \"acc_norm\": 0.23122866894197952,\n \"acc_norm_stderr\": 0.01232085883477228\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2813184624576778,\n\ \ \"acc_stderr\": 0.004487235657955673,\n \"acc_norm\": 0.3005377414857598,\n\ \ \"acc_norm_stderr\": 0.004575548557275204\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\ \ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\ \ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n\ \ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\ \ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \ \ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708083,\n\ \ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708083\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \ \ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\ \ \"acc_stderr\": 0.03095289021774988,\n \"acc_norm\": 0.20809248554913296,\n\ \ \"acc_norm_stderr\": 0.03095289021774988\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\ \ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\ \ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387533,\n\ \ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387533\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\ \ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\ \ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\ \ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\ acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\ \ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\ \ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\ \ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\ \ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\ \ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\ : 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\ \ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.3181818181818182,\n \"acc_stderr\": 0.033184773338453315,\n \"\ acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.033184773338453315\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n\ \ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645344,\n\ \ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645344\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \ \ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\ \ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\ acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.25137614678899084,\n \"acc_stderr\": 0.01859920636028741,\n \"\ acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.01859920636028741\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\ acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\ acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.189873417721519,\n \"acc_stderr\": 0.025530100460233504,\n \ \ \"acc_norm\": 0.189873417721519,\n \"acc_norm_stderr\": 0.025530100460233504\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\ \ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\ \ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"\ acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\ \ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\ \ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\ \ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\ \ \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.16071428571428573,\n\ \ \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n\ \ \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.2222222222222222,\n\ \ \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \ \ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\ \ \"acc_stderr\": 0.015959829933084046,\n \"acc_norm\": 0.27458492975734355,\n\ \ \"acc_norm_stderr\": 0.015959829933084046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.021855255263421795,\n\ \ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.021855255263421795\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\ \ \"acc_stderr\": 0.014465893829859936,\n \"acc_norm\": 0.24916201117318434,\n\ \ \"acc_norm_stderr\": 0.014465893829859936\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\ \ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\ \ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042117,\n\ \ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042117\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.2375886524822695,\n \"acc_stderr\": 0.02538951255272991,\n \ \ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.02538951255272991\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2333767926988266,\n\ \ \"acc_stderr\": 0.010803108481179081,\n \"acc_norm\": 0.2333767926988266,\n\ \ \"acc_norm_stderr\": 0.010803108481179081\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016643,\n\ \ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016643\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779582,\n \ \ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779582\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.031251275910891656,\n\ \ \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.031251275910891656\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\ \ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\ \ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\ \ \"acc_stderr\": 0.03384429155233134,\n \"acc_norm\": 0.25301204819277107,\n\ \ \"acc_norm_stderr\": 0.03384429155233134\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\ \ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n\ \ \"mc1_stderr\": 0.015274176219283349,\n \"mc2\": 0.4550530004497737,\n\ \ \"mc2_stderr\": 0.016187324561962944\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616438\n\ \ },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\"\ : 0.0,\n \"f1\": 0.003122902684563759,\n \"f1_stderr\": 0.00024291228896195137\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/postbot/pythia-160m-hq-emails leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|arc:challenge|25_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-11-19T15-36-28.681873.parquet' - config_name: harness_drop_3 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|drop|3_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|drop|3_2023-11-19T15-36-28.681873.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|gsm8k|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hellaswag|10_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T15-36-28.681873.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-11-19T15-36-28.681873.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_11_19T15_36_28.681873 path: - '**/details_harness|winogrande|5_2023-11-19T15-36-28.681873.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-11-19T15-36-28.681873.parquet' - config_name: results data_files: - split: 2023_11_19T15_36_28.681873 path: - results_2023-11-19T15-36-28.681873.parquet - split: latest path: - results_2023-11-19T15-36-28.681873.parquet --- # Dataset Card for Evaluation run of postbot/pythia-160m-hq-emails ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/postbot/pythia-160m-hq-emails - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [postbot/pythia-160m-hq-emails](https://huggingface.co/postbot/pythia-160m-hq-emails) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-19T15:36:28.681873](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__pythia-160m-hq-emails_public/blob/main/results_2023-11-19T15-36-28.681873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.264448312099342, "acc_stderr": 0.031139481426485385, "acc_norm": 0.26576656092101425, "acc_norm_stderr": 0.03196953249728658, "mc1": 0.2558139534883721, "mc1_stderr": 0.015274176219283349, "mc2": 0.4550530004497737, "mc2_stderr": 0.016187324561962944, "em": 0.0, "em_stderr": 0.0, "f1": 0.003122902684563759, "f1_stderr": 0.00024291228896195137 }, "harness|arc:challenge|25": { "acc": 0.19880546075085323, "acc_stderr": 0.011662850198175543, "acc_norm": 0.23122866894197952, "acc_norm_stderr": 0.01232085883477228 }, "harness|hellaswag|10": { "acc": 0.2813184624576778, "acc_stderr": 0.004487235657955673, "acc_norm": 0.3005377414857598, "acc_norm_stderr": 0.004575548557275204 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.34814814814814815, "acc_stderr": 0.041153246103369526, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.21052631578947367, "acc_stderr": 0.033176727875331574, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.033176727875331574 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.26037735849056604, "acc_stderr": 0.027008766090708083, "acc_norm": 0.26037735849056604, "acc_norm_stderr": 0.027008766090708083 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.03095289021774988, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.03095289021774988 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.22, "acc_stderr": 0.0416333199893227, "acc_norm": 0.22, "acc_norm_stderr": 0.0416333199893227 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.225531914893617, "acc_stderr": 0.027321078417387533, "acc_norm": 0.225531914893617, "acc_norm_stderr": 0.027321078417387533 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.04096985139843671, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.04096985139843671 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135303, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135303 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.04263906892795132, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.04263906892795132 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3161290322580645, "acc_stderr": 0.02645087448904277, "acc_norm": 0.3161290322580645, "acc_norm_stderr": 0.02645087448904277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.03144712581678242, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.03144712581678242 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.22424242424242424, "acc_stderr": 0.032568666616811015, "acc_norm": 0.22424242424242424, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3181818181818182, "acc_stderr": 0.033184773338453315, "acc_norm": 0.3181818181818182, "acc_norm_stderr": 0.033184773338453315 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3471502590673575, "acc_stderr": 0.03435696168361355, "acc_norm": 0.3471502590673575, "acc_norm_stderr": 0.03435696168361355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3128205128205128, "acc_stderr": 0.023507579020645344, "acc_norm": 0.3128205128205128, "acc_norm_stderr": 0.023507579020645344 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085626, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085626 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.31512605042016806, "acc_stderr": 0.03017680828897434, "acc_norm": 0.31512605042016806, "acc_norm_stderr": 0.03017680828897434 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.25137614678899084, "acc_stderr": 0.01859920636028741, "acc_norm": 0.25137614678899084, "acc_norm_stderr": 0.01859920636028741 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24019607843137256, "acc_stderr": 0.02998373305591362, "acc_norm": 0.24019607843137256, "acc_norm_stderr": 0.02998373305591362 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.189873417721519, "acc_stderr": 0.025530100460233504, "acc_norm": 0.189873417721519, "acc_norm_stderr": 0.025530100460233504 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.242152466367713, "acc_stderr": 0.028751392398694755, "acc_norm": 0.242152466367713, "acc_norm_stderr": 0.028751392398694755 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3140495867768595, "acc_stderr": 0.04236964753041018, "acc_norm": 0.3140495867768595, "acc_norm_stderr": 0.04236964753041018 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.24074074074074073, "acc_stderr": 0.041331194402438376, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.041331194402438376 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.03259177392742177, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.03259177392742177 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.16071428571428573, "acc_stderr": 0.034859460964757415, "acc_norm": 0.16071428571428573, "acc_norm_stderr": 0.034859460964757415 }, "harness|hendrycksTest-management|5": { "acc": 0.21359223300970873, "acc_stderr": 0.040580420156460344, "acc_norm": 0.21359223300970873, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2222222222222222, "acc_stderr": 0.027236013946196687, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.027236013946196687 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.27458492975734355, "acc_stderr": 0.015959829933084046, "acc_norm": 0.27458492975734355, "acc_norm_stderr": 0.015959829933084046 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.20809248554913296, "acc_stderr": 0.021855255263421795, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.021855255263421795 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859936, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859936 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2777777777777778, "acc_stderr": 0.025646863097137897, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2990353697749196, "acc_stderr": 0.02600330111788514, "acc_norm": 0.2990353697749196, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.24691358024691357, "acc_stderr": 0.023993501709042117, "acc_norm": 0.24691358024691357, "acc_norm_stderr": 0.023993501709042117 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2375886524822695, "acc_stderr": 0.02538951255272991, "acc_norm": 0.2375886524822695, "acc_norm_stderr": 0.02538951255272991 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2333767926988266, "acc_stderr": 0.010803108481179081, "acc_norm": 0.2333767926988266, "acc_norm_stderr": 0.010803108481179081 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4338235294117647, "acc_stderr": 0.030105636570016643, "acc_norm": 0.4338235294117647, "acc_norm_stderr": 0.030105636570016643 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.238562091503268, "acc_stderr": 0.017242385828779582, "acc_norm": 0.238562091503268, "acc_norm_stderr": 0.017242385828779582 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2636363636363636, "acc_stderr": 0.04220224692971987, "acc_norm": 0.2636363636363636, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.39183673469387753, "acc_stderr": 0.031251275910891656, "acc_norm": 0.39183673469387753, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23383084577114427, "acc_stderr": 0.029929415408348384, "acc_norm": 0.23383084577114427, "acc_norm_stderr": 0.029929415408348384 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-virology|5": { "acc": 0.25301204819277107, "acc_stderr": 0.03384429155233134, "acc_norm": 0.25301204819277107, "acc_norm_stderr": 0.03384429155233134 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03218093795602357, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.2558139534883721, "mc1_stderr": 0.015274176219283349, "mc2": 0.4550530004497737, "mc2_stderr": 0.016187324561962944 }, "harness|winogrande|5": { "acc": 0.5027624309392266, "acc_stderr": 0.014052271211616438 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.003122902684563759, "f1_stderr": 0.00024291228896195137 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
irds/hc4_fa
--- pretty_name: '`hc4/fa`' viewer: false source_datasets: [] task_categories: - text-retrieval --- # Dataset Card for `hc4/fa` The `hc4/fa` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/hc4#hc4/fa). # Data This dataset provides: - `docs` (documents, i.e., the corpus); count=486,486 ## Usage ```python from datasets import load_dataset docs = load_dataset('irds/hc4_fa', 'docs') for record in docs: record # {'doc_id': ..., 'title': ..., 'text': ..., 'url': ..., 'time': ..., 'cc_file': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @article{Lawrie2022HC4, author = {Dawn Lawrie and James Mayfield and Douglas W. Oard and Eugene Yang}, title = {HC4: A New Suite of Test Collections for Ad Hoc CLIR}, booktitle = {{Advances in Information Retrieval. 44th European Conference on IR Research (ECIR 2022)}, year = {2022}, month = apr, publisher = {Springer}, series = {Lecture Notes in Computer Science}, site = {Stavanger, Norway}, url = {https://arxiv.org/abs/2201.09992} } ```
middles/kanjiv1
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 122146551.743 num_examples: 6431 download_size: 143411498 dataset_size: 122146551.743 configs: - config_name: default data_files: - split: train path: data/train-* ---
dmayhem93/self-critiquing-critique-train
--- dataset_info: features: - name: id dtype: string - name: source_id dtype: string - name: split dtype: string - name: time dtype: float64 - name: labeler dtype: string - name: is_topic_based_summarization dtype: bool - name: category dtype: string - name: severity dtype: int64 - name: text_quotes list: - name: begin dtype: int64 - name: end dtype: int64 - name: response_quotes list: - name: begin dtype: int64 - name: end dtype: int64 - name: prompt dtype: string - name: response dtype: string splits: - name: train num_bytes: 262218653 num_examples: 61503 download_size: 0 dataset_size: 262218653 --- # Dataset Card for "self-critiquing-critique-train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Atipico1/mrqa-test-final-set
--- dataset_info: features: - name: subset dtype: string - name: qid dtype: string - name: question dtype: string - name: answers sequence: string - name: masked_query dtype: string - name: context dtype: string - name: answer_sent dtype: string - name: answer_in_context sequence: string - name: entity dtype: string - name: similar_entity dtype: string - name: clear_answer_sent dtype: string - name: vague_answer_sent dtype: string - name: adversary dtype: string - name: replace_count dtype: int64 - name: adversarial_passage dtype: string - name: masked_answer_sent dtype: string - name: num_mask_token dtype: int64 - name: entities sequence: string - name: gpt_adv_sent dtype: string - name: is_same dtype: string - name: gpt_adv_sent_passage dtype: string splits: - name: train num_bytes: 1776628.0409416582 num_examples: 684 download_size: 1185285 dataset_size: 1776628.0409416582 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/snow_white_nikke
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of snow_white/スノーホワイト/白雪公主/스노우 화이트 (Nikke: Goddess of Victory) This is the dataset of snow_white/スノーホワイト/白雪公主/스노우 화이트 (Nikke: Goddess of Victory), containing 110 images and their tags. The core tags of this character are `bangs, yellow_eyes, long_hair, white_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 110 | 230.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/snow_white_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 110 | 103.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/snow_white_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 261 | 214.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/snow_white_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 110 | 188.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/snow_white_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 261 | 344.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/snow_white_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/snow_white_nikke', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, holding_gun, boots, closed_mouth, full_body, looking_at_viewer, simple_background, standing, white_background, armor, assault_rifle, black_gloves, goggles_on_head | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, full_body, solo, weapon, boots, standing, cloak, goggles_on_head, looking_at_viewer, cape, holding, snowing | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | holding_gun | boots | closed_mouth | full_body | looking_at_viewer | simple_background | standing | white_background | armor | assault_rifle | black_gloves | goggles_on_head | weapon | cloak | cape | holding | snowing | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:--------|:---------------|:------------|:--------------------|:--------------------|:-----------|:-------------------|:--------|:----------------|:---------------|:------------------|:---------|:--------|:-------|:----------|:----------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | X | X | | X | | | | | X | X | X | X | X | X |
Circularmachines/batch_indexing_machine_230529_007
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 155593564.0 num_examples: 720 download_size: 155604795 dataset_size: 155593564.0 --- # Dataset Card for "batch_indexing_machine_230529_007" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ixelszy/text_inversion
--- license: creativeml-openrail-m tags: - not-for-all-audiences ---
maxmyn/thank_you_greentext
--- dataset_info: features: - name: 'Unnamed: 0' dtype: int64 - name: greentexts dtype: string splits: - name: train num_bytes: 8074796 num_examples: 58754 download_size: 4299667 dataset_size: 8074796 configs: - config_name: default data_files: - split: train path: data/train-* ---
khalidalt/Ashaar_diac_1
--- dataset_info: features: - name: output dtype: string - name: instruction dtype: string - name: input dtype: string splits: - name: train num_bytes: 12159497 num_examples: 23481 download_size: 6059483 dataset_size: 12159497 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "Ashaar_diac" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-Adapter
--- pretty_name: Evaluation run of KnutJaegersberg/Deacon-34b-Adapter dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [KnutJaegersberg/Deacon-34b-Adapter](https://huggingface.co/KnutJaegersberg/Deacon-34b-Adapter)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-Adapter\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-05T02:34:30.689274](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-Adapter/blob/main/results_2024-01-05T02-34-30.689274.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7572529924782586,\n\ \ \"acc_stderr\": 0.028143579191178096,\n \"acc_norm\": 0.762407110072492,\n\ \ \"acc_norm_stderr\": 0.028665771498963065,\n \"mc1\": 0.40514075887392903,\n\ \ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5623662255999308,\n\ \ \"mc2_stderr\": 0.015161958819373697\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n\ \ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059868\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6563433578968333,\n\ \ \"acc_stderr\": 0.004739575380508865,\n \"acc_norm\": 0.8557060346544513,\n\ \ \"acc_norm_stderr\": 0.0035066942243475764\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \ \ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.762962962962963,\n\ \ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.762962962962963,\n\ \ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n\ \ \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\ \ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \ \ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\ \ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n\ \ \"acc_stderr\": 0.026983346503309354,\n \"acc_norm\": 0.8819444444444444,\n\ \ \"acc_norm_stderr\": 0.026983346503309354\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\ : 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\ \ \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n\ \ \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n\ \ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\ \ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.027851252973889778,\n\ \ \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.027851252973889778\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\ \ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\ \ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n\ \ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6534391534391535,\n\ \ \"acc_stderr\": 0.024508777521028424,\n \"acc_norm\": 0.6534391534391535,\n\ \ \"acc_norm_stderr\": 0.024508777521028424\n },\n \"harness|hendrycksTest-formal_logic|5\"\ : {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n\ \ \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n\ \ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\"\ : {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n\ \ \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"\ acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\ : 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706473,\n\ \ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706473\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853106,\n \"\ acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853106\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909042,\n\ \ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909042\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246787,\n\ \ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246787\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.44814814814814813,\n \"acc_stderr\": 0.030321167196316286,\n \ \ \"acc_norm\": 0.44814814814814813,\n \"acc_norm_stderr\": 0.030321167196316286\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \ \ \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163086,\n \"\ acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163086\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334877,\n \"\ acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334877\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"\ acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\ acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.9240506329113924,\n \"acc_stderr\": 0.017244633251065702,\n \ \ \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.017244633251065702\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\ \ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\ \ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\ \ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"\ acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\ \ \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n\ \ \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507108,\n\ \ \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507108\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\ \ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\ \ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\ \ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\ \ \"acc_stderr\": 0.017004368568132342,\n \"acc_norm\": 0.9273504273504274,\n\ \ \"acc_norm_stderr\": 0.017004368568132342\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n\ \ \"acc_stderr\": 0.010524031079055831,\n \"acc_norm\": 0.9042145593869731,\n\ \ \"acc_norm_stderr\": 0.010524031079055831\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.8323699421965318,\n \"acc_stderr\": 0.02011057991973484,\n\ \ \"acc_norm\": 0.8323699421965318,\n \"acc_norm_stderr\": 0.02011057991973484\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n\ \ \"acc_stderr\": 0.01598420454526857,\n \"acc_norm\": 0.646927374301676,\n\ \ \"acc_norm_stderr\": 0.01598420454526857\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043697,\n\ \ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043697\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n\ \ \"acc_stderr\": 0.020862388082391888,\n \"acc_norm\": 0.8392282958199357,\n\ \ \"acc_norm_stderr\": 0.020862388082391888\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790913,\n\ \ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790913\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02812163604063989,\n \ \ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02812163604063989\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5997392438070405,\n\ \ \"acc_stderr\": 0.01251358252913621,\n \"acc_norm\": 0.5997392438070405,\n\ \ \"acc_norm_stderr\": 0.01251358252913621\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \ \ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262549,\n \ \ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262549\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\ \ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\ \ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.02292300409473685,\n\ \ \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.02292300409473685\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\ \ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\ \ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \ \ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\ \ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\ \ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n\ \ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\ \ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5623662255999308,\n\ \ \"mc2_stderr\": 0.015161958819373697\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825895\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \ \ \"acc_stderr\": 0.013423607564002734\n }\n}\n```" repo_url: https://huggingface.co/KnutJaegersberg/Deacon-34b-Adapter leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|arc:challenge|25_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-05T02-34-30.689274.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|gsm8k|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hellaswag|10_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-34-30.689274.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-34-30.689274.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T02-34-30.689274.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_05T02_34_30.689274 path: - '**/details_harness|winogrande|5_2024-01-05T02-34-30.689274.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-05T02-34-30.689274.parquet' - config_name: results data_files: - split: 2024_01_05T02_34_30.689274 path: - results_2024-01-05T02-34-30.689274.parquet - split: latest path: - results_2024-01-05T02-34-30.689274.parquet --- # Dataset Card for Evaluation run of KnutJaegersberg/Deacon-34b-Adapter <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-34b-Adapter](https://huggingface.co/KnutJaegersberg/Deacon-34b-Adapter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-Adapter", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-05T02:34:30.689274](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-34b-Adapter/blob/main/results_2024-01-05T02-34-30.689274.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7572529924782586, "acc_stderr": 0.028143579191178096, "acc_norm": 0.762407110072492, "acc_norm_stderr": 0.028665771498963065, "mc1": 0.40514075887392903, "mc1_stderr": 0.017185611727753368, "mc2": 0.5623662255999308, "mc2_stderr": 0.015161958819373697 }, "harness|arc:challenge|25": { "acc": 0.6160409556313993, "acc_stderr": 0.01421244498065189, "acc_norm": 0.6476109215017065, "acc_norm_stderr": 0.01396014260059868 }, "harness|hellaswag|10": { "acc": 0.6563433578968333, "acc_stderr": 0.004739575380508865, "acc_norm": 0.8557060346544513, "acc_norm_stderr": 0.0035066942243475764 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.762962962962963, "acc_stderr": 0.03673731683969506, "acc_norm": 0.762962962962963, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.9013157894736842, "acc_stderr": 0.024270227737522715, "acc_norm": 0.9013157894736842, "acc_norm_stderr": 0.024270227737522715 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7886792452830189, "acc_stderr": 0.025125766484827845, "acc_norm": 0.7886792452830189, "acc_norm_stderr": 0.025125766484827845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8819444444444444, "acc_stderr": 0.026983346503309354, "acc_norm": 0.8819444444444444, "acc_norm_stderr": 0.026983346503309354 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.65, "acc_stderr": 0.04793724854411019, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7109826589595376, "acc_stderr": 0.03456425745086999, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.03456425745086999 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4803921568627451, "acc_stderr": 0.04971358884367406, "acc_norm": 0.4803921568627451, "acc_norm_stderr": 0.04971358884367406 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7617021276595745, "acc_stderr": 0.027851252973889778, "acc_norm": 0.7617021276595745, "acc_norm_stderr": 0.027851252973889778 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.543859649122807, "acc_stderr": 0.046854730419077895, "acc_norm": 0.543859649122807, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8, "acc_stderr": 0.0333333333333333, "acc_norm": 0.8, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6534391534391535, "acc_stderr": 0.024508777521028424, "acc_norm": 0.6534391534391535, "acc_norm_stderr": 0.024508777521028424 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5634920634920635, "acc_stderr": 0.04435932892851466, "acc_norm": 0.5634920634920635, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8838709677419355, "acc_stderr": 0.018225757949432306, "acc_norm": 0.8838709677419355, "acc_norm_stderr": 0.018225757949432306 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6403940886699507, "acc_stderr": 0.03376458246509567, "acc_norm": 0.6403940886699507, "acc_norm_stderr": 0.03376458246509567 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8666666666666667, "acc_stderr": 0.026544435312706473, "acc_norm": 0.8666666666666667, "acc_norm_stderr": 0.026544435312706473 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8939393939393939, "acc_stderr": 0.021938047738853106, "acc_norm": 0.8939393939393939, "acc_norm_stderr": 0.021938047738853106 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9792746113989638, "acc_stderr": 0.010281417011909042, "acc_norm": 0.9792746113989638, "acc_norm_stderr": 0.010281417011909042 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7923076923076923, "acc_stderr": 0.020567539567246787, "acc_norm": 0.7923076923076923, "acc_norm_stderr": 0.020567539567246787 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.44814814814814813, "acc_stderr": 0.030321167196316286, "acc_norm": 0.44814814814814813, "acc_norm_stderr": 0.030321167196316286 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8571428571428571, "acc_stderr": 0.02273020811930654, "acc_norm": 0.8571428571428571, "acc_norm_stderr": 0.02273020811930654 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5231788079470199, "acc_stderr": 0.04078093859163086, "acc_norm": 0.5231788079470199, "acc_norm_stderr": 0.04078093859163086 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9155963302752294, "acc_stderr": 0.011918819327334877, "acc_norm": 0.9155963302752294, "acc_norm_stderr": 0.011918819327334877 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6481481481481481, "acc_stderr": 0.03256850570293647, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.03256850570293647 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9240506329113924, "acc_stderr": 0.017244633251065702, "acc_norm": 0.9240506329113924, "acc_norm_stderr": 0.017244633251065702 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7892376681614349, "acc_stderr": 0.02737309550054019, "acc_norm": 0.7892376681614349, "acc_norm_stderr": 0.02737309550054019 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9173553719008265, "acc_stderr": 0.025135382356604227, "acc_norm": 0.9173553719008265, "acc_norm_stderr": 0.025135382356604227 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8981481481481481, "acc_stderr": 0.029239272675632748, "acc_norm": 0.8981481481481481, "acc_norm_stderr": 0.029239272675632748 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8834355828220859, "acc_stderr": 0.025212327210507108, "acc_norm": 0.8834355828220859, "acc_norm_stderr": 0.025212327210507108 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6160714285714286, "acc_stderr": 0.04616143075028546, "acc_norm": 0.6160714285714286, "acc_norm_stderr": 0.04616143075028546 }, "harness|hendrycksTest-management|5": { "acc": 0.912621359223301, "acc_stderr": 0.027960689125970654, "acc_norm": 0.912621359223301, "acc_norm_stderr": 0.027960689125970654 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9273504273504274, "acc_stderr": 0.017004368568132342, "acc_norm": 0.9273504273504274, "acc_norm_stderr": 0.017004368568132342 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9042145593869731, "acc_stderr": 0.010524031079055831, "acc_norm": 0.9042145593869731, "acc_norm_stderr": 0.010524031079055831 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8323699421965318, "acc_stderr": 0.02011057991973484, "acc_norm": 0.8323699421965318, "acc_norm_stderr": 0.02011057991973484 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.646927374301676, "acc_stderr": 0.01598420454526857, "acc_norm": 0.646927374301676, "acc_norm_stderr": 0.01598420454526857 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8562091503267973, "acc_stderr": 0.020091188936043697, "acc_norm": 0.8562091503267973, "acc_norm_stderr": 0.020091188936043697 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8392282958199357, "acc_stderr": 0.020862388082391888, "acc_norm": 0.8392282958199357, "acc_norm_stderr": 0.020862388082391888 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8734567901234568, "acc_stderr": 0.018498600558790913, "acc_norm": 0.8734567901234568, "acc_norm_stderr": 0.018498600558790913 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6666666666666666, "acc_stderr": 0.02812163604063989, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.02812163604063989 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5997392438070405, "acc_stderr": 0.01251358252913621, "acc_norm": 0.5997392438070405, "acc_norm_stderr": 0.01251358252913621 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8125, "acc_stderr": 0.023709788253811766, "acc_norm": 0.8125, "acc_norm_stderr": 0.023709788253811766 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8235294117647058, "acc_stderr": 0.015422512066262549, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.015422512066262549 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8489795918367347, "acc_stderr": 0.02292300409473685, "acc_norm": 0.8489795918367347, "acc_norm_stderr": 0.02292300409473685 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.91, "acc_stderr": 0.02876234912646613, "acc_norm": 0.91, "acc_norm_stderr": 0.02876234912646613 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015578, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015578 }, "harness|truthfulqa:mc|0": { "mc1": 0.40514075887392903, "mc1_stderr": 0.017185611727753368, "mc2": 0.5623662255999308, "mc2_stderr": 0.015161958819373697 }, "harness|winogrande|5": { "acc": 0.829518547750592, "acc_stderr": 0.010569021122825895 }, "harness|gsm8k|5": { "acc": 0.6118271417740713, "acc_stderr": 0.013423607564002734 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BFauber__opt1.3b_10e6
--- pretty_name: Evaluation run of BFauber/opt1.3b_10e6 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [BFauber/opt1.3b_10e6](https://huggingface.co/BFauber/opt1.3b_10e6) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e6\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-02T19:21:41.324363](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e6/blob/main/results_2024-02-02T19-21-41.324363.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2600682096146715,\n\ \ \"acc_stderr\": 0.03101730232799903,\n \"acc_norm\": 0.26160699891348205,\n\ \ \"acc_norm_stderr\": 0.03184452335795072,\n \"mc1\": 0.2386780905752754,\n\ \ \"mc1_stderr\": 0.014922629695456416,\n \"mc2\": 0.4272454038428265,\n\ \ \"mc2_stderr\": 0.015391374654641474\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407312,\n\ \ \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768414\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3370842461661024,\n\ \ \"acc_stderr\": 0.00471747833568962,\n \"acc_norm\": 0.41674965146385184,\n\ \ \"acc_norm_stderr\": 0.004920130733271778\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\ \ \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \ \ \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\ \ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n\ \ \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \ \ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\ \ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.24855491329479767,\n\ \ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\ \ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n\ \ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\ \ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\ \ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\ \ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\ \ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\ acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\ \ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\ \ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.24838709677419354,\n \"acc_stderr\": 0.02458002892148101,\n \"\ acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.02458002892148101\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\ acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\ \ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300992,\n \"\ acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300992\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\ \ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463203,\n\ \ \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463203\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \ \ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n\ \ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360385,\n \"\ acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360385\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\ acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.3194444444444444,\n \"acc_stderr\": 0.031798763421768524,\n \"\ acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.031798763421768524\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"\ acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \ \ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\ \ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\ \ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\ \ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\ acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\ \ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\ \ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\ \ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n\ \ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n\ \ \"acc_stderr\": 0.026655699653922768,\n \"acc_norm\": 0.2094017094017094,\n\ \ \"acc_norm_stderr\": 0.026655699653922768\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\ \ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\ \ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399202,\n\ \ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399202\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\ \ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\ \ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n\ \ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\ \ \"acc_stderr\": 0.02558306248998484,\n \"acc_norm\": 0.2829581993569132,\n\ \ \"acc_norm_stderr\": 0.02558306248998484\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\ \ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \ \ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n\ \ \"acc_stderr\": 0.011092789056875246,\n \"acc_norm\": 0.2522816166883963,\n\ \ \"acc_norm_stderr\": 0.011092789056875246\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541097,\n\ \ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541097\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \ \ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\ \ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\ \ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\ \ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\ \ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\ \ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n\ \ \"acc_stderr\": 0.03329394119073529,\n \"acc_norm\": 0.24096385542168675,\n\ \ \"acc_norm_stderr\": 0.03329394119073529\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\ \ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\ \ \"mc1_stderr\": 0.014922629695456416,\n \"mc2\": 0.4272454038428265,\n\ \ \"mc2_stderr\": 0.015391374654641474\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5414364640883977,\n \"acc_stderr\": 0.014004146853791902\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/BFauber/opt1.3b_10e6 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|arc:challenge|25_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-02T19-21-41.324363.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|gsm8k|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hellaswag|10_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-02T19-21-41.324363.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_02T19_21_41.324363 path: - '**/details_harness|winogrande|5_2024-02-02T19-21-41.324363.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-02T19-21-41.324363.parquet' - config_name: results data_files: - split: 2024_02_02T19_21_41.324363 path: - results_2024-02-02T19-21-41.324363.parquet - split: latest path: - results_2024-02-02T19-21-41.324363.parquet --- # Dataset Card for Evaluation run of BFauber/opt1.3b_10e6 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e6](https://huggingface.co/BFauber/opt1.3b_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e6", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-02T19:21:41.324363](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e6/blob/main/results_2024-02-02T19-21-41.324363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2600682096146715, "acc_stderr": 0.03101730232799903, "acc_norm": 0.26160699891348205, "acc_norm_stderr": 0.03184452335795072, "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456416, "mc2": 0.4272454038428265, "mc2_stderr": 0.015391374654641474 }, "harness|arc:challenge|25": { "acc": 0.22525597269624573, "acc_stderr": 0.012207839995407312, "acc_norm": 0.257679180887372, "acc_norm_stderr": 0.012780770562768414 }, "harness|hellaswag|10": { "acc": 0.3370842461661024, "acc_stderr": 0.00471747833568962, "acc_norm": 0.41674965146385184, "acc_norm_stderr": 0.004920130733271778 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04072314811876837, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.28289473684210525, "acc_stderr": 0.03665349695640767, "acc_norm": 0.28289473684210525, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2188679245283019, "acc_stderr": 0.02544786382510861, "acc_norm": 0.2188679245283019, "acc_norm_stderr": 0.02544786382510861 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24855491329479767, "acc_stderr": 0.03295304696818317, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.03295304696818317 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.2, "acc_stderr": 0.04020151261036844, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036844 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.20425531914893616, "acc_stderr": 0.026355158413349424, "acc_norm": 0.20425531914893616, "acc_norm_stderr": 0.026355158413349424 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748141, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748141 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.296551724137931, "acc_stderr": 0.03806142687309993, "acc_norm": 0.296551724137931, "acc_norm_stderr": 0.03806142687309993 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708617, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708617 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15079365079365079, "acc_stderr": 0.03200686497287392, "acc_norm": 0.15079365079365079, "acc_norm_stderr": 0.03200686497287392 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.24838709677419354, "acc_stderr": 0.02458002892148101, "acc_norm": 0.24838709677419354, "acc_norm_stderr": 0.02458002892148101 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2955665024630542, "acc_stderr": 0.032104944337514575, "acc_norm": 0.2955665024630542, "acc_norm_stderr": 0.032104944337514575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.03287666758603488, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.03287666758603488 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.2878787878787879, "acc_stderr": 0.03225883512300992, "acc_norm": 0.2878787878787879, "acc_norm_stderr": 0.03225883512300992 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.35751295336787564, "acc_stderr": 0.034588160421810045, "acc_norm": 0.35751295336787564, "acc_norm_stderr": 0.034588160421810045 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.21794871794871795, "acc_stderr": 0.020932445774463203, "acc_norm": 0.21794871794871795, "acc_norm_stderr": 0.020932445774463203 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.02684205787383371, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.02684205787383371 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.25210084033613445, "acc_stderr": 0.028205545033277726, "acc_norm": 0.25210084033613445, "acc_norm_stderr": 0.028205545033277726 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.03603038545360385, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.03603038545360385 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.24036697247706423, "acc_stderr": 0.01832060732096407, "acc_norm": 0.24036697247706423, "acc_norm_stderr": 0.01832060732096407 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3194444444444444, "acc_stderr": 0.031798763421768524, "acc_norm": 0.3194444444444444, "acc_norm_stderr": 0.031798763421768524 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23039215686274508, "acc_stderr": 0.029554292605695053, "acc_norm": 0.23039215686274508, "acc_norm_stderr": 0.029554292605695053 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2489451476793249, "acc_stderr": 0.028146970599422644, "acc_norm": 0.2489451476793249, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.20179372197309417, "acc_stderr": 0.026936111912802273, "acc_norm": 0.20179372197309417, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.24427480916030533, "acc_stderr": 0.03768335959728742, "acc_norm": 0.24427480916030533, "acc_norm_stderr": 0.03768335959728742 }, "harness|hendrycksTest-international_law|5": { "acc": 0.371900826446281, "acc_stderr": 0.044120158066245044, "acc_norm": 0.371900826446281, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.0395783547198098, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.24107142857142858, "acc_stderr": 0.04059867246952687, "acc_norm": 0.24107142857142858, "acc_norm_stderr": 0.04059867246952687 }, "harness|hendrycksTest-management|5": { "acc": 0.2815533980582524, "acc_stderr": 0.044532548363264673, "acc_norm": 0.2815533980582524, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2094017094017094, "acc_stderr": 0.026655699653922768, "acc_norm": 0.2094017094017094, "acc_norm_stderr": 0.026655699653922768 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2707535121328225, "acc_stderr": 0.015889888362560486, "acc_norm": 0.2707535121328225, "acc_norm_stderr": 0.015889888362560486 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24566473988439305, "acc_stderr": 0.02317629820399202, "acc_norm": 0.24566473988439305, "acc_norm_stderr": 0.02317629820399202 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249588, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249588 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24509803921568626, "acc_stderr": 0.024630048979824775, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.024630048979824775 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2829581993569132, "acc_stderr": 0.02558306248998484, "acc_norm": 0.2829581993569132, "acc_norm_stderr": 0.02558306248998484 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2932098765432099, "acc_stderr": 0.02532988817190092, "acc_norm": 0.2932098765432099, "acc_norm_stderr": 0.02532988817190092 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.26595744680851063, "acc_stderr": 0.026358065698880592, "acc_norm": 0.26595744680851063, "acc_norm_stderr": 0.026358065698880592 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2522816166883963, "acc_stderr": 0.011092789056875246, "acc_norm": 0.2522816166883963, "acc_norm_stderr": 0.011092789056875246 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.19852941176470587, "acc_stderr": 0.024231013370541097, "acc_norm": 0.19852941176470587, "acc_norm_stderr": 0.024231013370541097 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24836601307189543, "acc_stderr": 0.017479487001364764, "acc_norm": 0.24836601307189543, "acc_norm_stderr": 0.017479487001364764 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2636363636363636, "acc_stderr": 0.04220224692971987, "acc_norm": 0.2636363636363636, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.24081632653061225, "acc_stderr": 0.027372942201788163, "acc_norm": 0.24081632653061225, "acc_norm_stderr": 0.027372942201788163 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23880597014925373, "acc_stderr": 0.030147775935409217, "acc_norm": 0.23880597014925373, "acc_norm_stderr": 0.030147775935409217 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-virology|5": { "acc": 0.24096385542168675, "acc_stderr": 0.03329394119073529, "acc_norm": 0.24096385542168675, "acc_norm_stderr": 0.03329394119073529 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.27485380116959063, "acc_stderr": 0.034240429246915824, "acc_norm": 0.27485380116959063, "acc_norm_stderr": 0.034240429246915824 }, "harness|truthfulqa:mc|0": { "mc1": 0.2386780905752754, "mc1_stderr": 0.014922629695456416, "mc2": 0.4272454038428265, "mc2_stderr": 0.015391374654641474 }, "harness|winogrande|5": { "acc": 0.5414364640883977, "acc_stderr": 0.014004146853791902 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Zaib/java-vulnerability
--- license: afl-3.0 ---
learn3r/gov_report_bp
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 1030500829 num_examples: 17457 - name: validation num_bytes: 60867802 num_examples: 972 - name: test num_bytes: 56606131 num_examples: 973 download_size: 547138870 dataset_size: 1147974762 --- # Dataset Card for "gov_report_bp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_2_t_0.75
--- dataset_info: config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string - name: preference dtype: int64 - name: output_1 dtype: string - name: output_2 dtype: string - name: reward_model_prompt_format dtype: string - name: gen_prompt_format dtype: string - name: gen_kwargs struct: - name: do_sample dtype: bool - name: max_new_tokens dtype: int64 - name: pad_token_id dtype: int64 - name: top_k dtype: int64 - name: top_p dtype: float64 - name: reward_1 dtype: float64 - name: reward_2 dtype: float64 - name: n_samples dtype: int64 - name: reject_select dtype: string - name: index dtype: int64 - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: filtered_epoch dtype: int64 - name: gen_reward dtype: float64 - name: gen_response dtype: string splits: - name: epoch_0 num_bytes: 43693937 num_examples: 18928 - name: epoch_1 num_bytes: 44240727 num_examples: 18928 - name: epoch_2 num_bytes: 44307753 num_examples: 18928 - name: epoch_3 num_bytes: 44337703 num_examples: 18928 - name: epoch_4 num_bytes: 44342250 num_examples: 18928 - name: epoch_5 num_bytes: 44332626 num_examples: 18928 - name: epoch_6 num_bytes: 44316490 num_examples: 18928 - name: epoch_7 num_bytes: 44306296 num_examples: 18928 - name: epoch_8 num_bytes: 44302019 num_examples: 18928 - name: epoch_9 num_bytes: 44302111 num_examples: 18928 - name: epoch_10 num_bytes: 44299863 num_examples: 18928 - name: epoch_11 num_bytes: 44301395 num_examples: 18928 - name: epoch_12 num_bytes: 44298390 num_examples: 18928 - name: epoch_13 num_bytes: 44299904 num_examples: 18928 - name: epoch_14 num_bytes: 44299085 num_examples: 18928 - name: epoch_15 num_bytes: 44298426 num_examples: 18928 - name: epoch_16 num_bytes: 44299133 num_examples: 18928 - name: epoch_17 num_bytes: 44298640 num_examples: 18928 - name: epoch_18 num_bytes: 44299618 num_examples: 18928 - name: epoch_19 num_bytes: 44299663 num_examples: 18928 - name: epoch_20 num_bytes: 44299510 num_examples: 18928 - name: epoch_21 num_bytes: 44301235 num_examples: 18928 - name: epoch_22 num_bytes: 44299957 num_examples: 18928 - name: epoch_23 num_bytes: 44298805 num_examples: 18928 - name: epoch_24 num_bytes: 44299428 num_examples: 18928 - name: epoch_25 num_bytes: 44299962 num_examples: 18928 - name: epoch_26 num_bytes: 44300441 num_examples: 18928 - name: epoch_27 num_bytes: 44300810 num_examples: 18928 - name: epoch_28 num_bytes: 44300854 num_examples: 18928 - name: epoch_29 num_bytes: 44300880 num_examples: 18928 download_size: 692780603 dataset_size: 1328477911 configs: - config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1 data_files: - split: epoch_0 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-* - split: epoch_1 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-* - split: epoch_2 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-* - split: epoch_3 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-* - split: epoch_4 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-* - split: epoch_5 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-* - split: epoch_6 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-* - split: epoch_7 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-* - split: epoch_8 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-* - split: epoch_9 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-* - split: epoch_10 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-* - split: epoch_11 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-* - split: epoch_12 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-* - split: epoch_13 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-* - split: epoch_14 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-* - split: epoch_15 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-* - split: epoch_16 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-* - split: epoch_17 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-* - split: epoch_18 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-* - split: epoch_19 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-* - split: epoch_20 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-* - split: epoch_21 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-* - split: epoch_22 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-* - split: epoch_23 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-* - split: epoch_24 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-* - split: epoch_25 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-* - split: epoch_26 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-* - split: epoch_27 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-* - split: epoch_28 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-* - split: epoch_29 path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-* ---
open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16
--- pretty_name: Evaluation run of OpenBuddy/openbuddy-openllama-3b-v10-bf16 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [OpenBuddy/openbuddy-openllama-3b-v10-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-15T08:49:40.172924](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16/blob/main/results_2023-10-15T08-49-40.172924.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05757130872483222,\n\ \ \"em_stderr\": 0.0023854315115358956,\n \"f1\": 0.10502097315436239,\n\ \ \"f1_stderr\": 0.002651285925411262,\n \"acc\": 0.30327051717566045,\n\ \ \"acc_stderr\": 0.008254166931468953\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.05757130872483222,\n \"em_stderr\": 0.0023854315115358956,\n\ \ \"f1\": 0.10502097315436239,\n \"f1_stderr\": 0.002651285925411262\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \ \ \"acc_stderr\": 0.0027210765770416608\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896245\n\ \ }\n}\n```" repo_url: https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|arc:challenge|25_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-17T14:16:36.275338.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_15T08_49_40.172924 path: - '**/details_harness|drop|3_2023-10-15T08-49-40.172924.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-15T08-49-40.172924.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_15T08_49_40.172924 path: - '**/details_harness|gsm8k|5_2023-10-15T08-49-40.172924.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-15T08-49-40.172924.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hellaswag|10_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_17T14_16_36.275338 path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T14:16:36.275338.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-17T14:16:36.275338.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_15T08_49_40.172924 path: - '**/details_harness|winogrande|5_2023-10-15T08-49-40.172924.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-15T08-49-40.172924.parquet' - config_name: results data_files: - split: 2023_08_17T14_16_36.275338 path: - results_2023-08-17T14:16:36.275338.parquet - split: 2023_10_15T08_49_40.172924 path: - results_2023-10-15T08-49-40.172924.parquet - split: latest path: - results_2023-10-15T08-49-40.172924.parquet --- # Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-3b-v10-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-3b-v10-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T08:49:40.172924](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16/blob/main/results_2023-10-15T08-49-40.172924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.05757130872483222, "em_stderr": 0.0023854315115358956, "f1": 0.10502097315436239, "f1_stderr": 0.002651285925411262, "acc": 0.30327051717566045, "acc_stderr": 0.008254166931468953 }, "harness|drop|3": { "em": 0.05757130872483222, "em_stderr": 0.0023854315115358956, "f1": 0.10502097315436239, "f1_stderr": 0.002651285925411262 }, "harness|gsm8k|5": { "acc": 0.009855951478392721, "acc_stderr": 0.0027210765770416608 }, "harness|winogrande|5": { "acc": 0.5966850828729282, "acc_stderr": 0.013787257285896245 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
MihaiIonascu/dreadit-train
--- license: apache-2.0 ---
open-llm-leaderboard/details_microsoft__phi-2
--- pretty_name: Evaluation run of microsoft/phi-2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__phi-2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-15T16:12:26.100927](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-2/blob/main/results_2024-04-15T16-12-26.100927.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5810413550660194,\n\ \ \"acc_stderr\": 0.033772948029595365,\n \"acc_norm\": 0.5826063358809028,\n\ \ \"acc_norm_stderr\": 0.03446197582267999,\n \"mc1\": 0.30966952264381886,\n\ \ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4423687837225679,\n\ \ \"mc2_stderr\": 0.015079580060665993\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n\ \ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892896\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5617406891057558,\n\ \ \"acc_stderr\": 0.004951594063272055,\n \"acc_norm\": 0.7491535550687114,\n\ \ \"acc_norm_stderr\": 0.004326143430360092\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.040089737857792046,\n\ \ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.040089737857792046\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\ \ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \ \ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\ \ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n\ \ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \ \ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\ \ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\ \ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\ \ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\ acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\ \ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\ \ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"\ acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\ acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\ : 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.0372820699868265,\n\ \ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.0372820699868265\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\ acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\ \ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \ \ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\ \ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\ acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\ acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.03308611113236436,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.03308611113236436\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146029,\n\ \ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146029\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\ \ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\ \ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\ \ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\ acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\ \ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\ \ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\ \ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\ \ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\ \ \"acc_stderr\": 0.02490443909891824,\n \"acc_norm\": 0.8247863247863247,\n\ \ \"acc_norm_stderr\": 0.02490443909891824\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6922094508301405,\n\ \ \"acc_stderr\": 0.016506045045155637,\n \"acc_norm\": 0.6922094508301405,\n\ \ \"acc_norm_stderr\": 0.016506045045155637\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\ \ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\ \ \"acc_stderr\": 0.015318257745976706,\n \"acc_norm\": 0.2994413407821229,\n\ \ \"acc_norm_stderr\": 0.015318257745976706\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702334,\n\ \ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702334\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\ \ \"acc_stderr\": 0.02755994980234782,\n \"acc_norm\": 0.6205787781350482,\n\ \ \"acc_norm_stderr\": 0.02755994980234782\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.02700252103451646,\n\ \ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.02700252103451646\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \ \ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\ \ \"acc_stderr\": 0.012667701919603664,\n \"acc_norm\": 0.4367666232073012,\n\ \ \"acc_norm_stderr\": 0.012667701919603664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.03035230339535196,\n\ \ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.03035230339535196\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \ \ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\ \ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\ \ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\ \ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \ \ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\ \ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\ \ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245231,\n\ \ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245231\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\ \ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4423687837225679,\n\ \ \"mc2_stderr\": 0.015079580060665993\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192858\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5496588324488249,\n \ \ \"acc_stderr\": 0.013704390498582809\n }\n}\n```" repo_url: https://huggingface.co/microsoft/phi-2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|arc:challenge|25_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|arc:challenge|25_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-15T16-12-26.100927.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|gsm8k|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|gsm8k|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hellaswag|10_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hellaswag|10_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-14T09-31-24.484620.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-12-26.100927.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-management|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-12-26.100927.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|truthfulqa:mc|0_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-15T16-12-26.100927.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_14T09_31_24.484620 path: - '**/details_harness|winogrande|5_2023-12-14T09-31-24.484620.parquet' - split: 2024_04_15T16_12_26.100927 path: - '**/details_harness|winogrande|5_2024-04-15T16-12-26.100927.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-15T16-12-26.100927.parquet' - config_name: results data_files: - split: 2023_12_14T09_31_24.484620 path: - results_2023-12-14T09-31-24.484620.parquet - split: 2024_04_15T16_12_26.100927 path: - results_2024-04-15T16-12-26.100927.parquet - split: latest path: - results_2024-04-15T16-12-26.100927.parquet --- # Dataset Card for Evaluation run of microsoft/phi-2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__phi-2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-15T16:12:26.100927](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-2/blob/main/results_2024-04-15T16-12-26.100927.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5810413550660194, "acc_stderr": 0.033772948029595365, "acc_norm": 0.5826063358809028, "acc_norm_stderr": 0.03446197582267999, "mc1": 0.30966952264381886, "mc1_stderr": 0.016185744355144912, "mc2": 0.4423687837225679, "mc2_stderr": 0.015079580060665993 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.01440982551840308, "acc_norm": 0.6100682593856656, "acc_norm_stderr": 0.014252959848892896 }, "harness|hellaswag|10": { "acc": 0.5617406891057558, "acc_stderr": 0.004951594063272055, "acc_norm": 0.7491535550687114, "acc_norm_stderr": 0.004326143430360092 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5855263157894737, "acc_stderr": 0.040089737857792046, "acc_norm": 0.5855263157894737, "acc_norm_stderr": 0.040089737857792046 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6037735849056604, "acc_stderr": 0.030102793781791197, "acc_norm": 0.6037735849056604, "acc_norm_stderr": 0.030102793781791197 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03942082639927213, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03942082639927213 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456344, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456344 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.048108401480826346, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.048108401480826346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.032650194750335815, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4603174603174603, "acc_stderr": 0.025670080636909186, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.025670080636909186 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6484848484848484, "acc_stderr": 0.0372820699868265, "acc_norm": 0.6484848484848484, "acc_norm_stderr": 0.0372820699868265 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124498, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124498 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8082901554404145, "acc_stderr": 0.028408953626245282, "acc_norm": 0.8082901554404145, "acc_norm_stderr": 0.028408953626245282 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5820512820512821, "acc_stderr": 0.02500732988246122, "acc_norm": 0.5820512820512821, "acc_norm_stderr": 0.02500732988246122 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652458, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652458 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.031499305777849054, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.031499305777849054 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.017381415563608674, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.017381415563608674 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236436, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236436 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.02845882099146029, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.02845882099146029 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.03181149747055359, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.03181149747055359 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7087378640776699, "acc_stderr": 0.044986763205729224, "acc_norm": 0.7087378640776699, "acc_norm_stderr": 0.044986763205729224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.02490443909891824, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.02490443909891824 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6922094508301405, "acc_stderr": 0.016506045045155637, "acc_norm": 0.6922094508301405, "acc_norm_stderr": 0.016506045045155637 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.02541600377316554, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.02541600377316554 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2994413407821229, "acc_stderr": 0.015318257745976706, "acc_norm": 0.2994413407821229, "acc_norm_stderr": 0.015318257745976706 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6209150326797386, "acc_stderr": 0.02778014120702334, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.02778014120702334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6205787781350482, "acc_stderr": 0.02755994980234782, "acc_norm": 0.6205787781350482, "acc_norm_stderr": 0.02755994980234782 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6203703703703703, "acc_stderr": 0.02700252103451646, "acc_norm": 0.6203703703703703, "acc_norm_stderr": 0.02700252103451646 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4397163120567376, "acc_stderr": 0.02960991207559411, "acc_norm": 0.4397163120567376, "acc_norm_stderr": 0.02960991207559411 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4367666232073012, "acc_stderr": 0.012667701919603664, "acc_norm": 0.4367666232073012, "acc_norm_stderr": 0.012667701919603664 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.48161764705882354, "acc_stderr": 0.03035230339535196, "acc_norm": 0.48161764705882354, "acc_norm_stderr": 0.03035230339535196 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5637254901960784, "acc_stderr": 0.02006287424353913, "acc_norm": 0.5637254901960784, "acc_norm_stderr": 0.02006287424353913 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8009950248756219, "acc_stderr": 0.028231365092758406, "acc_norm": 0.8009950248756219, "acc_norm_stderr": 0.028231365092758406 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866767, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245231, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245231 }, "harness|truthfulqa:mc|0": { "mc1": 0.30966952264381886, "mc1_stderr": 0.016185744355144912, "mc2": 0.4423687837225679, "mc2_stderr": 0.015079580060665993 }, "harness|winogrande|5": { "acc": 0.7348066298342542, "acc_stderr": 0.012406549466192858 }, "harness|gsm8k|5": { "acc": 0.5496588324488249, "acc_stderr": 0.013704390498582809 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
stanmalkinson199/ViniciusClassico
--- license: openrail ---
eligrayy/OE-LoL-Esports-Dataset
--- license: apache-2.0 ---
thesistranslation/distilled-ccmatrix-fr-en
--- dataset_info: features: - name: id dtype: int32 - name: translation dtype: translation: languages: - fr - en splits: - name: train num_bytes: 7513764655 num_examples: 30000000 download_size: 5154705851 dataset_size: 7513764655 language: - fr - en --- # Dataset Card for "distilled-ccmatrix-fr-en" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
UnfilteredAI/Prompts-for-txt-image
--- license: apache-2.0 ---
dmrau/cqudupstack-mathematica
--- configs: - config_name: default data_files: - split: queries path: data/queries-* - split: corpus path: data/corpus-* dataset_info: features: - name: _id dtype: string - name: text dtype: string - name: title dtype: string splits: - name: queries num_bytes: 52792 num_examples: 804 - name: corpus num_bytes: 18735825 num_examples: 16705 download_size: 10393860 dataset_size: 18788617 --- # Dataset Card for "cqudupstack-mathematica" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/lappland_arknights
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of lappland/ラップランド/拉普兰德 (Arknights) This is the dataset of lappland/ラップランド/拉普兰德 (Arknights), containing 500 images and their tags. The core tags of this character are `animal_ears, wolf_ears, long_hair, scar_across_eye, hair_ornament, hairclip, grey_hair, hair_between_eyes, scar_on_face, grey_eyes, white_hair, very_long_hair, tail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 951.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lappland_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 433.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lappland_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1243 | 945.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lappland_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 781.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lappland_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1243 | 1.45 GiB | [Download](https://huggingface.co/datasets/CyberHarem/lappland_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lappland_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_jacket, fingerless_gloves, grey_gloves, high_collar, scar, solo, black_nails, long_sleeves, looking_at_viewer, nail_polish, open_mouth, upper_body, :d, fangs, hand_up, originium_arts_(arknights), sharp_teeth, black_shorts, white_gloves | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_jacket, scar, solo, looking_at_viewer, upper_body, smile, closed_mouth, simple_background, high_collar, white_background, long_sleeves, grey_background | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_jacket, holding_sword, looking_at_viewer, scar, solo, white_background, long_sleeves, simple_background, upper_body, open_mouth, :d, midriff, shorts | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_jacket, black_shorts, long_sleeves, midriff, open_jacket, scar, short_shorts, solo, standing, tube_top, bandeau, holding_sword, looking_at_viewer, navel, oripathy_lesion_(arknights), stomach, medium_breasts, thighs, white_gloves, cowboy_shot, fingerless_gloves, wolf_tail, black_nails, grin, nail_polish, originium_arts_(arknights) | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_jacket, long_sleeves, looking_at_viewer, scar, smile, solo, black_shorts, cowboy_shot, holding_sword, oripathy_lesion_(arknights), black_coat, black_nails, high_collar, nail_polish, short_shorts, wide_sleeves, wolf_tail, gloves, parted_lips | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_footwear, black_jacket, black_shorts, full_body, long_sleeves, scar, solo, boots, looking_at_viewer, oripathy_lesion_(arknights), short_shorts, smile, wolf_tail, bare_legs, black_coat, white_background, closed_mouth, dual_wielding, holding_sword, simple_background, standing | | 6 | 12 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_jacket, holding_sword, long_sleeves, scar, solo, black_shorts, looking_at_viewer, oripathy_lesion_(arknights), grin, navel, short_shorts, fingerless_gloves, black_coat, black_footwear, wolf_tail, boots | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_jacket, long_sleeves, looking_at_viewer, scar, simple_background, solo, upper_body, white_background, fingerless_gloves, midriff, navel, :d, black_nails, nail_polish, open_mouth, coat, grey_background, grey_gloves, stomach | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, black_coat, black_dress, black_gloves, long_sleeves, looking_at_viewer, official_alternate_costume, open_coat, scar, smile, solo, wolf_girl, black_footwear, blue_eyes, boots, full_body, holding_sword, wolf_tail, standing, blood_on_face, dual_wielding, oripathy_lesion_(arknights), fur_trim | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, black_coat, black_dress, holding_sword, long_sleeves, looking_at_viewer, official_alternate_costume, scar, solo, open_coat, black_gloves, fur_trim, grin, sharp_teeth, cowboy_shot, blue_eyes, wolf_girl, blood, oripathy_lesion_(arknights), short_dress | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | fingerless_gloves | grey_gloves | high_collar | scar | solo | black_nails | long_sleeves | looking_at_viewer | nail_polish | open_mouth | upper_body | :d | fangs | hand_up | originium_arts_(arknights) | sharp_teeth | black_shorts | white_gloves | smile | closed_mouth | simple_background | white_background | grey_background | holding_sword | midriff | shorts | open_jacket | short_shorts | standing | tube_top | bandeau | navel | oripathy_lesion_(arknights) | stomach | medium_breasts | thighs | cowboy_shot | wolf_tail | grin | black_coat | wide_sleeves | gloves | parted_lips | black_footwear | full_body | boots | bare_legs | dual_wielding | coat | black_dress | black_gloves | official_alternate_costume | open_coat | wolf_girl | blue_eyes | blood_on_face | fur_trim | blood | short_dress | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:--------------|:--------------|:-------|:-------|:--------------|:---------------|:--------------------|:--------------|:-------------|:-------------|:-----|:--------|:----------|:-----------------------------|:--------------|:---------------|:---------------|:--------|:---------------|:--------------------|:-------------------|:------------------|:----------------|:----------|:---------|:--------------|:---------------|:-----------|:-----------|:----------|:--------|:------------------------------|:----------|:-----------------|:---------|:--------------|:------------|:-------|:-------------|:---------------|:---------|:--------------|:-----------------|:------------|:--------|:------------|:----------------|:-------|:--------------|:---------------|:-----------------------------|:------------|:------------|:------------|:----------------|:-----------|:--------|:--------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | X | X | X | | X | X | | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | X | | X | X | | X | X | X | | | | | | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 13 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | | X | X | X | X | X | X | | | | | | X | | X | X | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | X | X | X | X | X | X | X | | | | | | | | X | | X | | | | | X | | | | X | | | | | X | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | | X | X | | X | X | | | | | | | | | X | | X | X | X | X | | X | | | | X | X | | | | X | | | | | X | | X | | | | X | X | X | X | X | | | | | | | | | | | | | 6 | 12 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | | X | X | | X | X | | | | | | | | | X | | | | | | | X | | | | X | | | | X | X | | | | | X | X | X | | | | X | | X | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | X | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | | X | X | | X | X | | | | | | | | | | | X | | | | | X | | | | | X | | | | X | | | | | X | | X | | | | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | | X | X | | X | X | | | | | | | | X | | | | | | | | X | | | | | | | | | X | | | | X | | X | X | | | | | | | | | | X | X | X | X | X | X | | X | X | X |
CyberHarem/argus_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of argus/アーガス/百眼巨人 (Azur Lane) This is the dataset of argus/アーガス/百眼巨人 (Azur Lane), containing 25 images and their tags. The core tags of this character are `breasts, long_hair, large_breasts, bangs, blue_eyes, braid, very_long_hair, crown, white_hair, grey_hair, bow, hair_bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 25 | 45.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/argus_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 25 | 20.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/argus_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 69 | 48.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/argus_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 25 | 37.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/argus_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 69 | 72.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/argus_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/argus_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, cleavage, looking_at_viewer, white_dress, white_background, white_gloves, simple_background, clothing_cutout, elbow_gloves, necklace, closed_mouth, fingerless_gloves, bare_shoulders, blush, gem, mini_crown, smile | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, collared_shirt, long_sleeves, looking_at_viewer, solo, white_shirt, black_skirt, cleavage, pleated_skirt, black_pantyhose, partially_unbuttoned, sitting, black_bow, blush, closed_mouth, no_shoes, ribbon, simple_background, white_background, grey_eyes, indoors, open_clothes, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | white_dress | white_background | white_gloves | simple_background | clothing_cutout | elbow_gloves | necklace | closed_mouth | fingerless_gloves | bare_shoulders | blush | gem | mini_crown | smile | collared_shirt | long_sleeves | white_shirt | black_skirt | pleated_skirt | black_pantyhose | partially_unbuttoned | sitting | black_bow | no_shoes | ribbon | grey_eyes | indoors | open_clothes | thighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------------|:-------------------|:---------------|:--------------------|:------------------|:---------------|:-----------|:---------------|:--------------------|:-----------------|:--------|:------|:-------------|:--------|:-----------------|:---------------|:--------------|:--------------|:----------------|:------------------|:-----------------------|:----------|:------------|:-----------|:---------|:------------|:----------|:---------------|:---------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | X | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
Norod78/hearthstone-cards-512
--- dataset_info: features: - name: text dtype: string - name: image dtype: image splits: - name: train num_bytes: 230518521.36 num_examples: 2952 download_size: 230628184 dataset_size: 230518521.36 pretty_name: 'Blizzard Hearthstone cards, resized to 512x512 with OCR text field' size_categories: - n<10K tags: ["blizzard", "hearthstone", "game cards"] task_categories: - text-to-image --- # Dataset Card for "hearthstone-cards-512" # Not affiliated in anyway with Blizzard nor Hearthstone # Please note that this entrie dataset contains copyrighted matirial
HydraLM/partitioned_v2_standardized_1
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: dataset_id dtype: string splits: - name: train num_bytes: 27573820.31730978 num_examples: 57468 download_size: 22691489 dataset_size: 27573820.31730978 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "partitioned_v2_standardized_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FreedomIntelligence/alpaca-gpt4-deutsch
--- license: apache-2.0 --- The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT).
autoevaluate/autoeval-staging-eval-project-e1d72cd6-7845032
--- type: predictions tags: - autotrain - evaluation datasets: - billsum eval_info: task: summarization model: d0r1h/LEDBill metrics: [] dataset_name: billsum dataset_config: default dataset_split: test col_mapping: text: text target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: d0r1h/LEDBill * Dataset: billsum To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
rpereira90/autotrain-data-guitarsproject
--- task_categories: - image-classification --- # AutoTrain Dataset for project: guitarsproject ## Dataset Description This dataset has been automatically processed by AutoTrain for project guitarsproject. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<1990x2520 RGB PIL image>", "target": 1 }, { "image": "<6000x4000 RGB PIL image>", "target": 0 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['LesPaul', 'Stratocaster'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 80 | | valid | 21 |
tasneem123/audios1
--- license: unknown ---
lmms-lab/IQ50
--- dataset_info: features: - name: question_id dtype: string - name: question dtype: string - name: answer dtype: string - name: query_image_0 dtype: image - name: query_image_1 dtype: image - name: query_image_2 dtype: image - name: query_image_3 dtype: image - name: query_image_4 dtype: image - name: query_image_5 dtype: image - name: query_image_6 dtype: image - name: query_image_7 dtype: image - name: answer_image_a dtype: image - name: answer_image_b dtype: image - name: answer_image_c dtype: image - name: answer_image_d dtype: image - name: answer_image_e dtype: image - name: answer_image_f dtype: image splits: - name: test num_bytes: 12321743.0 num_examples: 50 download_size: 10640175 dataset_size: 12321743.0 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "IQ50" <p align="center" width="100%"> <img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%"> </p> # Large-scale Multi-modality Models Evaluation Suite > Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval` 🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab) # This Dataset This is a formatted version of [IQ50](https://github.com/microsoft/unilm/issues/1265). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models. ``` @article{huang2023language, title={Language is not all you need: Aligning perception with language models}, author={Huang, Shaohan and Dong, Li and Wang, Wenhui and Hao, Yaru and Singhal, Saksham and Ma, Shuming and Lv, Tengchao and Cui, Lei and Mohammed, Owais Khan and Liu, Qiang and others}, journal={arXiv preprint arXiv:2302.14045}, volume={1}, number={2}, pages={3}, year={2023} } ``` [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
juancopi81/orca-math-word-problems-80016_90018
--- dataset_info: features: - name: question dtype: string - name: answer dtype: string splits: - name: train num_bytes: 10869685 num_examples: 10002 download_size: 3934553 dataset_size: 10869685 configs: - config_name: default data_files: - split: train path: data/train-* ---
liuyanchen1015/MULTI_VALUE_mrpc_it_is_referential
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 1277 num_examples: 4 - name: train num_bytes: 866 num_examples: 4 download_size: 8508 dataset_size: 2143 --- # Dataset Card for "MULTI_VALUE_mrpc_it_is_referential" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_automerger__PasticheAlloyingotneoy-7B
--- pretty_name: Evaluation run of automerger/PasticheAlloyingotneoy-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [automerger/PasticheAlloyingotneoy-7B](https://huggingface.co/automerger/PasticheAlloyingotneoy-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__PasticheAlloyingotneoy-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-06T00:16:25.828016](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__PasticheAlloyingotneoy-7B/blob/main/results_2024-04-06T00-16-25.828016.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508232795294623,\n\ \ \"acc_stderr\": 0.03210820833288333,\n \"acc_norm\": 0.6499765787776198,\n\ \ \"acc_norm_stderr\": 0.03278385724556176,\n \"mc1\": 0.6364749082007344,\n\ \ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7831045197403762,\n\ \ \"mc2_stderr\": 0.013715683104637283\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\ \ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313962\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n\ \ \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.8908583947420833,\n\ \ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\ \ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\ \ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\ \ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \ \ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\ \ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\ \ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\ \ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\ \ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\ \ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\ acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\ \ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\ \ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\ \ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\ \ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \ \ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\ acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\ acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\ acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\ acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\ \ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\ \ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\ \ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\ \ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\ \ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\ \ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\ \ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\ \ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\ \ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\ \ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\ \ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\ \ \"acc_stderr\": 0.012749206007657474,\n \"acc_norm\": 0.47131681877444587,\n\ \ \"acc_norm_stderr\": 0.012749206007657474\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\ \ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\ \ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\ \ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \ \ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6364749082007344,\n\ \ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7831045197403762,\n\ \ \"mc2_stderr\": 0.013715683104637283\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479648\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \ \ \"acc_stderr\": 0.012714401009923645\n }\n}\n```" repo_url: https://huggingface.co/automerger/PasticheAlloyingotneoy-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|arc:challenge|25_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-06T00-16-25.828016.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|gsm8k|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hellaswag|10_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-16-25.828016.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-management|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-16-25.828016.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|truthfulqa:mc|0_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-06T00-16-25.828016.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_06T00_16_25.828016 path: - '**/details_harness|winogrande|5_2024-04-06T00-16-25.828016.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-06T00-16-25.828016.parquet' - config_name: results data_files: - split: 2024_04_06T00_16_25.828016 path: - results_2024-04-06T00-16-25.828016.parquet - split: latest path: - results_2024-04-06T00-16-25.828016.parquet --- # Dataset Card for Evaluation run of automerger/PasticheAlloyingotneoy-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [automerger/PasticheAlloyingotneoy-7B](https://huggingface.co/automerger/PasticheAlloyingotneoy-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_automerger__PasticheAlloyingotneoy-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-06T00:16:25.828016](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__PasticheAlloyingotneoy-7B/blob/main/results_2024-04-06T00-16-25.828016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6508232795294623, "acc_stderr": 0.03210820833288333, "acc_norm": 0.6499765787776198, "acc_norm_stderr": 0.03278385724556176, "mc1": 0.6364749082007344, "mc1_stderr": 0.016838862883965834, "mc2": 0.7831045197403762, "mc2_stderr": 0.013715683104637283 }, "harness|arc:challenge|25": { "acc": 0.7150170648464164, "acc_stderr": 0.013191348179838793, "acc_norm": 0.734641638225256, "acc_norm_stderr": 0.012902554762313962 }, "harness|hellaswag|10": { "acc": 0.717486556462856, "acc_stderr": 0.004493015945599716, "acc_norm": 0.8908583947420833, "acc_norm_stderr": 0.003111795320787943 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.028254200344438662, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.028254200344438662 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.0358687928008034, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305527, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305527 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642514, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642514 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.03374402644139403, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.03374402644139403 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.024035489676335075, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.024035489676335075 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5324074074074074, "acc_stderr": 0.03402801581358966, "acc_norm": 0.5324074074074074, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290902, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290902 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159463, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159463 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4301675977653631, "acc_stderr": 0.016558601636041035, "acc_norm": 0.4301675977653631, "acc_norm_stderr": 0.016558601636041035 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.025553169991826524, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.025553169991826524 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.026311858071854155, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.026311858071854155 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.024383665531035454, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.024383665531035454 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657474, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657474 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233264, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233264 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.6364749082007344, "mc1_stderr": 0.016838862883965834, "mc2": 0.7831045197403762, "mc2_stderr": 0.013715683104637283 }, "harness|winogrande|5": { "acc": 0.8524072612470402, "acc_stderr": 0.009968715765479648 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923645 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
DBQ/Chloe.Product.prices.United.States
--- annotations_creators: - other language_creators: - other language: - en license: - unknown multilinguality: - monolingual source_datasets: - original task_categories: - text-classification - image-classification - feature-extraction - image-segmentation - image-to-image - image-to-text - object-detection - summarization - zero-shot-image-classification pretty_name: United States - Chloe - Product-level price list tags: - webscraping - ecommerce - Chloe - fashion - fashion product - image - fashion image configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: Chloe dtype: string - name: '2023-11-08' dtype: string - name: USA dtype: string - name: USD dtype: string - name: CHLOE dtype: string - name: WOMEN dtype: string - name: NEW ARRIVALS dtype: string - name: WINTER 2023 dtype: string - name: 45800720AE dtype: string - name: Hana mini bag dtype: string - name: https://www.chloe.com/us/shoulder-bag_cod45800720ae.html dtype: string - name: https://www.chloe.com/product_image/45800720AE/f/w282.jpg dtype: string - name: '430.00' dtype: float64 - name: 430.00.1 dtype: float64 - name: '402.24' dtype: float64 - name: 402.24.1 dtype: float64 - name: '0' dtype: int64 splits: - name: train num_bytes: 714750 num_examples: 2569 download_size: 162722 dataset_size: 714750 --- # Chloe web scraped data ## About the website The **Ecommerce industry** in America, particularly in the United States, has been showing rapid growth and substantial advancements amidst digitalization and increased online shopping trends. One vital sector within this industry is the **fashion industry**, where renowned brands like **Chloe** record significant sales. The brand operates through both physical outlets and online platforms. The dataset observed has **Ecommerce product-list page (PLP) data on Chloe** in United States, providing insightful observations on their digital performance, customer preferences, and shopping behavior. Such data is instrumental to marketers and analysts in understanding market trends, designing strategies, and focusing on customer segments to increase revenue and profits. ## Link to **dataset** [United States - Chloe - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Chloe%20Product-prices%20United%20States/r/recHvXlrm9VQZj0kM)
mask-distilled-one-sec-cv12/chunk_10
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1029688964 num_examples: 202217 download_size: 1045671466 dataset_size: 1029688964 --- # Dataset Card for "chunk_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Thouph/categorized_tags
--- license: mit viewer: false ---
tangroucorn77/corn
--- license: apache-2.0 ---
Bibek1129/nepali_SQuAD_multiple_qsns
--- license: cc-by-4.0 ---
awettig/Pile-Books3-0.5B-6K-opt
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 6500959920 num_examples: 81380 - name: test num_bytes: 64945692 num_examples: 813 download_size: 1711566471 dataset_size: 6565905612 --- # Dataset Card for "Pile-Books3-0.5B-6K-opt" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)