datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
Phanh2532/GAML-151
--- license: other license_name: gama-platform license_link: LICENSE ---
VoidZeroe/FluxTrainNonSys2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 3981805 num_examples: 14272 download_size: 627141 dataset_size: 3981805 configs: - config_name: default data_files: - split: train path: data/train-* ---
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-9000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 665060 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
isek-ai/danbooru-tags-2023
--- dataset_info: - config_name: all features: - name: id dtype: int64 - name: copyright dtype: string - name: character dtype: string - name: artist dtype: string - name: general dtype: string - name: meta dtype: string - name: rating dtype: string - name: score dtype: int64 - name: created_at dtype: string splits: - name: train num_bytes: 3265428405 num_examples: 6574149 download_size: 1289260187 dataset_size: 3265428405 - config_name: safe features: - name: id dtype: int64 - name: copyright dtype: string - name: character dtype: string - name: artist dtype: string - name: general dtype: string - name: meta dtype: string - name: rating dtype: string - name: score dtype: int64 - name: created_at dtype: string splits: - name: train num_bytes: 689117431.2710671 num_examples: 1387371 download_size: 276644226 dataset_size: 689117431.2710671 configs: - config_name: all data_files: - split: train path: all/train-* - config_name: safe data_files: - split: train path: safe/train-* license: cc0-1.0 task_categories: - text-classification - text-generation - text2text-generation language: - en size_categories: - 1M<n<10M --- # danbooru-tags-2023 A dataset of danbooru tags. ## Dataset information Generated using [danbooru](https://danbooru.donmai.us/) and [safebooru](https://safebooru.donmai.us/) API. The dataset was created with the following conditions: |Subset name|`all`|`safe`| |-|-|-| |API Endpoint|https://danbooru.donmai.us|https://safebooru.donmai.us| |Date|`2005-01-01..2023-12-31`|`2005-01-01..2023-12-31`| |Score|`>0`|`>0`| |Rating|`g,s,q,e`|`g`| |Filetype|`png,jpg,webp`|`png,jpg,webp`| |Size (number of rows)|6,574,149|1,387,371| ## Usage ``` pip install datasets ``` ```py from datasets import load_dataset dataset = load_dataset( "isek-ai/danbooru-tags-2023", "safe", # or "all" split="train", ) print(dataset) print(dataset[0]) # Dataset({ # features: ['id', 'copyright', 'character', 'artist', 'general', 'meta', 'rating', 'score', 'created_at'], # num_rows: 1387371 # }) # {'id': 12219, # 'copyright': 'fate/stay night, fate/unlimited blade works, fate (series)', # 'character': 'emiya shirou, gilgamesh (fate), gilgamesh (immoral biker jacket) (fate)', # 'artist': 'takeuchi takashi', # 'general': '2boys, alternate costume, alternate hairstyle, battle, blonde hair, brown hair, clenched teeth, duel, dutch angle, field of blades, jacket, long sleeves, male focus, multiple boys, official alternate costume, open clothes, open jacket, open mouth, orange hair, pants, planted, planted sword, raglan sleeves, red eyes, sky, slashing, sword, teeth, unlimited blade works (fate), wasteland, weapon', # 'meta': 'game cg', # 'rating': 'g', # 'score': 33, # 'created_at': '2005-10-15T08:50:32.000+09:00'} ```
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-b01992-27495144908
--- type: predictions tags: - autotrain - evaluation datasets: - cnn_dailymail eval_info: task: summarization model: sysresearch101/t5-large-finetuned-xsum-cnn metrics: [] dataset_name: cnn_dailymail dataset_config: 3.0.0 dataset_split: test col_mapping: text: article target: highlights --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: sysresearch101/t5-large-finetuned-xsum-cnn * Dataset: cnn_dailymail * Config: 3.0.0 * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@kaprerna135](https://huggingface.co/kaprerna135) for evaluating this model.
Simonk97/duyuyen
--- license: openrail ---
navagg/PyVHR
--- license: unlicense ---
SKT27182/Preprocessed_OpenOrca
--- language: - en license: mit task_categories: - text-classification - conversational dataset_info: features: - name: id dtype: string - name: system_prompt dtype: string - name: question dtype: string - name: response dtype: string - name: length_before_preprocessing dtype: int64 splits: - name: train num_bytes: 3671168412.416216 num_examples: 2872771 - name: test num_bytes: 458896850.2513517 num_examples: 359097 - name: validation num_bytes: 458895572.3324322 num_examples: 359096 download_size: 2553683923 dataset_size: 4588960835.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Languages Langugage of the dataset is mostly English. ## Dataset Structure ### Data Fields The fields are: - 'id', a unique numbered identifier which includes one of 'niv', 't0', 'cot', or 'flan' to represent which source FLAN Collection submix the 'question' is sourced from. - 'system_prompt', representing the System Prompt presented to the GPT-3.5 or GPT-4 API for the datapoint - 'question', representing a question entry as provided by the FLAN Collection - 'response', a response to that question received from a query to either GPT-3.5 or GPT-4. ### Data Splits [More Information Needed] ### Source Data #### Initial Data Collection and Normalization Dataset is collected from huggingface's Open-Orca/OpenOrca. ## Additional Information ### Dataset Curators This dataset is taken from `Open-Orca/OpenOrca` and then modified it's prompt. Made it's overall length of `prompt` + `question` less than 512 to make it possible to give it input to mostly models whose Maximum input length is 512. # Citation ```bibtex @misc{OpenOrca, title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces}, author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}, } ``` ```bibtex @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtex @misc{longpre2023flan, title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning}, author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts}, year={2023}, eprint={2301.13688}, archivePrefix={arXiv}, primaryClass={cs.AI} } ``` ```bibtex @software{touvron2023llama, title={LLaMA: Open and Efficient Foundation Language Models}, author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume}, journal={arXiv preprint arXiv:2302.13971}, year={2023} } ```
james-burton/OrientalMuseum_min5-name-text
--- dataset_info: features: - name: obj_num dtype: string - name: file dtype: string - name: image dtype: image - name: root dtype: string - name: description dtype: string - name: label dtype: class_label: names: '0': Album Painting '1': Animal Figurine '2': Animal Mummy '3': Animal bone '4': Axe Head '5': Belt Hook '6': Blouse '7': Bolt '8': Box '9': Brush Pot '10': Cap '11': Case '12': Clay pipe (smoking) '13': Cosmetic and Medical Equipment and Implements '14': Cup And Saucer '15': DVDs '16': Dagger '17': Disc '18': Domestic Equipment and Utensils '19': Earring '20': Finger Ring '21': Funerary Cone '22': Funerary goods '23': Funerary money '24': Hanging '25': Heart Scarab '26': Human Figurine '27': Incense Holder '28': Inkstick '29': Kite '30': Kohl Pot '31': Letter '32': Manuscript Page '33': Mat '34': Mica Painting '35': Miniature Painting '36': Mortar '37': Mummy Label '38': Oracle Bone '39': Ostraka '40': Palette '41': Panel '42': Part '43': Pencase '44': Pendant '45': Pipe '46': Pith Painting '47': Plaque '48': Plate '49': Prayer Wheel '50': Scarab Seal '51': Scarf '52': Screen '53': Seal '54': Slide '55': Stand '56': Table '57': Thangka '58': Tomb Model '59': Water Dropper '60': Water Pot '61': Woodblock Print '62': accessories '63': albums '64': altar components '65': amulets '66': animation cels '67': animation drawings '68': armor '69': arrowheads '70': axes '71': 'axes: woodworking tools' '72': badges '73': bags '74': bandages '75': baskets '76': beads '77': bells '78': belts '79': blades '80': board games '81': books '82': bottles '83': bowls '84': boxes '85': bracelets '86': brick '87': brooches '88': brush washers '89': buckets '90': buckles '91': calligraphy '92': candleholders '93': canopic jars '94': cards '95': carvings '96': chains '97': chessmen '98': chopsticks '99': claypipe '100': cloth '101': clothing '102': coats '103': coins '104': collar '105': compact discs '106': containers '107': coverings '108': covers '109': cups '110': deity figurine '111': diagrams '112': dishes '113': dolls '114': drawings '115': dresses '116': drums '117': earrings '118': embroidery '119': ensembles '120': envelopes '121': 'equipment for personal use: grooming, hygiene and health care' '122': ewers '123': fans '124': figures '125': figurines '126': finials '127': flags '128': flasks '129': fragments '130': furniture components '131': gameboards '132': gaming counters '133': glassware '134': gongs '135': hair ornaments '136': hairpins '137': handles '138': harnesses '139': hats '140': headdresses '141': heads '142': incense burners '143': inlays '144': jackets '145': jars '146': jewelry '147': juglets '148': jugs '149': keys '150': kimonos '151': knives '152': lamps '153': lanterns '154': lids '155': maces '156': masks '157': medals '158': miniatures '159': mirrors '160': models '161': mounts '162': nails '163': necklaces '164': needles '165': netsukes '166': ornaments '167': pages '168': paintings '169': paper money '170': papyrus '171': pendants '172': petticoats '173': photographs '174': pictures '175': pins '176': playing cards '177': poker '178': postage stamps '179': postcards '180': posters '181': pots '182': pottery '183': printing blocks '184': prints '185': puppets '186': purses '187': reliefs '188': rings '189': robes '190': rubbings '191': rugs '192': sandals '193': saris '194': sarongs '195': sashes '196': saucers '197': scabbards '198': scaraboids '199': scarabs '200': scepters '201': scrolls '202': seed '203': seppa '204': shadow puppets '205': shawls '206': shell '207': sherds '208': shields '209': shoes '210': situlae '211': sketches '212': skirts '213': snuff bottles '214': socks '215': spatulas '216': spoons '217': statues '218': statuettes '219': stelae '220': straps '221': studs '222': swords '223': tablets '224': tacks '225': tea bowls '226': teapots '227': tiles '228': tools '229': toys '230': trays '231': tubes '232': tweezers '233': underwear '234': unidentified '235': ushabti '236': utensils '237': vases '238': vessels '239': weight '240': weights '241': whorls '242': wood blocks - name: other_name dtype: string - name: material dtype: string - name: production.period dtype: string - name: production.place dtype: string splits: - name: train num_bytes: 941952733.7932324 num_examples: 7168 - name: validation num_bytes: 183440438.5883838 num_examples: 1687 - name: test num_bytes: 190577197.6163838 num_examples: 1687 download_size: 1186854394 dataset_size: 1315970369.998 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
player211/whali
--- license: apache-2.0 ---
tonyassi/dollskill-ds-embeddings
--- dataset_info: features: - name: image dtype: image - name: embeddings sequence: float32 splits: - name: train num_bytes: 59107384.0 num_examples: 326 download_size: 57685384 dataset_size: 59107384.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
manfredmichael/quac-lamini-instruction-indo-2.6M
--- dataset_info: features: - name: context dtype: float64 - name: instruction dtype: string - name: response dtype: string - name: instruction_sources dtype: string splits: - name: train num_bytes: 1276671946 num_examples: 2585614 download_size: 713557171 dataset_size: 1276671946 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "quac-lamini-instruction-indo-2.6M" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
erin922/llama2_custom
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5340 num_examples: 30 download_size: 4642 dataset_size: 5340 configs: - config_name: default data_files: - split: train path: data/train-* ---
pedropauletti/processed_librispeech_pt
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: input_ids sequence: int32 - name: labels sequence: sequence: float32 - name: speaker_embeddings sequence: float32 splits: - name: train num_bytes: 1448647649.3426037 num_examples: 4648 - name: test num_bytes: 161134000.58307362 num_examples: 517 download_size: 1435028022 dataset_size: 1609781649.9256773 --- # Dataset Card for "processed_librispeech_pt" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_stsb_what_comparative
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: train num_bytes: 533 num_examples: 2 download_size: 0 dataset_size: 533 --- # Dataset Card for "MULTI_VALUE_stsb_what_comparative" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liyangbing/water
--- license: afl-3.0 --- this is a test
larryvrh/WikiMatrix-v1-Ja_Zh-filtered
--- license: cc-by-sa-4.0 dataset_info: features: - name: ja dtype: string - name: zh dtype: string splits: - name: train num_bytes: 149036235 num_examples: 690095 download_size: 115870646 dataset_size: 149036235 task_categories: - translation language: - ja - zh size_categories: - 100K<n<1M --- Filtered and modified version of Japanese/Chinese language pair data from [WikiMatrix v1](https://opus.nlpl.eu/WikiMatrix.php). Process steps: 1. Basic regex based filtering / length checking to remove abnormal pairs. 2. Semantic similarity filtering with a threshold value of 0.6, based on [sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE). 3. Convert all Traditional Chinese sentences into Simplified Chinese with [zhconv](https://github.com/gumblex/zhconv). ------ 经过过滤和修改的日语/中文语言对数据,来自[WikiMatrix v1](https://opus.nlpl.eu/WikiMatrix.php)。 处理步骤: 1. 基本的基于正则表达式的过滤/长度检查,以删除异常对。 2. 基于[sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE)的语义相似性过滤,阈值为0.6。 3. 使用[zhconv](https://github.com/gumblex/zhconv)将所有繁体中文句子转换为简体中文。 ------ 以下はフィルタリングされ修正された日本語/中国語のペアデータです。データ元は[WikiMatrix v1](https://opus.nlpl.eu/WikiMatrix.php)です。 処理手順: 1. 正規表現に基づくフィルタリング/長さのチェックを行い、異常なペアを削除します。 2. [sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE)に基づくセマンティック類似性フィルタリングを行い、閾値は0.6です。 3. [zhconv](https://github.com/gumblex/zhconv)を使って、すべての繁体字中国語の文を簡体字中国語に変換します。
llangnickel/long-covid-classification-data
--- annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - cc-by-4.0 multilinguality: - monolingual pretty_name: 'Dataset containing abstracts from PubMed, either related to long COVID or not. ' size_categories: - unknown source_datasets: - original task_categories: - text-classification --- ## Data Description Long-COVID related articles have been manually collected by information specialists. Please find further information [here](https://doi.org/10.1093/database/baac048). ## Size ||Training|Development|Test|Total| |--|--|--|--|--| Positive Examples|215|76|70|345| Negative Examples|199|62|68|345| Total|414|238|138|690| ## Citation @article{10.1093/database/baac048, author = {Langnickel, Lisa and Darms, Johannes and Heldt, Katharina and Ducks, Denise and Fluck, Juliane}, title = "{Continuous development of the semantic search engine preVIEW: from COVID-19 to long COVID}", journal = {Database}, volume = {2022}, year = {2022}, month = {07}, issn = {1758-0463}, doi = {10.1093/database/baac048}, url = {https://doi.org/10.1093/database/baac048}, note = {baac048}, eprint = {https://academic.oup.com/database/article-pdf/doi/10.1093/database/baac048/44371817/baac048.pdf}, }
rescer/twitter_dataset_1713224203
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 416276 num_examples: 1307 download_size: 233661 dataset_size: 416276 configs: - config_name: default data_files: - split: train path: data/train-* ---
arieg/cluster02_medium_10
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': '000140' '1': 001259 '2': '004507' '3': 005940 '4': '006443' '5': 007483 '6': 007487 '7': 007872 '8': '011237' '9': 012986 '10': '014541' '11': '014576' '12': '014661' '13': 018037 '14': 018038 '15': '022477' '16': '024367' '17': 025668 '18': 028241 '19': 028266 '20': '030056' '21': '032333' '22': '032337' '23': 032339 '24': '035543' '25': 036999 '26': 039259 '27': 039658 '28': '040657' '29': '042020' '30': '042023' '31': '042025' '32': '042030' '33': '042046' '34': '042372' '35': '043030' '36': 043598 '37': '043761' '38': 043965 '39': 044794 '40': 046839 '41': 047197 '42': 047835 '43': 049394 '44': 049478 '45': '051655' '46': 051659 '47': '052120' '48': '052122' '49': '052123' '50': '052125' '51': '053154' '52': '054153' '53': 055826 '54': 055830 '55': 055831 '56': '057371' '57': '057640' '58': '057665' '59': 057691 '60': 059678 '61': '060170' '62': '061160' '63': '061736' '64': 061820 '65': 061821 '66': 062592 '67': '064364' '68': 064629 '69': '066405' '70': '067366' '71': '067367' '72': '070426' '73': 072149 '74': 072788 '75': 073309 '76': '073467' '77': 075428 '78': 075784 '79': 075862 '80': '076074' '81': 076079 '82': 079593 '83': 080518 '84': 085966 '85': 086140 '86': 091443 '87': 094449 '88': 094628 '89': 095908 '90': 096168 '91': 096696 '92': 097374 '93': 099095 '94': '101111' '95': '101112' '96': '107432' '97': '107567' '98': '108012' '99': '108529' '100': '109445' '101': '109449' '102': '109450' '103': '110263' '104': '111392' '105': '112197' '106': '113018' '107': '113360' '108': '114036' '109': '114041' '110': '116239' '111': '116735' '112': '117170' '113': '119592' '114': '120196' '115': '121273' '116': '122077' '117': '122082' '118': '122201' '119': '122247' '120': '125190' '121': '126017' '122': '126300' '123': '126411' '124': '126718' '125': '128469' '126': '129887' '127': '129972' '128': '130129' '129': '130709' '130': '130711' '131': '131624' '132': '131787' '133': '134643' '134': '134934' '135': '135028' '136': '135043' '137': '135336' '138': '137898' '139': '139330' '140': '139804' '141': '140421' '142': '141903' '143': '144171' '144': '144551' '145': '144935' '146': '145749' '147': '145780' '148': '146639' '149': '148303' '150': '148518' '151': '148608' '152': '149623' '153': '149953' splits: - name: train num_bytes: 83267087.48 num_examples: 1540 download_size: 76401188 dataset_size: 83267087.48 configs: - config_name: default data_files: - split: train path: data/train-* ---
yagnikposhiya/CommonVoiceCorpusHindi15
--- license: apache-2.0 language: - hi --- ## CommonVoiceCorpusHindi15 #### Directory structure: 1. **assets** <br> **a.** Download whole compressed dataset by clicking on the cv-corpus-15.0-2023-09-08-hi.tar.gz file. <br> **b.** splitdata.py, python script contains code to split "clips" direcrtory in the original dataset. Because HuggingFace supports 10,000 files per directory but in the original dataset "clips" directory contains 14,000 files almost. So, "clips" directory is splitted into two directories "clips0" and "clips1". "clips0" contains 10,000 audio files exactly and "clips1" contains all remaining audio files. 2. **data** <br> **a.** "clips0" contains 10,000 audio files<br> **b.** "clips1" contains 4,000+ audio files (remaining from 14,000 audio files) **c.** Remaining all are metadata files. 3. **Credit:** [Common Voice moz://a](https://commonvoice.mozilla.org/hi/datasets)
Aoschu/donut_model_data_for_german_invoice_2
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: ground_truth dtype: string splits: - name: train num_bytes: 2062396.0 num_examples: 14 download_size: 1621615 dataset_size: 2062396.0 --- # Dataset Card for "donut_model_data_for_german_invoice_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/millaarc_cranstoun_senkizesshousymphogear
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Millaarc Cranstoun This is the dataset of Millaarc Cranstoun, containing 94 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 94 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 216 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 94 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 94 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 94 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 94 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 94 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 216 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 216 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 216 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
kaleemWaheed/twitter_dataset_1712976102
--- dataset_info: features: - name: id dtype: 'null' - name: tweet_content dtype: 'null' - name: user_name dtype: 'null' - name: user_id dtype: 'null' - name: created_at dtype: 'null' - name: url dtype: 'null' - name: favourite_count dtype: 'null' - name: scraped_at dtype: 'null' - name: image_urls dtype: 'null' splits: - name: train num_bytes: 0 num_examples: 0 download_size: 2160 dataset_size: 0 configs: - config_name: default data_files: - split: train path: data/train-* ---
tyzhu/fwv2_baseline_squad_train_10_eval_10
--- dataset_info: features: - name: inputs dtype: string - name: targets dtype: string - name: text dtype: string splits: - name: train num_bytes: 3782 num_examples: 10 - name: eval_find_word num_bytes: 3480 num_examples: 10 - name: validation num_bytes: 3480 num_examples: 10 download_size: 18600 dataset_size: 10742 --- # Dataset Card for "fwv2_baseline_squad_train_10_eval_10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Dampish/700M_trainee
--- license: cc-by-nc-4.0 dataset_info: features: - name: input dtype: string - name: instruction dtype: string - name: output dtype: string - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 1087012793 num_examples: 99800 download_size: 298661211 dataset_size: 1087012793 ---
mrcybertooth/bangla-news-crawl
--- license: cc-by-4.0 ---
bhavyagiri/InLegal-Sbert-Dataset
--- license: mit ---
iamwille/igbo-translation
--- dataset_info: features: - name: English dtype: string - name: Igbo dtype: string splits: - name: train num_bytes: 1983074.9958306309 num_examples: 8094 - name: test num_bytes: 661270.004169369 num_examples: 2699 download_size: 1705526 dataset_size: 2644345.0 annotations_creators: - found - crowdsourced language: - en - ig language_creators: - crowdsourced - found license: - apache-2.0 multilinguality: - translation pretty_name: 'Igbo to English language ' size_categories: - 10K<n<100K source_datasets: - extended|igbo_english_machine_translation tags: [] task_categories: - translation task_ids: [] --- # Dataset Card for "igbo-translation" ## Dataset Summary This data set contains translated data from engllish to igbo language for use in training general purpose translation models [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
atmallen/quirky_addition_increment3_bob_hard
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: alice_label dtype: bool - name: bob_label dtype: bool - name: difficulty dtype: int64 - name: statement dtype: string - name: choices sequence: string - name: character dtype: string - name: label dtype: bool splits: - name: train num_bytes: 1636460.2815 num_examples: 24225 - name: validation num_bytes: 160122.8088 num_examples: 2372 - name: test num_bytes: 164309.7354 num_examples: 2433 download_size: 619179 dataset_size: 1960892.8257 --- # Dataset Card for "quirky_addition_increment3_bob_hard" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft
--- pretty_name: Evaluation run of Yukang/Llama-2-13b-chat-longlora-32k-sft dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Yukang/Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-29T02:16:35.328850](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft/blob/main/results_2023-10-29T02-16-35.328850.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17051174496644295,\n\ \ \"em_stderr\": 0.003851429222727117,\n \"f1\": 0.23656669463087293,\n\ \ \"f1_stderr\": 0.003934121554985558,\n \"acc\": 0.32044198895027626,\n\ \ \"acc_stderr\": 0.006741557601060113\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.17051174496644295,\n \"em_stderr\": 0.003851429222727117,\n\ \ \"f1\": 0.23656669463087293,\n \"f1_stderr\": 0.003934121554985558\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6408839779005525,\n\ \ \"acc_stderr\": 0.013483115202120225\n }\n}\n```" repo_url: https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|arc:challenge|25_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|arc:challenge|25_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-10-03T19-09-03.932151.parquet' - config_name: harness_drop_3 data_files: - split: 2023_10_27T06_30_00.713733 path: - '**/details_harness|drop|3_2023-10-27T06-30-00.713733.parquet' - split: 2023_10_29T02_16_35.328850 path: - '**/details_harness|drop|3_2023-10-29T02-16-35.328850.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-29T02-16-35.328850.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_27T06_30_00.713733 path: - '**/details_harness|gsm8k|5_2023-10-27T06-30-00.713733.parquet' - split: 2023_10_29T02_16_35.328850 path: - '**/details_harness|gsm8k|5_2023-10-29T02-16-35.328850.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-29T02-16-35.328850.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hellaswag|10_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hellaswag|10_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-01-52.732036.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-09-03.932151.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_10_03T19_01_52.732036 path: - '**/details_harness|truthfulqa:mc|0_2023-10-03T19-01-52.732036.parquet' - split: 2023_10_03T19_09_03.932151 path: - '**/details_harness|truthfulqa:mc|0_2023-10-03T19-09-03.932151.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-10-03T19-09-03.932151.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_27T06_30_00.713733 path: - '**/details_harness|winogrande|5_2023-10-27T06-30-00.713733.parquet' - split: 2023_10_29T02_16_35.328850 path: - '**/details_harness|winogrande|5_2023-10-29T02-16-35.328850.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-29T02-16-35.328850.parquet' - config_name: results data_files: - split: 2023_10_03T19_01_52.732036 path: - results_2023-10-03T19-01-52.732036.parquet - split: 2023_10_03T19_09_03.932151 path: - results_2023-10-03T19-09-03.932151.parquet - split: 2023_10_27T06_30_00.713733 path: - results_2023-10-27T06-30-00.713733.parquet - split: 2023_10_29T02_16_35.328850 path: - results_2023-10-29T02-16-35.328850.parquet - split: latest path: - results_2023-10-29T02-16-35.328850.parquet --- # Dataset Card for Evaluation run of Yukang/Llama-2-13b-chat-longlora-32k-sft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T02:16:35.328850](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-chat-longlora-32k-sft/blob/main/results_2023-10-29T02-16-35.328850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.17051174496644295, "em_stderr": 0.003851429222727117, "f1": 0.23656669463087293, "f1_stderr": 0.003934121554985558, "acc": 0.32044198895027626, "acc_stderr": 0.006741557601060113 }, "harness|drop|3": { "em": 0.17051174496644295, "em_stderr": 0.003851429222727117, "f1": 0.23656669463087293, "f1_stderr": 0.003934121554985558 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.6408839779005525, "acc_stderr": 0.013483115202120225 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ChaoticNeutrals__Eris_7B
--- pretty_name: Evaluation run of ChaoticNeutrals/Eris_7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [ChaoticNeutrals/Eris_7B](https://huggingface.co/ChaoticNeutrals/Eris_7B) on the\ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Eris_7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-02T03:31:37.103146](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_7B/blob/main/results_2024-03-02T03-31-37.103146.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65726612107085,\n\ \ \"acc_stderr\": 0.03195271471521825,\n \"acc_norm\": 0.6572829578313998,\n\ \ \"acc_norm_stderr\": 0.03261474365117475,\n \"mc1\": 0.5152998776009792,\n\ \ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6695410911012915,\n\ \ \"mc2_stderr\": 0.015029125498737578\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344003,\n\ \ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537376\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7117108145787692,\n\ \ \"acc_stderr\": 0.004520406331084043,\n \"acc_norm\": 0.8799044015136427,\n\ \ \"acc_norm_stderr\": 0.0032440893478294444\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\ \ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\ \ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\ \ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\ \ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\ \ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\ \ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\ \ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.032232762667117124,\n\ \ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.032232762667117124\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\ \ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531,\n \"acc_norm\"\ : 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\ \ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\ \ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\ \ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\ : 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\ \ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\ acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\ \ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \ \ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \ \ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\ acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"\ acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\ : 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\ \ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n\ \ \"acc_stderr\": 0.02415222596280158,\n \"acc_norm\": 0.8627450980392157,\n\ \ \"acc_norm_stderr\": 0.02415222596280158\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n\ \ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\ \ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\ \ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\ \ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\ \ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\ \ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\ \ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\ \ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\ \ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\ \ \"acc_stderr\": 0.016568971233548606,\n \"acc_norm\": 0.4324022346368715,\n\ \ \"acc_norm_stderr\": 0.016568971233548606\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\ \ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\ \ \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n\ \ \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\ \ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\ \ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\ \ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \ \ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\ \ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6695410911012915,\n\ \ \"mc2_stderr\": 0.015029125498737578\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \ \ \"acc_stderr\": 0.01302366513622208\n }\n}\n```" repo_url: https://huggingface.co/ChaoticNeutrals/Eris_7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|arc:challenge|25_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-02T03-31-37.103146.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|gsm8k|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hellaswag|10_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-31-37.103146.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-management|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T03-31-37.103146.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|truthfulqa:mc|0_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-02T03-31-37.103146.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_02T03_31_37.103146 path: - '**/details_harness|winogrande|5_2024-03-02T03-31-37.103146.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-02T03-31-37.103146.parquet' - config_name: results data_files: - split: 2024_03_02T03_31_37.103146 path: - results_2024-03-02T03-31-37.103146.parquet - split: latest path: - results_2024-03-02T03-31-37.103146.parquet --- # Dataset Card for Evaluation run of ChaoticNeutrals/Eris_7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Eris_7B](https://huggingface.co/ChaoticNeutrals/Eris_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Eris_7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-02T03:31:37.103146](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_7B/blob/main/results_2024-03-02T03-31-37.103146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.65726612107085, "acc_stderr": 0.03195271471521825, "acc_norm": 0.6572829578313998, "acc_norm_stderr": 0.03261474365117475, "mc1": 0.5152998776009792, "mc1_stderr": 0.017495304473187902, "mc2": 0.6695410911012915, "mc2_stderr": 0.015029125498737578 }, "harness|arc:challenge|25": { "acc": 0.6911262798634812, "acc_stderr": 0.013501770929344003, "acc_norm": 0.7141638225255973, "acc_norm_stderr": 0.013203196088537376 }, "harness|hellaswag|10": { "acc": 0.7117108145787692, "acc_stderr": 0.004520406331084043, "acc_norm": 0.8799044015136427, "acc_norm_stderr": 0.0032440893478294444 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7245283018867924, "acc_stderr": 0.027495663683724057, "acc_norm": 0.7245283018867924, "acc_norm_stderr": 0.027495663683724057 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.032232762667117124, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.032232762667117124 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5175438596491229, "acc_stderr": 0.04700708033551038, "acc_norm": 0.5175438596491229, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.02805779167298902, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.02805779167298902 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6846153846153846, "acc_stderr": 0.02355964698318994, "acc_norm": 0.6846153846153846, "acc_norm_stderr": 0.02355964698318994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.01591955782997604, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.01591955782997604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5277777777777778, "acc_stderr": 0.0340470532865388, "acc_norm": 0.5277777777777778, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.02415222596280158, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.02415222596280158 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229136, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229136 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752599, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752599 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371802, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371802 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4324022346368715, "acc_stderr": 0.016568971233548606, "acc_norm": 0.4324022346368715, "acc_norm_stderr": 0.016568971233548606 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729484, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729484 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47196870925684486, "acc_stderr": 0.012750151802922438, "acc_norm": 0.47196870925684486, "acc_norm_stderr": 0.012750151802922438 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.027971541370170598, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.027971541370170598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.03882310850890594, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.03882310850890594 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5152998776009792, "mc1_stderr": 0.017495304473187902, "mc2": 0.6695410911012915, "mc2_stderr": 0.015029125498737578 }, "harness|winogrande|5": { "acc": 0.8421468034727704, "acc_stderr": 0.010247165248719763 }, "harness|gsm8k|5": { "acc": 0.6626231993934799, "acc_stderr": 0.01302366513622208 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
liuyanchen1015/MULTI_VALUE_stsb_possessives_for_pre
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 67613 num_examples: 332 - name: test num_bytes: 44114 num_examples: 238 - name: train num_bytes: 198563 num_examples: 1043 download_size: 207336 dataset_size: 310290 --- # Dataset Card for "MULTI_VALUE_stsb_possessives_for_pre" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jamestalentium/cnn_dailymail_250_test
--- dataset_info: features: - name: input_text dtype: string - name: output_text dtype: string - name: id dtype: string splits: - name: test num_bytes: 8255778.137510879 num_examples: 1900 download_size: 2210923 dataset_size: 8255778.137510879 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "cnn_dailymail_250_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
atmallen/quirky_bookrating
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: alice_label dtype: bool - name: bob_label dtype: bool - name: difficulty dtype: float64 - name: statement dtype: string - name: choices sequence: string - name: character dtype: string - name: label dtype: bool splits: - name: train num_bytes: 768921 num_examples: 5714 - name: validation num_bytes: 538515 num_examples: 4000 - name: test num_bytes: 540907 num_examples: 4000 download_size: 426661 dataset_size: 1848343 --- # Dataset Card for "quirky_bookrating" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
K4I/mini-platypus
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string splits: - name: train num_bytes: 4186564 num_examples: 1000 download_size: 2245921 dataset_size: 4186564 configs: - config_name: default data_files: - split: train path: data/train-* ---
bpalacios/news_to_graph
--- license: mit ---
sayeed99/fashion_segmentation
--- size_categories: - 10K<n<100K task_categories: - image-segmentation pretty_name: fashion_segmentation dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: train num_bytes: 26367772877.402 num_examples: 45193 download_size: 24601723580 dataset_size: 26367772877.402 configs: - config_name: default data_files: - split: train path: data/train-* ---
giuseppefutia/guanaco-llama2-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1654448 num_examples: 1000 download_size: 966692 dataset_size: 1654448 configs: - config_name: default data_files: - split: train path: data/train-* ---
msr_sqa
--- annotations_creators: - crowdsourced language_creators: - found language: - en license: - ms-pl multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - question-answering task_ids: - extractive-qa paperswithcode_id: null pretty_name: Microsoft Research Sequential Question Answering dataset_info: features: - name: id dtype: string - name: annotator dtype: int32 - name: position dtype: int32 - name: question dtype: string - name: question_and_history sequence: string - name: table_file dtype: string - name: table_header sequence: string - name: table_data sequence: sequence: string - name: answer_coordinates sequence: - name: row_index dtype: int32 - name: column_index dtype: int32 - name: answer_text sequence: string splits: - name: train num_bytes: 19732499 num_examples: 12276 - name: validation num_bytes: 3738331 num_examples: 2265 - name: test num_bytes: 5105873 num_examples: 3012 download_size: 4796932 dataset_size: 28576703 --- # Dataset Card for Microsoft Research Sequential Question Answering ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [Microsoft Research Sequential Question Answering (SQA) Dataset](https://msropendata.com/datasets/b25190ed-0f59-47b1-9211-5962858142c2) - **Repository:** - **Paper:** [https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/acl17-dynsp.pdf](https://www.microsoft.com/en-us/research/wp-content/uploads/2017/05/acl17-dynsp.pdf) - **Leaderboard:** - **Point of Contact:** - Scott Wen-tau Yih scottyih@microsoft.com - Mohit Iyyer m.iyyer@gmail.com - Ming-Wei Chang minchang@microsoft.com ### Dataset Summary Recent work in semantic parsing for question answering has focused on long and complicated questions, many of which would seem unnatural if asked in a normal conversation between two humans. In an effort to explore a conversational QA setting, we present a more realistic task: answering sequences of simple but inter-related questions. We created SQA by asking crowdsourced workers to decompose 2,022 questions from WikiTableQuestions (WTQ)*, which contains highly-compositional questions about tables from Wikipedia. We had three workers decompose each WTQ question, resulting in a dataset of 6,066 sequences that contain 17,553 questions in total. Each question is also associated with answers in the form of cell locations in the tables. - Panupong Pasupat, Percy Liang. "Compositional Semantic Parsing on Semi-Structured Tables" ACL-2015. [http://www-nlp.stanford.edu/software/sempre/wikitable/](http://www-nlp.stanford.edu/software/sempre/wikitable/) ### Supported Tasks and Leaderboards [More Information Needed] ### Languages English (`en`). ## Dataset Structure ### Data Instances ``` {'id': 'nt-639', 'annotator': 0, 'position': 0, 'question': 'where are the players from?', 'table_file': 'table_csv/203_149.csv', 'table_header': ['Pick', 'Player', 'Team', 'Position', 'School'], 'table_data': [['1', 'Ben McDonald', 'Baltimore Orioles', 'RHP', 'Louisiana State University'], ['2', 'Tyler Houston', 'Atlanta Braves', 'C', '"Valley HS (Las Vegas', ' NV)"'], ['3', 'Roger Salkeld', 'Seattle Mariners', 'RHP', 'Saugus (CA) HS'], ['4', 'Jeff Jackson', 'Philadelphia Phillies', 'OF', '"Simeon HS (Chicago', ' IL)"'], ['5', 'Donald Harris', 'Texas Rangers', 'OF', 'Texas Tech University'], ['6', 'Paul Coleman', 'Saint Louis Cardinals', 'OF', 'Frankston (TX) HS'], ['7', 'Frank Thomas', 'Chicago White Sox', '1B', 'Auburn University'], ['8', 'Earl Cunningham', 'Chicago Cubs', 'OF', 'Lancaster (SC) HS'], ['9', 'Kyle Abbott', 'California Angels', 'LHP', 'Long Beach State University'], ['10', 'Charles Johnson', 'Montreal Expos', 'C', '"Westwood HS (Fort Pierce', ' FL)"'], ['11', 'Calvin Murray', 'Cleveland Indians', '3B', '"W.T. White High School (Dallas', ' TX)"'], ['12', 'Jeff Juden', 'Houston Astros', 'RHP', 'Salem (MA) HS'], ['13', 'Brent Mayne', 'Kansas City Royals', 'C', 'Cal State Fullerton'], ['14', 'Steve Hosey', 'San Francisco Giants', 'OF', 'Fresno State University'], ['15', 'Kiki Jones', 'Los Angeles Dodgers', 'RHP', '"Hillsborough HS (Tampa', ' FL)"'], ['16', 'Greg Blosser', 'Boston Red Sox', 'OF', 'Sarasota (FL) HS'], ['17', 'Cal Eldred', 'Milwaukee Brewers', 'RHP', 'University of Iowa'], ['18', 'Willie Greene', 'Pittsburgh Pirates', 'SS', '"Jones County HS (Gray', ' GA)"'], ['19', 'Eddie Zosky', 'Toronto Blue Jays', 'SS', 'Fresno State University'], ['20', 'Scott Bryant', 'Cincinnati Reds', 'OF', 'University of Texas'], ['21', 'Greg Gohr', 'Detroit Tigers', 'RHP', 'Santa Clara University'], ['22', 'Tom Goodwin', 'Los Angeles Dodgers', 'OF', 'Fresno State University'], ['23', 'Mo Vaughn', 'Boston Red Sox', '1B', 'Seton Hall University'], ['24', 'Alan Zinter', 'New York Mets', 'C', 'University of Arizona'], ['25', 'Chuck Knoblauch', 'Minnesota Twins', '2B', 'Texas A&M University'], ['26', 'Scott Burrell', 'Seattle Mariners', 'RHP', 'Hamden (CT) HS']], 'answer_coordinates': {'row_index': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25], 'column_index': [4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4]}, 'answer_text': ['Louisiana State University', 'Valley HS (Las Vegas, NV)', 'Saugus (CA) HS', 'Simeon HS (Chicago, IL)', 'Texas Tech University', 'Frankston (TX) HS', 'Auburn University', 'Lancaster (SC) HS', 'Long Beach State University', 'Westwood HS (Fort Pierce, FL)', 'W.T. White High School (Dallas, TX)', 'Salem (MA) HS', 'Cal State Fullerton', 'Fresno State University', 'Hillsborough HS (Tampa, FL)', 'Sarasota (FL) HS', 'University of Iowa', 'Jones County HS (Gray, GA)', 'Fresno State University', 'University of Texas', 'Santa Clara University', 'Fresno State University', 'Seton Hall University', 'University of Arizona', 'Texas A&M University', 'Hamden (CT) HS']} ``` ### Data Fields - `id` (`str`): question sequence id (the id is consistent with those in WTQ) - `annotator` (`int`): `0`, `1`, `2` (the 3 annotators who annotated the question intent) - `position` (`int`): the position of the question in the sequence - `question` (`str`): the question given by the annotator - `table_file` (`str`): the associated table - `table_header` (`List[str]`): a list of headers in the table - `table_data` (`List[List[str]]`): 2d array of data in the table - `answer_coordinates` (`List[Dict]`): the table cell coordinates of the answers (0-based, where 0 is the first row after the table header) - `row_index` - `column_index` - `answer_text` (`List[str]`): the content of the answer cells Note that some text fields may contain Tab or LF characters and thus start with quotes. It is recommended to use a CSV parser like the Python CSV package to process the data. ### Data Splits | | train | test | |-------------|------:|-----:| | N. examples | 14541 | 3012 | ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [Microsoft Research Data License Agreement](https://msropendata-web-api.azurewebsites.net/licenses/2f933be3-284d-500b-7ea3-2aa2fd0f1bb2/view). ### Citation Information ``` @inproceedings{iyyer-etal-2017-search, title = "Search-based Neural Structured Learning for Sequential Question Answering", author = "Iyyer, Mohit and Yih, Wen-tau and Chang, Ming-Wei", booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", month = jul, year = "2017", address = "Vancouver, Canada", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/P17-1167", doi = "10.18653/v1/P17-1167", pages = "1821--1831", } ``` ### Contributions Thanks to [@mattbui](https://github.com/mattbui) for adding this dataset.
renumics/spotlight-b-mc2-sql-create-context-enrichment
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: answer.embedding sequence: float32 length: 2 - name: question.embedding sequence: float32 length: 2 - name: context.embedding sequence: float32 length: 2 splits: - name: train num_bytes: 1885848 num_examples: 78577 download_size: 2616932 dataset_size: 1885848 --- # Dataset Card for "spotlight-b-mc2-sql-create-context-enrichment" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhixuluo/speech_recognition
--- license: apache-2.0 ---
EleutherAI/quirky_population_raw
--- dataset_info: features: - name: id dtype: string - name: template_args struct: - name: character dtype: string - name: city dtype: string - name: character dtype: string - name: label dtype: bool - name: alice_label dtype: bool - name: bob_label dtype: bool - name: difficulty dtype: float64 - name: difficulty_quantile dtype: float64 splits: - name: train num_bytes: 429274 num_examples: 7493 - name: validation num_bytes: 229485 num_examples: 4000 - name: test num_bytes: 229325 num_examples: 4000 download_size: 526119 dataset_size: 888084 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
edarchimbaud/earnings-estimate-stocks
--- dataset_info: features: - name: symbol dtype: string - name: date dtype: string - name: current_qtr dtype: string - name: no_of_analysts_current_qtr dtype: int64 - name: next_qtr dtype: string - name: no_of_analysts_next_qtr dtype: int64 - name: current_year dtype: int64 - name: no_of_analysts_current_year dtype: int64 - name: next_year dtype: int64 - name: no_of_analysts_next_year dtype: int64 - name: avg_estimate_current_qtr dtype: float64 - name: avg_estimate_next_qtr dtype: float64 - name: avg_estimate_current_year dtype: float64 - name: avg_estimate_next_year dtype: float64 - name: low_estimate_current_qtr dtype: float64 - name: low_estimate_next_qtr dtype: float64 - name: low_estimate_current_year dtype: float64 - name: low_estimate_next_year dtype: float64 - name: high_estimate_current_qtr dtype: float64 - name: high_estimate_next_qtr dtype: float64 - name: high_estimate_current_year dtype: float64 - name: high_estimate_next_year dtype: float64 - name: year_ago_eps_current_qtr dtype: float64 - name: year_ago_eps_next_qtr dtype: float64 - name: year_ago_eps_current_year dtype: float64 - name: year_ago_eps_next_year dtype: float64 splits: - name: train num_bytes: 4919659 num_examples: 22192 download_size: 630013 dataset_size: 4919659 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "earnings-estimate-sp500" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Repository:** [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Dataset Summary The earnings-estimate-sp500 dataset provides earnings estimate data for companies in the S&P 500 index. ### Supported Tasks and Leaderboards The dataset can be used to analyze earnings estimates for systematic trading or financial analysis tasks. The dataset does not specify any associated leaderboards. ### Languages [N/A] ## Dataset Structure ### Data Instances [N/A] ### Data Fields The dataset contains the following fields: - symbol (string): A string representing the ticker symbol or abbreviation used to identify the company. - date (string): The date associated with the earnings estimate data. - current_qtr (string): The current quarter. - no_of_analysts_current_qtr (int64): The number of analysts providing estimates for the current quarter. - next_qtr (string): The next quarter. - no_of_analysts_next_qtr (int64): The number of analysts providing estimates for the next quarter. - current_year (int64): The current year. - no_of_analysts_current_year (int64): The number of analysts providing estimates for the current year. - next_year (int64): The next year. - no_of_analysts_next_year (int64): The number of analysts providing estimates for the next year. - avg_estimate_current_qtr (float64): The average estimate for the current quarter. - avg_estimate_next_qtr (float64): The average estimate for the next quarter. - avg_estimate_current_year (float64): The average estimate for the current year. - avg_estimate_next_year (float64): The average estimate for the next year. - low_estimate_current_qtr (float64): The low estimate for the current quarter. - low_estimate_next_qtr (float64): The low estimate for the next quarter. - low_estimate_current_year (float64): The low estimate for the current year. - low_estimate_next_year (float64): The low estimate for the next year. - high_estimate_current_qtr (float64): The high estimate for the current quarter. - high_estimate_next_qtr (float64): The high estimate for the next quarter. - high_estimate_current_year (float64): The high estimate for the current year. - high_estimate_next_year (float64): The high estimate for the next year. - year_ago_eps_current_qtr (float64): The earnings per share (EPS) for the current quarter a year ago. - year_ago_eps_next_qtr (float64): The earnings per share (EPS) for the next quarter a year ago. - year_ago_eps_current_year (float64): The earnings per share (EPS) for the current year a year ago. - year_ago_eps_next_year (float64): The earnings per share (EPS) for the next year a year ago. ### Data Splits The dataset consists of a single split, called "train." ## Additional Information ### Dataset Curators This dataset does not specify any specific curators. ### Licensing Information The earnings-estimate-sp500 dataset is licensed under the MIT License. ### Citation Information > https://edarchimbaud.substack.com, earnings-estimate-sp500 dataset, GitHub repository, https://github.com/edarchimbaud ### Contributions Thanks to [@edarchimbaud](https://github.com/edarchimbaud) for adding this dataset.
nixiesearch/msmarco-10k
--- license: apache-2.0 language: - en tags: - msmarco - nlp - search --- # A 10K docs sample from MS MARCO This is a sample dataset of random 10K rows from the [MS MARCO](https://microsoft.github.io/msmarco/) dataset. This is used in Nixiesearch [quickstart guide](https://www.nixiesearch.ai/quickstart/) to save some time indexing a full MSMARCO with 8M documents. ## Schema This is a JSONL-formatted dataset with only two fields inside: `id` for document identifier and `text` for the actual text snippet. ```json { "id": "0", "text": "The presence of communication amid scientific minds was equally important to the success of the Manhattan Project as scientific intellect was. The only cloud hanging over the impressive achievement of the atomic researchers and engineers is what their success truly meant; hundreds of thousands of innocent lives obliterated." } ``` ## License Apache 2.0
nadhifikbarw/id_ner_nimas
--- language: - id task_categories: - token-classification --- Token classification dataset developed from dataset by Katarina Nimas Kusumawati's undergraduate thesis: **"Identifikasi Entitas Bernama dalam Domain Medis pada Layanan Konsultasi Kesehatan Berbahasa Menggunkan Alrogritme Bidirectional-LSTM-CRF"** Institut Teknologi Sepuluh Nopember, Surabaya, Indonesia - 2022 I just performed stratified train-validation-test split work from the original dataset. Compatible with HuggingFace token-classification script (Tested in 4.17) https://github.com/huggingface/transformers/tree/v4.17.0/examples/pytorch/token-classification
Ti-Ma/TiMaGPT2-2015
--- license: other license_name: paracrawl-license license_link: LICENSE ---
heliosprime/twitter_dataset_1713016318
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 12541 num_examples: 28 download_size: 10270 dataset_size: 12541 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713016318" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yuiseki/g-uc
--- dataset_info: features: - name: episode dtype: string - name: subtitle dtype: string - name: place dtype: string - name: person dtype: string - name: text dtype: string - name: text_original dtype: string splits: - name: train num_bytes: 740179 num_examples: 3344 download_size: 286909 dataset_size: 740179 configs: - config_name: default data_files: - split: train path: data/train-* ---
mask-distilled-one-sec-cv12/chunk_37
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 951913756 num_examples: 186943 download_size: 967957987 dataset_size: 951913756 --- # Dataset Card for "chunk_37" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
teven/oscar_sanity
--- dataset_info: features: - name: id dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 107061941 num_examples: 7900 download_size: 63545141 dataset_size: 107061941 --- # Dataset Card for "oscar_sanity" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vrclc/festvox-iiith-ml
--- dataset_info: features: - name: audio dtype: audio - name: speech_id dtype: string - name: speaker_id dtype: string - name: transcript dtype: string splits: - name: train num_bytes: 187936686 num_examples: 1000 download_size: 180001519 dataset_size: 187936686 configs: - config_name: default data_files: - split: train path: data/train-* task_categories: - automatic-speech-recognition - text-to-speech language: - ml pretty_name: Festvox IIITH Malayalam size_categories: - n<1K ---
shivangibithel/SATO
--- license: cc-by-4.0 ---
huggingartists/taylor-swift
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/taylor-swift" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 1.469581 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/3c1f124fcbbc2857a95e513fb34cc5a8.400x400x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/taylor-swift"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">Taylor Swift</div> <a href="https://genius.com/artists/taylor-swift"> <div style="text-align: center; font-size: 14px;">@taylor-swift</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/taylor-swift). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/taylor-swift") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |762| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/taylor-swift") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B
--- pretty_name: Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [jsfs11/WildMBXMarconi-SLERP-7B](https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-24T03:57:13.465418](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B/blob/main/results_2024-01-24T03-57-13.465418.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550573661941113,\n\ \ \"acc_stderr\": 0.03204593323100834,\n \"acc_norm\": 0.6543869571104515,\n\ \ \"acc_norm_stderr\": 0.032715471402145695,\n \"mc1\": 0.5569155446756426,\n\ \ \"mc1_stderr\": 0.017389730346877116,\n \"mc2\": 0.6897987229376631,\n\ \ \"mc2_stderr\": 0.015167779378758222\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957002,\n\ \ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496364\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7191794463254332,\n\ \ \"acc_stderr\": 0.00448481564706465,\n \"acc_norm\": 0.884883489344752,\n\ \ \"acc_norm_stderr\": 0.0031851021916879108\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\ \ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\ \ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\ \ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \ \ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\ \ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\ \ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\ \ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\ \ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\ \ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\ \ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \ \ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\ \ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\ \ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\ \ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\ \ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\ acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\ \ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\ \ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \ \ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \ \ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\ acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\ acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \ \ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\ \ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\ \ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\ \ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\ \ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\ \ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\ \ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\ \ \"acc_stderr\": 0.01656382939904771,\n \"acc_norm\": 0.4312849162011173,\n\ \ \"acc_norm_stderr\": 0.01656382939904771\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\ \ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\ acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\ \ \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n\ \ \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\ \ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \ \ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\ \ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\ \ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\ \ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5569155446756426,\n\ \ \"mc1_stderr\": 0.017389730346877116,\n \"mc2\": 0.6897987229376631,\n\ \ \"mc2_stderr\": 0.015167779378758222\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \ \ \"acc_stderr\": 0.012513215297888463\n }\n}\n```" repo_url: https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|arc:challenge|25_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-24T03-57-13.465418.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|gsm8k|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hellaswag|10_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|truthfulqa:mc|0_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-24T03-57-13.465418.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_24T03_57_13.465418 path: - '**/details_harness|winogrande|5_2024-01-24T03-57-13.465418.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-24T03-57-13.465418.parquet' - config_name: results data_files: - split: 2024_01_24T03_57_13.465418 path: - results_2024-01-24T03-57-13.465418.parquet - split: latest path: - results_2024-01-24T03-57-13.465418.parquet --- # Dataset Card for Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jsfs11/WildMBXMarconi-SLERP-7B](https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-24T03:57:13.465418](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B/blob/main/results_2024-01-24T03-57-13.465418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6550573661941113, "acc_stderr": 0.03204593323100834, "acc_norm": 0.6543869571104515, "acc_norm_stderr": 0.032715471402145695, "mc1": 0.5569155446756426, "mc1_stderr": 0.017389730346877116, "mc2": 0.6897987229376631, "mc2_stderr": 0.015167779378758222 }, "harness|arc:challenge|25": { "acc": 0.6996587030716723, "acc_stderr": 0.013395909309957002, "acc_norm": 0.7329351535836177, "acc_norm_stderr": 0.012928933196496364 }, "harness|hellaswag|10": { "acc": 0.7191794463254332, "acc_stderr": 0.00448481564706465, "acc_norm": 0.884883489344752, "acc_norm_stderr": 0.0031851021916879108 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700914, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700914 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.04122737111370333, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.04122737111370333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328974, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328974 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131154, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.0302839955258844, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092434, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092434 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.034086558679777494, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.034086558679777494 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931796, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931796 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467618, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4312849162011173, "acc_stderr": 0.01656382939904771, "acc_norm": 0.4312849162011173, "acc_norm_stderr": 0.01656382939904771 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188936, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188936 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.01274197433389723, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.01274197433389723 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898445, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898445 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5569155446756426, "mc1_stderr": 0.017389730346877116, "mc2": 0.6897987229376631, "mc2_stderr": 0.015167779378758222 }, "harness|winogrande|5": { "acc": 0.8397790055248618, "acc_stderr": 0.010309209498187479 }, "harness|gsm8k|5": { "acc": 0.7088703563305534, "acc_stderr": 0.012513215297888463 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
atulxop/ArxivArticleSumm
--- license: apache-2.0 dataset_info: features: - name: text dtype: string - name: summary dtype: string splits: - name: train num_bytes: 13762366.740210796 num_examples: 10958 - name: test num_bytes: 3932284.2 num_examples: 3131 - name: validation num_bytes: 1966770.0597892047 num_examples: 1566 download_size: 11225883 dataset_size: 19661421.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: validation path: data/validation-* ---
edsongomes0215/lula
--- license: openrail ---
alisson40889/cidao
--- license: openrail ---
baptistecolle/sam-controlnet-test
--- dataset_info: features: - name: masks dtype: image splits: - name: train num_bytes: 28821400.0 num_examples: 200 download_size: 0 dataset_size: 28821400.0 --- # Dataset Card for "sam-controlnet-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DynamicSuperbPrivate/ReverberationDetectionSmallRoom_VoxcelebRirsNoises
--- dataset_info: features: - name: file dtype: string - name: audio dtype: audio - name: instruction dtype: string - name: label dtype: string splits: - name: train num_bytes: 3088676087.0 num_examples: 24000 - name: validation num_bytes: 671529456.0 num_examples: 5218 - name: test num_bytes: 1254515820.0 num_examples: 9748 download_size: 4999935933 dataset_size: 5014721363.0 --- # Dataset Card for "ReverberationDetectionsmallroom_VoxcelebRirsNoises" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lemunite/Tunetest
--- language: - vi license: cc-by-sa-4.0 ---
zhangzi/arknights
--- license: openrail ---
liyongsea/THINGS_EEG_Test
--- dataset_info: features: - name: eeg_array sequence: sequence: float64 - name: label dtype: int64 - name: fold dtype: int64 - name: metadata struct: - name: fold dtype: int64 - name: img_concept dtype: string - name: label dtype: int64 - name: repeat_id dtype: int64 - name: subject dtype: string splits: - name: sub_01 num_bytes: 219827280 num_examples: 16000 - name: sub_02 num_bytes: 219827280 num_examples: 16000 - name: sub_03 num_bytes: 219827280 num_examples: 16000 - name: sub_04 num_bytes: 219827280 num_examples: 16000 - name: sub_05 num_bytes: 219827280 num_examples: 16000 - name: sub_06 num_bytes: 219827280 num_examples: 16000 - name: sub_07 num_bytes: 219827280 num_examples: 16000 - name: sub_08 num_bytes: 219827280 num_examples: 16000 - name: sub_09 num_bytes: 219827280 num_examples: 16000 - name: sub_10 num_bytes: 219827280 num_examples: 16000 download_size: 2222377857 dataset_size: 2198272800 configs: - config_name: default data_files: - split: sub_01 path: data/sub_01-* - split: sub_02 path: data/sub_02-* - split: sub_03 path: data/sub_03-* - split: sub_04 path: data/sub_04-* - split: sub_05 path: data/sub_05-* - split: sub_06 path: data/sub_06-* - split: sub_07 path: data/sub_07-* - split: sub_08 path: data/sub_08-* - split: sub_09 path: data/sub_09-* - split: sub_10 path: data/sub_10-* ---
open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-length-100000
--- pretty_name: Evaluation run of NLUHOPOE/Mistral-7B-length-100000 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [NLUHOPOE/Mistral-7B-length-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-length-100000)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-length-100000\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-26T10:39:37.184670](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-length-100000/blob/main/results_2024-01-26T10-39-37.184670.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5543337020996407,\n\ \ \"acc_stderr\": 0.03388959855375057,\n \"acc_norm\": 0.5605963010866872,\n\ \ \"acc_norm_stderr\": 0.034638046930789805,\n \"mc1\": 0.2937576499388005,\n\ \ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.44949394895398154,\n\ \ \"mc2_stderr\": 0.01461792575669919\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\ \ \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536598\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5826528579964151,\n\ \ \"acc_stderr\": 0.004921133864931888,\n \"acc_norm\": 0.7832105158334993,\n\ \ \"acc_norm_stderr\": 0.004112158798877642\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\ \ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\ \ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\ \ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\ \ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \ \ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286627,\n\ \ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286627\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\ \ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\ \ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\ : 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\ \ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\ \ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\ \ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\ \ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\ \ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\ \ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\ \ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.335978835978836,\n \"acc_stderr\": 0.02432631052914914,\n \"acc_norm\"\ : 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914914\n },\n\ \ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\ \ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\ \ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\ \ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n\ \ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\ \ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\ : 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\ \ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\ acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775295,\n\ \ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775295\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n\ \ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524582,\n \ \ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524582\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\ \ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\ acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7577981651376147,\n \"acc_stderr\": 0.018368176306598618,\n \"\ acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.018368176306598618\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\ acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n\ \ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\ \ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\ \ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\ \ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"\ acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831028,\n\ \ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831028\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\ \ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\ \ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.735632183908046,\n\ \ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.735632183908046,\n\ \ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n\ \ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\ \ \"acc_stderr\": 0.014893391735249622,\n \"acc_norm\": 0.27262569832402234,\n\ \ \"acc_norm_stderr\": 0.014893391735249622\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\ \ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\ \ \"acc_stderr\": 0.02736807824397164,\n \"acc_norm\": 0.6334405144694534,\n\ \ \"acc_norm_stderr\": 0.02736807824397164\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507887,\n\ \ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507887\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543444,\n \ \ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543444\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39895697522816165,\n\ \ \"acc_stderr\": 0.01250675765529367,\n \"acc_norm\": 0.39895697522816165,\n\ \ \"acc_norm_stderr\": 0.01250675765529367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \ \ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02013038831290452,\n \ \ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02013038831290452\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\ \ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\ \ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\ \ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\ \ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n\ \ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\ \ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\ \ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\ \ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\ \ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.44949394895398154,\n\ \ \"mc2_stderr\": 0.01461792575669919\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856542\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1956027293404094,\n \ \ \"acc_stderr\": 0.010926096810556464\n }\n}\n```" repo_url: https://huggingface.co/NLUHOPOE/Mistral-7B-length-100000 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|arc:challenge|25_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-26T10-39-37.184670.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|gsm8k|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hellaswag|10_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-39-37.184670.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-management|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T10-39-37.184670.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|truthfulqa:mc|0_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-26T10-39-37.184670.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_26T10_39_37.184670 path: - '**/details_harness|winogrande|5_2024-01-26T10-39-37.184670.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-26T10-39-37.184670.parquet' - config_name: results data_files: - split: 2024_01_26T10_39_37.184670 path: - results_2024-01-26T10-39-37.184670.parquet - split: latest path: - results_2024-01-26T10-39-37.184670.parquet --- # Dataset Card for Evaluation run of NLUHOPOE/Mistral-7B-length-100000 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [NLUHOPOE/Mistral-7B-length-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-length-100000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-length-100000", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-26T10:39:37.184670](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-length-100000/blob/main/results_2024-01-26T10-39-37.184670.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5543337020996407, "acc_stderr": 0.03388959855375057, "acc_norm": 0.5605963010866872, "acc_norm_stderr": 0.034638046930789805, "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236618, "mc2": 0.44949394895398154, "mc2_stderr": 0.01461792575669919 }, "harness|arc:challenge|25": { "acc": 0.49402730375426623, "acc_stderr": 0.014610348300255793, "acc_norm": 0.5170648464163823, "acc_norm_stderr": 0.014602878388536598 }, "harness|hellaswag|10": { "acc": 0.5826528579964151, "acc_stderr": 0.004921133864931888, "acc_norm": 0.7832105158334993, "acc_norm_stderr": 0.004112158798877642 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5263157894736842, "acc_stderr": 0.04063302731486671, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.04063302731486671 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6188679245283019, "acc_stderr": 0.029890609686286627, "acc_norm": 0.6188679245283019, "acc_norm_stderr": 0.029890609686286627 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5491329479768786, "acc_stderr": 0.0379401267469703, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.0379401267469703 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.04655010411319616, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.04655010411319616 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4765957446808511, "acc_stderr": 0.03265019475033582, "acc_norm": 0.4765957446808511, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.335978835978836, "acc_stderr": 0.02432631052914914, "acc_norm": 0.335978835978836, "acc_norm_stderr": 0.02432631052914914 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.04461960433384739, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6903225806451613, "acc_stderr": 0.026302774983517418, "acc_norm": 0.6903225806451613, "acc_norm_stderr": 0.026302774983517418 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4236453201970443, "acc_stderr": 0.03476725747649037, "acc_norm": 0.4236453201970443, "acc_norm_stderr": 0.03476725747649037 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.03588624800091706, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.03588624800091706 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7424242424242424, "acc_stderr": 0.03115626951964683, "acc_norm": 0.7424242424242424, "acc_norm_stderr": 0.03115626951964683 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7823834196891192, "acc_stderr": 0.02977866303775295, "acc_norm": 0.7823834196891192, "acc_norm_stderr": 0.02977866303775295 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5358974358974359, "acc_stderr": 0.025285585990017848, "acc_norm": 0.5358974358974359, "acc_norm_stderr": 0.025285585990017848 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524582, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524582 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.032385469487589795, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.032385469487589795 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7577981651376147, "acc_stderr": 0.018368176306598618, "acc_norm": 0.7577981651376147, "acc_norm_stderr": 0.018368176306598618 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842544, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842544 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6717557251908397, "acc_stderr": 0.04118438565806298, "acc_norm": 0.6717557251908397, "acc_norm_stderr": 0.04118438565806298 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6776859504132231, "acc_stderr": 0.04266416363352167, "acc_norm": 0.6776859504132231, "acc_norm_stderr": 0.04266416363352167 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6073619631901841, "acc_stderr": 0.03836740907831028, "acc_norm": 0.6073619631901841, "acc_norm_stderr": 0.03836740907831028 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8162393162393162, "acc_stderr": 0.025372139671722933, "acc_norm": 0.8162393162393162, "acc_norm_stderr": 0.025372139671722933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.735632183908046, "acc_stderr": 0.01576998484069052, "acc_norm": 0.735632183908046, "acc_norm_stderr": 0.01576998484069052 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6213872832369942, "acc_stderr": 0.02611374936131034, "acc_norm": 0.6213872832369942, "acc_norm_stderr": 0.02611374936131034 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249622, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249622 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6078431372549019, "acc_stderr": 0.027956046165424516, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.027956046165424516 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6334405144694534, "acc_stderr": 0.02736807824397164, "acc_norm": 0.6334405144694534, "acc_norm_stderr": 0.02736807824397164 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6327160493827161, "acc_stderr": 0.026822801759507887, "acc_norm": 0.6327160493827161, "acc_norm_stderr": 0.026822801759507887 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.38652482269503546, "acc_stderr": 0.029049190342543444, "acc_norm": 0.38652482269503546, "acc_norm_stderr": 0.029049190342543444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39895697522816165, "acc_stderr": 0.01250675765529367, "acc_norm": 0.39895697522816165, "acc_norm_stderr": 0.01250675765529367 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5625, "acc_stderr": 0.030134614954403924, "acc_norm": 0.5625, "acc_norm_stderr": 0.030134614954403924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5490196078431373, "acc_stderr": 0.02013038831290452, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.02013038831290452 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6081632653061224, "acc_stderr": 0.031251275910891656, "acc_norm": 0.6081632653061224, "acc_norm_stderr": 0.031251275910891656 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7412935323383084, "acc_stderr": 0.030965903123573026, "acc_norm": 0.7412935323383084, "acc_norm_stderr": 0.030965903123573026 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7309941520467836, "acc_stderr": 0.0340105262010409, "acc_norm": 0.7309941520467836, "acc_norm_stderr": 0.0340105262010409 }, "harness|truthfulqa:mc|0": { "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236618, "mc2": 0.44949394895398154, "mc2_stderr": 0.01461792575669919 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856542 }, "harness|gsm8k|5": { "acc": 0.1956027293404094, "acc_stderr": 0.010926096810556464 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
RIW/small-coco-wm_1
--- dataset_info: features: - name: image dtype: image - name: caption dtype: string - name: url dtype: string - name: key dtype: string - name: status dtype: string - name: error_message dtype: 'null' - name: width dtype: int64 - name: height dtype: int64 - name: original_width dtype: int64 - name: original_height dtype: int64 - name: exif dtype: string - name: sha256 dtype: string splits: - name: train num_bytes: 3929376803.167 num_examples: 19971 - name: validation num_bytes: 1954694137.43 num_examples: 9985 download_size: 1348722744 dataset_size: 5884070940.597 --- # Dataset Card for "small-coco-wm_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
NomeIncrivel/eimine
--- license: openrail ---
japanese-asr/whisper_transcriptions.reazonspeech.all_8
--- dataset_info: config_name: all features: - name: name dtype: string - name: audio dtype: audio: sampling_rate: 16000 - name: transcription dtype: string - name: whisper_transcript sequence: int64 splits: - name: train num_bytes: 30529777148.0 num_examples: 267895 download_size: 30286849367 dataset_size: 30529777148.0 configs: - config_name: all data_files: - split: train path: all/train-* ---
liuyanchen1015/MULTI_VALUE_stsb_plural_to_singular_human
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: score dtype: float64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 59516 num_examples: 341 - name: test num_bytes: 45394 num_examples: 275 - name: train num_bytes: 231584 num_examples: 1348 download_size: 219624 dataset_size: 336494 --- # Dataset Card for "MULTI_VALUE_stsb_plural_to_singular_human" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sordonia/platypus-templated-ia-flat
--- dataset_info: features: - name: split dtype: string - name: source dtype: string - name: target dtype: string - name: task_source dtype: string - name: task_name dtype: string splits: - name: train num_bytes: 31654017 num_examples: 24926 download_size: 15672184 dataset_size: 31654017 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "platypus-templated-ia-flat" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jlbaker361/spider-test
--- dataset_info: features: - name: image dtype: image - name: text dtype: string - name: frame dtype: int64 splits: - name: train num_bytes: 6900967258.928 num_examples: 1572 download_size: 6901168744 dataset_size: 6900967258.928 configs: - config_name: default data_files: - split: train path: data/train-* ---
fish-audio-private/playerfm-20000h
--- license: cc-by-nc-sa-4.0 size_categories: - 1M<n<10M --- # Multilingual Speech 20000h This dataset contains data crawled from internet and resampled to 44.1k mp3 files.
DianaJin/demo
--- dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 31703120 num_examples: 33 - name: test num_bytes: 4804408 num_examples: 5 - name: valid num_bytes: 3842896 num_examples: 4 download_size: 14357391 dataset_size: 40350424 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* ---
minimario/apps_partial_300_310
--- dataset_info: features: - name: problem dtype: string - name: code dtype: string - name: label dtype: int64 - name: full_sample dtype: string - name: where_from dtype: string splits: - name: train num_bytes: 17369435 num_examples: 15529 download_size: 438347 dataset_size: 17369435 --- # Dataset Card for "apps_partial_300_310" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pyimagesearch/blog-post-images
--- license: mit ---
one-sec-cv12/chunk_210
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 21185211312.875 num_examples: 220569 download_size: 20221288182 dataset_size: 21185211312.875 --- # Dataset Card for "chunk_210" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yardeny/processed_bert_context_len_256
--- dataset_info: features: - name: input_ids sequence: int32 - name: token_type_ids sequence: int8 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 14968483524.0 num_examples: 9669563 download_size: 5205237526 dataset_size: 14968483524.0 --- # Dataset Card for "processed_bert_context_len_256" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bigscience-data/roots_id_indonesian_news_articles_2017
--- language: id license: cc0-1.0 extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience Ethical Charter. The charter can be found at: https://hf.co/spaces/bigscience/ethical-charter' extra_gated_fields: I have read and agree to abide by the BigScience Ethical Charter: checkbox --- ROOTS Subset: roots_id_indonesian_news_articles_2017 # Indonesian News Articles 2017 - Dataset uid: `indonesian_news_articles_2017` ### Description Indonesian news articles published at 2017 contains published date, content, title, and source. ### Homepage kaggle.com/aashari/indonesian-news-articles-published-at-2017 ### Licensing - public domain - cc0-1.0: Creative Commons Zero v1.0 Universal CC0: Public Domain ### Speaker Locations - Asia - Indonesia ### Sizes - 0.0688 % of total - 26.1751 % of id ### BigScience processing steps #### Filters applied to: id - dedup_document - dedup_template_soft - filter_remove_empty_docs - filter_small_docs_bytes_300
CognitiveLab/hh-rlhf-json
--- dataset_info: features: - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 349908709 num_examples: 160800 download_size: 179927558 dataset_size: 349908709 configs: - config_name: default data_files: - split: train path: data/train-* ---
amaye15/Classify-Anything
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': AFRICAN EMERALD CUCKOO '1': Wire Haired Fox Terrier '2': MILITARY MACAW '3': JABIRU '4': 136.mandolin '5': 100.hawksbill-101 '6': EUROPEAN GOLDFINCH '7': 171.refrigerator '8': horse racing '9': 193.soccer-ball '10': GUINEAFOWL '11': BLACK THROATED BUSHTIT '12': curling '13': PALM NUT VULTURE '14': Kelpie '15': SAMATRAN THRUSH '16': 240.watch-101 '17': EURASIAN BULLFINCH '18': HOATZIN '19': 107.hot-air-balloon '20': 185.skateboard '21': ABYSSINIAN GROUND HORNBILL '22': INDIAN BUSTARD '23': ROSE BREASTED COCKATOO '24': Staffordshire Bullterrier '25': SPOTTED WHISTLING DUCK '26': CRESTED AUKLET '27': OSPREY '28': RUFOUS TREPE '29': 214.teepee '30': PURPLE SWAMPHEN '31': 014.blimp '32': 230.trilobite-101 '33': 045.computer-keyboard '34': uneven bars '35': GOLDEN EAGLE '36': CALIFORNIA CONDOR '37': SPANGLED COTINGA '38': 201.starfish-101 '39': RUFOUS KINGFISHER '40': weightlifting '41': Doberman '42': WHITE NECKED RAVEN '43': Samoyed '44': CANARY '45': 052.crab-101 '46': rock climbing '47': 198.spider '48': 188.smokestack '49': snowmobile racing '50': Birman '51': ABBOTTS BABBLER '52': ANIANIAU '53': WATTLED LAPWING '54': 130.license-plate '55': BUFFLEHEAD '56': MASKED BOBWHITE '57': judo '58': 009.bear '59': German Shepherd '60': AZURE BREASTED PITTA '61': DARJEELING WOODPECKER '62': 056.dog '63': 228.triceratops '64': RUDDY SHELDUCK '65': 231.tripod '66': Otterhound '67': STEAMER DUCK '68': 166.praying-mantis '69': RED HEADED DUCK '70': 253.faces-easy-101 '71': RED LEGGED HONEYCREEPER '72': BALD IBIS '73': HOUSE SPARROW '74': ALTAMIRA YELLOWTHROAT '75': EUROPEAN TURTLE DOVE '76': TAWNY FROGMOUTH '77': 049.cormorant '78': TRICOLORED BLACKBIRD '79': 065.elk '80': Dandie Dinmont '81': GREAT KISKADEE '82': HYACINTH MACAW '83': POMARINE JAEGER '84': EASTERN BLUEBONNET '85': INDIGO BUNTING '86': BANANAQUIT '87': GUINEA TURACO '88': 006.basketball-hoop '89': TAIWAN MAGPIE '90': BELTED KINGFISHER '91': YELLOW BELLIED FLOWERPECKER '92': 175.roulette-wheel '93': Eskimo Dog '94': Keeshond '95': INDIAN PITTA '96': GREY HEADED CHACHALACA '97': Irish Terrier '98': 184.sheet-music '99': ice climbing '100': NORTHERN PARULA '101': shot put '102': GROVED BILLED ANI '103': ECUADORIAN HILLSTAR '104': harness racing '105': 189.snail '106': billiards '107': STRIPPED MANAKIN '108': 241.waterfall '109': 027.calculator '110': ARARIPE MANAKIN '111': SCARLET TANAGER '112': TURKEY VULTURE '113': BAIKAL TEAL '114': 089.goose '115': GREY PLOVER '116': 030.canoe '117': ROUGH LEG BUZZARD '118': polo '119': German Short Haired Pointer '120': Border Terrier '121': 161.photocopier '122': SQUACCO HERON '123': 031.car-tire '124': Collie '125': SPLENDID WREN '126': CHINESE BAMBOO PARTRIDGE '127': 108.hot-dog '128': GRAY PARTRIDGE '129': 158.penguin '130': 058.doorknob '131': STRIPPED SWALLOW '132': MALAGASY WHITE EYE '133': GREEN JAY '134': BALI STARLING '135': 172.revolver-101 '136': 134.llama-101 '137': INDIAN VULTURE '138': Pomeranian '139': BROWN CREPPER '140': LITTLE AUK '141': ANDEAN GOOSE '142': PEREGRINE FALCON '143': SWINHOES PHEASANT '144': 212.teapot '145': SAYS PHOEBE '146': Borzoi '147': Rhodesian Ridgeback '148': 042.coffin '149': GREATER PRAIRIE CHICKEN '150': Walker Hound '151': MASKED BOOBY '152': parallel bar '153': BOBOLINK '154': GRANDALA '155': jousting '156': COMMON GRACKLE '157': Toy Poodle '158': SHORT BILLED DOWITCHER '159': CRIMSON SUNBIRD '160': AMERICAN ROBIN '161': BLOOD PHEASANT '162': KING VULTURE '163': 150.octopus '164': figure skating pairs '165': 013.birdbath '166': SCARLET IBIS '167': 043.coin '168': 111.house-fly '169': Tibetan Terrier '170': GREY HEADED FISH EAGLE '171': AMERICAN KESTREL '172': javelin '173': Standard Schnauzer '174': AMERICAN BITTERN '175': GREAT GRAY OWL '176': Chow '177': TAKAHE '178': TOUCHAN '179': 072.fire-truck '180': WOODLAND KINGFISHER '181': water cycling '182': 233.tuning-fork '183': RED FACED CORMORANT '184': 226.traffic-light '185': EURASIAN GOLDEN ORIOLE '186': CERULEAN WARBLER '187': pole climbing '188': IBISBILL '189': EASTERN TOWEE '190': TAILORBIRD '191': GREATER PEWEE '192': LAUGHING GULL '193': PARAKETT AUKLET '194': 206.sushi '195': 239.washing-machine '196': 007.bat '197': RED FACED WARBLER '198': 234.tweezer '199': Potato '200': CHESTNUT WINGED CUCKOO '201': ORANGE BRESTED BUNTING '202': RUFUOS MOTMOT '203': Siberian Husky '204': CRESTED SERPENT EAGLE '205': rugby '206': TURQUOISE MOTMOT '207': COMMON LOON '208': 060.duck '209': 053.desk-globe '210': Dingo '211': WILLOW PTARMIGAN '212': 015.bonsai-101 '213': COCKATOO '214': 199.spoon '215': ASIAN CRESTED IBIS '216': AMERICAN PIPIT '217': STRIPED OWL '218': BANDED PITA '219': 094.guitar-pick '220': Flat Coated Retriever '221': Groenendael '222': VULTURINE GUINEAFOWL '223': Pumpkin '224': 018.bowling-pin '225': 029.cannon '226': AFRICAN PIED HORNBILL '227': CACTUS WREN '228': GREAT ARGUS '229': PYRRHULOXIA '230': BLACK AND YELLOW BROADBILL '231': GREAT POTOO '232': RED TAILED THRUSH '233': 176.saddle '234': 248.yarmulke '235': arm wrestling '236': 163.playing-card '237': 104.homer-simpson '238': 180.screwdriver '239': NORTHERN CARDINAL '240': boxing '241': DAURIAN REDSTART '242': PARUS MAJOR '243': 168.raccoon '244': AUSTRAL CANASTERO '245': GOLD WING WARBLER '246': VIOLET CUCKOO '247': 016.boom-box '248': 135.mailbox '249': 186.skunk '250': NORTHERN FLICKER '251': Lakeland Terrier '252': HOUSE FINCH '253': BLACK VULTURE '254': 076.football-helmet '255': GOULDIAN FINCH '256': COMMON FIRECREST '257': RED BEARDED BEE EATER '258': 101.head-phones '259': 019.boxing-glove '260': 138.mattress '261': ALPINE CHOUGH '262': CAPE ROCK THRUSH '263': 115.ice-cream-cone '264': HELMET VANGA '265': 243.welding-mask '266': Carrot '267': RAZORBILL '268': BLUE HERON '269': ELLIOTS PHEASANT '270': EVENING GROSBEAK '271': 125.knife '272': LESSER ADJUTANT '273': EASTERN BLUEBIRD '274': PLUSH CRESTED JAY '275': formula 1 racing '276': ASHY STORM PETREL '277': skydiving '278': Kerry Blue Terrier '279': LOONEY BIRDS '280': roller derby '281': 147.mushroom '282': Pug '283': VIOLET BACKED STARLING '284': CLARKS NUTCRACKER '285': DALMATIAN PELICAN '286': WATTLED CURASSOW '287': fly fishing '288': VIOLET GREEN SWALLOW '289': 128.lathe '290': 048.conch '291': 223.top-hat '292': Toy Terrier '293': 034.centipede '294': Norwegian Elkhound '295': 050.covered-wagon '296': BLACK-THROATED SPARROW '297': OILBIRD '298': 088.golf-ball '299': GOLDEN CHLOROPHONIA '300': 194.socks '301': Miniature Schnauzer '302': ENGGANO MYNA '303': Sealyham Terrier '304': CASPIAN TERN '305': DOUBLE EYED FIG PARROT '306': RED BROWED FINCH '307': mushing '308': PAINTED BUNTING '309': ANHINGA '310': British Shorthair '311': 190.snake '312': 103.hibiscus '313': GOLDEN BOWER BIRD '314': 126.ladder '315': Cardigan '316': BALTIMORE ORIOLE '317': French Bulldog '318': 002.american-flag '319': AMERICAN REDSTART '320': BLACK NECKED STILT '321': MALLARD DUCK '322': TREE SWALLOW '323': 044.comet '324': GREAT TINAMOU '325': KAGU '326': volleyball '327': 117.ipod '328': PUNA TEAL '329': RED WINGED BLACKBIRD '330': ROSEATE SPOONBILL '331': KIWI '332': WILD TURKEY '333': 011.billiards '334': RED BELLIED PITTA '335': 251.airplanes-101 '336': PINK ROBIN '337': 232.t-shirt '338': Broccoli '339': HAWAIIAN GOOSE '340': hammer throw '341': JANDAYA PARAKEET '342': BARROWS GOLDENEYE '343': trapeze '344': 085.goat '345': speed skating '346': ALEXANDRINE PARAKEET '347': DUSKY ROBIN '348': 055.dice '349': wheelchair racing '350': 139.megaphone '351': Moderate_Demented '352': fencing '353': TRUMPTER SWAN '354': 095.hamburger '355': 215.telephone-box '356': gaga '357': hang gliding '358': CHARA DE COLLAR '359': BAND TAILED GUAN '360': GO AWAY BIRD '361': JAPANESE ROBIN '362': SPOON BILED SANDPIPER '363': 073.fireworks '364': golf '365': RED CROSSBILL '366': GREEN WINGED DOVE '367': American Shorthair '368': RING-NECKED PHEASANT '369': 039.chopsticks '370': NORTHERN MOCKINGBIRD '371': 187.skyscraper '372': track bicycle '373': 237.vcr '374': CREAM COLORED WOODPECKER '375': GREY CUCKOOSHRIKE '376': 181.segway '377': Bloodhound '378': UMBRELLA BIRD '379': 023.bulldozer '380': Brabancon Griffon '381': Russian Blue '382': 017.bowling-ball '383': 256.toad '384': 113.hummingbird '385': 041.coffee-mug '386': 203.stirrups '387': 252.car-side-101 '388': 177.saturn '389': barell racing '390': KNOB BILLED DUCK '391': Papaya '392': 146.mountain-bike '393': GRAY KINGBIRD '394': AMERICAN AVOCET '395': BORNEAN LEAFBIRD '396': AFRICAN CROWNED CRANE '397': 165.pram '398': 078.fried-egg '399': Basenji '400': BLACK VENTED SHEARWATER '401': 219.theodolite '402': sumo wrestling '403': SRI LANKA BLUE MAGPIE '404': EGYPTIAN GOOSE '405': baton twirling '406': 093.grasshopper '407': SCARLET MACAW '408': FASCIATED WREN '409': cricket '410': Bedlington Terrier '411': WHITE BROWED CRAKE '412': Beagle '413': water polo '414': 132.light-house '415': 083.gas-pump '416': Chihuahua '417': REGENT BOWERBIRD '418': Bull Mastiff '419': Airedale '420': CAATINGA CACHOLOTE '421': NORTHERN JACANA '422': TIT MOUSE '423': YELLOW CACIQUE '424': Standard Poodle '425': DOUBLE BARRED FINCH '426': BULWERS PHEASANT '427': HARLEQUIN DUCK '428': MARABOU STORK '429': 202.steering-wheel '430': high jump '431': Welsh Springer Spaniel '432': OYSTER CATCHER '433': 036.chandelier-101 '434': RAINBOW LORIKEET '435': GOLDEN CHEEKED WARBLER '436': 225.tower-pisa '437': 090.gorilla '438': Shih Tzu '439': FAIRY TERN '440': NICOBAR PIGEON '441': Great Pyrenees '442': 086.golden-gate-bridge '443': MASKED LAPWING '444': Maine Coon '445': CRANE HAWK '446': swimming '447': VERMILION FLYCATHER '448': 081.frying-pan '449': VENEZUELIAN TROUPIAL '450': ANNAS HUMMINGBIRD '451': WHITE THROATED BEE EATER '452': MALACHITE KINGFISHER '453': 067.eyeglasses '454': MALABAR HORNBILL '455': 224.touring-bike '456': 191.sneaker '457': BLACK THROATED HUET '458': 080.frog '459': 040.cockroach '460': 035.cereal-box '461': SNOWY OWL '462': OSTRICH '463': Ibizan Hound '464': LONG-EARED OWL '465': 144.minotaur '466': 220.toaster '467': Australian Terrier '468': BORNEAN PHEASANT '469': BORNEAN BRISTLEHEAD '470': rollerblade racing '471': balance beam '472': pole vault '473': Cabbage '474': 227.treadmill '475': log rolling '476': ZEBRA DOVE '477': Malamute '478': Bean '479': Afghan Hound '480': 155.paperclip '481': English Springer '482': APAPANE '483': EASTERN YELLOW ROBIN '484': IVORY BILLED ARACARI '485': CHINESE POND HERON '486': SHOEBILL '487': AZURE JAY '488': Cocker Spaniel '489': Dhole '490': GANG GANG COCKATOO '491': AUSTRALASIAN FIGBIRD '492': BROWN HEADED COWBIRD '493': FRILL BACK PIGEON '494': CAPPED HERON '495': bike polo '496': BARN SWALLOW '497': African Hunting Dog '498': hydroplane racing '499': Old English Sheepdog '500': Egyptian Mau '501': BARRED PUFFBIRD '502': DOWNY WOODPECKER '503': SCARLET CROWNED FRUIT DOVE '504': WHITE BREASTED WATERHEN '505': 164.porcupine '506': CRAB PLOVER '507': nascar racing '508': CINNAMON TEAL '509': ROSY FACED LOVEBIRD '510': 003.backpack '511': BLACK HEADED CAIQUE '512': CROW '513': SNOW GOOSE '514': 114.ibis-101 '515': ampute football '516': horse jumping '517': BLUE MALKOHA '518': BLACK TAIL CRAKE '519': CRESTED FIREBACK '520': MOURNING DOVE '521': EASTERN GOLDEN WEAVER '522': COPPERSMITH BARBET '523': BUSH TURKEY '524': Bernese Mountain Dog '525': ORIENTAL BAY OWL '526': 205.superman '527': 208.swiss-army-knife '528': CRESTED SHRIKETIT '529': CURL CRESTED ARACURI '530': 152.owl '531': EMU '532': KOOKABURRA '533': steer wrestling '534': MERLIN '535': COLLARED ARACARI '536': KAKAPO '537': STRIATED CARACARA '538': 025.cactus '539': AUCKLAND SHAQ '540': Very_Mild_Demented '541': Abyssinian '542': TASMANIAN HEN '543': 213.teddy-bear '544': IMPERIAL SHAQ '545': 024.butterfly '546': Tuxedo '547': HARPY EAGLE '548': 084.giraffe '549': OCELLATED TURKEY '550': WHITE CHEEKED TURACO '551': Greater Swiss Mountain Dog '552': 209.sword '553': SURF SCOTER '554': BLACK COCKATO '555': KILLDEAR '556': AFRICAN FIREFINCH '557': Blenheim Spaniel '558': SAND MARTIN '559': Maltese Dog '560': NOISY FRIARBIRD '561': GOLDEN PARAKEET '562': EASTERN MEADOWLARK '563': WHITE EARED HUMMINGBIRD '564': 140.menorah-101 '565': Border Collie '566': COMMON POORWILL '567': BLACK THROATED WARBLER '568': ANTILLEAN EUPHONIA '569': 087.goldfish '570': EASTERN ROSELLA '571': OKINAWA RAIL '572': giant slalom '573': BLUE THROATED PIPING GUAN '574': Curly Coated Retriever '575': BEARDED BARBET '576': 157.pci-card '577': ice yachting '578': ROYAL FLYCATCHER '579': CEDAR WAXWING '580': ORANGE BREASTED TROGON '581': 179.scorpion-101 '582': bull riding '583': Tomato '584': 129.leopards-101 '585': 154.palm-tree '586': Miniature Pinscher '587': 118.iris '588': Schipperke '589': 010.beer-mug '590': CHUCAO TAPACULO '591': 071.fire-hydrant '592': Tibetan Mastiff '593': 244.wheelbarrow '594': Mild_Demented '595': CHIPPING SPARROW '596': AZARAS SPINETAIL '597': IWI '598': ORNATE HAWK EAGLE '599': SMITHS LONGSPUR '600': bmx '601': Lhasa '602': ASIAN GREEN BEE EATER '603': 229.tricycle '604': Redbone '605': FOREST WAGTAIL '606': wingsuit flying '607': 142.microwave '608': BREWERS BLACKBIRD '609': Capsicum '610': 200.stained-glass '611': BAR-TAILED GODWIT '612': WHITE CRESTED HORNBILL '613': FLAME BOWERBIRD '614': PURPLE GALLINULE '615': 151.ostrich '616': CLARKS GREBE '617': TEAL DUCK '618': 141.microscope '619': Bitter_Gourd '620': Giant Schnauzer '621': Briard '622': 020.brain-101 '623': BLACK BAZA '624': JACK SNIPE '625': 105.horse '626': 210.syringe '627': BRANDT CORMARANT '628': IBERIAN MAGPIE '629': FIRE TAILLED MYZORNIS '630': 207.swan '631': YELLOW BREASTED CHAT '632': AZURE TANAGER '633': NORTHERN FULMAR '634': 183.sextant '635': BLUE GRAY GNATCATCHER '636': CAPUCHINBIRD '637': SPOTTED CATBIRD '638': jai alai '639': LARK BUNTING '640': COPPERY TAILED COUCAL '641': TROPICAL KINGBIRD '642': 021.breadmaker '643': BANDED BROADBILL '644': 116.iguana '645': Scotch Terrier '646': 038.chimp '647': Boxer '648': 109.hot-tub '649': PYGMY KINGFISHER '650': 059.drinking-straw '651': RED BILLED TROPICBIRD '652': BLUE DACNIS '653': LOGGERHEAD SHRIKE '654': Bouvier Des Flandres '655': INDIAN ROLLER '656': Siamese '657': 112.human-skeleton '658': sky surfing '659': ALBERTS TOWHEE '660': CRESTED KINGFISHER '661': Vizsla '662': air hockey '663': basketball '664': VICTORIA CROWNED PIGEON '665': Gordon Setter '666': 061.dumb-bell '667': 008.bathtub '668': bungee jumping '669': Ragdoll '670': rowing '671': 257.clutter '672': Pembroke '673': ASHY THRUSHBIRD '674': BLUE COAU '675': CRESTED CARACARA '676': WALL CREAPER '677': 005.baseball-glove '678': BURCHELLS COURSER '679': HIMALAYAN MONAL '680': 099.harpsichord '681': 096.hammock '682': 121.kangaroo-101 '683': 217.tennis-court '684': CINNAMON FLYCATCHER '685': ANDEAN SISKIN '686': horseshoe pitching '687': GREAT XENOPS '688': WHIMBREL '689': 122.kayak '690': 119.jesus-christ '691': Weimaraner '692': 222.tombstone '693': football '694': RUBY CROWNED KINGLET '695': 195.soda-can '696': FLAME TANAGER '697': 242.watermelon '698': Golden Retriever '699': PHAINOPEPLA '700': surfing '701': Kuvasz '702': LUCIFER HUMMINGBIRD '703': EARED PITA '704': 170.rainbow '705': CASSOWARY '706': baseball '707': 079.frisbee '708': 047.computer-mouse '709': frisbee '710': GILDED FLICKER '711': Shetland Sheepdog '712': pommel horse '713': Miniature Poodle '714': Yorkshire Terrier '715': IVORY GULL '716': HORNED LARK '717': BLACK FACED SPOONBILL '718': SNOWY EGRET '719': figure skating men '720': AMERICAN GOLDFINCH '721': OVENBIRD '722': GOLDEN PHEASANT '723': Boston Bull '724': FIERY MINIVET '725': Non_Demented '726': CAPE GLOSSY STARLING '727': CUBAN TROGON '728': FAIRY BLUEBIRD '729': 097.harmonica '730': ski jumping '731': 133.lightning '732': CHUKAR PARTRIDGE '733': JAVA SPARROW '734': NORTHERN RED BISHOP '735': RED TAILED HAWK '736': Clumber '737': Cairn '738': Norfolk Terrier '739': Persian '740': 001.ak47 '741': ROSE BREASTED GROSBEAK '742': BROWN THRASHER '743': GILA WOODPECKER '744': NORTHERN GANNET '745': Pekinese '746': Irish Water Spaniel '747': AMERICAN DIPPER '748': 123.ketch-101 '749': HOODED MERGANSER '750': sidecar racing '751': HORNED GUAN '752': NORTHERN SHOVELER '753': BEARDED REEDLING '754': 216.tennis-ball '755': AMERICAN FLAMINGO '756': ASIAN DOLLARD BIRD '757': GYRFALCON '758': PHILIPPINE EAGLE '759': RED NAPED TROGON '760': Brinjal '761': BIRD OF PARADISE '762': PATAGONIAN SIERRA FINCH '763': CRESTED WOOD PARTRIDGE '764': HIMALAYAN BLUETAIL '765': 254.greyhound '766': canoe slamon '767': Newfoundland '768': INLAND DOTTEREL '769': FRIGATE '770': PALILA '771': 238.video-projector '772': American Staffordshire Terrier '773': APOSTLEBIRD '774': HAWFINCH '775': 074.flashlight '776': 153.palm-pilot '777': motorcycle racing '778': SUPERB STARLING '779': 098.harp '780': snow boarding '781': 156.paper-shredder '782': cheerleading '783': Cauliflower '784': CALIFORNIA QUAIL '785': figure skating women '786': 211.tambourine '787': Chesapeake Bay Retriever '788': 221.tomato '789': COMMON HOUSE MARTIN '790': JOCOTOCO ANTPITTA '791': RED HEADED WOODPECKER '792': Irish Wolfhound '793': BLUE GROSBEAK '794': Whippet '795': 178.school-bus '796': YELLOW HEADED BLACKBIRD '797': EURASIAN MAGPIE '798': 026.cake '799': MALEO '800': 182.self-propelled-lawn-mower '801': croquet '802': SUNBITTERN '803': Affenpinscher '804': Sphynx '805': ROADRUNNER '806': rings '807': COCK OF THE ROCK '808': PURPLE MARTIN '809': 037.chess-board '810': LIMPKIN '811': Bombay '812': 131.lightbulb '813': Komondor '814': 145.motorbikes-101 '815': NORTHERN BEARDLESS TYRANNULET '816': hurdles '817': HARLEQUIN QUAIL '818': CHESTNET BELLIED EUPHONIA '819': Entlebucher '820': 033.cd '821': disc golf '822': 192.snowmobile '823': Soft Coated Wheaten Terrier '824': English Setter '825': ELEGANT TROGON '826': QUETZAL '827': 054.diamond-ring '828': archery '829': GURNEYS PITTA '830': Sussex Spaniel '831': ROCK DOVE '832': 160.pez-dispenser '833': FAN TAILED WIDOW '834': CAMPO FLICKER '835': CHATTERING LORY '836': CRIMSON CHAT '837': 197.speed-boat '838': 143.minaret '839': 064.elephant-101 '840': HORNED SUNGEM '841': 032.cartman '842': DEMOISELLE CRANE '843': 218.tennis-racket '844': CAPE LONGCLAW '845': CRESTED NUTHATCH '846': GAMBELS QUAIL '847': 169.radio-telescope '848': D-ARNAUDS BARBET '849': PUFFIN '850': BLONDE CRESTED WOODPECKER '851': ALBATROSS '852': STORK BILLED KINGFISHER '853': 149.necktie '854': axe throwing '855': HEPATIC TANAGER '856': Italian Greyhound '857': COMMON IORA '858': COMMON STARLING '859': HAMERKOP '860': GLOSSY IBIS '861': SNOWY SHEATHBILL '862': ANTBIRD '863': VEERY '864': AMETHYST WOODSTAR '865': bowling '866': DOUBLE BRESTED CORMARANT '867': 057.dolphin-101 '868': Malinois '869': BLACK-CAPPED CHICKADEE '870': 070.fire-extinguisher '871': 196.spaghetti '872': Black And Tan Coonhound '873': West Highland White Terrier '874': Basset '875': AMERICAN WIGEON '876': BLACKBURNIAM WARBLER '877': VIOLET TURACO '878': table tennis '879': PARADISE TANAGER '880': SANDHILL CRANE '881': 063.electric-guitar-101 '882': Bottle_Gourd '883': 174.rotary-phone '884': 204.sunflower-101 '885': WOOD THRUSH '886': 235.umbrella-101 '887': DARK EYED JUNCO '888': LAZULI BUNTING '889': BLACK SKIMMER '890': WRENTIT '891': 247.xylophone '892': 069.fighter-jet '893': BARN OWL '894': Appenzeller '895': INDIGO FLYCATCHER '896': SNOW PARTRIDGE '897': MCKAYS BUNTING '898': 046.computer-monitor '899': 162.picnic-table '900': MANDRIN DUCK '901': olympic wrestling '902': MYNA '903': American Bobtail '904': Radish '905': GREATOR SAGE GROUSE '906': 120.joy-stick '907': Mexican Hairless '908': BLUE THROATED TOUCANET '909': tennis '910': PURPLE FINCH '911': Papillon '912': AFRICAN OYSTER CATCHER '913': EMERALD TANAGER '914': 022.buddha-101 '915': GRAY CATBIRD '916': COLLARED CRESCENTCHEST '917': VERDIN '918': 004.baseball-bat '919': 051.cowboy-hat '920': RUBY THROATED HUMMINGBIRD '921': 068.fern '922': TOWNSENDS WARBLER '923': WILSONS BIRD OF PARADISE '924': 246.wine-bottle '925': 092.grapes '926': lacrosse '927': tug of war '928': GREEN BROADBILL '929': Saint Bernard '930': 012.binoculars '931': BROWN NOODY '932': 082.galaxy '933': VARIED THRUSH '934': Brittany Spaniel '935': Labrador Retriever '936': WHITE TAILED TROPIC '937': 124.killer-whale '938': 102.helicopter-101 '939': ASIAN OPENBILL STORK '940': 236.unicorn '941': ABBOTTS BOOBY '942': 062.eiffel-tower '943': 249.yo-yo '944': English Foxhound '945': bobsled '946': Japanese Spaniel '947': 106.horseshoe-crab '948': 255.tennis-shoes '949': ultimate '950': GREEN MAGPIE '951': RUDY KINGFISHER '952': AVADAVAT '953': RED WISKERED BULBUL '954': SCARLET FACED LIOCICHLA '955': RED KNOT '956': AZURE TIT '957': MAGPIE GOOSE '958': 245.windmill '959': CRESTED OROPENDOLA '960': WOOD DUCK '961': wheelchair basketball '962': CARMINE BEE-EATER '963': 110.hourglass '964': MIKADO PHEASANT '965': BLACK BREASTED PUFFBIRD '966': SNOWY PLOVER '967': Saluki '968': Silky Terrier '969': Great Dane '970': Bengal '971': RED FODY '972': CRESTED COUA '973': SORA '974': Rottweiler '975': CINNAMON ATTILA '976': 250.zebra '977': CALIFORNIA GULL '978': 077.french-horn '979': luge '980': CABOTS TRAGOPAN '981': NORTHERN GOSHAWK '982': hockey '983': Norwich Terrier '984': Scottish Deerhound '985': BAY-BREASTED WARBLER '986': INCA TERN '987': VISAYAN HORNBILL '988': chuckwagon racing '989': BANDED STILT '990': JACOBIN PIGEON '991': MANGROVE CUCKOO '992': pole dancing '993': BLACK SWAN '994': AFRICAN PYGMY GOOSE '995': KING EIDER '996': Leonberg '997': EMPEROR PENGUIN '998': Bluetick '999': LILAC ROLLER '1000': CUBAN TODY '1001': 159.people '1002': BALD EAGLE '1003': 075.floppy-disk '1004': PEACOCK '1005': ANDEAN LAPWING '1006': Irish Setter '1007': PARAKETT AUKLET '1008': FAIRY PENGUIN '1009': GREAT JACAMAR '1010': BLUE GROUSE '1011': EASTERN WIP POOR WILL '1012': FIORDLAND PENGUIN '1013': BLACK FRANCOLIN '1014': RED SHOULDERED HAWK '1015': CAPE MAY WARBLER '1016': 127.laptop-101 '1017': 148.mussels '1018': field hockey '1019': sailboat racing '1020': 028.camel '1021': shuffleboard '1022': HOOPOES '1023': 173.rifle '1024': AMERICAN COOT '1025': 066.ewer-101 '1026': SATYR TRAGOPAN '1027': 091.grand-piano-101 '1028': Cucumber '1029': CANVASBACK '1030': BLACK-NECKED GREBE '1031': BEARDED BELLBIRD '1032': 167.pyramid '1033': DUSKY LORY '1034': 137.mars '1035': GOLDEN PIPIT '1036': DUNLIN id: - 0 - 1 - 2 - 3 - 4 - 5 - 6 - 7 - 8 - 9 - 10 - 11 - 12 - 13 - 14 - 15 - 16 - 17 - 18 - 19 - 20 - 21 - 22 - 23 - 24 - 25 - 26 - 27 - 28 - 29 - 30 - 31 - 32 - 33 - 34 - 35 - 36 - 37 - 38 - 39 - 40 - 41 - 42 - 43 - 44 - 45 - 46 - 47 - 48 - 49 - 50 - 51 - 52 - 53 - 54 - 55 - 56 - 57 - 58 - 59 - 60 - 61 - 62 - 63 - 64 - 65 - 66 - 67 - 68 - 69 - 70 - 71 - 72 - 73 - 74 - 75 - 76 - 77 - 78 - 79 - 80 - 81 - 82 - 83 - 84 - 85 - 86 - 87 - 88 - 89 - 90 - 91 - 92 - 93 - 94 - 95 - 96 - 97 - 98 - 99 - 100 - 101 - 102 - 103 - 104 - 105 - 106 - 107 - 108 - 109 - 110 - 111 - 112 - 113 - 114 - 115 - 116 - 117 - 118 - 119 - 120 - 121 - 122 - 123 - 124 - 125 - 126 - 127 - 128 - 129 - 130 - 131 - 132 - 133 - 134 - 135 - 136 - 137 - 138 - 139 - 140 - 141 - 142 - 143 - 144 - 145 - 146 - 147 - 148 - 149 - 150 - 151 - 152 - 153 - 154 - 155 - 156 - 157 - 158 - 159 - 160 - 161 - 162 - 163 - 164 - 165 - 166 - 167 - 168 - 169 - 170 - 171 - 172 - 173 - 174 - 175 - 176 - 177 - 178 - 179 - 180 - 181 - 182 - 183 - 184 - 185 - 186 - 187 - 188 - 189 - 190 - 191 - 192 - 193 - 194 - 195 - 196 - 197 - 198 - 199 - 200 - 201 - 202 - 203 - 204 - 205 - 206 - 207 - 208 - 209 - 210 - 211 - 212 - 213 - 214 - 215 - 216 - 217 - 218 - 219 - 220 - 221 - 222 - 223 - 224 - 225 - 226 - 227 - 228 - 229 - 230 - 231 - 232 - 233 - 234 - 235 - 236 - 237 - 238 - 239 - 240 - 241 - 242 - 243 - 244 - 245 - 246 - 247 - 248 - 249 - 250 - 251 - 252 - 253 - 254 - 255 - 256 - 257 - 258 - 259 - 260 - 261 - 262 - 263 - 264 - 265 - 266 - 267 - 268 - 269 - 270 - 271 - 272 - 273 - 274 - 275 - 276 - 277 - 278 - 279 - 280 - 281 - 282 - 283 - 284 - 285 - 286 - 287 - 288 - 289 - 290 - 291 - 292 - 293 - 294 - 295 - 296 - 297 - 298 - 299 - 300 - 301 - 302 - 303 - 304 - 305 - 306 - 307 - 308 - 309 - 310 - 311 - 312 - 313 - 314 - 315 - 316 - 317 - 318 - 319 - 320 - 321 - 322 - 323 - 324 - 325 - 326 - 327 - 328 - 329 - 330 - 331 - 332 - 333 - 334 - 335 - 336 - 337 - 338 - 339 - 340 - 341 - 342 - 343 - 344 - 345 - 346 - 347 - 348 - 349 - 350 - 351 - 352 - 353 - 354 - 355 - 356 - 357 - 358 - 359 - 360 - 361 - 362 - 363 - 364 - 365 - 366 - 367 - 368 - 369 - 370 - 371 - 372 - 373 - 374 - 375 - 376 - 377 - 378 - 379 - 380 - 381 - 382 - 383 - 384 - 385 - 386 - 387 - 388 - 389 - 390 - 391 - 392 - 393 - 394 - 395 - 396 - 397 - 398 - 399 - 400 - 401 - 402 - 403 - 404 - 405 - 406 - 407 - 408 - 409 - 410 - 411 - 412 - 413 - 414 - 415 - 416 - 417 - 418 - 419 - 420 - 421 - 422 - 423 - 424 - 425 - 426 - 427 - 428 - 429 - 430 - 431 - 432 - 433 - 434 - 435 - 436 - 437 - 438 - 439 - 440 - 441 - 442 - 443 - 444 - 445 - 446 - 447 - 448 - 449 - 450 - 451 - 452 - 453 - 454 - 455 - 456 - 457 - 458 - 459 - 460 - 461 - 462 - 463 - 464 - 465 - 466 - 467 - 468 - 469 - 470 - 471 - 472 - 473 - 474 - 475 - 476 - 477 - 478 - 479 - 480 - 481 - 482 - 483 - 484 - 485 - 486 - 487 - 488 - 489 - 490 - 491 - 492 - 493 - 494 - 495 - 496 - 497 - 498 - 499 - 500 - 501 - 502 - 503 - 504 - 505 - 506 - 507 - 508 - 509 - 510 - 511 - 512 - 513 - 514 - 515 - 516 - 517 - 518 - 519 - 520 - 521 - 522 - 523 - 524 - 525 - 526 - 527 - 528 - 529 - 530 - 531 - 532 - 533 - 534 - 535 - 536 - 537 - 538 - 539 - 540 - 541 - 542 - 543 - 544 - 545 - 546 - 547 - 548 - 549 - 550 - 551 - 552 - 553 - 554 - 555 - 556 - 557 - 558 - 559 - 560 - 561 - 562 - 563 - 564 - 565 - 566 - 567 - 568 - 569 - 570 - 571 - 572 - 573 - 574 - 575 - 576 - 577 - 578 - 579 - 580 - 581 - 582 - 583 - 584 - 585 - 586 - 587 - 588 - 589 - 590 - 591 - 592 - 593 - 594 - 595 - 596 - 597 - 598 - 599 - 600 - 601 - 602 - 603 - 604 - 605 - 606 - 607 - 608 - 609 - 610 - 611 - 612 - 613 - 614 - 615 - 616 - 617 - 618 - 619 - 620 - 621 - 622 - 623 - 624 - 625 - 626 - 627 - 628 - 629 - 630 - 631 - 632 - 633 - 634 - 635 - 636 - 637 - 638 - 639 - 640 - 641 - 642 - 643 - 644 - 645 - 646 - 647 - 648 - 649 - 650 - 651 - 652 - 653 - 654 - 655 - 656 - 657 - 658 - 659 - 660 - 661 - 662 - 663 - 664 - 665 - 666 - 667 - 668 - 669 - 670 - 671 - 672 - 673 - 674 - 675 - 676 - 677 - 678 - 679 - 680 - 681 - 682 - 683 - 684 - 685 - 686 - 687 - 688 - 689 - 690 - 691 - 692 - 693 - 694 - 695 - 696 - 697 - 698 - 699 - 700 - 701 - 702 - 703 - 704 - 705 - 706 - 707 - 708 - 709 - 710 - 711 - 712 - 713 - 714 - 715 - 716 - 717 - 718 - 719 - 720 - 721 - 722 - 723 - 724 - 725 - 726 - 727 - 728 - 729 - 730 - 731 - 732 - 733 - 734 - 735 - 736 - 737 - 738 - 739 - 740 - 741 - 742 - 743 - 744 - 745 - 746 - 747 - 748 - 749 - 750 - 751 - 752 - 753 - 754 - 755 - 756 - 757 - 758 - 759 - 760 - 761 - 762 - 763 - 764 - 765 - 766 - 767 - 768 - 769 - 770 - 771 - 772 - 773 - 774 - 775 - 776 - 777 - 778 - 779 - 780 - 781 - 782 - 783 - 784 - 785 - 786 - 787 - 788 - 789 - 790 - 791 - 792 - 793 - 794 - 795 - 796 - 797 - 798 - 799 - 800 - 801 - 802 - 803 - 804 - 805 - 806 - 807 - 808 - 809 - 810 - 811 - 812 - 813 - 814 - 815 - 816 - 817 - 818 - 819 - 820 - 821 - 822 - 823 - 824 - 825 - 826 - 827 - 828 - 829 - 830 - 831 - 832 - 833 - 834 - 835 - 836 - 837 - 838 - 839 - 840 - 841 - 842 - 843 - 844 - 845 - 846 - 847 - 848 - 849 - 850 - 851 - 852 - 853 - 854 - 855 - 856 - 857 - 858 - 859 - 860 - 861 - 862 - 863 - 864 - 865 - 866 - 867 - 868 - 869 - 870 - 871 - 872 - 873 - 874 - 875 - 876 - 877 - 878 - 879 - 880 - 881 - 882 - 883 - 884 - 885 - 886 - 887 - 888 - 889 - 890 - 891 - 892 - 893 - 894 - 895 - 896 - 897 - 898 - 899 - 900 - 901 - 902 - 903 - 904 - 905 - 906 - 907 - 908 - 909 - 910 - 911 - 912 - 913 - 914 - 915 - 916 - 917 - 918 - 919 - 920 - 921 - 922 - 923 - 924 - 925 - 926 - 927 - 928 - 929 - 930 - 931 - 932 - 933 - 934 - 935 - 936 - 937 - 938 - 939 - 940 - 941 - 942 - 943 - 944 - 945 - 946 - 947 - 948 - 949 - 950 - 951 - 952 - 953 - 954 - 955 - 956 - 957 - 958 - 959 - 960 - 961 - 962 - 963 - 964 - 965 - 966 - 967 - 968 - 969 - 970 - 971 - 972 - 973 - 974 - 975 - 976 - 977 - 978 - 979 - 980 - 981 - 982 - 983 - 984 - 985 - 986 - 987 - 988 - 989 - 990 - 991 - 992 - 993 - 994 - 995 - 996 - 997 - 998 - 999 - 1000 - 1001 - 1002 - 1003 - 1004 - 1005 - 1006 - 1007 - 1008 - 1009 - 1010 - 1011 - 1012 - 1013 - 1014 - 1015 - 1016 - 1017 - 1018 - 1019 - 1020 - 1021 - 1022 - 1023 - 1024 - 1025 - 1026 - 1027 - 1028 - 1029 - 1030 - 1031 - 1032 - 1033 - 1034 - 1035 - 1036 splits: - name: train num_bytes: 4022424722.3025527 num_examples: 150871 - name: test num_bytes: 1015738620.1934471 num_examples: 37718 download_size: 5159857645 dataset_size: 5038163342.496 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
christinacdl/hate_speech_dataset_new
--- license: apache-2.0 task_categories: - text-classification language: - en --- - 44.246 texts in total, 21.493 NOT hateful texts and 22.753 HATE texts - All duplicate values were removed - Split using sklearn into 80% train and 20% temporary test (stratified label). Then split the test set using 0.50% test and validation (stratified label) - Split: 80/10/10 - Train set label distribution: 0 ==> 17.194, 1 ==> 18.202, 35.396 in total - Validation set label distribution: 0 ==> 2.150, 1 ==> 2.275, 4.425 in total - Test set label distribution: 0 ==> 2.149, 1 ==> 2.276, 4.425 in total - Combination of 6 publicly available datasets: - 1. "Ethos" dataset (Mollas et al., 2022) - 2. Anatomy of Online Hate: Developing a Taxonomy and Machine Learning Models for Identifying and Classifying Hate in Online News Media (Salminem et al. (2018) - 3. A Benchmark Dataset for Learning to Intervene in Online Hate Speech (Qian et al., 2019) - 4. Automated Hate Speech Detection and the Problem of Offensive Language (Davidson, et al., 2017) - 5. HatEval (Basile et al, 2019), SemEval-2019 Task 5 - 6. "Hate Towards the Political Opponent"(Grimminger et al., 2021)
bh8648/split_dataset_6
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string - name: page_num dtype: int64 splits: - name: train num_bytes: 701891 num_examples: 212 download_size: 342602 dataset_size: 701891 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "split_dataset_6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
coeuslearning/product_ads
--- dataset_info: features: - name: name dtype: string - name: description dtype: string - name: ad dtype: string splits: - name: train num_bytes: 5006 num_examples: 25 download_size: 6203 dataset_size: 5006 license: openrail task_categories: - text-generation language: - en tags: - art pretty_name: Product Ads size_categories: - 1K<n<10K --- # Dataset Card for "product_ads" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
farmaieu/plantorgans
--- license: cdla-permissive-2.0 size_categories: - 1K<n<10K task_categories: - image-segmentation tags: - biology dataset_info: features: - name: image dtype: image - name: label dtype: image splits: - name: train num_bytes: 9121146572.05 num_examples: 5745 - name: validation num_bytes: 2367801100.383 num_examples: 1437 download_size: 11607836195 dataset_size: 11488947672.432999 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* language: - en pretty_name: plant organs --- # PLANT ORGANS Photos of various plants with their major, above ground organs labeled. Includes labels for stem, leafs, fruits and flowers. Note, that categories listed above do not necessarily correspond to a correct botanical term for the given part of the plant photographed. Instead they correspond to the conventional understanding of them. # ID - Label Map Following table describes pixel values corresponding to labels in provided masks. The first label, "void", represents the background. | Index | Label | |-------|-------| |0 | void | |1 | Fruit | |2 | Leaf | |3 | Flower | |4 | Stem |
Ubaidbhat/StockInvestingForDummies
--- dataset_info: features: - name: context dtype: string - name: question dtype: string - name: answer dtype: string - name: source_doc dtype: string - name: groundedness_score dtype: float64 - name: groundedness_eval dtype: string - name: relevance_score dtype: float64 - name: relevance_eval dtype: string splits: - name: train num_bytes: 1460656 num_examples: 791 download_size: 594609 dataset_size: 1460656 configs: - config_name: default data_files: - split: train path: data/train-* ---
carlosejimenez/seq2seq-cnndm-tokenized
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 883654818 num_examples: 287113 - name: validation num_bytes: 41790866 num_examples: 13368 - name: test num_bytes: 35615246 num_examples: 11490 download_size: 359302108 dataset_size: 961060930 --- # Dataset Card for "seq2seq-cnndm-tokenized" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
saibo/bookcorpus_deduplicated
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2867856394 num_examples: 38832894 download_size: 1794567875 dataset_size: 2867856394 --- # Dataset Card for "bookcorpus_deduplicated" ## Dataset Summary This is a deduplicated version of the original [Book Corpus dataset](https://huggingface.co/datasets/bookcorpus). The Book Corpus (Zhu et al., 2015), which was used to train popular models such as BERT, has a substantial amount of exact-duplicate documents according to [Bandy and Vincent (2021)](https://arxiv.org/abs/2105.05241) [Bandy and Vincent (2021)](https://arxiv.org/abs/2105.05241) find that thousands of books in BookCorpus are duplicated, with only 7,185 unique books out of 11,038 total. Effect of deduplication - Num of lines: 38832894 VS 74004228 - Dataset size: 2.91GB VS 4.63GB The duplicate text has been droped and only the first appearance is kept. The order of text appearance is kept. ## Why deduplicate? Deduplication of training data has showed various advantages, including: - require fewer training steps to achieve the same or better accuracy - train models that emit memorized text ten times less frequently - reduce carbon emission and energy consumption cf [Deduplicating Training Data Makes Language Models Better](https://arxiv.org/abs/2107.06499) ## Deduplication script ```python import pandas as pd from datasets import load_dataset dataset = load_dataset("bookcorpus")["train"]["text"] df = pd.Dataframe({"text":dataset}) # drop duplicates(exact match) df_filtered = df["text"].drop_duplicates() df_filtered.to_csv("bookcorpus_filtered.csv","index"=False,"header"=False) new_dataset = load_dataset("text",data_files={"train":"bookcorpus_filtered.csv"}) ``` The running time is short, less than several minutes. More sophicated deduplication algorithms can be applied to improve the performance, such as https://github.com/google-research/deduplicate-text-datasets ## Reference ```bib @misc{https://doi.org/10.48550/arxiv.2105.05241, doi = {10.48550/ARXIV.2105.05241}, url = {https://arxiv.org/abs/2105.05241}, author = {Bandy, Jack and Vincent, Nicholas}, keywords = {Computation and Language (cs.CL), Computers and Society (cs.CY), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Addressing "Documentation Debt" in Machine Learning Research: A Retrospective Datasheet for BookCorpus}, publisher = {arXiv}, year = {2021}, copyright = {arXiv.org perpetual, non-exclusive license} } ``` ```bib @misc{https://doi.org/10.48550/arxiv.2107.06499, doi = {10.48550/ARXIV.2107.06499}, url = {https://arxiv.org/abs/2107.06499}, author = {Lee, Katherine and Ippolito, Daphne and Nystrom, Andrew and Zhang, Chiyuan and Eck, Douglas and Callison-Burch, Chris and Carlini, Nicholas}, keywords = {Computation and Language (cs.CL), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Deduplicating Training Data Makes Language Models Better}, publisher = {arXiv}, year = {2021}, copyright = {arXiv.org perpetual, non-exclusive license} } ``` ```bib @misc{https://doi.org/10.48550/arxiv.2209.00099, doi = {10.48550/ARXIV.2209.00099}, url = {https://arxiv.org/abs/2209.00099}, author = {Treviso, Marcos and Ji, Tianchu and Lee, Ji-Ung and van Aken, Betty and Cao, Qingqing and Ciosici, Manuel R. and Hassid, Michael and Heafield, Kenneth and Hooker, Sara and Martins, Pedro H. and Martins, André F. T. and Milder, Peter and Raffel, Colin and Simpson, Edwin and Slonim, Noam and Balasubramanian, Niranjan and Derczynski, Leon and Schwartz, Roy}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Methods for Natural Language Processing: A Survey}, publisher = {arXiv}, year = {2022}, copyright = {arXiv.org perpetual, non-exclusive license} } ``` [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FarhatMay/coco_celeba
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 33345997.0 num_examples: 224 download_size: 33248044 dataset_size: 33345997.0 --- # Dataset Card for "coco_celeba" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
irds/cranfield
--- pretty_name: '`cranfield`' viewer: false source_datasets: [] task_categories: - text-retrieval --- # Dataset Card for `cranfield` The `cranfield` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/cranfield#cranfield). # Data This dataset provides: - `docs` (documents, i.e., the corpus); count=1,400 - `queries` (i.e., topics); count=225 - `qrels`: (relevance assessments); count=1,837 ## Usage ```python from datasets import load_dataset docs = load_dataset('irds/cranfield', 'docs') for record in docs: record # {'doc_id': ..., 'title': ..., 'text': ..., 'author': ..., 'bib': ...} queries = load_dataset('irds/cranfield', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/cranfield', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format.
newguyme/flir_paired
--- dataset_info: features: - name: rgb dtype: image - name: ir dtype: image splits: - name: train num_bytes: 743157232.6539416 num_examples: 4113 - name: test num_bytes: 186676141.08705837 num_examples: 1029 download_size: 928212503 dataset_size: 929833373.7409999 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* --- For modified FLIR Usage: ```py from torchvision import transforms from datasets import load_dataset dataset_name = "newguyme/flir_paired" dataset_flir_paired = load_dataset(dataset_name, split="train",use_auth_token=True) # dataset_flir_paired flir_preprocess = transforms.Compose( [ transforms.Resize((config.image_size, config.image_size)), # transforms.RandomHorizontalFlip(), transforms.ToTensor(), transforms.Normalize([0.5], [0.5]), ] ) def flir_transform(examples): rgb = [flir_preprocess(image.convert("RGB")) for image in examples["rgb"]] ir = [flir_preprocess(image.convert("RGB")) for image in examples["ir"]] return {"rgb_t_3ch": rgb, "ir_t_3ch":ir} dataset_flir_paired.set_transform(flir_transform) ```
Kelvin878/gc10_det
--- dataset_info: features: - name: image dtype: image - name: guide dtype: image - name: text dtype: string splits: - name: train num_bytes: 148251517.77 num_examples: 1595 download_size: 147032275 dataset_size: 148251517.77 configs: - config_name: default data_files: - split: train path: data/train-* ---
ctu-aic/qacg-sk
--- dataset_info: - config_name: balanced features: - name: claim dtype: string - name: label dtype: string - name: evidence sequence: string splits: - name: train num_bytes: 28754278 num_examples: 295209 - name: validation num_bytes: 2930142 num_examples: 30087 - name: test num_bytes: 2759646 num_examples: 28440 download_size: 24238831 dataset_size: 34444066 - config_name: balanced_shuf features: - name: claim dtype: string - name: label dtype: string - name: evidence sequence: string splits: - name: train num_bytes: 17618787 num_examples: 183485 - name: validation num_bytes: 1798751 num_examples: 18750 - name: test num_bytes: 1699031 num_examples: 17727 download_size: 14655644 dataset_size: 21116569 configs: - config_name: balanced data_files: - split: train path: balanced/train-* - split: validation path: balanced/validation-* - split: test path: balanced/test-* - config_name: balanced_shuf data_files: - split: train path: balanced_shuf/train-* - split: validation path: balanced_shuf/validation-* - split: test path: balanced_shuf/test-* ---
sunholee1217/golf
--- license: mit ---
mtek2000/hausa_newsclass_topic
--- license: mit ---
BangumiBase/vanitasnokarte
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Vanitas No Karte This is the image base of bangumi Vanitas no Karte, we detected 31 characters, 2212 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 446 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 58 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 47 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 21 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 20 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 31 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 102 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 14 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 13 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 42 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 16 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 11 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 46 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 38 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 12 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 481 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 67 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 94 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 40 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 64 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 19 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 9 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 40 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 55 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 39 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 55 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 32 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 5 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | N/A | N/A | N/A | | 28 | 8 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 10 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | noise | 277 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
Nicolas-BZRD/English_French_Songs_Lyrics_Translation_Original
--- license: unknown configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: artist_name dtype: string - name: album_name dtype: string - name: year dtype: int64 - name: title dtype: string - name: number dtype: int64 - name: original_version dtype: string - name: french_version dtype: string - name: language dtype: string splits: - name: train num_bytes: 250317845 num_examples: 99289 download_size: 122323323 dataset_size: 250317845 task_categories: - translation - text-generation language: - fr - en - es - it - de - ko - id - pt - 'no' - fi - sv - sw - hr - so - ca - tl - ja - nl - ru - et - tr - ro - cy - vi - af - hu - sk - sl - cs - da - pl - sq - el - he - zh - th - bg - ar tags: - music - parallel - parallel data pretty_name: SYFT size_categories: - 10K<n<100K --- # Original Songs Lyrics with French Translation ### Dataset Summary Dataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French. Details of the number of songs by language of origin can be found in the table below: | Original language | Number of songs | |---|:---| | en | 75786 | | fr | 18486 | | es | 1743 | | it | 803 | | de | 691 | | sw | 529 | | ko | 193 | | id | 169 | | pt | 142 | | no | 122 | | fi | 113 | | sv | 70 | | hr | 53 | | so | 43 | | ca | 41 | | tl | 36 | | ja | 35 | | nl | 32 | | ru | 29 | | et | 27 | | tr | 22 | | ro | 19 | | cy | 14 | | vi | 14 | | af | 13 | | hu | 10 | | sk | 10 | | sl | 10 | | cs | 7 | | da | 6 | | pl | 5 | | sq | 4 | | el | 4 | | he | 3 | | zh-cn | 2 | | th | 1 | | bg | 1 | | ar | 1 | ## Citation Our work can be cited as: ```bash @misc{faysse2024croissantllm, title={CroissantLLM: A Truly Bilingual French-English Language Model}, author={Manuel Faysse and Patrick Fernandes and Nuno Guerreiro and António Loison and Duarte Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro Martins and Antoni Bigata Casademunt and François Yvon and André Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo}, year={2024}, eprint={2402.00786}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
liuyanchen1015/MULTI_VALUE_rte_fronting_pobj
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 650133 num_examples: 2022 - name: train num_bytes: 570848 num_examples: 1692 download_size: 789565 dataset_size: 1220981 --- # Dataset Card for "MULTI_VALUE_rte_fronting_pobj" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mrsteyk/openchatgpt-safe-r2
--- license: apache-2.0 task_categories: - conversational language: - en tags: - chatgpt - openai - gpt35-alpha --- I'm too lazy to fill in the dataset card template! Think of it like r1, but after NY - timestamp is XX-01-2023. This is not turbo at this point, it was before 26ths. This must be "alpha", I'm 99% sure. Has same problems, additional one is missing greetings! "NDA" stuff is missing from this as well!
wisenut-nlp-team/query-generation
--- dataset_info: features: - name: title dtype: string - name: question dtype: string - name: context sequence: string splits: - name: train num_bytes: 400062719 num_examples: 263295 - name: validation num_bytes: 99931566 num_examples: 65824 download_size: 304051369 dataset_size: 499994285 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* --- # Dataset Card for "query-generation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
one-sec-cv12/chunk_13
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 splits: - name: train num_bytes: 22396952880.875 num_examples: 233185 download_size: 19489840324 dataset_size: 22396952880.875 --- # Dataset Card for "chunk_13" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
igorknez/clth_dset
--- license: afl-3.0 ---