datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
distilled-one-sec-cv12-each-chunk-uniq/chunk_185
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 748748536.0 num_examples: 145898 download_size: 766085530 dataset_size: 748748536.0 --- # Dataset Card for "chunk_185" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ziozzang/deepl-trans-ID-KO
--- task_categories: - translation language: - ko - id --- This dataset is some wikipedia article with DeepL translation, auto-aggregated. # String/Corpus pairs From ID/Indonesian to KO/Korean. # Quality Filtering - Stripping whole HTML tags. - removed references and annotation mark. - Filtered by string length. --- The strings/corpus are aggregated from wikipedia(pt) using DeepL translated. whole data collected by Jioh L. Jung<ziozzang@gmail.com> license: mit ---
chronbmm/sanskrit-stemming-tagging-pali-long
--- dataset_info: features: - name: sentence dtype: string - name: unsandhied dtype: string splits: - name: train num_bytes: 525248615 num_examples: 1655728 - name: validation num_bytes: 1858678 num_examples: 3051 - name: test num_bytes: 1924834 num_examples: 3137 - name: test_long_500 num_bytes: 302454 num_examples: 500 - name: validation_long_500 num_bytes: 311042 num_examples: 500 download_size: 184746128 dataset_size: 529645623 --- # Dataset Card for "sanskrit-stemming-tagging-pali-long" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/be64ca95
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 182 num_examples: 10 download_size: 1331 dataset_size: 182 --- # Dataset Card for "be64ca95" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
KEEPYs/titou
--- license: openrail ---
irds/wikiclir_fr
--- pretty_name: '`wikiclir/fr`' viewer: false source_datasets: [] task_categories: - text-retrieval --- # Dataset Card for `wikiclir/fr` The `wikiclir/fr` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/fr). # Data This dataset provides: - `docs` (documents, i.e., the corpus); count=1,894,397 - `queries` (i.e., topics); count=1,089,179 - `qrels`: (relevance assessments); count=5,137,366 ## Usage ```python from datasets import load_dataset docs = load_dataset('irds/wikiclir_fr', 'docs') for record in docs: record # {'doc_id': ..., 'title': ..., 'text': ...} queries = load_dataset('irds/wikiclir_fr', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/wikiclir_fr', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @inproceedings{sasaki-etal-2018-cross, title = "Cross-Lingual Learning-to-Rank with Shared Representations", author = "Sasaki, Shota and Sun, Shuo and Schamoni, Shigehiko and Duh, Kevin and Inui, Kentaro", booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)", month = jun, year = "2018", address = "New Orleans, Louisiana", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/N18-2073", doi = "10.18653/v1/N18-2073", pages = "458--463" } ```
manucm/1
--- license: other ---
twodgirl/kimiko_v3
--- language: - en --- All conversations are made up by Mistral 7B. The indices come from the [Kimiko dataset](https://huggingface.co/datasets/Chat-Error/Kimiko_v3-v0.1).
2HW/llama2_dobe_01
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 3338910 num_examples: 6440 download_size: 752557 dataset_size: 3338910 configs: - config_name: default data_files: - split: train path: data/train-* ---
AdapterOcean/med_alpaca_standardized_cluster_50_std
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: cluster dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 7133138 num_examples: 17928 download_size: 2600516 dataset_size: 7133138 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_50_std" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adityarra07/test_ds_noise2
--- dataset_info: features: - name: audio struct: - name: array sequence: float32 - name: path dtype: 'null' - name: sampling_rate dtype: int64 - name: transcription dtype: string - name: id dtype: string splits: - name: train num_bytes: 228091121.30052426 num_examples: 1000 download_size: 230405299 dataset_size: 228091121.30052426 --- # Dataset Card for "test_ds_noise2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Against61/output_test
--- license: other ---
autoevaluate/autoeval-staging-eval-project-03e83e3b-2528-4e84-b075-34edd28549da-5755
--- type: predictions tags: - autotrain - evaluation datasets: - glue eval_info: task: natural_language_inference model: autoevaluate/natural-language-inference metrics: [] dataset_name: glue dataset_config: mrpc dataset_split: validation col_mapping: text1: sentence1 text2: sentence2 target: label --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Natural Language Inference * Model: autoevaluate/natural-language-inference * Dataset: glue * Config: mrpc * Split: validation To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
huggingartists/oxxxymiron
--- language: - en tags: - huggingartists - lyrics --- # Dataset Card for "huggingartists/oxxxymiron" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [About](#about) ## Dataset Description - **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists) - **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) - **Size of the generated dataset:** 2.070318 MB <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://images.genius.com/57ecbbdaf70c671be2d8b7bd39112db0.1000x1000x1.jpg&#39;)"> </div> </div> <a href="https://huggingface.co/huggingartists/oxxxymiron"> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div> </a> <div style="text-align: center; font-size: 16px; font-weight: 800">Oxxxymiron</div> <a href="https://genius.com/artists/oxxxymiron"> <div style="text-align: center; font-size: 14px;">@oxxxymiron</div> </a> </div> ### Dataset Summary The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists. Model is available [here](https://huggingface.co/huggingartists/oxxxymiron). ### Supported Tasks and Leaderboards [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Languages en ## How to use How to load this dataset directly with the datasets library: ```python from datasets import load_dataset dataset = load_dataset("huggingartists/oxxxymiron") ``` ## Dataset Structure An example of 'train' looks as follows. ``` This example was too long and was cropped: { "text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..." } ``` ### Data Fields The data fields are the same among all splits. - `text`: a `string` feature. ### Data Splits | train |validation|test| |------:|---------:|---:| |210| -| -| 'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code: ```python from datasets import load_dataset, Dataset, DatasetDict import numpy as np datasets = load_dataset("huggingartists/oxxxymiron") train_percentage = 0.9 validation_percentage = 0.07 test_percentage = 0.03 train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))]) datasets = DatasetDict( { 'train': Dataset.from_dict({'text': list(train)}), 'validation': Dataset.from_dict({'text': list(validation)}), 'test': Dataset.from_dict({'text': list(test)}) } ) ``` ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @InProceedings{huggingartists, author={Aleksey Korshuk} year=2021 } ``` ## About *Built by Aleksey Korshuk* [![Follow](https://img.shields.io/github/followers/AlekseyKorshuk?style=social)](https://github.com/AlekseyKorshuk) [![Follow](https://img.shields.io/twitter/follow/alekseykorshuk?style=social)](https://twitter.com/intent/follow?screen_name=alekseykorshuk) [![Follow](https://img.shields.io/badge/dynamic/json?color=blue&label=Telegram%20Channel&query=%24.result&url=https%3A%2F%2Fapi.telegram.org%2Fbot1929545866%3AAAFGhV-KKnegEcLiyYJxsc4zV6C-bdPEBtQ%2FgetChatMemberCount%3Fchat_id%3D-1001253621662&style=social&logo=telegram)](https://t.me/joinchat/_CQ04KjcJ-4yZTky) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/AlekseyKorshuk/huggingartists?style=social)](https://github.com/AlekseyKorshuk/huggingartists)
NightMachinery/ImageNet1K-val-indexed
--- dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': n01440764 '1': n01443537 '2': n01484850 '3': n01491361 '4': n01494475 '5': n01496331 '6': n01498041 '7': n01514668 '8': n01514859 '9': n01518878 '10': n01530575 '11': n01531178 '12': n01532829 '13': n01534433 '14': n01537544 '15': n01558993 '16': n01560419 '17': n01580077 '18': n01582220 '19': n01592084 '20': n01601694 '21': n01608432 '22': n01614925 '23': n01616318 '24': n01622779 '25': n01629819 '26': n01630670 '27': n01631663 '28': n01632458 '29': n01632777 '30': n01641577 '31': n01644373 '32': n01644900 '33': n01664065 '34': n01665541 '35': n01667114 '36': n01667778 '37': n01669191 '38': n01675722 '39': n01677366 '40': n01682714 '41': n01685808 '42': n01687978 '43': n01688243 '44': n01689811 '45': n01692333 '46': n01693334 '47': n01694178 '48': n01695060 '49': n01697457 '50': n01698640 '51': n01704323 '52': n01728572 '53': n01728920 '54': n01729322 '55': n01729977 '56': n01734418 '57': n01735189 '58': n01737021 '59': n01739381 '60': n01740131 '61': n01742172 '62': n01744401 '63': n01748264 '64': n01749939 '65': n01751748 '66': n01753488 '67': n01755581 '68': n01756291 '69': n01768244 '70': n01770081 '71': n01770393 '72': n01773157 '73': n01773549 '74': n01773797 '75': n01774384 '76': n01774750 '77': n01775062 '78': n01776313 '79': n01784675 '80': n01795545 '81': n01796340 '82': n01797886 '83': n01798484 '84': n01806143 '85': n01806567 '86': n01807496 '87': n01817953 '88': n01818515 '89': n01819313 '90': n01820546 '91': n01824575 '92': n01828970 '93': n01829413 '94': n01833805 '95': n01843065 '96': n01843383 '97': n01847000 '98': n01855032 '99': n01855672 '100': n01860187 '101': n01871265 '102': n01872401 '103': n01873310 '104': n01877812 '105': n01882714 '106': n01883070 '107': n01910747 '108': n01914609 '109': n01917289 '110': n01924916 '111': n01930112 '112': n01943899 '113': n01944390 '114': n01945685 '115': n01950731 '116': n01955084 '117': n01968897 '118': n01978287 '119': n01978455 '120': n01980166 '121': n01981276 '122': n01983481 '123': n01984695 '124': n01985128 '125': n01986214 '126': n01990800 '127': n02002556 '128': n02002724 '129': n02006656 '130': n02007558 '131': n02009229 '132': n02009912 '133': n02011460 '134': n02012849 '135': n02013706 '136': n02017213 '137': n02018207 '138': n02018795 '139': n02025239 '140': n02027492 '141': n02028035 '142': n02033041 '143': n02037110 '144': n02051845 '145': n02056570 '146': n02058221 '147': n02066245 '148': n02071294 '149': n02074367 '150': n02077923 '151': n02085620 '152': n02085782 '153': n02085936 '154': n02086079 '155': n02086240 '156': n02086646 '157': n02086910 '158': n02087046 '159': n02087394 '160': n02088094 '161': n02088238 '162': n02088364 '163': n02088466 '164': n02088632 '165': n02089078 '166': n02089867 '167': n02089973 '168': n02090379 '169': n02090622 '170': n02090721 '171': n02091032 '172': n02091134 '173': n02091244 '174': n02091467 '175': n02091635 '176': n02091831 '177': n02092002 '178': n02092339 '179': n02093256 '180': n02093428 '181': n02093647 '182': n02093754 '183': n02093859 '184': n02093991 '185': n02094114 '186': n02094258 '187': n02094433 '188': n02095314 '189': n02095570 '190': n02095889 '191': n02096051 '192': n02096177 '193': n02096294 '194': n02096437 '195': n02096585 '196': n02097047 '197': n02097130 '198': n02097209 '199': n02097298 '200': n02097474 '201': n02097658 '202': n02098105 '203': n02098286 '204': n02098413 '205': n02099267 '206': n02099429 '207': n02099601 '208': n02099712 '209': n02099849 '210': n02100236 '211': n02100583 '212': n02100735 '213': n02100877 '214': n02101006 '215': n02101388 '216': n02101556 '217': n02102040 '218': n02102177 '219': n02102318 '220': n02102480 '221': n02102973 '222': n02104029 '223': n02104365 '224': n02105056 '225': n02105162 '226': n02105251 '227': n02105412 '228': n02105505 '229': n02105641 '230': n02105855 '231': n02106030 '232': n02106166 '233': n02106382 '234': n02106550 '235': n02106662 '236': n02107142 '237': n02107312 '238': n02107574 '239': n02107683 '240': n02107908 '241': n02108000 '242': n02108089 '243': n02108422 '244': n02108551 '245': n02108915 '246': n02109047 '247': n02109525 '248': n02109961 '249': n02110063 '250': n02110185 '251': n02110341 '252': n02110627 '253': n02110806 '254': n02110958 '255': n02111129 '256': n02111277 '257': n02111500 '258': n02111889 '259': n02112018 '260': n02112137 '261': n02112350 '262': n02112706 '263': n02113023 '264': n02113186 '265': n02113624 '266': n02113712 '267': n02113799 '268': n02113978 '269': n02114367 '270': n02114548 '271': n02114712 '272': n02114855 '273': n02115641 '274': n02115913 '275': n02116738 '276': n02117135 '277': n02119022 '278': n02119789 '279': n02120079 '280': n02120505 '281': n02123045 '282': n02123159 '283': n02123394 '284': n02123597 '285': n02124075 '286': n02125311 '287': n02127052 '288': n02128385 '289': n02128757 '290': n02128925 '291': n02129165 '292': n02129604 '293': n02130308 '294': n02132136 '295': n02133161 '296': n02134084 '297': n02134418 '298': n02137549 '299': n02138441 '300': n02165105 '301': n02165456 '302': n02167151 '303': n02168699 '304': n02169497 '305': n02172182 '306': n02174001 '307': n02177972 '308': n02190166 '309': n02206856 '310': n02219486 '311': n02226429 '312': n02229544 '313': n02231487 '314': n02233338 '315': n02236044 '316': n02256656 '317': n02259212 '318': n02264363 '319': n02268443 '320': n02268853 '321': n02276258 '322': n02277742 '323': n02279972 '324': n02280649 '325': n02281406 '326': n02281787 '327': n02317335 '328': n02319095 '329': n02321529 '330': n02325366 '331': n02326432 '332': n02328150 '333': n02342885 '334': n02346627 '335': n02356798 '336': n02361337 '337': n02363005 '338': n02364673 '339': n02389026 '340': n02391049 '341': n02395406 '342': n02396427 '343': n02397096 '344': n02398521 '345': n02403003 '346': n02408429 '347': n02410509 '348': n02412080 '349': n02415577 '350': n02417914 '351': n02422106 '352': n02422699 '353': n02423022 '354': n02437312 '355': n02437616 '356': n02441942 '357': n02442845 '358': n02443114 '359': n02443484 '360': n02444819 '361': n02445715 '362': n02447366 '363': n02454379 '364': n02457408 '365': n02480495 '366': n02480855 '367': n02481823 '368': n02483362 '369': n02483708 '370': n02484975 '371': n02486261 '372': n02486410 '373': n02487347 '374': n02488291 '375': n02488702 '376': n02489166 '377': n02490219 '378': n02492035 '379': n02492660 '380': n02493509 '381': n02493793 '382': n02494079 '383': n02497673 '384': n02500267 '385': n02504013 '386': n02504458 '387': n02509815 '388': n02510455 '389': n02514041 '390': n02526121 '391': n02536864 '392': n02606052 '393': n02607072 '394': n02640242 '395': n02641379 '396': n02643566 '397': n02655020 '398': n02666196 '399': n02667093 '400': n02669723 '401': n02672831 '402': n02676566 '403': n02687172 '404': n02690373 '405': n02692877 '406': n02699494 '407': n02701002 '408': n02704792 '409': n02708093 '410': n02727426 '411': n02730930 '412': n02747177 '413': n02749479 '414': n02769748 '415': n02776631 '416': n02777292 '417': n02782093 '418': n02783161 '419': n02786058 '420': n02787622 '421': n02788148 '422': n02790996 '423': n02791124 '424': n02791270 '425': n02793495 '426': n02794156 '427': n02795169 '428': n02797295 '429': n02799071 '430': n02802426 '431': n02804414 '432': n02804610 '433': n02807133 '434': n02808304 '435': n02808440 '436': n02814533 '437': n02814860 '438': n02815834 '439': n02817516 '440': n02823428 '441': n02823750 '442': n02825657 '443': n02834397 '444': n02835271 '445': n02837789 '446': n02840245 '447': n02841315 '448': n02843684 '449': n02859443 '450': n02860847 '451': n02865351 '452': n02869837 '453': n02870880 '454': n02871525 '455': n02877765 '456': n02879718 '457': n02883205 '458': n02892201 '459': n02892767 '460': n02894605 '461': n02895154 '462': n02906734 '463': n02909870 '464': n02910353 '465': n02916936 '466': n02917067 '467': n02927161 '468': n02930766 '469': n02939185 '470': n02948072 '471': n02950826 '472': n02951358 '473': n02951585 '474': n02963159 '475': n02965783 '476': n02966193 '477': n02966687 '478': n02971356 '479': n02974003 '480': n02977058 '481': n02978881 '482': n02979186 '483': n02980441 '484': n02981792 '485': n02988304 '486': n02992211 '487': n02992529 '488': n02999410 '489': n03000134 '490': n03000247 '491': n03000684 '492': n03014705 '493': n03016953 '494': n03017168 '495': n03018349 '496': n03026506 '497': n03028079 '498': n03032252 '499': n03041632 '500': n03042490 '501': n03045698 '502': n03047690 '503': n03062245 '504': n03063599 '505': n03063689 '506': n03065424 '507': n03075370 '508': n03085013 '509': n03089624 '510': n03095699 '511': n03100240 '512': n03109150 '513': n03110669 '514': n03124043 '515': n03124170 '516': n03125729 '517': n03126707 '518': n03127747 '519': n03127925 '520': n03131574 '521': n03133878 '522': n03134739 '523': n03141823 '524': n03146219 '525': n03160309 '526': n03179701 '527': n03180011 '528': n03187595 '529': n03188531 '530': n03196217 '531': n03197337 '532': n03201208 '533': n03207743 '534': n03207941 '535': n03208938 '536': n03216828 '537': n03218198 '538': n03220513 '539': n03223299 '540': n03240683 '541': n03249569 '542': n03250847 '543': n03255030 '544': n03259280 '545': n03271574 '546': n03272010 '547': n03272562 '548': n03290653 '549': n03291819 '550': n03297495 '551': n03314780 '552': n03325584 '553': n03337140 '554': n03344393 '555': n03345487 '556': n03347037 '557': n03355925 '558': n03372029 '559': n03376595 '560': n03379051 '561': n03384352 '562': n03388043 '563': n03388183 '564': n03388549 '565': n03393912 '566': n03394916 '567': n03400231 '568': n03404251 '569': n03417042 '570': n03424325 '571': n03425413 '572': n03443371 '573': n03444034 '574': n03445777 '575': n03445924 '576': n03447447 '577': n03447721 '578': n03450230 '579': n03452741 '580': n03457902 '581': n03459775 '582': n03461385 '583': n03467068 '584': n03476684 '585': n03476991 '586': n03478589 '587': n03481172 '588': n03482405 '589': n03483316 '590': n03485407 '591': n03485794 '592': n03492542 '593': n03494278 '594': n03495258 '595': n03496892 '596': n03498962 '597': n03527444 '598': n03529860 '599': n03530642 '600': n03532672 '601': n03534580 '602': n03535780 '603': n03538406 '604': n03544143 '605': n03584254 '606': n03584829 '607': n03590841 '608': n03594734 '609': n03594945 '610': n03595614 '611': n03598930 '612': n03599486 '613': n03602883 '614': n03617480 '615': n03623198 '616': n03627232 '617': n03630383 '618': n03633091 '619': n03637318 '620': n03642806 '621': n03649909 '622': n03657121 '623': n03658185 '624': n03661043 '625': n03662601 '626': n03666591 '627': n03670208 '628': n03673027 '629': n03676483 '630': n03680355 '631': n03690938 '632': n03691459 '633': n03692522 '634': n03697007 '635': n03706229 '636': n03709823 '637': n03710193 '638': n03710637 '639': n03710721 '640': n03717622 '641': n03720891 '642': n03721384 '643': n03724870 '644': n03729826 '645': n03733131 '646': n03733281 '647': n03733805 '648': n03742115 '649': n03743016 '650': n03759954 '651': n03761084 '652': n03763968 '653': n03764736 '654': n03769881 '655': n03770439 '656': n03770679 '657': n03773504 '658': n03775071 '659': n03775546 '660': n03776460 '661': n03777568 '662': n03777754 '663': n03781244 '664': n03782006 '665': n03785016 '666': n03786901 '667': n03787032 '668': n03788195 '669': n03788365 '670': n03791053 '671': n03792782 '672': n03792972 '673': n03793489 '674': n03794056 '675': n03796401 '676': n03803284 '677': n03804744 '678': n03814639 '679': n03814906 '680': n03825788 '681': n03832673 '682': n03837869 '683': n03838899 '684': n03840681 '685': n03841143 '686': n03843555 '687': n03854065 '688': n03857828 '689': n03866082 '690': n03868242 '691': n03868863 '692': n03871628 '693': n03873416 '694': n03874293 '695': n03874599 '696': n03876231 '697': n03877472 '698': n03877845 '699': n03884397 '700': n03887697 '701': n03888257 '702': n03888605 '703': n03891251 '704': n03891332 '705': n03895866 '706': n03899768 '707': n03902125 '708': n03903868 '709': n03908618 '710': n03908714 '711': n03916031 '712': n03920288 '713': n03924679 '714': n03929660 '715': n03929855 '716': n03930313 '717': n03930630 '718': n03933933 '719': n03935335 '720': n03937543 '721': n03938244 '722': n03942813 '723': n03944341 '724': n03947888 '725': n03950228 '726': n03954731 '727': n03956157 '728': n03958227 '729': n03961711 '730': n03967562 '731': n03970156 '732': n03976467 '733': n03976657 '734': n03977966 '735': n03980874 '736': n03982430 '737': n03983396 '738': n03991062 '739': n03992509 '740': n03995372 '741': n03998194 '742': n04004767 '743': n04005630 '744': n04008634 '745': n04009552 '746': n04019541 '747': n04023962 '748': n04026417 '749': n04033901 '750': n04033995 '751': n04037443 '752': n04039381 '753': n04040759 '754': n04041544 '755': n04044716 '756': n04049303 '757': n04065272 '758': n04067472 '759': n04069434 '760': n04070727 '761': n04074963 '762': n04081281 '763': n04086273 '764': n04090263 '765': n04099969 '766': n04111531 '767': n04116512 '768': n04118538 '769': n04118776 '770': n04120489 '771': n04125021 '772': n04127249 '773': n04131690 '774': n04133789 '775': n04136333 '776': n04141076 '777': n04141327 '778': n04141975 '779': n04146614 '780': n04147183 '781': n04149813 '782': n04152593 '783': n04153751 '784': n04154565 '785': n04162706 '786': n04179913 '787': n04192698 '788': n04200800 '789': n04201297 '790': n04204238 '791': n04204347 '792': n04208210 '793': n04209133 '794': n04209239 '795': n04228054 '796': n04229816 '797': n04235860 '798': n04238763 '799': n04239074 '800': n04243546 '801': n04251144 '802': n04252077 '803': n04252225 '804': n04254120 '805': n04254680 '806': n04254777 '807': n04258138 '808': n04259630 '809': n04263257 '810': n04264628 '811': n04265275 '812': n04266014 '813': n04270147 '814': n04273569 '815': n04275548 '816': n04277352 '817': n04285008 '818': n04286575 '819': n04296562 '820': n04310018 '821': n04311004 '822': n04311174 '823': n04317175 '824': n04325704 '825': n04326547 '826': n04328186 '827': n04330267 '828': n04332243 '829': n04335435 '830': n04336792 '831': n04344873 '832': n04346328 '833': n04347754 '834': n04350905 '835': n04355338 '836': n04355933 '837': n04356056 '838': n04357314 '839': n04366367 '840': n04367480 '841': n04370456 '842': n04371430 '843': n04371774 '844': n04372370 '845': n04376876 '846': n04380533 '847': n04389033 '848': n04392985 '849': n04398044 '850': n04399382 '851': n04404412 '852': n04409515 '853': n04417672 '854': n04418357 '855': n04423845 '856': n04428191 '857': n04429376 '858': n04435653 '859': n04442312 '860': n04443257 '861': n04447861 '862': n04456115 '863': n04458633 '864': n04461696 '865': n04462240 '866': n04465501 '867': n04467665 '868': n04476259 '869': n04479046 '870': n04482393 '871': n04483307 '872': n04485082 '873': n04486054 '874': n04487081 '875': n04487394 '876': n04493381 '877': n04501370 '878': n04505470 '879': n04507155 '880': n04509417 '881': n04515003 '882': n04517823 '883': n04522168 '884': n04523525 '885': n04525038 '886': n04525305 '887': n04532106 '888': n04532670 '889': n04536866 '890': n04540053 '891': n04542943 '892': n04548280 '893': n04548362 '894': n04550184 '895': n04552348 '896': n04553703 '897': n04554684 '898': n04557648 '899': n04560804 '900': n04562935 '901': n04579145 '902': n04579432 '903': n04584207 '904': n04589890 '905': n04590129 '906': n04591157 '907': n04591713 '908': n04592741 '909': n04596742 '910': n04597913 '911': n04599235 '912': n04604644 '913': n04606251 '914': n04612504 '915': n04613696 '916': n06359193 '917': n06596364 '918': n06785654 '919': n06794110 '920': n06874185 '921': n07248320 '922': n07565083 '923': n07579787 '924': n07583066 '925': n07584110 '926': n07590611 '927': n07613480 '928': n07614500 '929': n07615774 '930': n07684084 '931': n07693725 '932': n07695742 '933': n07697313 '934': n07697537 '935': n07711569 '936': n07714571 '937': n07714990 '938': n07715103 '939': n07716358 '940': n07716906 '941': n07717410 '942': n07717556 '943': n07718472 '944': n07718747 '945': n07720875 '946': n07730033 '947': n07734744 '948': n07742313 '949': n07745940 '950': n07747607 '951': n07749582 '952': n07753113 '953': n07753275 '954': n07753592 '955': n07754684 '956': n07760859 '957': n07768694 '958': n07802026 '959': n07831146 '960': n07836838 '961': n07860988 '962': n07871810 '963': n07873807 '964': n07875152 '965': n07880968 '966': n07892512 '967': n07920052 '968': n07930864 '969': n07932039 '970': n09193705 '971': n09229709 '972': n09246464 '973': n09256479 '974': n09288635 '975': n09332890 '976': n09399592 '977': n09421951 '978': n09428293 '979': n09468604 '980': n09472597 '981': n09835506 '982': n10148035 '983': n10565667 '984': n11879895 '985': n11939491 '986': n12057211 '987': n12144580 '988': n12267677 '989': n12620546 '990': n12768682 '991': n12985857 '992': n12998815 '993': n13037406 '994': n13040303 '995': n13044778 '996': n13052670 '997': n13054560 '998': n13133613 '999': n15075141 - name: id dtype: int64 splits: - name: train num_bytes: 6633504145.375 num_examples: 49101 download_size: 6622641479 dataset_size: 6633504145.375 --- # Dataset Card for "ImageNet1K-val-indexed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
cahya/instructions-es
--- dataset_info: features: - name: id dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 10931526.154626371 num_examples: 22947 - name: test num_bytes: 287734.42268681433 num_examples: 604 - name: validation num_bytes: 287734.42268681433 num_examples: 604 download_size: 6659541 dataset_size: 11506995.0 --- # Dataset Card for "instructions-es" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ohhhchank3/tlcn
--- size_categories: - 10K<n<100K ---
and-effect/MDK_taxonomy
--- language: - de license: cc0-1.0 size_categories: - n<1K --- # Dataset Card for MDK_taxonomy ## Dataset Description / Summary This dataset was created as part of the [Bertelsmann Foundation](https://www.bertelsmann-stiftung.de/de/startseite) [Musterdatenkatalog]("https://www.bertelsmann-stiftung.de/de/unsere-projekte/smart-country/musterdatenkatalog") project. See the project on GitHub [here](https://github.com/bertelsmannstift/Musterdatenkatalog-V4). The MDK provides an overview of Open Data in municipalities in Germany. This data contains the taxonomy created by and-effect as part of the project. The taxonomy adheres to the SKOS standard and is available as RDF and JSON-LD. There are two levels to the taxonomy: 'Thema' (Topic, first level) and 'Bezeichnung' (Label, second level). The taxonomy contains 25 elements on the first level and 241 elements on the second level. ### Languages German, some information translated to English ## Dataset Structure ### Data Fields Contains information for each concept such as 'skos:prefLabel' (in german and english), 'skos:definition' in german and optionally matches to other concepts. ## Dataset Creation The RDF and JSON-LD file are created with the help of '2023-05-17_MDK_taxonomy_info.csv' which contains all the information about the taxonomy and its concepts. ## Additional Information ### Licensing Information CC0 ### Contributions The taxonomy was based on previous work by the Bertelsmann Stiftung and together with and-effect revised.
noobmaster29/Verified-Camel-zh
--- license: apache-2.0 task_categories: - conversational - question-answering - text-generation language: - en - zh tags: - Physics - Chemistry - Math - Biology - Culture - Logic size_categories: - n<1K --- This is a direct Chinese translation using GPT4 of the Verified-Camel dataset. I hope you find it useful. https://huggingface.co/datasets/LDJnr/Verified-Camel Citation: ``` @article{daniele2023amplify-instruct, title={Amplify-Instruct: Synthetically Generated Diverse Multi-turn Conversations for Effecient LLM Training.}, author={Daniele, Luigi and Suphavadeeprasit}, journal={arXiv preprint arXiv:(comming soon)}, year={2023} } ```
ctang/gpt_util_eval_llama2
--- dataset_info: features: - name: prompt dtype: string - name: response_a dtype: string - name: response_b dtype: string - name: more_pleasant dtype: string splits: - name: train num_bytes: 1653 num_examples: 10 download_size: 3774 dataset_size: 1653 configs: - config_name: default data_files: - split: train path: data/train-* ---
yuchen0187/pcmask
--- dataset_info: features: - name: xyz sequence: sequence: float32 - name: rgb sequence: sequence: uint8 - name: mask sequence: sequence: bool splits: - name: test num_bytes: 1369365892 num_examples: 5348 - name: partnet num_bytes: 3850276066 num_examples: 15383 - name: shapenet num_bytes: 4959124112 num_examples: 20860 - name: fusion360 num_bytes: 8408989552 num_examples: 35858 - name: scanobjectnn num_bytes: 982688978 num_examples: 2902 - name: scannet_scale_0 num_bytes: 372642986 num_examples: 1513 - name: scannet_scale_1 num_bytes: 1454217230 num_examples: 6052 - name: partnet_mobility num_bytes: 635042984 num_examples: 2290 - name: partnet_mobility_aug1 num_bytes: 634966490 num_examples: 2290 - name: shapenet_new num_bytes: 4958407478 num_examples: 19735 - name: scannet_new num_bytes: 324334058 num_examples: 1201 download_size: 16313033591 dataset_size: 27950055826 configs: - config_name: default data_files: - split: partnet path: data/partnet-* - split: shapenet path: data/shapenet-* - split: test path: data/test-* - split: fusion360 path: data/fusion360-* - split: scanobjectnn path: data/scanobjectnn-* - split: scannet_scale_0 path: data/scannet_scale_0-* - split: scannet_scale_1 path: data/scannet_scale_1-* - split: partnet_mobility path: data/partnet_mobility-* - split: partnet_mobility_aug1 path: data/partnet_mobility_aug1-* - split: shapenet_new path: data/shapenet_new-* - split: scannet_new path: data/scannet_new-* ---
AISE-TUDelft/PY150k
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: dev path: data/dev-* - split: test path: data/test-* dataset_info: features: - name: index dtype: int64 - name: input dtype: string - name: gt dtype: string - name: hash dtype: int64 - name: full_line dtype: string splits: - name: train num_bytes: 662931126 num_examples: 95000 - name: dev num_bytes: 41218084 num_examples: 5000 - name: test num_bytes: 343336086 num_examples: 50000 download_size: 277005224 dataset_size: 1047485296 license: cc0-1.0 tags: - code pretty_name: PY150 Line Completion Dataset size_categories: - 100K<n<1M --- # Dataset Card for "PY150k" ## Dataset Summary Code Completion dataset created from the code available in [CodeXGlue](https://github.com/microsoft/CodeXGLUE/tree/main/Code-Code/CodeCompletion-line).
thercyl/HD
--- dataset_info: features: - name: 'Unnamed: 0' dtype: float64 - name: Ticker dtype: string - name: Year dtype: string - name: Text dtype: string - name: Embedding dtype: string splits: - name: train num_bytes: 57668542 num_examples: 1656 download_size: 33638115 dataset_size: 57668542 --- # Dataset Card for "HD" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pharaouk/cortex_beta
--- dataset_info: features: - name: prompts dtype: string - name: responses dtype: string splits: - name: train num_bytes: 23398376 num_examples: 9923 download_size: 11618983 dataset_size: 23398376 configs: - config_name: default data_files: - split: train path: data/train-* ---
VedCodes/esy_shr
--- task_categories: - text-generation language: - en tags: - medical pretty_name: boyyyzz size_categories: - n<1K ---
minimario/apps_partial_sorted_0_200
--- dataset_info: features: - name: problem dtype: string - name: code dtype: string - name: label dtype: int64 - name: full_sample dtype: string - name: where_from dtype: string splits: - name: train num_bytes: 100164623 num_examples: 80215 download_size: 3452557 dataset_size: 100164623 --- # Dataset Card for "apps_partial_sorted_0_200" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/mmlu-high_school_computer_science-neg-answer
--- dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: neg_answer dtype: string splits: - name: test num_bytes: 50115 num_examples: 100 download_size: 31373 dataset_size: 50115 --- # Dataset Card for "mmlu-high_school_computer_science-neg-answer" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kgr123/quality_counter_500_4_simple
--- dataset_info: features: - name: context dtype: string - name: word dtype: string - name: claim dtype: string - name: label dtype: int64 splits: - name: test num_bytes: 2727654 num_examples: 1747 - name: train num_bytes: 2728865 num_examples: 1777 - name: validation num_bytes: 2826634 num_examples: 1807 download_size: 2047104 dataset_size: 8283153 configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* - split: validation path: data/validation-* ---
harpreetsahota/vectordb_trend_analysis
--- dataset_info: features: - name: provider dtype: string - name: package_name dtype: string - name: 'last_day_downloads (Reference Date: 2024-04-07)' dtype: int64 - name: last_week_downloads dtype: int64 - name: last_month_downloads dtype: int64 - name: last_180days_downloads dtype: int64 - name: total_downloads (pepy.tech) dtype: int64 splits: - name: train num_bytes: 1990 num_examples: 26 download_size: 5660 dataset_size: 1990 configs: - config_name: default data_files: - split: train path: data/train-* ---
neil-code/autotrain-data-summarization
--- language: - en task_categories: - summarization --- # AutoTrain Dataset for project: summarization ## Dataset Description This dataset has been automatically processed by AutoTrain for project summarization. ### Languages The BCP-47 code for the dataset's language is en. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "feat_id": "train_0", "text": "#Person1#: Hi, Mr. Smith. I'm Doctor Hawkins. Why are you here today?\n#Person2#: I found it would be a good idea to get a check-up.\n#Person1#: Yes, well, you haven't had one for 5 years. You should have one every year.\n#Person2#: I know. I figure as long as there is nothing wrong, why go see the doctor?\n#Person1#: Well, the best way to avoid serious illnesses is to find out about them early. So try to come at least once a year for your own good.\n#Person2#: Ok.\n#Person1#: Let me see here. Your eyes and ears look fine. Take a deep breath, please. Do you smoke, Mr. Smith?\n#Person2#: Yes.\n#Person1#: Smoking is the leading cause of lung cancer and heart disease, you know. You really should quit.\n#Person2#: I've tried hundreds of times, but I just can't seem to kick the habit.\n#Person1#: Well, we have classes and some medications that might help. I'll give you more information before you leave.\n#Person2#: Ok, thanks doctor.", "target": "Mr. Smith's getting a check-up, and Doctor Hawkins advises him to have one every year. Hawkins'll give some information about their classes and medications to help Mr. Smith quit smoking.", "feat_topic": "get a check-up" }, { "feat_id": "train_1", "text": "#Person1#: Hello Mrs. Parker, how have you been?\n#Person2#: Hello Dr. Peters. Just fine thank you. Ricky and I are here for his vaccines.\n#Person1#: Very well. Let's see, according to his vaccination record, Ricky has received his Polio, Tetanus and Hepatitis B shots. He is 14 months old, so he is due for Hepatitis A, Chickenpox and Measles shots.\n#Person2#: What about Rubella and Mumps?\n#Person1#: Well, I can only give him these for now, and after a couple of weeks I can administer the rest.\n#Person2#: OK, great. Doctor, I think I also may need a Tetanus booster. Last time I got it was maybe fifteen years ago!\n#Person1#: We will check our records and I'll have the nurse administer and the booster as well. Now, please hold Ricky's arm tight, this may sting a little.", "target": "Mrs Parker takes Ricky for his vaccines. Dr. Peters checks the record and then gives Ricky a vaccine.", "feat_topic": "vaccines" } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "feat_id": "Value(dtype='string', id=None)", "text": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)", "feat_topic": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 1999 | | valid | 499 |
arattinger/noto-emoji-captions
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 77868555.5 num_examples: 3468 download_size: 77424588 dataset_size: 77868555.5 annotations_creators: - machine-generated language: - en multilinguality: - monolingual pretty_name: 'Pokémon BLIP captions' --- # Dataset Card for Noto Emoji Captions BLIP generated captions for Noto emojis. The dataset was captioned with the [pre-trained BLIP model](https://github.com/salesforce/BLIP). It contains a list of ´image´ and ´text´ keys with the images being 512x512.
myradeng/diffusion_db_5k_val_v1
--- dataset_info: features: - name: image dtype: image - name: prompt dtype: string - name: seed dtype: uint32 - name: step dtype: uint16 - name: cfg dtype: float32 - name: sampler dtype: string - name: width dtype: uint16 - name: height dtype: uint16 - name: user_name dtype: string - name: timestamp dtype: timestamp[us, tz=UTC] - name: image_nsfw dtype: float32 - name: prompt_nsfw dtype: float32 splits: - name: train num_bytes: 519559992.6 num_examples: 1000 download_size: 519441334 dataset_size: 519559992.6 --- # Dataset Card for "diffusion_db_5k_val_v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/openhermes-dev__mistralai_Mistral-7B-Instruct-v0.1__1707331096
--- dataset_info: features: - name: model dtype: 'null' - name: category dtype: string - name: language dtype: string - name: custom_instruction dtype: bool - name: id dtype: string - name: topic dtype: string - name: avatarUrl dtype: 'null' - name: idx dtype: 'null' - name: conversations list: - name: from dtype: string - name: value dtype: string - name: weight dtype: 'null' - name: system_prompt dtype: string - name: source dtype: string - name: model_name dtype: string - name: skip_prompt_formatting dtype: bool - name: title dtype: string - name: hash dtype: 'null' - name: views dtype: 'null' - name: prompt dtype: string - name: token_length dtype: int64 - name: candidate0 list: - name: content dtype: string - name: role dtype: string - name: candidate1 list: - name: content dtype: string - name: role dtype: string - name: candidate0_policy dtype: string - name: candidate1_policy dtype: string - name: candidate0_score dtype: float64 - name: candidate1_score dtype: float64 - name: chosen list: - name: content dtype: string - name: role dtype: string - name: chosen_policy dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: rejected_policy dtype: string splits: - name: train_prefs num_bytes: 1204323.2874251497 num_examples: 87 download_size: 606169 dataset_size: 1204323.2874251497 configs: - config_name: default data_files: - split: train_prefs path: data/train_prefs-* ---
DLIlab/webarena_paraphrased_instruction_web_web
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: generated_task sequence: string - name: target_web_name dtype: string - name: given_task dtype: string - name: annotation_id dtype: string splits: - name: train num_bytes: 104975 num_examples: 73 download_size: 51033 dataset_size: 104975 --- In this case, given_tasks are made through the following steps. 1. make vectorDB with webname and webdescription in mind2Web. 2. for each website in WebArena, retrieve relevant websites in mind2web, top_4, with a threshold 0.36(L2 distance) 3. every instructions in retrieved websites(in mind2web) are used as give_task. --- dataset_info: features: - name: annotation_id dtype: string - name: given_task dtype: string - name: target_web_name dtype: string - name: generated_task sequence: string splits: - name: train num_bytes: 115053 num_examples: 146 download_size: 51880 dataset_size: 115053 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq
--- pretty_name: Evaluation run of malhajar/Platypus2-70B-instruct-4bit-gptq dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [malhajar/Platypus2-70B-instruct-4bit-gptq](https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq\"\ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\ \nThese are the [latest results from run 2023-08-26T12:30:11.519673](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq/blob/main/results_2023-08-26T12%3A30%3A11.519673.json)\ \ (note that their might be results for other tasks in the repos if successive evals\ \ didn't cover the same tasks. You find each in the results and the \"latest\" split\ \ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23568332946534118,\n\ \ \"acc_stderr\": 0.030875990616634128,\n \"acc_norm\": 0.23665349264658486,\n\ \ \"acc_norm_stderr\": 0.030890666475037305,\n \"mc1\": 0.2460220318237454,\n\ \ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4955854635237609,\n\ \ \"mc2_stderr\": 0.01695340721579618\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301829,\n\ \ \"acc_norm\": 0.2901023890784983,\n \"acc_norm_stderr\": 0.01326157367752077\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2560246962756423,\n\ \ \"acc_stderr\": 0.004355436696716298,\n \"acc_norm\": 0.25951005775741887,\n\ \ \"acc_norm_stderr\": 0.0043746991892848605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \ \ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\ \ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\ \ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n\ \ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\ \ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\ \ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\ \ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\ \ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \ \ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n\ \ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \ \ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\ \ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\ \ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\ \ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\ \ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\ \ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\ \ \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n\ \ \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\ \ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\ acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\ \ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\ \ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.19032258064516128,\n \"acc_stderr\": 0.022331707611823088,\n \"\ acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.022331707611823088\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\ acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \"acc_norm\"\ : 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\ acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\ \ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\"\ : 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\ : {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n\ \ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\ \ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.1986754966887417,\n \"acc_stderr\": 0.032578473844367774,\n \"\ acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.032578473844367774\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217878,\n \"\ acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217878\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.12037037037037036,\n \"acc_stderr\": 0.02219169594400172,\n \"\ acc_norm\": 0.12037037037037036,\n \"acc_norm_stderr\": 0.02219169594400172\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\ \ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\ : {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n\ \ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\ \ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\ \ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\ \ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\ acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\ \ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\ \ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\ \ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\ \ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\ \ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\ \ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\ \ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\ \ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\ \ \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n\ \ \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n\ \ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\ \ \"acc_stderr\": 0.014465893829859923,\n \"acc_norm\": 0.24916201117318434,\n\ \ \"acc_norm_stderr\": 0.014465893829859923\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934101,\n\ \ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934101\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\ \ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\ \ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\ \ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.20921985815602837,\n \"acc_stderr\": 0.02426476943998847,\n \ \ \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.02426476943998847\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\ \ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\ \ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010112997,\n\ \ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010112997\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \ \ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\ \ \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.20909090909090908,\n\ \ \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n\ \ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\ \ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\ \ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\ \ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\ \ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\ \ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\ \ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.4955854635237609,\n\ \ \"mc2_stderr\": 0.01695340721579618\n }\n}\n```" repo_url: https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|arc:challenge|25_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hellaswag|10_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T12:30:11.519673.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_26T12_30_11.519673 path: - '**/details_harness|truthfulqa:mc|0_2023-08-26T12:30:11.519673.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-26T12:30:11.519673.parquet' - config_name: results data_files: - split: 2023_08_26T12_30_11.519673 path: - results_2023-08-26T12:30:11.519673.parquet - split: latest path: - results_2023-08-26T12:30:11.519673.parquet --- # Dataset Card for Evaluation run of malhajar/Platypus2-70B-instruct-4bit-gptq ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [malhajar/Platypus2-70B-instruct-4bit-gptq](https://huggingface.co/malhajar/Platypus2-70B-instruct-4bit-gptq) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-26T12:30:11.519673](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Platypus2-70B-instruct-4bit-gptq/blob/main/results_2023-08-26T12%3A30%3A11.519673.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23568332946534118, "acc_stderr": 0.030875990616634128, "acc_norm": 0.23665349264658486, "acc_norm_stderr": 0.030890666475037305, "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662574, "mc2": 0.4955854635237609, "mc2_stderr": 0.01695340721579618 }, "harness|arc:challenge|25": { "acc": 0.2363481228668942, "acc_stderr": 0.012414960524301829, "acc_norm": 0.2901023890784983, "acc_norm_stderr": 0.01326157367752077 }, "harness|hellaswag|10": { "acc": 0.2560246962756423, "acc_stderr": 0.004355436696716298, "acc_norm": 0.25951005775741887, "acc_norm_stderr": 0.0043746991892848605 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.21710526315789475, "acc_stderr": 0.033550453048829226, "acc_norm": 0.21710526315789475, "acc_norm_stderr": 0.033550453048829226 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21132075471698114, "acc_stderr": 0.025125766484827845, "acc_norm": 0.21132075471698114, "acc_norm_stderr": 0.025125766484827845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.0358687928008034, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.0358687928008034 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.21965317919075145, "acc_stderr": 0.031568093627031744, "acc_norm": 0.21965317919075145, "acc_norm_stderr": 0.031568093627031744 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808778, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808778 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220575, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220575 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2482758620689655, "acc_stderr": 0.036001056927277716, "acc_norm": 0.2482758620689655, "acc_norm_stderr": 0.036001056927277716 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.23809523809523808, "acc_stderr": 0.021935878081184763, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.021935878081184763 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.19047619047619047, "acc_stderr": 0.035122074123020534, "acc_norm": 0.19047619047619047, "acc_norm_stderr": 0.035122074123020534 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.19032258064516128, "acc_stderr": 0.022331707611823088, "acc_norm": 0.19032258064516128, "acc_norm_stderr": 0.022331707611823088 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.18226600985221675, "acc_stderr": 0.02716334085964515, "acc_norm": 0.18226600985221675, "acc_norm_stderr": 0.02716334085964515 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.28, "acc_stderr": 0.045126085985421255, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421255 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.20207253886010362, "acc_stderr": 0.02897908979429673, "acc_norm": 0.20207253886010362, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2, "acc_stderr": 0.020280805062535722, "acc_norm": 0.2, "acc_norm_stderr": 0.020280805062535722 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.02620276653465215, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.02620276653465215 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.032578473844367774, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.032578473844367774 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1981651376146789, "acc_stderr": 0.017090573804217878, "acc_norm": 0.1981651376146789, "acc_norm_stderr": 0.017090573804217878 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.12037037037037036, "acc_stderr": 0.02219169594400172, "acc_norm": 0.12037037037037036, "acc_norm_stderr": 0.02219169594400172 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2742616033755274, "acc_stderr": 0.02904133351059804, "acc_norm": 0.2742616033755274, "acc_norm_stderr": 0.02904133351059804 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2644628099173554, "acc_stderr": 0.04026187527591207, "acc_norm": 0.2644628099173554, "acc_norm_stderr": 0.04026187527591207 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2331288343558282, "acc_stderr": 0.03322015795776741, "acc_norm": 0.2331288343558282, "acc_norm_stderr": 0.03322015795776741 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.04464285714285712, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.04464285714285712 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2388250319284802, "acc_stderr": 0.015246803197398691, "acc_norm": 0.2388250319284802, "acc_norm_stderr": 0.015246803197398691 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2543352601156069, "acc_stderr": 0.023445826276545546, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.023445826276545546 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859923, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859923 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.23202614379084968, "acc_stderr": 0.02417084087934101, "acc_norm": 0.23202614379084968, "acc_norm_stderr": 0.02417084087934101 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2654320987654321, "acc_stderr": 0.024569223600460845, "acc_norm": 0.2654320987654321, "acc_norm_stderr": 0.024569223600460845 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.20921985815602837, "acc_stderr": 0.02426476943998847, "acc_norm": 0.20921985815602837, "acc_norm_stderr": 0.02426476943998847 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.16911764705882354, "acc_stderr": 0.022770868010112997, "acc_norm": 0.16911764705882354, "acc_norm_stderr": 0.022770868010112997 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2565359477124183, "acc_stderr": 0.017667841612378984, "acc_norm": 0.2565359477124183, "acc_norm_stderr": 0.017667841612378984 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.20909090909090908, "acc_stderr": 0.03895091015724136, "acc_norm": 0.20909090909090908, "acc_norm_stderr": 0.03895091015724136 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.20816326530612245, "acc_stderr": 0.025991117672813292, "acc_norm": 0.20816326530612245, "acc_norm_stderr": 0.025991117672813292 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24875621890547264, "acc_stderr": 0.030567675938916707, "acc_norm": 0.24875621890547264, "acc_norm_stderr": 0.030567675938916707 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2460220318237454, "mc1_stderr": 0.015077219200662574, "mc2": 0.4955854635237609, "mc2_stderr": 0.01695340721579618 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
snowfly/verification_guide_query_list
--- dataset_info: features: - name: 待验证query dtype: string - name: 可回答query的CSCO推荐data_id dtype: string splits: - name: train num_bytes: 7890 num_examples: 60 download_size: 6420 dataset_size: 7890 --- # Dataset Card for "verification_guide_query_list" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
webjunkie/ds
--- license: apache-2.0 ---
M-AI-C/en-tafsir-maarif
--- dataset_info: features: - name: ayah dtype: int64 - name: sorah dtype: int64 - name: sentence dtype: string - name: en-tafsir-maarif-html dtype: string - name: en-tafsir-maarif-text dtype: string splits: - name: train num_bytes: 24431167 num_examples: 6235 download_size: 13388809 dataset_size: 24431167 --- # Dataset Card for "en-tafsir-maarif" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Neekey/test_datase
--- license: mit ---
saibo/bookcorpus_small_compact_256_meta
--- dataset_info: features: - name: text dtype: string - name: concept_with_offset dtype: string - name: cid_arrangement sequence: int32 - name: schema_lengths sequence: int64 - name: topic_entity_mask sequence: int64 - name: text_lengths sequence: int64 splits: - name: train num_bytes: 213919213 num_examples: 6104 download_size: 45654115 dataset_size: 213919213 --- # Dataset Card for "bookcorpus_small_compact_256_meta" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ZHENGRAN/code_ujb_testgenissue
--- dataset_info: features: - name: function dtype: string - name: testmethod dtype: string - name: location_fixed dtype: string - name: end_buggy dtype: int64 - name: location dtype: string - name: function_name dtype: string - name: source_buggy dtype: string - name: prompt_complete dtype: string - name: end_fixed dtype: int64 - name: comment dtype: string - name: bug_id dtype: string - name: start_fixed dtype: int64 - name: location_buggy dtype: string - name: source_dir dtype: string - name: prompt_chat dtype: string - name: start_buggy dtype: int64 - name: classes_modified sequence: string - name: task_id dtype: string - name: function_signature dtype: string - name: prompt_complete_without_signature dtype: string - name: project dtype: string - name: indent dtype: string - name: source_fixed dtype: string splits: - name: train num_bytes: 33813808 num_examples: 451 download_size: 8570988 dataset_size: 33813808 configs: - config_name: default data_files: - split: train path: data/train-* ---
chaoscodes/xp3
--- license: apache-2.0 ---
CyberHarem/rei_nikke
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of rei/ライ/莱伊/라이 (Nikke: Goddess of Victory) This is the dataset of rei/ライ/莱伊/라이 (Nikke: Goddess of Victory), containing 14 images and their tags. The core tags of this character are `hat, long_hair, purple_hair, ahoge, bangs, blue_headwear, beret, bow, one_side_up, purple_headwear, red_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 14 | 18.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rei_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 14 | 11.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rei_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 31 | 22.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rei_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 14 | 16.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rei_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 31 | 34.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rei_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/rei_nikke', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, long_sleeves, open_mouth, 1girl, bag, jacket, solo, looking_at_viewer, bird, plaid_skirt, pleated_skirt, standing, white_shirt, black_footwear, collared_shirt, shoes, stuffed_animal, white_socks, :d, multiple_girls, school_uniform | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | long_sleeves | open_mouth | 1girl | bag | jacket | solo | looking_at_viewer | bird | plaid_skirt | pleated_skirt | standing | white_shirt | black_footwear | collared_shirt | shoes | stuffed_animal | white_socks | :d | multiple_girls | school_uniform | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------|:--------|:------|:---------|:-------|:--------------------|:-------|:--------------|:----------------|:-----------|:--------------|:-----------------|:-----------------|:--------|:-----------------|:--------------|:-----|:-----------------|:-----------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
air_dialogue
--- annotations_creators: - crowdsourced language_creators: - machine-generated language: - en license: - cc-by-nc-4.0 multilinguality: - monolingual size_categories: - 100K<n<1M source_datasets: - original task_categories: - text-generation - fill-mask task_ids: - conversational - dialogue-generation - dialogue-modeling - language-modeling - masked-language-modeling pretty_name: AirDialogue dataset_info: - config_name: air_dialogue_data features: - name: action struct: - name: status dtype: string - name: name dtype: string - name: flight sequence: int32 - name: intent struct: - name: return_month dtype: string - name: return_day dtype: string - name: max_price dtype: int32 - name: departure_airport dtype: string - name: max_connections dtype: int32 - name: departure_day dtype: string - name: goal dtype: string - name: departure_month dtype: string - name: name dtype: string - name: return_airport dtype: string - name: timestamps sequence: int64 - name: dialogue sequence: string - name: expected_action struct: - name: status dtype: string - name: name dtype: string - name: flight sequence: int32 - name: search_info list: - name: button_name dtype: string - name: field_name dtype: string - name: field_value dtype: string - name: timestmamp dtype: int64 - name: correct_sample dtype: bool_ splits: - name: train num_bytes: 353718365 num_examples: 321459 - name: validation num_bytes: 44441818 num_examples: 40363 download_size: 141766743 dataset_size: 398160183 - config_name: air_dialogue_kb features: - name: kb list: - name: airline dtype: string - name: class dtype: string - name: departure_airport dtype: string - name: departure_day dtype: string - name: departure_month dtype: string - name: departure_time_num dtype: int32 - name: flight_number dtype: int32 - name: num_connections dtype: int32 - name: price dtype: int32 - name: return_airport dtype: string - name: return_day dtype: string - name: return_month dtype: string - name: return_time_num dtype: int32 - name: reservation dtype: int32 splits: - name: train num_bytes: 782590970 num_examples: 321459 - name: validation num_bytes: 98269609 num_examples: 40363 download_size: 57883938 dataset_size: 880860579 configs: - config_name: air_dialogue_data data_files: - split: train path: air_dialogue_data/train-* - split: validation path: air_dialogue_data/validation-* default: true - config_name: air_dialogue_kb data_files: - split: train path: air_dialogue_kb/train-* - split: validation path: air_dialogue_kb/validation-* --- # Dataset Card for air_dialogue ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** https://worksheets.codalab.org/worksheets/0xa79833f4b3c24f4188cee7131b120a59 - **Repository:** https://github.com/google/airdialogue - **Paper:** https://aclanthology.org/D18-1419/ - **Leaderboard:** https://worksheets.codalab.org/worksheets/0xa79833f4b3c24f4188cee7131b120a59 - **Point of Contact:** [AirDialogue-Google](mailto:airdialogue@gmail.com) - **Point of Contact:** [Wei Wei](mailto:wewei@google.com) ### Dataset Summary AirDialogue, is a large dataset that contains 402,038 goal-oriented conversations. To collect this dataset, we create a contextgenerator which provides travel and flight restrictions. Then the human annotators are asked to play the role of a customer or an agent and interact with the goal of successfully booking a trip given the restrictions. News in v1.3: - We have included the test split of the AirDialogue dataset. - We have included the meta context for OOD2 in the original AirDialogue paper. ### Supported Tasks and Leaderboards We use perplexity and BLEU score to evaluate the quality of the language generated by the model. We also compare the dialogue state generated by the model s and the ground truth state s0. Two categories of the metrics are used: exact match scores and scaled scores The inference competition & leaderboard can be found here: https://worksheets.codalab.org/worksheets/0xa79833f4b3c24f4188cee7131b120a59 ### Languages The text in the dataset is in English. The BCP 47 code is `en` ## Dataset Structure ### Data Instances The data is provided in two set of files. The first one has the dialogues (`air_dialogue_data`) and the knowledge-base (`air_dialogue_kb`) BuilderConfig: `air_dialogue_data` ``` {"action": {"status": "book", "name": "Emily Edwards", "flight": [1027]}, "intent": {"return_month": "June", "return_day": "14", "max_price": 200, "departure_airport": "DFW", "return_time": "afternoon", "max_connections": 1, "departure_day": "12", "goal": "book", "departure_month": "June", "name": "Emily Edwards", "return_airport": "IAD"}, "timestamps": [1519233239, 1519233244, 1519233249, 1519233252, 1519233333, 1519233374, 1519233392, 1519233416, 1519233443, 1519233448, 1519233464, 1519233513, 1519233525, 1519233540, 1519233626, 1519233628, 1519233638], "dialogue": ["customer: Hello.", "agent: Hello.", "customer: My name is Emily Edwards.", "agent: How may I help you out?", "customer: I need some help in my flight ticket reservation to attend a convocation meeting, can you please help me?", "agent: Sure, I will help you out. May I know your travelling dates please?", "customer: Thank you and my dates are 06/12 and back on 06/14.", "agent: Can I know your airport codes?", "customer: The airport codes are from DFW to IAD.", "agent: Ok, please wait a moment.", "customer: Sure.", "agent: There is a flight with connection 1 and price 200, can I proceed with this flight?", "customer: Yes, do proceed with booking.", "agent: Ok, your ticket has been booked.", "customer: Thank you for your assistance in my flight ticket reservation.", "agent: Thank you for choosing us.", "customer: You are welcome."], "expected_action": {"status": "book", "name": "Emily Edwards", "flight": [1027]}, "correct_sample": true} ``` BuilderConfig: `air_dialogue_kb` ``` {"kb": [{"return_airport": "DTW", "airline": "Spirit", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1000, "departure_month": "June", "departure_time_num": 17, "class": "economy", "return_time_num": 2, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "DTW", "airline": "Frontier", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1001, "departure_month": "June", "departure_time_num": 0, "class": "business", "return_time_num": 15, "return_month": "June", "return_day": "13", "num_connections": 0, "price": 500}, {"return_airport": "DTW", "airline": "JetBlue", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1002, "departure_month": "June", "departure_time_num": 0, "class": "business", "return_time_num": 13, "return_month": "June", "return_day": "13", "num_connections": 1, "price": 600}, {"return_airport": "IAD", "airline": "Hawaiian", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1003, "departure_month": "June", "departure_time_num": 6, "class": "economy", "return_time_num": 5, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "DFW", "airline": "AA", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1004, "departure_month": "June", "departure_time_num": 9, "class": "economy", "return_time_num": 11, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "IAD", "airline": "AA", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1005, "departure_month": "June", "departure_time_num": 3, "class": "economy", "return_time_num": 17, "return_month": "June", "return_day": "13", "num_connections": 1, "price": 100}, {"return_airport": "DTW", "airline": "Frontier", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1006, "departure_month": "June", "departure_time_num": 10, "class": "economy", "return_time_num": 10, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "IAD", "airline": "UA", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1007, "departure_month": "June", "departure_time_num": 14, "class": "economy", "return_time_num": 20, "return_month": "June", "return_day": "13", "num_connections": 1, "price": 100}, {"return_airport": "DFW", "airline": "AA", "departure_day": "13", "departure_airport": "DTW", "flight_number": 1008, "departure_month": "June", "departure_time_num": 6, "class": "economy", "return_time_num": 8, "return_month": "June", "return_day": "14", "num_connections": 2, "price": 400}, {"return_airport": "DFW", "airline": "Delta", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1009, "departure_month": "June", "departure_time_num": 18, "class": "economy", "return_time_num": 6, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "DFW", "airline": "Frontier", "departure_day": "13", "departure_airport": "DTW", "flight_number": 1010, "departure_month": "June", "departure_time_num": 4, "class": "economy", "return_time_num": 2, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "DFW", "airline": "Southwest", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1011, "departure_month": "June", "departure_time_num": 17, "class": "economy", "return_time_num": 22, "return_month": "June", "return_day": "13", "num_connections": 0, "price": 100}, {"return_airport": "DTW", "airline": "JetBlue", "departure_day": "11", "departure_airport": "DFW", "flight_number": 1012, "departure_month": "June", "departure_time_num": 13, "class": "economy", "return_time_num": 22, "return_month": "June", "return_day": "13", "num_connections": 1, "price": 100}, {"return_airport": "DTW", "airline": "Southwest", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1013, "departure_month": "June", "departure_time_num": 16, "class": "economy", "return_time_num": 13, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "DTW", "airline": "Delta", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1014, "departure_month": "June", "departure_time_num": 0, "class": "economy", "return_time_num": 8, "return_month": "June", "return_day": "15", "num_connections": 1, "price": 100}, {"return_airport": "DTW", "airline": "Southwest", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1015, "departure_month": "June", "departure_time_num": 17, "class": "economy", "return_time_num": 1, "return_month": "June", "return_day": "15", "num_connections": 1, "price": 300}, {"return_airport": "DTW", "airline": "UA", "departure_day": "11", "departure_airport": "DFW", "flight_number": 1016, "departure_month": "June", "departure_time_num": 10, "class": "economy", "return_time_num": 4, "return_month": "June", "return_day": "14", "num_connections": 0, "price": 200}, {"return_airport": "DFW", "airline": "AA", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1017, "departure_month": "June", "departure_time_num": 14, "class": "economy", "return_time_num": 23, "return_month": "June", "return_day": "14", "num_connections": 2, "price": 400}, {"return_airport": "DTW", "airline": "JetBlue", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1018, "departure_month": "June", "departure_time_num": 3, "class": "economy", "return_time_num": 1, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "DFW", "airline": "Hawaiian", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1019, "departure_month": "June", "departure_time_num": 7, "class": "economy", "return_time_num": 18, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "DFW", "airline": "Delta", "departure_day": "12", "departure_airport": "IAD", "flight_number": 1020, "departure_month": "June", "departure_time_num": 6, "class": "economy", "return_time_num": 18, "return_month": "June", "return_day": "14", "num_connections": 2, "price": 200}, {"return_airport": "IAD", "airline": "Delta", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1021, "departure_month": "June", "departure_time_num": 11, "class": "business", "return_time_num": 8, "return_month": "June", "return_day": "14", "num_connections": 0, "price": 1000}, {"return_airport": "IAD", "airline": "JetBlue", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1022, "departure_month": "June", "departure_time_num": 4, "class": "economy", "return_time_num": 14, "return_month": "June", "return_day": "13", "num_connections": 0, "price": 200}, {"return_airport": "IAD", "airline": "Frontier", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1023, "departure_month": "June", "departure_time_num": 19, "class": "economy", "return_time_num": 23, "return_month": "June", "return_day": "13", "num_connections": 1, "price": 200}, {"return_airport": "DFW", "airline": "UA", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1024, "departure_month": "June", "departure_time_num": 11, "class": "economy", "return_time_num": 19, "return_month": "June", "return_day": "15", "num_connections": 1, "price": 200}, {"return_airport": "DTW", "airline": "Hawaiian", "departure_day": "11", "departure_airport": "IAD", "flight_number": 1025, "departure_month": "June", "departure_time_num": 6, "class": "economy", "return_time_num": 10, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "DTW", "airline": "UA", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1026, "departure_month": "June", "departure_time_num": 0, "class": "economy", "return_time_num": 18, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 300}, {"return_airport": "IAD", "airline": "Delta", "departure_day": "12", "departure_airport": "DFW", "flight_number": 1027, "departure_month": "June", "departure_time_num": 17, "class": "economy", "return_time_num": 15, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 200}, {"return_airport": "IAD", "airline": "Southwest", "departure_day": "12", "departure_airport": "DTW", "flight_number": 1028, "departure_month": "June", "departure_time_num": 23, "class": "economy", "return_time_num": 13, "return_month": "June", "return_day": "14", "num_connections": 1, "price": 100}, {"return_airport": "DFW", "airline": "Spirit", "departure_day": "11", "departure_airport": "DTW", "flight_number": 1029, "departure_month": "June", "departure_time_num": 22, "class": "business", "return_time_num": 4, "return_month": "June", "return_day": "14", "num_connections": 0, "price": 800}], "reservation": 0} ``` ### Data Fields BuilderConfig: `air_dialogue_data`: Provides for customer context, dialogue states and environment key name | Description | |---|---| |'search_action' | search action performed by customer | |'action' | Action taken by the agent | |'intent' | Intents from the conversation | |'timestamps' | Timestamp for each of the dialogues | |'dialogue' | Dialogue recorded between agent & customer | |'expected_action' | Expected action from agent (human-annotated)| |'correct_sample' | whether action performed by agent was same as expected_action | BuilderConfig: `air_dialogue_kb`: Provides for the Agent Context _ca_ = (_db_, _r_ ) key name | Description | |---|---| |'kb' | Available flights in the database | |'reservation' | whether customer has an existing reservation| ### Data Splits Data is split into Train/Dev & Test in the ration of 80%, 10% and 10% ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process To collect this dataset, we create a contextgenerator which provides travel and flight restrictions. We then ask human annotators to play the role of a customer or an agent and interact with the goal of successfully booking a trip given the restrictions. Key to our environment is the ease of evaluating the success of the dialogue, which is achieved by using ground-truth states (e.g., the flight being booked) generated by the restrictions. Any dialogue agent that does not generate the correct states is considered to fail. #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information No personal and sensitive information is stored ## Considerations for Using the Data ### Social Impact of Dataset [Needs More Information] ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [AirDialogue team](mailto:airdialogue@gmail.com) For issues regarding HuggingFace Dataset Hub implementation [Aakash Gupta](mailto:aakashg80@gmail.com) ### Licensing Information cc-by-nc-4.0 ### Citation Information ```bibtex @inproceedings{wei-etal-2018-airdialogue, title = "{A}ir{D}ialogue: An Environment for Goal-Oriented Dialogue Research", author = "Wei, Wei and Le, Quoc and Dai, Andrew and Li, Jia", editor = "Riloff, Ellen and Chiang, David and Hockenmaier, Julia and Tsujii, Jun{'}ichi", booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing", month = oct # "-" # nov, year = "2018", address = "Brussels, Belgium", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/D18-1419", doi = "10.18653/v1/D18-1419", pages = "3844--3854", abstract = "Recent progress in dialogue generation has inspired a number of studies on dialogue systems that are capable of accomplishing tasks through natural language interactions. A promising direction among these studies is the use of reinforcement learning techniques, such as self-play, for training dialogue agents. However, current datasets are limited in size, and the environment for training agents and evaluating progress is relatively unsophisticated. We present AirDialogue, a large dataset that contains 301,427 goal-oriented conversations. To collect this dataset, we create a context-generator which provides travel and flight restrictions. We then ask human annotators to play the role of a customer or an agent and interact with the goal of successfully booking a trip given the restrictions. Key to our environment is the ease of evaluating the success of the dialogue, which is achieved by using ground-truth states (e.g., the flight being booked) generated by the restrictions. Any dialogue agent that does not generate the correct states is considered to fail. Our experimental results indicate that state-of-the-art dialogue models can only achieve a score of 0.17 while humans can reach a score of 0.91, which suggests significant opportunities for future improvement.", } ``` ### Contributions Thanks to [@skyprince999](https://github.com/skyprince999) for adding this dataset.
open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1
--- pretty_name: Evaluation run of OpenLemur/lemur-70b-chat-v1 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [OpenLemur/lemur-70b-chat-v1](https://huggingface.co/OpenLemur/lemur-70b-chat-v1)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-17T13:31:04.707005](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1/blob/main/results_2023-09-17T13-31-04.707005.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006711409395973154,\n\ \ \"em_stderr\": 0.0008361500895152445,\n \"f1\": 0.0739702181208053,\n\ \ \"f1_stderr\": 0.001585201628872726,\n \"acc\": 0.5850941225115532,\n\ \ \"acc_stderr\": 0.01201805791264202\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.006711409395973154,\n \"em_stderr\": 0.0008361500895152445,\n\ \ \"f1\": 0.0739702181208053,\n \"f1_stderr\": 0.001585201628872726\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35329795299469297,\n \ \ \"acc_stderr\": 0.013166337192115683\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168358\n\ \ }\n}\n```" repo_url: https://huggingface.co/OpenLemur/lemur-70b-chat-v1 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|arc:challenge|25_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-24T04:11:57.870589.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_17T13_31_04.707005 path: - '**/details_harness|drop|3_2023-09-17T13-31-04.707005.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-17T13-31-04.707005.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_17T13_31_04.707005 path: - '**/details_harness|gsm8k|5_2023-09-17T13-31-04.707005.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-17T13-31-04.707005.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hellaswag|10_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T04:11:57.870589.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_24T04_11_57.870589 path: - '**/details_harness|truthfulqa:mc|0_2023-08-24T04:11:57.870589.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-24T04:11:57.870589.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_17T13_31_04.707005 path: - '**/details_harness|winogrande|5_2023-09-17T13-31-04.707005.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-17T13-31-04.707005.parquet' - config_name: results data_files: - split: 2023_09_17T13_31_04.707005 path: - results_2023-09-17T13-31-04.707005.parquet - split: latest path: - results_2023-09-17T13-31-04.707005.parquet --- # Dataset Card for Evaluation run of OpenLemur/lemur-70b-chat-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/OpenLemur/lemur-70b-chat-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [OpenLemur/lemur-70b-chat-v1](https://huggingface.co/OpenLemur/lemur-70b-chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T13:31:04.707005](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenLemur__lemur-70b-chat-v1/blob/main/results_2023-09-17T13-31-04.707005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006711409395973154, "em_stderr": 0.0008361500895152445, "f1": 0.0739702181208053, "f1_stderr": 0.001585201628872726, "acc": 0.5850941225115532, "acc_stderr": 0.01201805791264202 }, "harness|drop|3": { "em": 0.006711409395973154, "em_stderr": 0.0008361500895152445, "f1": 0.0739702181208053, "f1_stderr": 0.001585201628872726 }, "harness|gsm8k|5": { "acc": 0.35329795299469297, "acc_stderr": 0.013166337192115683 }, "harness|winogrande|5": { "acc": 0.8168902920284136, "acc_stderr": 0.010869778633168358 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Changgil__K2S3-Mistral-7b-v1.46
--- pretty_name: Evaluation run of Changgil/K2S3-Mistral-7b-v1.46 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Changgil/K2S3-Mistral-7b-v1.46](https://huggingface.co/Changgil/K2S3-Mistral-7b-v1.46)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__K2S3-Mistral-7b-v1.46\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-08T07:26:46.778850](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-Mistral-7b-v1.46/blob/main/results_2024-04-08T07-26-46.778850.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6267592703097891,\n\ \ \"acc_stderr\": 0.032306235635994135,\n \"acc_norm\": 0.6312457601112512,\n\ \ \"acc_norm_stderr\": 0.0329595615269791,\n \"mc1\": 0.3598531211750306,\n\ \ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5181608703310118,\n\ \ \"mc2_stderr\": 0.015007220671761173\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642466,\n\ \ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670731\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6340370444134634,\n\ \ \"acc_stderr\": 0.0048071469251620555,\n \"acc_norm\": 0.8348934475204143,\n\ \ \"acc_norm_stderr\": 0.0037051790292873276\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\ \ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\ \ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\ \ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\ \ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\ \ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\ \ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\ \ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \ \ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\ \ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\ \ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\ \ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\ \ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\ \ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562424,\n \"\ acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562424\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\ \ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\ \ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\ \ \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n\ \ \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n\ \ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\ : 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\ \ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\ : 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\ \ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \ \ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\ \ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \ \ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \ \ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \ \ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\ acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612917,\n \"\ acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612917\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\ \ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\ : {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n\ \ \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \ \ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\ \ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\ \ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\ \ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\ : 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\ \ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\ \ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\ \ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\ \ \"acc_stderr\": 0.022509033937077826,\n \"acc_norm\": 0.8632478632478633,\n\ \ \"acc_norm_stderr\": 0.022509033937077826\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\ \ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\ \ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\ \ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31508379888268156,\n\ \ \"acc_stderr\": 0.015536850852473636,\n \"acc_norm\": 0.31508379888268156,\n\ \ \"acc_norm_stderr\": 0.015536850852473636\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729474,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729474\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\ \ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\ \ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\ \ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \ \ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\ \ \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n\ \ \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\ \ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \ \ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\ \ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\ \ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\ \ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\ \ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\ \ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\ \ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\ \ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5181608703310118,\n\ \ \"mc2_stderr\": 0.015007220671761173\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218329\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42380591357088704,\n \ \ \"acc_stderr\": 0.01361163200881037\n }\n}\n```" repo_url: https://huggingface.co/Changgil/K2S3-Mistral-7b-v1.46 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|arc:challenge|25_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-08T07-26-46.778850.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|gsm8k|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hellaswag|10_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-08T07-26-46.778850.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-management|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T07-26-46.778850.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|truthfulqa:mc|0_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-08T07-26-46.778850.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_08T07_26_46.778850 path: - '**/details_harness|winogrande|5_2024-04-08T07-26-46.778850.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-08T07-26-46.778850.parquet' - config_name: results data_files: - split: 2024_04_08T07_26_46.778850 path: - results_2024-04-08T07-26-46.778850.parquet - split: latest path: - results_2024-04-08T07-26-46.778850.parquet --- # Dataset Card for Evaluation run of Changgil/K2S3-Mistral-7b-v1.46 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Changgil/K2S3-Mistral-7b-v1.46](https://huggingface.co/Changgil/K2S3-Mistral-7b-v1.46) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Changgil__K2S3-Mistral-7b-v1.46", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-08T07:26:46.778850](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__K2S3-Mistral-7b-v1.46/blob/main/results_2024-04-08T07-26-46.778850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6267592703097891, "acc_stderr": 0.032306235635994135, "acc_norm": 0.6312457601112512, "acc_norm_stderr": 0.0329595615269791, "mc1": 0.3598531211750306, "mc1_stderr": 0.016801860466677157, "mc2": 0.5181608703310118, "mc2_stderr": 0.015007220671761173 }, "harness|arc:challenge|25": { "acc": 0.5691126279863481, "acc_stderr": 0.014471133392642466, "acc_norm": 0.6075085324232082, "acc_norm_stderr": 0.014269634635670731 }, "harness|hellaswag|10": { "acc": 0.6340370444134634, "acc_stderr": 0.0048071469251620555, "acc_norm": 0.8348934475204143, "acc_norm_stderr": 0.0037051790292873276 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.03953173377749194, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.02794321998933714, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.02794321998933714 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207763, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207763 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.024833839825562424, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.024833839825562424 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895518, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175007, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.02860620428922987, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.02860620428922987 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6333333333333333, "acc_stderr": 0.02443301646605246, "acc_norm": 0.6333333333333333, "acc_norm_stderr": 0.02443301646605246 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066485, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03038835355188679, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03038835355188679 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8220183486238533, "acc_stderr": 0.016399436366612917, "acc_norm": 0.8220183486238533, "acc_norm_stderr": 0.016399436366612917 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.028626547912437406, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.028626547912437406 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.0345727283691767, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.0345727283691767 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077826, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077826 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8135376756066411, "acc_stderr": 0.013927751372001506, "acc_norm": 0.8135376756066411, "acc_norm_stderr": 0.013927751372001506 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7109826589595376, "acc_stderr": 0.02440517393578323, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.02440517393578323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31508379888268156, "acc_stderr": 0.015536850852473636, "acc_norm": 0.31508379888268156, "acc_norm_stderr": 0.015536850852473636 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729474, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729474 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4491525423728814, "acc_stderr": 0.012704030518851488, "acc_norm": 0.4491525423728814, "acc_norm_stderr": 0.012704030518851488 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.02916312857067073, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.02916312857067073 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6486928104575164, "acc_stderr": 0.01931267606578655, "acc_norm": 0.6486928104575164, "acc_norm_stderr": 0.01931267606578655 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8557213930348259, "acc_stderr": 0.02484575321230604, "acc_norm": 0.8557213930348259, "acc_norm_stderr": 0.02484575321230604 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.3598531211750306, "mc1_stderr": 0.016801860466677157, "mc2": 0.5181608703310118, "mc2_stderr": 0.015007220671761173 }, "harness|winogrande|5": { "acc": 0.8042620363062352, "acc_stderr": 0.011151145042218329 }, "harness|gsm8k|5": { "acc": 0.42380591357088704, "acc_stderr": 0.01361163200881037 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pbevan11/image_gen_ocr_evaluation_data
--- license: apache-2.0 --- # image_gen_ocr_eval **Author:** Peter J. Bevan **Date:** 15/12/23 **github:** [https://github.com/pbevan1/image-gen-spelling-eval](https://github.com/pbevan1/image-gen-spelling-eval) --- *Table 1: Normalised Levenshtein similarity scores between instructed text and text present in image (as identified by OCR)* | Model | object | signage | natural | long | Overall | | --- | --- | --- | --- | --- | --- | | DALLE3 | 0.62 | 0.62 | 0.62 | 0.58 | 0.61 | | DeepFloydIF | 0.57 | 0.56 | 0.66 | 0.39 | 0.54 | | DALLE2 | 0.44 | 0.35 | 0.42 | 0.22 | 0.36 | | SDXL | 0.3 | 0.33 | 0.4 | 0.21 | 0.31 | | SD | 0.28 | 0.26 | 0.32 | 0.22 | 0.27 | | PlayGroundV2 | 0.19 | 0.23 | 0.17 | 0.2 | 0.2 | | Wuerstchen | 0.14 | 0.19 | 0.19 | 0.19 | 0.18 | | Kandinsky | 0.13 | 0.2 | 0.18 | 0.17 | 0.17 | --- This is a POC that calculates the normalised Levenshtein similarity between prompted text and the text present in the generated image (as recognised by OCR). To us this to create a metric, we create a dataset of prompts, each instructing to include some text in the image. We also provide a column for ground truth generated text which contains only the instructed text. The below scorer is then run on the generated images, comparing the target text with the actual text, outputting a score. The scores are then averaged to give a benchmark score. A score of 1 indicates a perfect match to the text. You can find the dataset at https://huggingface.co/datasets/pbevan11/image_gen_ocr_evaluation_data Since this metric solely looks at text within the generated images and not image quality as a whole, this metric should be used alongside other benchmarks such as those in https://karine-h.github.io/T2I-CompBench/. --- ![Image generation model spelling comparison](model_comparison.png) ``` @misc {peter_j._bevan_2024, author = { {Peter J. Bevan} }, title = { image_gen_ocr_evaluation_data (Revision 6182779) }, year = 2024, url = { https://huggingface.co/datasets/pbevan11/image_gen_ocr_evaluation_data }, doi = { 10.57967/hf/1944 }, publisher = { Hugging Face } } ```
CyberHarem/focalors_genshin
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of focalors_genshin This is the dataset of focalors_genshin, containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 464 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 464 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 464 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 464 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
alisson40889/friza
--- license: openrail ---
mteb/sts22-crosslingual-sts
--- language: - ar - de - en - es - fr - it - pl - ru - tr - zh --- Scores in this dataset have been inverted to be from least to most similar! The scores in the original STS22 task were from most to least similar.
vitruv/report_simplify
--- dataset_info: features: - name: name sequence: string - name: time sequence: int64 - name: score sequence: int64 - name: time_mean dtype: float64 - name: score_mean dtype: float64 - name: time_median dtype: int64 - name: score_median dtype: int64 - name: prompt dtype: string splits: - name: train num_bytes: 5260965 num_examples: 2900 - name: val num_bytes: 181435 num_examples: 100 - name: test num_bytes: 141592 num_examples: 100 download_size: 976397 dataset_size: 5583992 configs: - config_name: default data_files: - split: train path: data/train-* - split: val path: data/val-* - split: test path: data/test-* ---
bn22/dolphin-50k-mini
--- license: apache-2.0 ---
SuryaKrishna02/aya-telugu-jokes
--- annotations_creators: - expert-generated language: - te language_creators: - expert-generated license: - apache-2.0 multilinguality: - monolingual pretty_name: Telugu Jokes size_categories: - n<1K source_datasets: - original tags: - jokes - humor - fun conversations task_categories: - text-generation task_ids: - language-modeling --- # Summary `aya-telugu-jokes` is an open source dataset of instruct-style records generated by webscraping a Telugu Jokes website. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI. This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License. Supported Tasks: - Training LLMs - Synthetic Data Generation - Data Augmentation Languages: Telugu Version: 1.0 # Dataset Overview `aya-telugu-jokes` is a corpus of more than 900 records generated by webscraping of the Telugu Jokes website. This Dataset can be used for the following task: - Given the title of a funny conversation, generate a funny conversation based on the title. # Intended Uses While immediately valuable for instruction fine tuning large language models, as a corpus of instruction prompts, this dataset also presents a valuable opportunity for synthetic data generation in the methods. For example, prompt-completions could be submitted as few-shot examples to a large open language model to generate additional funny conversations and their titles. # Dataset ## Load with Datasets To load this dataset with Datasets, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code: ```python from datasets import load_dataset ds = load_dataset('SuryaKrishna02/aya-telugu-jokes') ``` ## Purpose of Collection Telugu is a low-resource language where there no funny conversation generation instruct-style dataset to the best of my knowledge. This was created as a part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI to make sure Telugu is well represented in the space of AI/ML. Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including academic or commercial applications. ## Sources - **Andhrajyothi Website**: Performed webscraping from [Andhrajyothi Website](https://lit.andhrajyothy.com/jokes/) which is a website consisting of funny conversations. Next, performed some pre-processing of the data like removing unwanted characters from the scraped data. Finally, converted the scraped data into Instruct-style prompts and completions. ## Data Fields - `inputs` : Prompt or input to the language model. - `targets` : Completion or output of the language model. - `template_id` : Id of the template used in `inputs` and `targets`. - `template_lang`: ISO code of the language used in the `inputs` and `targets` where *tel* refers to Telugu. ## Templates For the creation of instruct-style prompts and completions from the scraped data, the following one template category with 14 different variations were used: 1. Given the title of a funny conversation, generate a funny conversation based on the title. | template_id | inputs | targets | |-------------|--------|---------| | 1 | ```{{Title}} అనే శీర్షిక తో జోక్ ఇవ్వు``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 2 | ```{{Title}} అనే టైటిల్ తో జోక్ ఇవ్వు``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 3 | ```ఒక హాస్య సంభాషణ ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 4 | ```ఒక చిన్న హాస్య సన్నివేశం ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 5 | ```ఒక చమత్కారమయిన సంభాషణ ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 6 | ```ఒక చిన్న చమత్కారమయిన సన్నివేశం ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 7 | ```ఒక తమాషా అయినా సంభాషణ ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 8 | ```ఒక చిన్న తమాషా అయినా సన్నివేశం ఇవ్వు మరియు దాని యొక్క శీర్షిక {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 9 | ```ఒక హాస్య సంభాషణ ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 10 | ```ఒక చిన్న హాస్య సన్నివేశం ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 11 | ```ఒక చమత్కారమయిన సంభాషణ ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 12 | ```ఒక చిన్న చమత్కారమయిన సన్నివేశం ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 13 | ```ఒక తమాషా అయినా సంభాషణ ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | | 14 | ```ఒక చిన్న తమాషా అయినా సన్నివేశం ఇవ్వు మరియు దాని యొక్క టైటిల్ {{Title}} ఉండే లాగా ఇవ్వు.``` | ```శీర్షిక: {{Title}}\n\n{{Funny Conversation}}``` | ## Personal or Sensitive Data This dataset contains public information. To our knowledge, there are no private person’s personal identifiers or sensitive information. ## Language Telugu # Known Limitations - The Dataset is scraped from the Jokes Website and the contents of this dataset may reflect the bias, factual errors, inappropriate and sensitive matters. - Although there is utmost care taken to keep the dataset as monolingual, there might be some records that may contain English Language along with Telugu. # Contributors [SuryaKrishna02](https://github.com/SuryaKrishna02) and [Desik98](https://github.com/desik1998)
Intel/neural-chat-dataset-v1-1
--- license: apache-2.0 --- Here is a collective list of instruction dataset used for Neural Chat fine-tuning. The total number of instruction samples and tokens are about 1.1M and 326M respectively. | Type | Language | Dataset | Number | |--| ---- |--------|----| | HC3 | en | [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3) | 24K | | dolly | en | [databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) | 15K | | alpaca-zh | zh | [tigerbot-alpaca-zh-0.5m](https://huggingface.co/datasets/TigerResearch/tigerbot-alpaca-zh-0.5m) | 500K | | alpaca-en | en | [TigerResearch/tigerbot-alpaca-en-50k](https://huggingface.co/datasets/TigerResearch/tigerbot-alpaca-en-50k) | 50K | | math | en | [tigerbot-gsm-8k-en](https://huggingface.co/datasets/TigerResearch/tigerbot-gsm-8k-en) | 8K | | general | en | [tigerbot-stackexchange-qa-en-0.5m](https://huggingface.co/datasets/TigerResearch/tigerbot-stackexchange-qa-en-0.5m) | 500K | The collective dataset has been validated on multiple LLMs (such as MPT, LLama) by the NeuralChat team (Kaokao Lv, Wenxin Zhang, Xuhui Ren, and Haihao Shen) from Intel/SATG/AIA/AIPT. Thanks to [Hello-SimpleAI](https://huggingface.co/Hello-SimpleAI), [databricks](https://huggingface.co/databricks), [TigerResearch/TigerBot](https://github.com/TigerResearch/TigerBot) for releasing the open-source instruction dataset.
atluzz/tuntun_faq_guanaco_compr
--- license: apache-2.0 ---
PavolPragoData/testSet
--- language: - en size_categories: - n<1K --- Q: What is the general workflow suggested for using the GitHub client? A: The general workflow for using the GitHub client is referred to as the "GitHub Flow," which involves committing to a branch and syncing up with a remote repository regularly. Q: How does branch management differ between the GitHub clients on macOS and Windows? A: On macOS, branch management in the GitHub client involves a button at the top of the window for creating a new branch. On Windows, creating a branch is done by typing the new branch’s name in the branch-switching widget. Q: What is the main way to interact with other repositories over the network in the GitHub client? A: The main way to interact with other repositories over the network in the GitHub client is through the “Sync” feature, which internally uses a combination of Git operations such as pull, push, fetch, merge, and rebase. Q: What happens when you click the Sync button in the GitHub client? A: When the Sync button is clicked, it first performs a git pull --rebase, and if that fails due to a merge conflict, it falls back to git pull --no-rebase. Then, it executes git push.
wolfbr950/dataset2
--- license: apache-2.0 ---
Ahmad-Sarmad-Ali/SHEAKESPEAR-QNA
--- license: mit ---
autoevaluate/autoeval-staging-eval-project-samsum-0c672345-10275367
--- type: predictions tags: - autotrain - evaluation datasets: - samsum eval_info: task: summarization model: facebook/bart-large-cnn metrics: [] dataset_name: samsum dataset_config: samsum dataset_split: train col_mapping: text: dialogue target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: facebook/bart-large-cnn * Dataset: samsum To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@ikadebi](https://huggingface.co/ikadebi) for evaluating this model.
bond005/sberdevices_golos_100h_farfield
--- pretty_name: Golos annotations_creators: - expert-generated language_creators: - crowdsourced - expert-generated language: - ru license: - other multilinguality: - monolingual paperswithcode_id: golos size_categories: - 10K<n<100k source_datasets: - extended task_categories: - automatic-speech-recognition - audio-classification --- # Dataset Card for sberdevices_golos_100h_farfield ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [Golos ASR corpus](https://www.openslr.org/114) - **Repository:** [Golos dataset](https://github.com/sberdevices/golos) - **Paper:** [Golos: Russian Dataset for Speech Research](https://arxiv.org/pdf/2106.10161.pdf) - **Leaderboard:** [The 🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench) - **Point of Contact:** [Nikolay Karpov](mailto:karpnv@gmail.com) ### Dataset Summary Sberdevices Golos is a corpus of approximately 1200 hours of 16kHz Russian speech from crowd (reading speech) and farfield (communication with smart devices) domains, prepared by SberDevices Team (Alexander Denisenko, Angelina Kovalenko, Fedor Minkin, and Nikolay Karpov). The data is derived from the crowd-sourcing platform, and has been manually annotated. Authors divide all dataset into train and test subsets. The training subset includes approximately 1000 hours. For experiments with a limited number of records, authors identified training subsets of shorter length: 100 hours, 10 hours, 1 hour, 10 minutes. This dataset is a simpler version of the above mentioned Golos: - it includes the farfield domain only (without any sound from the crowd domain); - validation split is built on the 10-hour training subset; - training split corresponds to the 100-hour training subset without sounds from the 10-hour training subset; - test split is a full original test split. ### Supported Tasks and Leaderboards - `automatic-speech-recognition`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active Hugging Face leaderboard which can be found at https://huggingface.co/spaces/huggingface/hf-speech-bench. The leaderboard ranks models uploaded to the Hub based on their WER. ### Languages The audio is in Russian. ## Dataset Structure ### Data Instances A typical data point comprises the audio data, usually called `audio` and its transcription, called `transcription`. Any additional information about the speaker and the passage which contains the transcription is not provided. ``` {'audio': {'path': None, 'array': array([ 1.22070312e-04, 1.22070312e-04, 9.15527344e-05, ..., 6.10351562e-05, 6.10351562e-05, 3.05175781e-05]), dtype=float64), 'sampling_rate': 16000}, 'transcription': 'джой источники истории турции'} ``` ### Data Fields - audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`. - transcription: the transcription of the audio file. ### Data Splits This dataset is a simpler version of the original Golos: - it includes the farfield domain only (without any sound from the crowd domain); - validation split is built on the 10-hour training subset; - training split corresponds to the 100-hour training subset without sounds from the 10-hour training subset; - test split is a full original test split. | | Train | Validation | Test | | ----- | ------ | ---------- | ----- | | examples | 9570 | 933 | 1916 | | hours | 10.3h | 1.0h | 1.4h | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process All recorded audio files were manually annotated on the crowd-sourcing platform. #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information The dataset consists of people who have donated their voice. You agree to not attempt to determine the identity of speakers in this dataset. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators The dataset was initially created by Alexander Denisenko, Angelina Kovalenko, Fedor Minkin, and Nikolay Karpov. ### Licensing Information [Public license with attribution and conditions reserved](https://github.com/sberdevices/golos/blob/master/license/en_us.pdf) ### Citation Information ``` @misc{karpov2021golos, author = {Karpov, Nikolay and Denisenko, Alexander and Minkin, Fedor}, title = {Golos: Russian Dataset for Speech Research}, publisher = {arXiv}, year = {2021}, url = {https://arxiv.org/abs/2106.10161} } ``` ### Contributions Thanks to [@bond005](https://github.com/bond005) for adding this dataset.
ovior/twitter_dataset_1713127382
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 2248910 num_examples: 7113 download_size: 1260609 dataset_size: 2248910 configs: - config_name: default data_files: - split: train path: data/train-* ---
vwxyzjn/ultrachat_200k_filtered_1707919115
--- dataset_info: features: - name: prompt dtype: string - name: prompt_id dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string - name: query_reference_response list: - name: content dtype: string - name: role dtype: string - name: query_reference_response_token sequence: int64 - name: query_reference_response_token_len dtype: int64 - name: query list: - name: content dtype: string - name: role dtype: string - name: query_token sequence: int64 - name: query_token_len dtype: int64 - name: reference_response struct: - name: content dtype: string - name: role dtype: string - name: reference_response_token sequence: int64 - name: reference_response_token_len dtype: int64 splits: - name: test_gen num_bytes: 30484069 num_examples: 1000 - name: test_sft num_bytes: 39592502 num_examples: 1000 - name: train_gen num_bytes: 29613744 num_examples: 1000 - name: train_sft num_bytes: 39521233 num_examples: 1000 download_size: 50859072 dataset_size: 139211548 --- # Dataset Card for "ultrachat_200k_filtered_1707919115" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
davidfant/natural-questions-chunk-19
--- dataset_info: features: - name: id dtype: string - name: document struct: - name: html dtype: string - name: title dtype: string - name: tokens sequence: - name: end_byte dtype: int64 - name: is_html dtype: bool - name: start_byte dtype: int64 - name: token dtype: string - name: url dtype: string - name: question struct: - name: text dtype: string - name: tokens sequence: string - name: long_answer_candidates sequence: - name: end_byte dtype: int64 - name: end_token dtype: int64 - name: start_byte dtype: int64 - name: start_token dtype: int64 - name: top_level dtype: bool - name: annotations sequence: - name: id dtype: string - name: long_answer struct: - name: candidate_index dtype: int64 - name: end_byte dtype: int64 - name: end_token dtype: int64 - name: start_byte dtype: int64 - name: start_token dtype: int64 - name: short_answers sequence: - name: end_byte dtype: int64 - name: end_token dtype: int64 - name: start_byte dtype: int64 - name: start_token dtype: int64 - name: text dtype: string - name: yes_no_answer dtype: class_label: names: '0': 'NO' '1': 'YES' splits: - name: train num_bytes: 4658446669 num_examples: 10000 download_size: 1809697228 dataset_size: 4658446669 --- # Dataset Card for "natural-questions-chunk-19" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
GokhanAI/Synthetic
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: prompt dtype: string - name: response dtype: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 15850263.160279274 num_examples: 24354 - name: test num_bytes: 1301655.8397207255 num_examples: 2000 download_size: 5685426 dataset_size: 17151919.0 --- # Dataset Card for "AGENT_V3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
KIND-Dataset/Open-ended_Questions_dialectal_data
--- license: cc-by-4.0 task_categories: - question-answering language: - ar size_categories: - n<1K --- ### Dataset Summary A collection of open-ended questions that was provided to the data marathon competitors to populate KIND dataset. It was designed to elicit longer responses cultural and context-rich sentences. For more details, please check the paper [The KIND Dataset: A Social Collaboration Approach for Nuanced Dialect Data Collection](https://aclanthology.org/2024.eacl-srw.3/) ### Citation Information ``` @inproceedings{yamani-etal-2024-kind, title = "The {KIND} Dataset: A Social Collaboration Approach for Nuanced Dialect Data Collection", author = "Yamani, Asma and Alziyady, Raghad and AlYami, Reem and Albelali, Salma and Albelali, Leina and Almulhim, Jawharah and Alsulami, Amjad and Alfarraj, Motaz and Al-Zaidy, Rabeah", editor = "Falk, Neele and Papi, Sara and Zhang, Mike", booktitle = "Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop", month = mar, year = "2024", address = "St. Julian{'}s, Malta", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2024.eacl-srw.3", pages = "32--43", } ```
CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Miura Yumiko (Yahari Ore no Seishun LoveCome wa Machigatte Iru) This is the dataset of Miura Yumiko (Yahari Ore no Seishun LoveCome wa Machigatte Iru), containing 190 images and their tags. The core tags of this character are `long_hair, blonde_hair, green_eyes, drill_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 190 | 100.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 190 | 88.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 372 | 167.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 190 | 100.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 372 | 184.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/miura_yumiko_yahariorenoseishunlovecomewamachigatteiru', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, from_side, profile, bangs, blurry_background, indoors, closed_mouth, holding, open_mouth, reading, school_uniform, upper_body | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, earrings, profile, school_uniform, from_side, upper_body, black_jacket, ribbon | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blazer, indoors, school_uniform, solo, sitting, chair, classroom, school_bag, school_desk, black_jacket, collared_shirt, long_sleeves, open_mouth, plaid, red_ribbon, skirt, smartphone, white_shirt | | 3 | 23 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, solo, sobu_high_school_uniform, ribbon, blazer, black_jacket, shirt, upper_body | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, crossed_arms, indoors, school_uniform, solo, sweater_vest, earrings, long_sleeves, school_desk, white_shirt, chair, classroom, looking_at_viewer, sitting, closed_mouth, collared_shirt, open_mouth | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, cloud, day, earrings, tree, outdoors, school_uniform, solo, blue_sky, crossed_arms, ribbon, anime_coloring, blazer, chain-link_fence, frown, shirt | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, collarbone, upper_body, fur_trim, solo, striped_shirt, blurry, orange_jacket, looking_at_viewer | | 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, tennis_racket, tennis_uniform, solo, chain-link_fence, outdoors, skirt | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, earrings, off_shoulder, short_sleeves, bare_shoulders, closed_mouth, red_dress, bangs, crossed_arms, breasts, solo | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | from_side | profile | bangs | blurry_background | indoors | closed_mouth | holding | open_mouth | reading | school_uniform | upper_body | earrings | black_jacket | ribbon | blazer | sitting | chair | classroom | school_bag | school_desk | collared_shirt | long_sleeves | plaid | red_ribbon | skirt | smartphone | white_shirt | sobu_high_school_uniform | shirt | crossed_arms | sweater_vest | looking_at_viewer | cloud | day | tree | outdoors | blue_sky | anime_coloring | chain-link_fence | frown | collarbone | fur_trim | striped_shirt | blurry | orange_jacket | tennis_racket | tennis_uniform | off_shoulder | short_sleeves | bare_shoulders | red_dress | breasts | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:----------|:--------|:--------------------|:----------|:---------------|:----------|:-------------|:----------|:-----------------|:-------------|:-----------|:---------------|:---------|:---------|:----------|:--------|:------------|:-------------|:--------------|:-----------------|:---------------|:--------|:-------------|:--------|:-------------|:--------------|:---------------------------|:--------|:---------------|:---------------|:--------------------|:--------|:------|:-------|:-----------|:-----------|:-----------------|:-------------------|:--------|:-------------|:-----------|:----------------|:---------|:----------------|:----------------|:-----------------|:---------------|:----------------|:-----------------|:------------|:----------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | | X | | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 23 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | | | | | | | X | | X | X | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | | X | X | | X | | X | | X | | | | X | X | X | | X | X | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | | | | | | | | | X | | X | | X | X | | | | | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | X | | | | | | | X | X | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X |
AshtonIsNotHere/nlp_pp_code_dataset
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2126529.0 num_examples: 1463 - name: test num_bytes: 528817.0 num_examples: 258 download_size: 948983 dataset_size: 2655346.0 --- # Dataset Card for "nlp_pp_code_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zense-raaga-ai/raaga_dataset_v2
--- dataset_info: features: - name: audio dtype: audio - name: RaagaNumber dtype: int64 splits: - name: train num_bytes: 22119715558.812 num_examples: 86746 download_size: 29873744194 dataset_size: 22119715558.812 --- # Dataset Card for "raaga_dataset_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zooxufpb/NSText2SQL_1_column
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 405208506 num_examples: 289288 download_size: 0 dataset_size: 405208506 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "NSText2SQL_1_column" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
RUCAIBox/Style-Transfer
--- language: - en multilinguality: - monolingual task_categories: - other task_ids: [] tags: - style-transfer --- This is the text style transfer datasets collected by TextBox, including: - GYAFC Entertainment & Music (gyafc_em). - GYAFC Family & Relationships (gyafc_fr). The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset).
hztttian/vits-simple-api
--- license: apache-2.0 ---
heliosprime/twitter_dataset_1713011915
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 12269 num_examples: 27 download_size: 10776 dataset_size: 12269 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "twitter_dataset_1713011915" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ZhaofengWu/transparency-data
--- license: apache-2.0 --- Datasets used in our paper (https://arxiv.org/abs/2210.07468) ```bibtex @inproceedings{wu-etal-2022-continued, title = "Transparency Helps Reveal When Language Models Learn Meaning", author = "Zhaofeng Wu and William Merrill and Hao Peng and Iz Beltagy and Noah A. Smith", url = {https://arxiv.org/abs/2210.07468}, publisher = {arXiv}, year = {2022}, doi = {10.48550/ARXIV.2210.07468}, } ``` Please see the "Files and versions" tab for the data.
open-llm-leaderboard/details_abideen__PhigRange-2.7B-slerp
--- pretty_name: Evaluation run of abideen/PhigRange-2.7B-slerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [abideen/PhigRange-2.7B-slerp](https://huggingface.co/abideen/PhigRange-2.7B-slerp)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__PhigRange-2.7B-slerp\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-04T21:20:07.760171](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__PhigRange-2.7B-slerp/blob/main/results_2024-04-04T21-20-07.760171.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5767941428995237,\n\ \ \"acc_stderr\": 0.03387553112768988,\n \"acc_norm\": 0.5768839451218056,\n\ \ \"acc_norm_stderr\": 0.034576125147848885,\n \"mc1\": 0.3769889840881273,\n\ \ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5316928803057882,\n\ \ \"mc2_stderr\": 0.015529301409234428\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449703,\n\ \ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672874\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5813582951603267,\n\ \ \"acc_stderr\": 0.004923281841828515,\n \"acc_norm\": 0.7663811989643498,\n\ \ \"acc_norm_stderr\": 0.004222676709104567\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\ \ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\ \ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\ \ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\ \ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\ \ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\ \ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\ \ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\ \ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\ \ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\ \ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\ \ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\ \ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\ \ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406772,\n \"\ acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406772\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\ \ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\ \ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\ \ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\ \ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\ \ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.035886248000917075,\n\ \ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.035886248000917075\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\ acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\ \ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042338,\n\ \ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042338\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948503,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948503\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \ \ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\ acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200148,\n \"\ acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200148\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\ acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"\ acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \ \ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\ \ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\ \ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\ \ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\ : 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\ \ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\ \ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\ \ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\ \ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\ \ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\ \ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \ \ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\ \ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.70242656449553,\n\ \ \"acc_stderr\": 0.016349111912909425,\n \"acc_norm\": 0.70242656449553,\n\ \ \"acc_norm_stderr\": 0.016349111912909425\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879702,\n\ \ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879702\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\ \ \"acc_stderr\": 0.014987325439963544,\n \"acc_norm\": 0.2782122905027933,\n\ \ \"acc_norm_stderr\": 0.014987325439963544\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\ \ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\ \ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\ \ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\ \ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \ \ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\ \ \"acc_stderr\": 0.012585471793400659,\n \"acc_norm\": 0.4152542372881356,\n\ \ \"acc_norm_stderr\": 0.012585471793400659\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\ \ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5310457516339869,\n \"acc_stderr\": 0.02018880445636189,\n \ \ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.02018880445636189\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\ \ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.6454545454545455,\n\ \ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\ \ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\ \ \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.736318407960199,\n\ \ \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\ \ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\ \ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\ \ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\ \ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5316928803057882,\n\ \ \"mc2_stderr\": 0.015529301409234428\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6027293404094011,\n \ \ \"acc_stderr\": 0.013478659652337794\n }\n}\n```" repo_url: https://huggingface.co/abideen/PhigRange-2.7B-slerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|arc:challenge|25_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-04T21-20-07.760171.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|gsm8k|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hellaswag|10_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-04T21-20-07.760171.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-management|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T21-20-07.760171.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|truthfulqa:mc|0_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-04T21-20-07.760171.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_04T21_20_07.760171 path: - '**/details_harness|winogrande|5_2024-04-04T21-20-07.760171.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-04T21-20-07.760171.parquet' - config_name: results data_files: - split: 2024_04_04T21_20_07.760171 path: - results_2024-04-04T21-20-07.760171.parquet - split: latest path: - results_2024-04-04T21-20-07.760171.parquet --- # Dataset Card for Evaluation run of abideen/PhigRange-2.7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [abideen/PhigRange-2.7B-slerp](https://huggingface.co/abideen/PhigRange-2.7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abideen__PhigRange-2.7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-04T21:20:07.760171](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__PhigRange-2.7B-slerp/blob/main/results_2024-04-04T21-20-07.760171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5767941428995237, "acc_stderr": 0.03387553112768988, "acc_norm": 0.5768839451218056, "acc_norm_stderr": 0.034576125147848885, "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5316928803057882, "mc2_stderr": 0.015529301409234428 }, "harness|arc:challenge|25": { "acc": 0.5921501706484642, "acc_stderr": 0.014361097288449703, "acc_norm": 0.6168941979522184, "acc_norm_stderr": 0.014206472661672874 }, "harness|hellaswag|10": { "acc": 0.5813582951603267, "acc_stderr": 0.004923281841828515, "acc_norm": 0.7663811989643498, "acc_norm_stderr": 0.004222676709104567 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5855263157894737, "acc_stderr": 0.04008973785779206, "acc_norm": 0.5855263157894737, "acc_norm_stderr": 0.04008973785779206 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.030242233800854494, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.030242233800854494 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629475, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.03261936918467382, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4896551724137931, "acc_stderr": 0.04165774775728763, "acc_norm": 0.4896551724137931, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406772, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406772 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.04375888492727061, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.04375888492727061 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302837, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302837 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4433497536945813, "acc_stderr": 0.03495334582162934, "acc_norm": 0.4433497536945813, "acc_norm_stderr": 0.03495334582162934 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.696969696969697, "acc_stderr": 0.035886248000917075, "acc_norm": 0.696969696969697, "acc_norm_stderr": 0.035886248000917075 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03191178226713547, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03191178226713547 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.03003114797764154, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5769230769230769, "acc_stderr": 0.025049197876042338, "acc_norm": 0.5769230769230769, "acc_norm_stderr": 0.025049197876042338 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948503, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948503 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6134453781512605, "acc_stderr": 0.03163145807552379, "acc_norm": 0.6134453781512605, "acc_norm_stderr": 0.03163145807552379 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4105960264900662, "acc_stderr": 0.04016689594849928, "acc_norm": 0.4105960264900662, "acc_norm_stderr": 0.04016689594849928 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200148, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200148 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4537037037037037, "acc_stderr": 0.03395322726375797, "acc_norm": 0.4537037037037037, "acc_norm_stderr": 0.03395322726375797 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236434, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236434 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7341772151898734, "acc_stderr": 0.02875679962965834, "acc_norm": 0.7341772151898734, "acc_norm_stderr": 0.02875679962965834 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.032277904428505, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.032277904428505 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302871, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302871 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8162393162393162, "acc_stderr": 0.025372139671722933, "acc_norm": 0.8162393162393162, "acc_norm_stderr": 0.025372139671722933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.70242656449553, "acc_stderr": 0.016349111912909425, "acc_norm": 0.70242656449553, "acc_norm_stderr": 0.016349111912909425 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6705202312138728, "acc_stderr": 0.025305258131879702, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.025305258131879702 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2782122905027933, "acc_stderr": 0.014987325439963544, "acc_norm": 0.2782122905027933, "acc_norm_stderr": 0.014987325439963544 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5915032679738562, "acc_stderr": 0.028146405993096358, "acc_norm": 0.5915032679738562, "acc_norm_stderr": 0.028146405993096358 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.02731684767419271, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.02731684767419271 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6388888888888888, "acc_stderr": 0.026725868809100793, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.026725868809100793 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.02958345203628407, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.02958345203628407 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4152542372881356, "acc_stderr": 0.012585471793400659, "acc_norm": 0.4152542372881356, "acc_norm_stderr": 0.012585471793400659 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5310457516339869, "acc_stderr": 0.02018880445636189, "acc_norm": 0.5310457516339869, "acc_norm_stderr": 0.02018880445636189 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.04582004841505416, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.04582004841505416 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6612244897959184, "acc_stderr": 0.030299506562154185, "acc_norm": 0.6612244897959184, "acc_norm_stderr": 0.030299506562154185 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.031157150869355575, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.031157150869355575 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.038786267710023595, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.0352821125824523, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.0352821125824523 }, "harness|truthfulqa:mc|0": { "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5316928803057882, "mc2_stderr": 0.015529301409234428 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 }, "harness|gsm8k|5": { "acc": 0.6027293404094011, "acc_stderr": 0.013478659652337794 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
jijay/instructpix2pix-demov2
--- dataset_info: features: - name: input_image dtype: image - name: edit_prompt dtype: string - name: edited_image dtype: image splits: - name: train num_bytes: 4872706.0 num_examples: 4 download_size: 4873592 dataset_size: 4872706.0 --- # Dataset Card for "instructpix2pix-demov2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
PocketDoc/text-splitter-alpaca
--- task_categories: - text-generation language: - en --- https://huggingface.co/datasets/mhenrichsen/context-aware-splits-english
jhoerr-livefront/livefront-illustrations
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 1380652.0 num_examples: 29 download_size: 1372836 dataset_size: 1380652.0 configs: - config_name: default data_files: - split: train path: data/train-* ---
Kelvinloh/lady
--- license: mit ---
0x7o/GCRL-q
--- dataset_info: features: - name: type dtype: string - name: theme dtype: string - name: title dtype: string - name: text dtype: string - name: quality dtype: float64 - name: id dtype: string - name: answer dtype: string splits: - name: train num_bytes: 6519238 num_examples: 3975 download_size: 3383601 dataset_size: 6519238 --- # Dataset Card for "GCRL-q" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ZiAngGu/scannet_3dbox_v2
--- dataset_info: features: - name: image dtype: image - name: conditioning_image dtype: image - name: text dtype: string splits: - name: train num_bytes: 2869658129.643 num_examples: 33421 download_size: 2788454287 dataset_size: 2869658129.643 --- # Dataset Card for "scannet_3dbox_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
GalaktischeGurke/full_dataset_1510_lines_invoice_contract_mail_GPT3.5_test
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 6455.2913907284765 num_examples: 2 download_size: 12763 dataset_size: 6455.2913907284765 --- # Dataset Card for "full_dataset_1510_lines_invoice_contract_mail_GPT3.5_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Pravincoder/Indian_traffic_law_QA
--- license: bigcode-openrail-m task_categories: - question-answering language: - en tags: - traffic_rules - law - Indian_traffic_rules size_categories: - n<1K --- # Dataset Card for Indian Traffic Rules ### Dataset Summary This DataSet is curated to Train or fine-tuning LLMs on basic questions on Indian traffic rules. ### Licensing Information :- bigcode-openrail-m
roa7n/patched_test_p_40_f_ATCaseOTCase_m1_predictions
--- dataset_info: features: - name: id dtype: string - name: sequence_str dtype: string - name: label dtype: int64 - name: m1_preds dtype: float32 splits: - name: train num_bytes: 48235460 num_examples: 130287 download_size: 4618804 dataset_size: 48235460 --- # Dataset Card for "patched_test_p_40_f_ATCaseOTCase_m1_predictions" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/nagato_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of nagato/長門/长门 (Azur Lane) This is the dataset of nagato/長門/长门 (Azur Lane), containing 500 images and their tags. The core tags of this character are `animal_ears, fox_ears, black_hair, long_hair, bangs, animal_ear_fluff, yellow_eyes, hair_ornament, blunt_bangs, very_long_hair, brown_eyes, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 679.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagato_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 397.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagato_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1222 | 849.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagato_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 603.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagato_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1222 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nagato_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nagato_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, hair_bow, hairband, looking_at_viewer, solo, twintails, white_dress, bare_shoulders, blush, flower, mini_crown, ribbon, wings, frilled_dress, official_alternate_costume, strapless_dress, choker, sidelocks, sitting | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, detached_sleeves, looking_at_viewer, solo, strapless_dress, necklace, sidelocks, simple_background, white_background, red_dress, upper_body, blush, choker, collarbone, open_mouth, parted_lips, wide_sleeves | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, collarbone, red_dress, solo, strapless_dress, detached_sleeves, blush, white_sleeves, twitter_username, closed_mouth, long_sleeves, looking_at_viewer, simple_background, upper_body, smile, wide_sleeves, pleated_dress, white_background | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, fox_mask, looking_at_viewer, mask_on_head, necklace, red_panties, side-tie_panties, solo, white_thighhighs, blush, navel, red_bra, see-through, camisole, strap_slip, simple_background, white_background, ass_visible_through_thighs, collarbone, ribbon-trimmed_legwear | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, blush, chick, fox_mask, looking_at_viewer, mask_on_head, necklace, red_panties, ribbon-trimmed_legwear, see-through, side-tie_panties, solo, strap_slip, white_thighhighs, camisole, daruma_doll, no_shoes, sanshoku_dango, tatami, collarbone, flower, jingle_bell, red_bra, small_breasts, wariza, ass, ball, knees_up, navel, smile, tail, thighs, wind_chime | | 5 | 16 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, solo, wataboushi, bare_shoulders, looking_at_viewer, uchikake, detached_sleeves, wide_sleeves, white_kimono, dress, smile, ribbon-trimmed_sleeves, x_hair_ornament, hood_up, long_sleeves, jingle_bell, sitting, closed_mouth, ears_through_headwear | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, hair_flower, looking_at_viewer, solo, bare_shoulders, off_shoulder, official_alternate_costume, fox_tail, lantern, necklace, sitting, multiple_tails, blush, full_body, pink_kimono, small_breasts | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | :d, bare_shoulders, blush, detached_sleeves, long_sleeves, open_mouth, sleeveless_kimono, white_kimono, 1girl, headpiece, solo, twitter_username, white_sleeves, wide_sleeves, jingle_bell, simple_background, upper_body, ^_^, chibi, hands_up, ribbon_trim, short_hair, white_background | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, alternate_costume, fox_girl, fox_tail, looking_at_viewer, signature, solo, blush, contemporary, sidelocks, casual, long_sleeves, school_uniform, skirt, blurry_background, bokeh, bow, holding, parted_lips, pink_cardigan, smile | | 9 | 7 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, alternate_costume, looking_at_viewer, sidelocks, solo, blue_sky, cloudy_sky, ocean, outdoors, blue_one-piece_swimsuit, day, signature, alternate_hairstyle, depth_of_field, horizon, ass, fox_girl, hair_bow, hair_ribbon, holding, looking_back, old_school_swimsuit, parted_lips, ponytail | | 10 | 6 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, alternate_costume, looking_at_viewer, petals, sidelocks, solo, dress, flower, signature, casual, collarbone, fox_girl, bare_shoulders, contemporary, fox_tail, parted_lips, simple_background, spaghetti_strap, white_background | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, looking_at_viewer, short_sleeves, sidelocks, solo, dress, fox_girl, signature, simple_background, enmaided, fox_tail, maid_apron, frills, parted_lips, white_background, blush, holding, paw_pose | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, ass, looking_at_viewer, small_breasts, solo, blush, fox_tail, from_behind, looking_back, open_mouth, red_bikini, simple_background, white_background, butt_crack, covering_breasts, fox_girl, from_side, nude, short_hair, sideboob, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_bow | hairband | looking_at_viewer | solo | twintails | white_dress | bare_shoulders | blush | flower | mini_crown | ribbon | wings | frilled_dress | official_alternate_costume | strapless_dress | choker | sidelocks | sitting | detached_sleeves | necklace | simple_background | white_background | red_dress | upper_body | collarbone | open_mouth | parted_lips | wide_sleeves | white_sleeves | twitter_username | closed_mouth | long_sleeves | smile | pleated_dress | fox_mask | mask_on_head | red_panties | side-tie_panties | white_thighhighs | navel | red_bra | see-through | camisole | strap_slip | ass_visible_through_thighs | ribbon-trimmed_legwear | chick | daruma_doll | no_shoes | sanshoku_dango | tatami | jingle_bell | small_breasts | wariza | ass | ball | knees_up | tail | thighs | wind_chime | wataboushi | uchikake | white_kimono | dress | ribbon-trimmed_sleeves | x_hair_ornament | hood_up | ears_through_headwear | hair_flower | off_shoulder | fox_tail | lantern | multiple_tails | full_body | pink_kimono | :d | sleeveless_kimono | headpiece | ^_^ | chibi | hands_up | ribbon_trim | short_hair | alternate_costume | fox_girl | signature | contemporary | casual | school_uniform | skirt | blurry_background | bokeh | bow | holding | pink_cardigan | blue_sky | cloudy_sky | ocean | outdoors | blue_one-piece_swimsuit | day | alternate_hairstyle | depth_of_field | horizon | hair_ribbon | looking_back | old_school_swimsuit | ponytail | petals | spaghetti_strap | short_sleeves | enmaided | maid_apron | frills | paw_pose | from_behind | red_bikini | butt_crack | covering_breasts | from_side | nude | sideboob | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------|:-----------|:--------------------|:-------|:------------|:--------------|:-----------------|:--------|:---------|:-------------|:---------|:--------|:----------------|:-----------------------------|:------------------|:---------|:------------|:----------|:-------------------|:-----------|:--------------------|:-------------------|:------------|:-------------|:-------------|:-------------|:--------------|:---------------|:----------------|:-------------------|:---------------|:---------------|:--------|:----------------|:-----------|:---------------|:--------------|:-------------------|:-------------------|:--------|:----------|:--------------|:-----------|:-------------|:-----------------------------|:-------------------------|:--------|:--------------|:-----------|:-----------------|:---------|:--------------|:----------------|:---------|:------|:-------|:-----------|:-------|:---------|:-------------|:-------------|:-----------|:---------------|:--------|:-------------------------|:------------------|:----------|:------------------------|:--------------|:---------------|:-----------|:----------|:-----------------|:------------|:--------------|:-----|:--------------------|:------------|:------|:--------|:-----------|:--------------|:-------------|:--------------------|:-----------|:------------|:---------------|:---------|:-----------------|:--------|:--------------------|:--------|:------|:----------|:----------------|:-----------|:-------------|:--------|:-----------|:--------------------------|:------|:----------------------|:-----------------|:----------|:--------------|:---------------|:----------------------|:-----------|:---------|:------------------|:----------------|:-----------|:-------------|:---------|:-----------|:--------------|:-------------|:-------------|:-------------------|:------------|:-------|:-----------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | | | X | X | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | | | X | X | | | | | | | X | | | | X | | X | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | | | X | X | | | | | | | | | | | | X | X | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | | X | X | X | | | | | | | | | | | X | | | | | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 16 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | | X | X | | | | | | | | | | X | X | | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | | | X | X | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | X | | | X | X | | | | | | | | | | | X | | X | X | | X | | X | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | X | | | | X | | | | | | | | | X | | | | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 7 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 10 | 6 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | X | | | X | | X | | | | | | | | X | | | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | X | X | | | | X | | | | | | | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | 12 | 5 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | | | X | X | | | | X | | | | | | | | | | | | | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X |
simbolo-ai/burmese-hate-speech-small
--- license: gpl language: - my --- ### What is Hate Speech? Hate speech is any toxic communication used to attack individuals or groups directly, especially based on the characteristics (but not limited to): physical deficiency, mental deficiency, moral deficiency, age, ethnicity, race, national origin, caste, religion, disability, serious disease, sex, gender, gender identity, gender reassignment, sexual orientation, and immigration status. ### Disclaimer The dataset may contain toxic data like using rude words, and these are not aligned with the definition of hate speech data mentioned above. ### Contributors: Main Contributor: Sa Phyo Thu Htet Other Contributors: Ei Thandar Aung, Naing Linn Phyo, Yang Ni Linn Lat, Chaw Su Thwe Thiha Nyein, Hnin Aye Thant Data Collectors: Sa Phyo Thu Htet, Students from Simbolo, Club Members of Data Science and Machine Learning Club, University of Technology, Yatanarpon Cyber City, Myanmar ### Cite As: @misc{burmese-hate-speech-small, author = {{Sa Phyo Thu Htet, Data Science and ML Club, UTYCC}}, title = {burmese-hate-speech-small}, url = {https://huggingface.co/datasets/simbolo-ai/burmese-hate-speech-data}, urldate = {2024-3-1}, date = {2024-3-1} }
Mlxa/java_methods
--- license: apache-2.0 ---
LxYxvv/ChinaDaily_EN_ZH
--- license: mit ---
HydraLM/GPTeacher-General-Instruct_standardized
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 splits: - name: train num_bytes: 62696353 num_examples: 267780 download_size: 0 dataset_size: 62696353 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "GPTeacher-General-Instruct_standardized" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.2
--- pretty_name: Evaluation run of openagi-project/OpenAGI-7B-v0.2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [openagi-project/OpenAGI-7B-v0.2](https://huggingface.co/openagi-project/OpenAGI-7B-v0.2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-01T18:03:01.560923](https://huggingface.co/datasets/open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.2/blob/main/results_2024-02-01T18-03-01.560923.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.632889617920343,\n\ \ \"acc_stderr\": 0.03254482538442493,\n \"acc_norm\": 0.63500329056049,\n\ \ \"acc_norm_stderr\": 0.0331984716026879,\n \"mc1\": 0.5593635250917993,\n\ \ \"mc1_stderr\": 0.01737969755543745,\n \"mc2\": 0.7204002538063481,\n\ \ \"mc2_stderr\": 0.015000816890913878\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283512,\n\ \ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6982672774347739,\n\ \ \"acc_stderr\": 0.004580718115992504,\n \"acc_norm\": 0.8602867954590719,\n\ \ \"acc_norm_stderr\": 0.003459806991389835\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\ \ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\ \ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\ \ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\ \ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \ \ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\ \ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\ \ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \ \ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\ : {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\ : 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\ \ \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n\ \ \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\ \ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\ \ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\ \ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\ \ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\ \ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\ \ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\ acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\ \ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\ \ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\ \ \"acc_stderr\": 0.0253781399708852,\n \"acc_norm\": 0.7258064516129032,\n\ \ \"acc_norm_stderr\": 0.0253781399708852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\ \ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\ acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\ \ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \ \ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \ \ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \ \ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"\ acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\ acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\ acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \ \ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\ \ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\ \ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\ acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\ \ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\ \ \"acc_stderr\": 0.020237149008990925,\n \"acc_norm\": 0.8931623931623932,\n\ \ \"acc_norm_stderr\": 0.020237149008990925\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\ \ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\ \ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\ \ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\ \ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\ \ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\ \ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\ \ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\ \ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n\ \ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \ \ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\ \ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\ \ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\ \ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \ \ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\ \ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\ \ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\ \ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\ \ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\ \ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n\ \ \"mc1_stderr\": 0.01737969755543745,\n \"mc2\": 0.7204002538063481,\n\ \ \"mc2_stderr\": 0.015000816890913878\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987736\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5344958301743745,\n \ \ \"acc_stderr\": 0.013739668147545913\n }\n}\n```" repo_url: https://huggingface.co/openagi-project/OpenAGI-7B-v0.2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|arc:challenge|25_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-01T18-03-01.560923.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|gsm8k|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hellaswag|10_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-03-01.560923.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-management|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T18-03-01.560923.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|truthfulqa:mc|0_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-01T18-03-01.560923.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_01T18_03_01.560923 path: - '**/details_harness|winogrande|5_2024-02-01T18-03-01.560923.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-01T18-03-01.560923.parquet' - config_name: results data_files: - split: 2024_02_01T18_03_01.560923 path: - results_2024-02-01T18-03-01.560923.parquet - split: latest path: - results_2024-02-01T18-03-01.560923.parquet --- # Dataset Card for Evaluation run of openagi-project/OpenAGI-7B-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [openagi-project/OpenAGI-7B-v0.2](https://huggingface.co/openagi-project/OpenAGI-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T18:03:01.560923](https://huggingface.co/datasets/open-llm-leaderboard/details_openagi-project__OpenAGI-7B-v0.2/blob/main/results_2024-02-01T18-03-01.560923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.632889617920343, "acc_stderr": 0.03254482538442493, "acc_norm": 0.63500329056049, "acc_norm_stderr": 0.0331984716026879, "mc1": 0.5593635250917993, "mc1_stderr": 0.01737969755543745, "mc2": 0.7204002538063481, "mc2_stderr": 0.015000816890913878 }, "harness|arc:challenge|25": { "acc": 0.6621160409556314, "acc_stderr": 0.013822047922283512, "acc_norm": 0.6851535836177475, "acc_norm_stderr": 0.013572657703084948 }, "harness|hellaswag|10": { "acc": 0.6982672774347739, "acc_stderr": 0.004580718115992504, "acc_norm": 0.8602867954590719, "acc_norm_stderr": 0.003459806991389835 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6578947368421053, "acc_stderr": 0.03860731599316092, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.03860731599316092 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.037038511930995215, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.037038511930995215 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287533, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287533 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.044084400227680794, "acc_norm": 0.74, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.025225450284067884, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.025225450284067884 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7258064516129032, "acc_stderr": 0.0253781399708852, "acc_norm": 0.7258064516129032, "acc_norm_stderr": 0.0253781399708852 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790486, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790486 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.024233532297758723, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.024233532297758723 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6076923076923076, "acc_stderr": 0.02475600038213095, "acc_norm": 0.6076923076923076, "acc_norm_stderr": 0.02475600038213095 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.818348623853211, "acc_stderr": 0.016530617409266857, "acc_norm": 0.818348623853211, "acc_norm_stderr": 0.016530617409266857 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467617, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467617 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917669, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917669 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.020237149008990925, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.020237149008990925 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7109826589595376, "acc_stderr": 0.02440517393578323, "acc_norm": 0.7109826589595376, "acc_norm_stderr": 0.02440517393578323 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3776536312849162, "acc_stderr": 0.01621414875213663, "acc_norm": 0.3776536312849162, "acc_norm_stderr": 0.01621414875213663 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.02625605383571896, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6945337620578779, "acc_stderr": 0.026160584450140446, "acc_norm": 0.6945337620578779, "acc_norm_stderr": 0.026160584450140446 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.025171041915309684, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.025171041915309684 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45827900912646674, "acc_stderr": 0.01272570165695364, "acc_norm": 0.45827900912646674, "acc_norm_stderr": 0.01272570165695364 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6470588235294118, "acc_stderr": 0.029029422815681397, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.029029422815681397 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6584967320261438, "acc_stderr": 0.019184639328092487, "acc_norm": 0.6584967320261438, "acc_norm_stderr": 0.019184639328092487 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128438, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128438 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.02753912288906145, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.02753912288906145 }, "harness|truthfulqa:mc|0": { "mc1": 0.5593635250917993, "mc1_stderr": 0.01737969755543745, "mc2": 0.7204002538063481, "mc2_stderr": 0.015000816890913878 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987736 }, "harness|gsm8k|5": { "acc": 0.5344958301743745, "acc_stderr": 0.013739668147545913 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Yehor/ukrainian-tts-oleksa
--- license: apache-2.0 task_categories: - text-to-speech language: - uk --- # 🇺🇦 Open Source Ukrainian Text-to-Speech dataset named OLEKSA Join Ukrainian community - https://t.me/speech_synthesis_uk More details about this dataset - https://github.com/egorsmkv/ukrainian-tts-datasets/tree/main/oleksa # Voice OLEKSA (male) The voice of: https://twitter.com/CHOBUDA ## Features - Quality: high - Duration: 6h - Audio formats: OPUS - Text format: JSONL (a `metadata.jsonl` file) - Frequency: 48000 Hz
result-kand2-sdxl-wuerst-karlo/4e08d540
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 207 num_examples: 10 download_size: 1373 dataset_size: 207 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "4e08d540" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-94000
--- dataset_info: features: - name: input_ids sequence: sequence: int32 - name: attention_mask sequence: sequence: int8 - name: labels sequence: sequence: int64 splits: - name: train num_bytes: 13336000 num_examples: 1000 download_size: 653970 dataset_size: 13336000 configs: - config_name: default data_files: - split: train path: data/train-* ---
banloada/banloda
--- license: other ---
open-llm-leaderboard/details_doas__test5
--- pretty_name: Evaluation run of doas/test5 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [doas/test5](https://huggingface.co/doas/test5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_doas__test5\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-25T05:43:32.139729](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test5/blob/main/results_2023-09-25T05-43-32.139729.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\ em_stderr\": 0.0,\n \"f1\": 4.5092281879194636e-05,\n \"f1_stderr\"\ : 2.699913059109046e-05,\n \"acc\": 0.2632202052091555,\n \"acc_stderr\"\ : 0.007016411937203614\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\ \ \"em_stderr\": 0.0,\n \"f1\": 4.5092281879194636e-05,\n \"\ f1_stderr\": 2.699913059109046e-05\n },\n \"harness|gsm8k|5\": {\n \ \ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.526440410418311,\n \"acc_stderr\": 0.014032823874407229\n\ \ }\n}\n```" repo_url: https://huggingface.co/doas/test5 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|arc:challenge|25_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-08-23T10:59:05.636787.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_25T05_43_32.139729 path: - '**/details_harness|drop|3_2023-09-25T05-43-32.139729.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-25T05-43-32.139729.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_25T05_43_32.139729 path: - '**/details_harness|gsm8k|5_2023-09-25T05-43-32.139729.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-25T05-43-32.139729.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hellaswag|10_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T10:59:05.636787.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_08_23T10_59_05.636787 path: - '**/details_harness|truthfulqa:mc|0_2023-08-23T10:59:05.636787.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-08-23T10:59:05.636787.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_25T05_43_32.139729 path: - '**/details_harness|winogrande|5_2023-09-25T05-43-32.139729.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-25T05-43-32.139729.parquet' - config_name: results data_files: - split: 2023_09_25T05_43_32.139729 path: - results_2023-09-25T05-43-32.139729.parquet - split: latest path: - results_2023-09-25T05-43-32.139729.parquet --- # Dataset Card for Evaluation run of doas/test5 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/doas/test5 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [doas/test5](https://huggingface.co/doas/test5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_doas__test5", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-25T05:43:32.139729](https://huggingface.co/datasets/open-llm-leaderboard/details_doas__test5/blob/main/results_2023-09-25T05-43-32.139729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 4.5092281879194636e-05, "f1_stderr": 2.699913059109046e-05, "acc": 0.2632202052091555, "acc_stderr": 0.007016411937203614 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 4.5092281879194636e-05, "f1_stderr": 2.699913059109046e-05 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.526440410418311, "acc_stderr": 0.014032823874407229 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
David19930/test_david_v1
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string splits: - name: train num_bytes: 91266 num_examples: 590 download_size: 9428 dataset_size: 91266 configs: - config_name: default data_files: - split: train path: data/train-* ---
SergeyKarpenko1/autotrain-data-nlp
--- language: - en --- # AutoTrain Dataset for project: nlp ## Dataset Description This dataset has been automatically processed by AutoTrain for project nlp. ### Languages The BCP-47 code for the dataset's language is en. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "context": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u041d\u0430 \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\u043c \u0433\u0435\u043b\u0435, \u0432 \u043a\u043e\u0442\u043e\u0440\u043e\u043c \u043f\u043e\u0434\u043e\u0431\u0440\u0430\u043d\u044b \u0438\u043d\u0433\u0440\u0435\u0434\u0438\u0435\u043d\u0442\u044b \u043e\u0442\u0432\u0435\u0447\u0430\u044e\u0449\u0438\u0435 \u0437\u0430 \u043f\u043e\u0434\u0442\u044f\u0436\u043a\u0443, \u0430\u043d\u0442\u0438\u0446\u0435\u043b\u043b\u044e\u043b\u0438\u0442\u043d\u044b\u0439 \u044d\u0444\u0444\u0435\u043a\u0442 \u0438 \u0436\u0438\u0440\u043e\u0437\u0436\u0438\u0433\u0430\u043d\u0438\u0435 ! \u0412\u044b\u043f\u043e\u043b\u043d\u044f\u0442\u044c\u0441\u044f \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u0435 \u0442\u0435\u0445\u043d\u0438\u043a\u0438 \u043c\u0430\u0441\u0441\u0430\u0436\u043d\u044b\u0435, \u043f\u043e \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u043c \u043b\u0438\u043d\u0438\u044f\u043c, \u043f\u0438\u043b\u0438\u043d\u0433 \u0441\u043d\u0430\u0447\u0430\u043b\u0430, \u043f\u043e\u0442\u043e\u043c \u0432\u0431\u0438\u0432\u0430\u043d\u0438\u0435 \u0413\u0435\u043b\u044f \u0432 \u043f\u043e\u0440\u044b \u0438 \u043e\u0431\u0435\u0440\u0442\u044b\u0432\u0430\u043d\u0438\u0435 \u0432 \u0438\u043d\u0444\u0440\u0430\u043a\u0440\u0430\u0441\u043d\u043e\u0435 \u043e\u0434\u0435\u044f\u043b\u043e! \u041d\u0435 \u0431\u043e\u043b\u044c\u043d\u043e)", "question": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435, \u0440\u0430\u0441\u0441\u043a\u0430\u0436\u0438\u0442\u0435 \u043f\u0440\u043e \u043f\u0440\u043e\u0446\u0435\u0434\u0443\u0440\u0443 \u043a\u0430\u0440\u0430\u043c\u0435\u043b\u044c\u043d\u0430\u044f \u043b\u0438\u043f\u0430\u043a\u0441\u0430\u0446\u0438\u044f , \u043a\u0430\u043a \u044d\u0442\u043e \u0434\u0435\u043b\u0430\u0435\u0442\u0441\u044f , \u0431\u043e\u043b\u044c\u043d\u043e \u044d\u0442\u043e ?", "answers.text": [ "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u041d\u0430 \u0441\u043f\u0435\u0446\u0438\u0430\u043b\u044c\u043d\u043e\u043c \u0433\u0435\u043b\u0435, \u0432 \u043a\u043e\u0442\u043e\u0440\u043e\u043c \u043f\u043e\u0434\u043e\u0431\u0440\u0430\u043d\u044b \u0438\u043d\u0433\u0440\u0435\u0434\u0438\u0435\u043d\u0442\u044b \u043e\u0442\u0432\u0435\u0447\u0430\u044e\u0449\u0438\u0435 \u0437\u0430 \u043f\u043e\u0434\u0442\u044f\u0436\u043a\u0443, \u0430\u043d\u0442\u0438\u0446\u0435\u043b\u043b\u044e\u043b\u0438\u0442\u043d\u044b\u0439 \u044d\u0444\u0444\u0435\u043a\u0442 \u0438 \u0436\u0438\u0440\u043e\u0437\u0436\u0438\u0433\u0430\u043d\u0438\u0435 ! \u0412\u044b\u043f\u043e\u043b\u043d\u044f\u0442\u044c\u0441\u044f \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u0435 \u0442\u0435\u0445\u043d\u0438\u043a\u0438 \u043c\u0430\u0441\u0441\u0430\u0436\u043d\u044b\u0435, \u043f\u043e \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0451\u043d\u043d\u044b\u043c \u043b\u0438\u043d\u0438\u044f\u043c, \u043f\u0438\u043b\u0438\u043d\u0433 \u0441\u043d\u0430\u0447\u0430\u043b\u0430, \u043f\u043e\u0442\u043e\u043c \u0432\u0431\u0438\u0432\u0430\u043d\u0438\u0435 \u0413\u0435\u043b\u044f \u0432 \u043f\u043e\u0440\u044b \u0438 \u043e\u0431\u0435\u0440\u0442\u044b\u0432\u0430\u043d\u0438\u0435 \u0432 \u0438\u043d\u0444\u0440\u0430\u043a\u0440\u0430\u0441\u043d\u043e\u0435 \u043e\u0434\u0435\u044f\u043b\u043e! \u041d\u0435 \u0431\u043e\u043b\u044c\u043d\u043e)" ], "answers.answer_start": [ -1 ] }, { "context": "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u0414\u0430, \u043d\u0430 \u0447\u0442\u043e \u0445\u043e\u0442\u0438\u0442\u0435?) \u0441 \u043a\u0430\u043a\u043e\u0433\u043e \u0447\u0438\u0441\u043b\u0430? \u0412 \u043a\u0430\u043a\u043e\u0435 \u0432\u0440\u0435\u043c\u044f \u0443\u0434\u043e\u0431\u043d\u043e?", "question": "\u0410 \u043c\u043e\u0436\u043d\u043e \u043d\u0430 \u044f\u043d\u0432\u0430\u0440\u044c \u0443\u0436\u0435 \u0437\u0430\u043f\u0438\u0441\u0430\u0442\u044c\u0441\u044f ?", "answers.text": [ "\u0417\u0434\u0440\u0430\u0432\u0441\u0442\u0432\u0443\u0439\u0442\u0435 \ud83e\udd1d \u0414\u0430, \u043d\u0430 \u0447\u0442\u043e \u0445\u043e\u0442\u0438\u0442\u0435?) \u0441 \u043a\u0430\u043a\u043e\u0433\u043e \u0447\u0438\u0441\u043b\u0430? \u0412 \u043a\u0430\u043a\u043e\u0435 \u0432\u0440\u0435\u043c\u044f \u0443\u0434\u043e\u0431\u043d\u043e?" ], "answers.answer_start": [ -1 ] } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "context": "Value(dtype='string', id=None)", "question": "Value(dtype='string', id=None)", "answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)", "answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 248 | | valid | 63 |
adleme94/borges
--- license: apache-2.0 task_categories: - text-generation language: - es size_categories: - n<1K ---
gguichard/coref_dataset
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* dataset_info: features: - name: input_ids sequence: int32 - name: labels sequence: int64 splits: - name: train num_bytes: 595904106.6677296 num_examples: 101287 - name: test num_bytes: 31363993.332270347 num_examples: 5331 download_size: 94532509 dataset_size: 627268100.0 --- # Dataset Card for "coref_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vikp/code_search_net_filtered_34k
--- license: cc-by-4.0 dataset_info: features: - name: code dtype: string - name: signature dtype: string - name: docstring dtype: string - name: loss_without_docstring dtype: float64 - name: loss_with_docstring dtype: float64 - name: factor dtype: float64 - name: rendered dtype: string - name: quality_prob dtype: float64 - name: learning_prob dtype: float64 splits: - name: train num_bytes: 59688305.816243604 num_examples: 34488 download_size: 30704027 dataset_size: 59688305.816243604 configs: - config_name: default data_files: - split: train path: data/train-* --- Filtered version of code search net python subset, with filtering based on perplexity with/without docstring, learning value/quality classifiers, and manual filtering. Original data with perplexity filtering is from [here](https://huggingface.co/datasets/bjoernp/code_search_net_python_processed_400k), with credit to bjoernp.
jxu9001/conll_v3
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: tokens sequence: string - name: tags sequence: int64 splits: - name: train num_bytes: 3445822 num_examples: 14041 - name: validation num_bytes: 866541 num_examples: 3250 - name: test num_bytes: 784956 num_examples: 3453 download_size: 1247438 dataset_size: 5097319 --- # Dataset Card for "conll_v3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)