modelId
stringlengths
4
111
lastModified
stringlengths
24
24
tags
list
pipeline_tag
stringlengths
5
30
author
stringlengths
2
34
config
null
securityStatus
null
id
stringlengths
4
111
likes
int64
0
9.53k
downloads
int64
2
73.6M
library_name
stringlengths
2
84
created
timestamp[us]
card
stringlengths
101
901k
card_len
int64
101
901k
embeddings
list
t5-large
2023-04-06T13:42:27.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "t5", "text2text-generation", "summarization", "translation", "en", "fr", "ro", "de", "multilingual", "dataset:c4", "arxiv:1805.12471", "arxiv:1708.00055", "arxiv:1704.05426", "arxiv:1606.05250", "arxiv:1808.09121", "arxiv:1810.12885", "arxiv:1905.10044", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
translation
null
null
null
t5-large
109
311,081
transformers
2022-03-02T23:29:04
--- language: - en - fr - ro - de - multilingual license: apache-2.0 tags: - summarization - translation datasets: - c4 --- # Model Card for T5 Large ![model image](https://camo.githubusercontent.com/623b4dea0b653f2ad3f36c71ebfe749a677ac0a1/68747470733a2f2f6d69726f2e6d656469756d2e636f6d2f6d61782f343030362f312a44304a31674e51663876727255704b657944387750412e706e67) # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Environmental Impact](#environmental-impact) 7. [Citation](#citation) 8. [Model Card Authors](#model-card-authors) 9. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html): > With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task. T5-Large is the checkpoint with 770 million parameters. - **Developed by:** Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. See [associated paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) and [GitHub repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints) - **Model type:** Language model - **Language(s) (NLP):** English, French, Romanian, German - **License:** Apache 2.0 - **Related Models:** [All T5 Checkpoints](https://huggingface.co/models?search=t5) - **Resources for more information:** - [Research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) - [Google's T5 Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) - [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer) - [Hugging Face T5 Docs](https://huggingface.co/docs/transformers/model_doc/t5) # Uses ## Direct Use and Downstream Use The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model: > Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task, including machine translation, document summarization, question answering, and classification tasks (e.g., sentiment analysis). We can even apply T5 to regression tasks by training it to predict the string representation of a number instead of the number itself. See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations More information needed. ## Recommendations More information needed. # Training Details ## Training Data The model is pre-trained on the [Colossal Clean Crawled Corpus (C4)](https://www.tensorflow.org/datasets/catalog/c4), which was developed and released in the context of the same [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) as T5. The model was pre-trained on a on a **multi-task mixture of unsupervised (1.) and supervised tasks (2.)**. Thereby, the following datasets were being used for (1.) and (2.): 1. **Datasets used for Unsupervised denoising objective**: - [C4](https://huggingface.co/datasets/c4) - [Wiki-DPR](https://huggingface.co/datasets/wiki_dpr) 2. **Datasets used for Supervised text-to-text language modeling objective** - Sentence acceptability judgment - CoLA [Warstadt et al., 2018](https://arxiv.org/abs/1805.12471) - Sentiment analysis - SST-2 [Socher et al., 2013](https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf) - Paraphrasing/sentence similarity - MRPC [Dolan and Brockett, 2005](https://aclanthology.org/I05-5002) - STS-B [Ceret al., 2017](https://arxiv.org/abs/1708.00055) - QQP [Iyer et al., 2017](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) - Natural language inference - MNLI [Williams et al., 2017](https://arxiv.org/abs/1704.05426) - QNLI [Rajpurkar et al.,2016](https://arxiv.org/abs/1606.05250) - RTE [Dagan et al., 2005](https://link.springer.com/chapter/10.1007/11736790_9) - CB [De Marneff et al., 2019](https://semanticsarchive.net/Archive/Tg3ZGI2M/Marneffe.pdf) - Sentence completion - COPA [Roemmele et al., 2011](https://www.researchgate.net/publication/221251392_Choice_of_Plausible_Alternatives_An_Evaluation_of_Commonsense_Causal_Reasoning) - Word sense disambiguation - WIC [Pilehvar and Camacho-Collados, 2018](https://arxiv.org/abs/1808.09121) - Question answering - MultiRC [Khashabi et al., 2018](https://aclanthology.org/N18-1023) - ReCoRD [Zhang et al., 2018](https://arxiv.org/abs/1810.12885) - BoolQ [Clark et al., 2019](https://arxiv.org/abs/1905.10044) ## Training Procedure In their [abstract](https://jmlr.org/papers/volume21/20-074/20-074.pdf), the model developers write: > In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. The framework introduced, the T5 framework, involves a training procedure that brings together the approaches studied in the paper. See the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for further details. # Evaluation ## Testing Data, Factors & Metrics The developers evaluated the model on 24 tasks, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf) for full details. ## Results For full results for T5-Large, see the [research paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf), Table 14. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @article{2020t5, author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu}, title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer}, journal = {Journal of Machine Learning Research}, year = {2020}, volume = {21}, number = {140}, pages = {1-67}, url = {http://jmlr.org/papers/v21/20-074.html} } ``` **APA:** - Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., ... & Liu, P. J. (2020). Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 21(140), 1-67. # Model Card Authors This model card was written by the team at Hugging Face. # How to Get Started with the Model Use the code below to get started with the model. <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5Model tokenizer = T5Tokenizer.from_pretrained("t5-large") model = T5Model.from_pretrained("t5-large") input_ids = tokenizer( "Studies have been shown that owning a dog is good for you", return_tensors="pt" ).input_ids # Batch size 1 decoder_input_ids = tokenizer("Studies show that", return_tensors="pt").input_ids # Batch size 1 # forward pass outputs = model(input_ids=input_ids, decoder_input_ids=decoder_input_ids) last_hidden_states = outputs.last_hidden_state ``` See the [Hugging Face T5](https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Model) docs and a [Colab Notebook](https://colab.research.google.com/github/google-research/text-to-text-transfer-transformer/blob/main/notebooks/t5-trivia.ipynb) created by the model developers for more examples. </details>
8,473
[ [ -0.0215301513671875, -0.025726318359375, 0.03741455078125, 0.01091766357421875, -0.012115478515625, -0.00649261474609375, -0.021240234375, -0.043975830078125, -0.0234527587890625, 0.032745361328125, -0.037261962890625, -0.046539306640625, -0.062042236328125, 0.025787353515625, -0.04132080078125, 0.0799560546875, -0.007282257080078125, -0.0121307373046875, -0.00835418701171875, -0.0087890625, -0.0260162353515625, -0.03985595703125, -0.04656982421875, -0.02587890625, 0.033233642578125, 0.0157623291015625, 0.02105712890625, 0.032562255859375, 0.05108642578125, 0.0189361572265625, -0.0093231201171875, -0.0014867782592773438, -0.035064697265625, -0.0202178955078125, -0.0211181640625, -0.0229949951171875, -0.026824951171875, -0.0036869049072265625, 0.045257568359375, 0.053192138671875, 0.0029354095458984375, 0.0273895263671875, 0.009735107421875, 0.03778076171875, -0.047607421875, 0.011688232421875, -0.04351806640625, 0.00885009765625, -0.0014104843139648438, 0.0036258697509765625, -0.04486083984375, -0.0037841796875, 0.015655517578125, -0.046417236328125, 0.02252197265625, -0.0026912689208984375, 0.09002685546875, 0.0243377685546875, -0.037567138671875, -0.012786865234375, -0.058502197265625, 0.08392333984375, -0.05853271484375, 0.04058837890625, 0.01161956787109375, 0.01177215576171875, 0.0102081298828125, -0.084228515625, -0.052490234375, -0.0015621185302734375, -0.01348876953125, 0.018096923828125, -0.02191162109375, 0.0030727386474609375, 0.02764892578125, 0.0258636474609375, -0.03204345703125, -0.0004353523254394531, -0.045166015625, -0.00839996337890625, 0.044464111328125, -0.0021228790283203125, 0.0241546630859375, -0.01288604736328125, -0.03515625, -0.0228729248046875, -0.025146484375, 0.005107879638671875, -0.00917816162109375, 0.018707275390625, -0.025390625, 0.0222015380859375, 0.007808685302734375, 0.04498291015625, 0.01366424560546875, -0.016357421875, 0.032318115234375, -0.056182861328125, -0.01788330078125, -0.0254974365234375, 0.08441162109375, 0.0235137939453125, 0.01165771484375, -0.03216552734375, -0.004241943359375, -0.0116424560546875, 0.0288238525390625, -0.07464599609375, -0.00616455078125, 0.0245513916015625, -0.036285400390625, -0.034881591796875, -0.00368499755859375, -0.0589599609375, -0.00458526611328125, -0.00887298583984375, 0.037506103515625, -0.03643798828125, -0.016357421875, 0.015777587890625, -0.02191162109375, 0.0276031494140625, 0.0231475830078125, -0.06439208984375, 0.024505615234375, 0.025299072265625, 0.0552978515625, -0.033050537109375, -0.0270538330078125, -0.0126800537109375, 0.01068878173828125, -0.01039886474609375, 0.05059814453125, -0.03173828125, -0.031585693359375, -0.00920867919921875, 0.01270294189453125, -0.0171356201171875, -0.0211334228515625, 0.064697265625, -0.02227783203125, 0.05316162109375, -0.02490234375, -0.040191650390625, -0.030364990234375, 0.01361846923828125, -0.047821044921875, 0.0865478515625, -0.0007815361022949219, -0.059722900390625, 0.0224761962890625, -0.071533203125, -0.020904541015625, -0.0205078125, 0.022796630859375, -0.042205810546875, -0.0194091796875, 0.0220184326171875, 0.04931640625, -0.03082275390625, 0.028533935546875, -0.0226287841796875, -0.0189361572265625, 0.00901031494140625, -0.025299072265625, 0.07708740234375, 0.02117919921875, -0.0377197265625, 0.0027675628662109375, -0.054168701171875, 0.0014772415161132812, 0.0006618499755859375, -0.020751953125, 0.0025272369384765625, -0.0155181884765625, 0.01983642578125, 0.034515380859375, 0.019622802734375, -0.040771484375, 0.00359344482421875, -0.020660400390625, 0.051361083984375, 0.0362548828125, -0.00765228271484375, 0.041290283203125, -0.03497314453125, 0.0274658203125, 0.01433563232421875, 0.00580596923828125, -0.01352691650390625, -0.023773193359375, -0.0594482421875, 0.0006303787231445312, 0.039520263671875, 0.039825439453125, -0.04498291015625, 0.0445556640625, -0.04254150390625, -0.052215576171875, -0.045745849609375, -0.004665374755859375, 0.0270538330078125, 0.04766845703125, 0.060455322265625, -0.00664520263671875, -0.044891357421875, -0.048370361328125, -0.0228729248046875, -0.0015306472778320312, -0.002651214599609375, 0.00777435302734375, 0.054412841796875, -0.01016998291015625, 0.06011962890625, -0.0240325927734375, -0.0291595458984375, -0.039886474609375, -0.0011816024780273438, -0.00421905517578125, 0.043670654296875, 0.047332763671875, -0.0570068359375, -0.036346435546875, -0.01250457763671875, -0.0631103515625, 0.0002205371856689453, -0.0117950439453125, 0.0004818439483642578, 0.032501220703125, 0.043212890625, -0.046783447265625, 0.0150909423828125, 0.04718017578125, -0.02587890625, 0.0241241455078125, -0.0117645263671875, -0.0007624626159667969, -0.1241455078125, 0.0411376953125, 0.01261138916015625, -0.01490020751953125, -0.054779052734375, -0.007335662841796875, 0.00490570068359375, -0.007137298583984375, -0.038665771484375, 0.05242919921875, -0.0313720703125, 0.0017337799072265625, -0.0004849433898925781, 0.004390716552734375, 0.01044464111328125, 0.04888916015625, -0.00272369384765625, 0.0589599609375, 0.0185089111328125, -0.052215576171875, 0.0019855499267578125, 0.0245513916015625, -0.006683349609375, 0.023040771484375, -0.0570068359375, 0.0198516845703125, -0.005218505859375, 0.038665771484375, -0.0706787109375, 0.00913238525390625, 0.0231475830078125, -0.050628662109375, 0.0257720947265625, 0.002704620361328125, -0.031524658203125, -0.0307464599609375, -0.0241851806640625, 0.0200347900390625, 0.050445556640625, -0.0384521484375, 0.054412841796875, 0.01001739501953125, 0.022796630859375, -0.0604248046875, -0.06396484375, 0.01381683349609375, -0.03057861328125, -0.03900146484375, 0.059722900390625, -0.0106201171875, 0.008087158203125, 0.0096435546875, 0.0032806396484375, -0.0157623291015625, 0.01007080078125, 0.0031795501708984375, 0.0149078369140625, 0.001800537109375, 0.0157470703125, -0.009124755859375, -0.01088714599609375, 0.0006108283996582031, -0.036346435546875, 0.024078369140625, -0.01160430908203125, 0.01302337646484375, -0.051422119140625, 0.0129241943359375, 0.048248291015625, -0.0162200927734375, 0.06451416015625, 0.0765380859375, -0.0183868408203125, -0.00408172607421875, -0.037933349609375, -0.01593017578125, -0.03460693359375, 0.03204345703125, -0.032318115234375, -0.066650390625, 0.03240966796875, 0.00241851806640625, 0.030029296875, 0.0662841796875, 0.025299072265625, -0.01047515869140625, 0.05938720703125, 0.062744140625, -0.005340576171875, 0.04254150390625, -0.03521728515625, 0.022552490234375, -0.066650390625, -0.019622802734375, -0.05731201171875, -0.0219879150390625, -0.059112548828125, -0.028533935546875, 0.0080718994140625, -0.003299713134765625, -0.02972412109375, 0.03594970703125, -0.040283203125, 0.00966644287109375, 0.031982421875, 0.006374359130859375, 0.027313232421875, 0.0002884864807128906, -0.002727508544921875, -0.0108184814453125, -0.064208984375, -0.03424072265625, 0.0986328125, 0.02801513671875, 0.0289154052734375, -0.0017328262329101562, 0.0482177734375, 0.015472412109375, 0.01334381103515625, -0.05731201171875, 0.052215576171875, -0.03057861328125, -0.037872314453125, -0.0199432373046875, -0.0328369140625, -0.0877685546875, 0.02191162109375, -0.02801513671875, -0.05120849609375, 0.00849151611328125, -0.002017974853515625, -0.01617431640625, 0.041015625, -0.0648193359375, 0.0814208984375, -0.005237579345703125, -0.026336669921875, -0.0033817291259765625, -0.05316162109375, 0.017608642578125, -0.0000623464584350586, 0.00820159912109375, 0.009918212890625, -0.013763427734375, 0.07403564453125, -0.023040771484375, 0.0682373046875, -0.01380157470703125, 0.00000667572021484375, 0.00910186767578125, -0.0272064208984375, 0.0335693359375, -0.03192138671875, -0.0074462890625, 0.0302581787109375, 0.0080108642578125, -0.0355224609375, -0.040771484375, 0.035308837890625, -0.07366943359375, -0.0294342041015625, -0.0303802490234375, -0.036102294921875, -0.01065826416015625, 0.028656005859375, 0.0261688232421875, 0.01491546630859375, -0.01186370849609375, 0.0309295654296875, 0.050506591796875, -0.029937744140625, 0.055755615234375, 0.0236968994140625, 0.000995635986328125, -0.022735595703125, 0.058624267578125, 0.0113525390625, 0.0288238525390625, 0.04388427734375, 0.0141754150390625, -0.026397705078125, -0.0426025390625, -0.0264892578125, 0.02667236328125, -0.04547119140625, -0.00751495361328125, -0.07366943359375, -0.0174407958984375, -0.04278564453125, -0.003326416015625, -0.03216552734375, -0.028533935546875, -0.03369140625, -0.012939453125, 0.0209503173828125, 0.037139892578125, 0.0122528076171875, 0.013671875, -0.06951904296875, 0.01207733154296875, 0.00345611572265625, 0.0082550048828125, 0.00321197509765625, -0.0609130859375, -0.0103302001953125, 0.00662994384765625, -0.03265380859375, -0.047454833984375, 0.03314208984375, 0.0171356201171875, 0.02459716796875, 0.004985809326171875, 0.0116729736328125, 0.04718017578125, -0.0210723876953125, 0.07635498046875, 0.01454925537109375, -0.07879638671875, 0.0184173583984375, -0.0210723876953125, 0.03375244140625, 0.036956787109375, 0.038543701171875, -0.049041748046875, -0.015716552734375, -0.07464599609375, -0.059661865234375, 0.0562744140625, 0.016693115234375, 0.0112457275390625, 0.0299530029296875, 0.0186920166015625, 0.0016727447509765625, 0.01197052001953125, -0.07159423828125, -0.01551055908203125, -0.0179443359375, -0.0271759033203125, -0.00977325439453125, -0.0059814453125, 0.0080413818359375, -0.0260009765625, 0.04901123046875, -0.005985260009765625, 0.0550537109375, 0.0220184326171875, -0.0191650390625, 0.0132293701171875, 0.029571533203125, 0.05120849609375, 0.0411376953125, -0.01290130615234375, -0.001491546630859375, 0.034271240234375, -0.040771484375, -0.003322601318359375, 0.012176513671875, -0.0218353271484375, -0.0006823539733886719, 0.037200927734375, 0.0723876953125, 0.007305145263671875, -0.031982421875, 0.043365478515625, -0.0030956268310546875, -0.048248291015625, -0.0222320556640625, -0.005336761474609375, 0.0103302001953125, -0.000020503997802734375, 0.0202178955078125, 0.01446533203125, 0.0086822509765625, -0.038330078125, 0.0021457672119140625, 0.01128387451171875, -0.03436279296875, -0.0341796875, 0.060028076171875, 0.026458740234375, -0.00032591819763183594, 0.045989990234375, -0.0073699951171875, -0.0419921875, 0.04302978515625, 0.041412353515625, 0.076904296875, -0.00547027587890625, 0.01448822021484375, 0.05224609375, 0.0285491943359375, -0.00994873046875, 0.001667022705078125, -0.004573822021484375, -0.05914306640625, -0.04656982421875, -0.036956787109375, -0.021270751953125, 0.0204620361328125, -0.032867431640625, 0.0254364013671875, -0.025299072265625, -0.00029397010803222656, 0.00832366943359375, 0.013092041015625, -0.05718994140625, 0.02435302734375, 0.0023479461669921875, 0.0635986328125, -0.05621337890625, 0.063720703125, 0.0557861328125, -0.044952392578125, -0.07171630859375, 0.01190185546875, -0.02337646484375, -0.04791259765625, 0.043670654296875, 0.01277923583984375, 0.0009036064147949219, 0.0136260986328125, -0.043060302734375, -0.06689453125, 0.100830078125, 0.0264892578125, -0.0248260498046875, -0.0261077880859375, 0.020263671875, 0.0504150390625, -0.0185089111328125, 0.03173828125, 0.0350341796875, 0.039031982421875, 0.0200958251953125, -0.08026123046875, 0.0258026123046875, -0.01739501953125, 0.00528717041015625, 0.006107330322265625, -0.0638427734375, 0.044281005859375, -0.0261688232421875, -0.0162506103515625, -0.015869140625, 0.050872802734375, 0.00202178955078125, 0.014617919921875, 0.0360107421875, 0.056182861328125, 0.052154541015625, -0.006591796875, 0.08807373046875, -0.02313232421875, 0.034820556640625, 0.06103515625, 0.00923919677734375, 0.0693359375, 0.03680419921875, -0.02264404296875, 0.03564453125, 0.04840087890625, -0.00939178466796875, 0.03521728515625, -0.007205963134765625, -0.00492095947265625, -0.00940704345703125, -0.01284027099609375, -0.027099609375, 0.0208740234375, 0.01885986328125, -0.033355712890625, -0.0191650390625, 0.01016998291015625, 0.0230712890625, -0.01204681396484375, -0.00812530517578125, 0.06292724609375, 0.0174407958984375, -0.05975341796875, 0.05682373046875, 0.01035308837890625, 0.06622314453125, -0.03338623046875, 0.00656890869140625, -0.012298583984375, 0.0174713134765625, -0.024688720703125, -0.053680419921875, 0.035491943359375, 0.001323699951171875, -0.01287841796875, -0.0494384765625, 0.060272216796875, -0.033721923828125, -0.0313720703125, 0.0261383056640625, 0.03192138671875, 0.0073699951171875, 0.0018978118896484375, -0.069580078125, -0.00485992431640625, 0.014007568359375, -0.0167083740234375, 0.0261077880859375, 0.03082275390625, 0.0038776397705078125, 0.05206298828125, 0.045440673828125, -0.01483154296875, -0.0010776519775390625, -0.010650634765625, 0.051300048828125, -0.05706787109375, -0.02191162109375, -0.05523681640625, 0.0506591796875, -0.00307464599609375, -0.0335693359375, 0.048370361328125, 0.033660888671875, 0.07916259765625, -0.00920867919921875, 0.074462890625, -0.01288604736328125, 0.0396728515625, -0.0308990478515625, 0.038909912109375, -0.05224609375, 0.0126800537109375, -0.0275421142578125, -0.0643310546875, -0.0235443115234375, 0.0294036865234375, -0.029693603515625, 0.0248870849609375, 0.080322265625, 0.0457763671875, -0.000858306884765625, -0.01023101806640625, 0.017242431640625, 0.016571044921875, 0.02838134765625, 0.05780029296875, 0.026458740234375, -0.0714111328125, 0.072509765625, -0.023162841796875, 0.016021728515625, -0.002452850341796875, -0.0618896484375, -0.0723876953125, -0.06414794921875, -0.0286712646484375, -0.0350341796875, 0.00916290283203125, 0.060455322265625, 0.043060302734375, -0.04998779296875, -0.019012451171875, -0.0260772705078125, -0.00020623207092285156, -0.0149078369140625, -0.01538848876953125, 0.03924560546875, -0.03680419921875, -0.0665283203125, -0.0011072158813476562, -0.006488800048828125, 0.00323486328125, -0.0022258758544921875, -0.0003337860107421875, -0.0233001708984375, -0.0140228271484375, 0.049224853515625, 0.01322174072265625, -0.04656982421875, -0.0230865478515625, 0.0184783935546875, -0.0122222900390625, 0.00894927978515625, 0.034759521484375, -0.04754638671875, 0.01377105712890625, 0.038665771484375, 0.07110595703125, 0.0634765625, -0.01056671142578125, 0.047607421875, -0.0292510986328125, -0.009613037109375, 0.010162353515625, 0.007598876953125, 0.0294342041015625, -0.016357421875, 0.047698974609375, 0.03387451171875, -0.0386962890625, -0.04986572265625, -0.0123748779296875, -0.09423828125, -0.015777587890625, 0.09771728515625, -0.011932373046875, -0.01849365234375, 0.0025787353515625, -0.004894256591796875, 0.028045654296875, -0.03594970703125, 0.05780029296875, 0.0657958984375, 0.0109405517578125, -0.032806396484375, -0.04547119140625, 0.04718017578125, 0.043365478515625, -0.07928466796875, -0.01441192626953125, 0.0165557861328125, 0.03717041015625, 0.009674072265625, 0.04241943359375, -0.00858306884765625, 0.00543975830078125, -0.01641845703125, 0.0237884521484375, -0.0011339187622070312, -0.00336456298828125, -0.0260772705078125, 0.015960693359375, -0.01398468017578125, -0.0186309814453125 ] ]
facebook/dpr-ctx_encoder-single-nq-base
2022-12-21T15:16:53.000Z
[ "transformers", "pytorch", "tf", "dpr", "en", "dataset:nq_open", "arxiv:2004.04906", "arxiv:1702.08734", "arxiv:1910.09700", "license:cc-by-nc-4.0", "has_space", "region:us" ]
null
facebook
null
null
facebook/dpr-ctx_encoder-single-nq-base
15
310,231
transformers
2022-03-02T23:29:05
--- language: en license: cc-by-nc-4.0 tags: - dpr datasets: - nq_open inference: false --- # `dpr-ctx_encoder-single-nq-base` ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** [Dense Passage Retrieval (DPR)](https://github.com/facebookresearch/DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. `dpr-ctx_encoder-single-nq-base` is the Context Encoder trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). - **Developed by:** See [GitHub repo](https://github.com/facebookresearch/DPR) for model developers - **Model Type:** BERT-based encoder - **Language(s):** [CC-BY-NC-4.0](https://github.com/facebookresearch/DPR/blob/main/LICENSE), also see [Code of Conduct](https://github.com/facebookresearch/DPR/blob/main/CODE_OF_CONDUCT.md) - **License:** English - **Related Models:** - [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base) - [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) - [`dpr-ctx_encoder-multiset-base`](https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base) - [`dpr-question_encoder-multiset-base`](https://huggingface.co/facebook/dpr-question_encoder-multiset-base) - [`dpr-reader-multiset-base`](https://huggingface.co/facebook/dpr-reader-multiset-base) - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2004.04906) - [GitHub Repo](https://github.com/facebookresearch/DPR) - [Hugging Face DPR docs](https://huggingface.co/docs/transformers/main/en/model_doc/dpr) - [BERT Base Uncased Model Card](https://huggingface.co/bert-base-uncased) ## How to Get Started with the Model Use the code below to get started with the model. ```python >>> from transformers import DPRContextEncoder, DPRContextEncoderTokenizer >>> tokenizer = DPRContextEncoderTokenizer.from_pretrained("facebook/dpr-ctx_encoder-single-nq-base") >>> model = DPRContextEncoder.from_pretrained("facebook/dpr-ctx_encoder-single-nq-base") >>> input_ids = tokenizer("Hello, is my dog cute ?", return_tensors="pt")["input_ids"] >>> embeddings = model(input_ids).pooler_output ``` ## Uses #### Direct Use `dpr-ctx_encoder-single-nq-base`, [`dpr-question-encoder-single-nq-base`](https://huggingface.co/facebook/dpr-question_encoder-single-nq-base), and [`dpr-reader-single-nq-base`](https://huggingface.co/facebook/dpr-reader-single-nq-base) can be used for the task of open-domain question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the set of DPR models was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section may contain content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Training #### Training Data This model was trained using the [Natural Questions (NQ) dataset](https://huggingface.co/datasets/nq_open) ([Lee et al., 2019](https://aclanthology.org/P19-1612/); [Kwiatkowski et al., 2019](https://aclanthology.org/Q19-1026/)). The model authors write that: > [The dataset] was designed for end-to-end question answering. The questions were mined from real Google search queries and the answers were spans in Wikipedia articles identified by annotators. #### Training Procedure The training procedure is described in the [associated paper](https://arxiv.org/pdf/2004.04906.pdf): > Given a collection of M text passages, the goal of our dense passage retriever (DPR) is to index all the passages in a low-dimensional and continuous space, such that it can retrieve efficiently the top k passages relevant to the input question for the reader at run-time. > Our dense passage retriever (DPR) uses a dense encoder EP(·) which maps any text passage to a d- dimensional real-valued vectors and builds an index for all the M passages that we will use for retrieval. At run-time, DPR applies a different encoder EQ(·) that maps the input question to a d-dimensional vector, and retrieves k passages of which vectors are the closest to the question vector. The authors report that for encoders, they used two independent BERT ([Devlin et al., 2019](https://aclanthology.org/N19-1423/)) networks (base, un-cased) and use FAISS ([Johnson et al., 2017](https://arxiv.org/abs/1702.08734)) during inference time to encode and index passages. See the paper for further details on training, including encoders, inference, positive and negative passages, and in-batch negatives. ## Evaluation The following evaluation information is extracted from the [associated paper](https://arxiv.org/pdf/2004.04906.pdf). #### Testing Data, Factors and Metrics The model developers report the performance of the model on five QA datasets, using the top-k accuracy (k ∈ {20, 100}). The datasets were [NQ](https://huggingface.co/datasets/nq_open), [TriviaQA](https://huggingface.co/datasets/trivia_qa), [WebQuestions (WQ)](https://huggingface.co/datasets/web_questions), [CuratedTREC (TREC)](https://huggingface.co/datasets/trec), and [SQuAD v1.1](https://huggingface.co/datasets/squad). #### Results | | Top 20 | | | | | Top 100| | | | | |:----:|:------:|:---------:|:--:|:----:|:-----:|:------:|:---------:|:--:|:----:|:-----:| | | NQ | TriviaQA | WQ | TREC | SQuAD | NQ | TriviaQA | WQ | TREC | SQuAD | | | 78.4 | 79.4 |73.2| 79.8 | 63.2 | 85.4 | 85.0 |81.4| 89.1 | 77.2 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and based on the [associated paper](https://arxiv.org/abs/2004.04906). - **Hardware Type:** 8 32GB GPUs - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/2004.04906) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{karpukhin-etal-2020-dense, title = "Dense Passage Retrieval for Open-Domain Question Answering", author = "Karpukhin, Vladimir and Oguz, Barlas and Min, Sewon and Lewis, Patrick and Wu, Ledell and Edunov, Sergey and Chen, Danqi and Yih, Wen-tau", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.emnlp-main.550", doi = "10.18653/v1/2020.emnlp-main.550", pages = "6769--6781", } ``` ## Model Card Authors This model card was written by the team at Hugging Face.
8,189
[ [ -0.04412841796875, -0.07147216796875, 0.0246124267578125, 0.01226806640625, -0.011260986328125, -0.004009246826171875, -0.0100860595703125, -0.0214385986328125, 0.00244140625, 0.0312347412109375, -0.051788330078125, -0.0318603515625, -0.035491943359375, 0.0215301513671875, -0.02764892578125, 0.0635986328125, 0.0029048919677734375, 0.0013265609741210938, -0.0300140380859375, -0.0089874267578125, -0.011138916015625, -0.051483154296875, -0.0489501953125, -0.003772735595703125, 0.022216796875, 0.0037212371826171875, 0.046539306640625, 0.029052734375, 0.039825439453125, 0.0204620361328125, -0.031280517578125, 0.01209259033203125, -0.04327392578125, -0.014129638671875, 0.003185272216796875, -0.0147552490234375, -0.0350341796875, -0.0045318603515625, 0.053680419921875, 0.04107666015625, -0.005039215087890625, 0.019775390625, 0.00672149658203125, 0.05108642578125, -0.036041259765625, -0.0014028549194335938, -0.033538818359375, 0.004230499267578125, 0.011688232421875, -0.00341796875, -0.0259552001953125, -0.03594970703125, 0.0027294158935546875, -0.0396728515625, 0.01776123046875, 0.003173828125, 0.0814208984375, 0.02288818359375, -0.03076171875, -0.0211029052734375, -0.032135009765625, 0.0572509765625, -0.0703125, 0.04052734375, 0.0280609130859375, 0.0128936767578125, -0.0005040168762207031, -0.052276611328125, -0.0711669921875, -0.01116943359375, -0.01277923583984375, 0.015777587890625, -0.012908935546875, 0.0034694671630859375, 0.0292510986328125, 0.036834716796875, -0.053619384765625, -0.005767822265625, -0.0244598388671875, -0.005596160888671875, 0.069091796875, 0.0162200927734375, 0.0186767578125, -0.033203125, -0.03009033203125, -0.0226898193359375, -0.0201263427734375, 0.0243682861328125, 0.021942138671875, 0.0192413330078125, -0.0265350341796875, 0.03704833984375, -0.014984130859375, 0.057647705078125, 0.0247955322265625, -0.0113983154296875, 0.038116455078125, -0.04656982421875, -0.005550384521484375, -0.0180511474609375, 0.072265625, 0.025909423828125, 0.007411956787109375, -0.003582000732421875, -0.01511383056640625, -0.022216796875, 0.00786590576171875, -0.0733642578125, -0.00450897216796875, 0.04150390625, -0.03179931640625, -0.010345458984375, 0.0063934326171875, -0.062286376953125, -0.0093536376953125, -0.00672149658203125, 0.03448486328125, -0.038909912109375, -0.0306549072265625, 0.03271484375, -0.02728271484375, 0.038055419921875, 0.0180816650390625, -0.04193115234375, 0.026885986328125, 0.036041259765625, 0.052154541015625, -0.0026187896728515625, -0.00506591796875, -0.00893402099609375, -0.02032470703125, -0.0018129348754882812, 0.03814697265625, -0.03314208984375, -0.01605224609375, 0.0006222724914550781, 0.0142364501953125, -0.0168914794921875, -0.031280517578125, 0.047943115234375, -0.048736572265625, 0.027618408203125, -0.0399169921875, -0.05194091796875, -0.017333984375, 0.034637451171875, -0.052276611328125, 0.09307861328125, 0.01061248779296875, -0.069580078125, 0.007617950439453125, -0.04522705078125, -0.00807952880859375, -0.00441741943359375, -0.005001068115234375, -0.02801513671875, -0.02020263671875, 0.0360107421875, 0.033721923828125, -0.0182037353515625, 0.021759033203125, -0.0224456787109375, -0.0362548828125, 0.029083251953125, -0.0219573974609375, 0.09747314453125, 0.0086822509765625, -0.014190673828125, -0.0177764892578125, -0.051544189453125, 0.003391265869140625, 0.03289794921875, -0.0247955322265625, -0.00984954833984375, -0.0187530517578125, 0.007045745849609375, 0.027923583984375, 0.027252197265625, -0.0628662109375, 0.00659942626953125, -0.0225677490234375, 0.040557861328125, 0.0426025390625, 0.0183563232421875, 0.0302734375, -0.0310516357421875, 0.041839599609375, 0.00463104248046875, 0.025909423828125, 0.00788116455078125, -0.0399169921875, -0.048828125, -0.0147552490234375, 0.0272979736328125, 0.045074462890625, -0.0599365234375, 0.0457763671875, -0.0234832763671875, -0.045684814453125, -0.0428466796875, -0.00909423828125, 0.037506103515625, 0.042388916015625, 0.036834716796875, -0.00666046142578125, -0.03466796875, -0.060821533203125, 0.0005240440368652344, -0.00972747802734375, 0.006427764892578125, 0.04986572265625, 0.061767578125, -0.00826263427734375, 0.07147216796875, -0.04339599609375, -0.007965087890625, -0.02740478515625, -0.01544189453125, 0.0204315185546875, 0.0377197265625, 0.054656982421875, -0.08148193359375, -0.042724609375, -0.035552978515625, -0.058624267578125, 0.01568603515625, -0.0009937286376953125, -0.016326904296875, 0.01326751708984375, 0.031951904296875, -0.0517578125, 0.02606201171875, 0.0267486572265625, -0.0232391357421875, 0.033050537109375, 0.0032444000244140625, 0.01227569580078125, -0.0775146484375, 0.0158233642578125, 0.002399444580078125, 0.009674072265625, -0.0450439453125, 0.0030059814453125, 0.00858306884765625, -0.0062408447265625, -0.041412353515625, 0.054443359375, -0.0249786376953125, 0.0062408447265625, 0.01050567626953125, 0.0182952880859375, 0.026397705078125, 0.059661865234375, 0.00945281982421875, 0.051971435546875, 0.0223541259765625, -0.055419921875, 0.0121612548828125, 0.05609130859375, -0.0174560546875, 0.0202178955078125, -0.06353759765625, 0.02801513671875, -0.029876708984375, 0.0260772705078125, -0.07586669921875, -0.00450897216796875, 0.0213775634765625, -0.0594482421875, 0.0201263427734375, -0.0008983612060546875, -0.05279541015625, -0.046112060546875, -0.0173187255859375, 0.032867431640625, 0.04052734375, -0.035797119140625, 0.0299072265625, 0.0257415771484375, 0.0015821456909179688, -0.064208984375, -0.057281494140625, -0.018402099609375, -0.001171112060546875, -0.05523681640625, 0.04254150390625, -0.0213775634765625, -0.0038604736328125, 0.0184478759765625, 0.00185394287109375, -0.0310211181640625, -0.0003943443298339844, 0.0007596015930175781, 0.00746917724609375, -0.005107879638671875, 0.0190887451171875, -0.0062255859375, 0.0214691162109375, 0.008453369140625, 0.006710052490234375, 0.042724609375, -0.0184478759765625, -0.010650634765625, -0.0267791748046875, 0.0236968994140625, 0.0186004638671875, -0.0284576416015625, 0.061767578125, 0.0484619140625, -0.0361328125, -0.006488800048828125, -0.044891357421875, -0.0302734375, -0.037200927734375, 0.037506103515625, -0.0191192626953125, -0.08233642578125, 0.0521240234375, 0.03143310546875, 0.005260467529296875, 0.04541015625, 0.02813720703125, -0.006076812744140625, 0.0743408203125, 0.03729248046875, 0.0072021484375, 0.03704833984375, -0.0374755859375, 0.01393890380859375, -0.06719970703125, -0.0213775634765625, -0.034423828125, -0.0238494873046875, -0.0457763671875, -0.0333251953125, 0.0192108154296875, 0.00548553466796875, -0.040740966796875, 0.0248565673828125, -0.052337646484375, 0.01441192626953125, 0.04071044921875, 0.0269012451171875, 0.007762908935546875, -0.007244110107421875, -0.00437164306640625, -0.0081634521484375, -0.06829833984375, -0.02667236328125, 0.0880126953125, 0.0343017578125, 0.0390625, 0.0009431838989257812, 0.058013916015625, 0.0106048583984375, -0.00478363037109375, -0.037109375, 0.05401611328125, -0.010589599609375, -0.07098388671875, -0.0244903564453125, -0.041839599609375, -0.066162109375, 0.01039886474609375, -0.01372528076171875, -0.0380859375, 0.042755126953125, -0.006946563720703125, -0.049285888671875, 0.0274505615234375, -0.032928466796875, 0.07666015625, -0.03302001953125, -0.03179931640625, 0.00943756103515625, -0.051116943359375, 0.0262451171875, 0.0081634521484375, 0.00325775146484375, 0.00032448768615722656, -0.00557708740234375, 0.06500244140625, -0.017303466796875, 0.06353759765625, -0.031341552734375, 0.01024627685546875, 0.045135498046875, -0.021759033203125, 0.016998291015625, 0.01215362548828125, -0.0165557861328125, 0.02496337890625, 0.0161590576171875, -0.025238037109375, -0.040435791015625, 0.0274505615234375, -0.07135009765625, -0.0244903564453125, -0.037811279296875, -0.032196044921875, -0.00554656982421875, 0.007602691650390625, 0.0207977294921875, 0.038909912109375, -0.014495849609375, 0.0263824462890625, 0.06915283203125, -0.052886962890625, 0.0267181396484375, 0.0390625, -0.0030040740966796875, -0.03411865234375, 0.050872802734375, 0.0116729736328125, 0.013580322265625, 0.0494384765625, -0.0021038055419921875, -0.041351318359375, -0.03839111328125, -0.0294036865234375, 0.0245513916015625, -0.0594482421875, -0.021209716796875, -0.0740966796875, -0.044281005859375, -0.045074462890625, 0.00948333740234375, -0.03131103515625, -0.02764892578125, -0.0281829833984375, -0.01021575927734375, 0.0374755859375, 0.032318115234375, 0.004180908203125, 0.01177215576171875, -0.053680419921875, 0.026885986328125, 0.01544189453125, 0.02301025390625, -0.002239227294921875, -0.058197021484375, -0.017913818359375, 0.0237884521484375, -0.0226898193359375, -0.0635986328125, 0.03631591796875, 0.0202178955078125, 0.05194091796875, 0.005901336669921875, 0.03399658203125, 0.04376220703125, -0.01268768310546875, 0.0660400390625, -0.0162200927734375, -0.045013427734375, 0.038055419921875, -0.0196380615234375, 0.01873779296875, 0.05645751953125, 0.051544189453125, -0.03179931640625, -0.007747650146484375, -0.044708251953125, -0.0654296875, 0.048828125, 0.0160369873046875, 0.0174407958984375, -0.0236968994140625, 0.04791259765625, -0.013275146484375, 0.024658203125, -0.06884765625, -0.022308349609375, -0.0182342529296875, -0.0174713134765625, 0.004474639892578125, -0.0228424072265625, 0.001132965087890625, -0.041351318359375, 0.048431396484375, -0.00624847412109375, 0.05615234375, 0.0498046875, -0.01442718505859375, 0.01068115234375, 0.01277923583984375, 0.0265655517578125, 0.035797119140625, -0.04461669921875, -0.0248565673828125, 0.00737762451171875, -0.036651611328125, -0.0032253265380859375, 0.027130126953125, -0.024566650390625, -0.00185394287109375, 0.018096923828125, 0.061431884765625, 0.0116119384765625, -0.060272216796875, 0.052642822265625, -0.0176544189453125, -0.040008544921875, -0.045135498046875, -0.01552581787109375, -0.00006377696990966797, 0.0185699462890625, 0.02545166015625, -0.0233001708984375, 0.0163116455078125, -0.0304107666015625, 0.019775390625, 0.0299835205078125, -0.03125, -0.00836944580078125, 0.050628662109375, 0.013519287109375, -0.0033969879150390625, 0.0684814453125, -0.0399169921875, -0.050262451171875, 0.061767578125, 0.0248565673828125, 0.0635986328125, 0.00542449951171875, 0.017730712890625, 0.06817626953125, 0.03564453125, 0.0034027099609375, 0.0430908203125, 0.00311279296875, -0.0732421875, -0.02557373046875, -0.0594482421875, -0.0244140625, 0.01206207275390625, -0.06036376953125, 0.002208709716796875, -0.03179931640625, -0.0180511474609375, -0.0164031982421875, 0.01517486572265625, -0.070068359375, 0.0164794921875, 0.00441741943359375, 0.08001708984375, -0.0531005859375, 0.04071044921875, 0.05340576171875, -0.0574951171875, -0.05413818359375, -0.0011463165283203125, -0.0159912109375, -0.050872802734375, 0.05029296875, 0.0183563232421875, 0.0275726318359375, 0.00994110107421875, -0.0482177734375, -0.0718994140625, 0.091552734375, 0.01395416259765625, -0.03521728515625, -0.0110626220703125, 0.0221405029296875, 0.039337158203125, -0.026397705078125, 0.0390625, 0.032318115234375, 0.0296630859375, 0.00244903564453125, -0.06512451171875, 0.0229034423828125, -0.036407470703125, -0.01332855224609375, -0.0113677978515625, -0.06719970703125, 0.0806884765625, -0.0129852294921875, -0.0210418701171875, -0.0035686492919921875, 0.03521728515625, 0.0270843505859375, 0.0192108154296875, 0.033660888671875, 0.058807373046875, 0.05023193359375, -0.0122833251953125, 0.08770751953125, -0.033355712890625, 0.028717041015625, 0.070068359375, -0.00913238525390625, 0.07537841796875, 0.0208587646484375, -0.0231781005859375, 0.036590576171875, 0.058929443359375, -0.010101318359375, 0.041015625, 0.0046539306640625, -0.00322723388671875, -0.021759033203125, -0.005420684814453125, -0.038818359375, 0.0292510986328125, 0.01727294921875, -0.027587890625, 0.0009336471557617188, -0.0017614364624023438, -0.0082244873046875, 0.0030651092529296875, -0.0111083984375, 0.05712890625, 0.0038394927978515625, -0.04461669921875, 0.0662841796875, -0.00690460205078125, 0.06439208984375, -0.038665771484375, 0.00013625621795654297, -0.0230712890625, 0.0196380615234375, -0.006610870361328125, -0.06378173828125, 0.0182952880859375, -0.01058197021484375, -0.01274871826171875, -0.01983642578125, 0.047607421875, -0.0272064208984375, -0.043182373046875, 0.031341552734375, 0.050872802734375, 0.0078887939453125, -0.02166748046875, -0.08880615234375, 0.0023479461669921875, 0.0016984939575195312, -0.0272216796875, 0.022125244140625, 0.025787353515625, 0.01904296875, 0.0535888671875, 0.03277587890625, -0.0281524658203125, 0.00862884521484375, 0.0020599365234375, 0.07525634765625, -0.060546875, -0.026397705078125, -0.044342041015625, 0.050994873046875, -0.01374053955078125, -0.032440185546875, 0.0677490234375, 0.048065185546875, 0.0777587890625, 0.0023746490478515625, 0.064208984375, -0.021331787109375, 0.0426025390625, -0.0229644775390625, 0.052276611328125, -0.06219482421875, 0.005428314208984375, -0.0218048095703125, -0.058135986328125, 0.004497528076171875, 0.04656982421875, -0.0202789306640625, 0.0265655517578125, 0.043182373046875, 0.068603515625, 0.00695037841796875, 0.00494384765625, 0.0025653839111328125, 0.0140838623046875, 0.007648468017578125, 0.05279541015625, 0.0513916015625, -0.060699462890625, 0.052978515625, -0.047271728515625, -0.00926971435546875, -0.009674072265625, -0.044952392578125, -0.07391357421875, -0.0484619140625, -0.0382080078125, -0.036773681640625, 0.004486083984375, 0.05615234375, 0.0389404296875, -0.048736572265625, -0.00820159912109375, -0.00116729736328125, 0.0010461807250976562, -0.0252227783203125, -0.0212249755859375, 0.033782958984375, -0.005138397216796875, -0.05523681640625, 0.007350921630859375, -0.00995635986328125, 0.0017108917236328125, -0.0183563232421875, -0.0188751220703125, -0.042724609375, 0.004154205322265625, 0.038360595703125, 0.0182952880859375, -0.049285888671875, -0.0158843994140625, 0.0386962890625, -0.01128387451171875, 0.0036945343017578125, 0.017059326171875, -0.043701171875, 0.0215301513671875, 0.049896240234375, 0.05828857421875, 0.0477294921875, 0.0087738037109375, 0.020172119140625, -0.0560302734375, 0.00371551513671875, 0.04144287109375, 0.020111083984375, 0.0299072265625, -0.035736083984375, 0.04461669921875, 0.01473236083984375, -0.048828125, -0.0703125, -0.004611968994140625, -0.0877685546875, -0.0202178955078125, 0.09979248046875, -0.006122589111328125, -0.023651123046875, -0.0023822784423828125, -0.00939178466796875, 0.016265869140625, -0.03350830078125, 0.038726806640625, 0.045684814453125, -0.00821685791015625, -0.03131103515625, -0.05694580078125, 0.032135009765625, 0.02008056640625, -0.042724609375, -0.01488494873046875, 0.0285186767578125, 0.0255584716796875, 0.0045013427734375, 0.0650634765625, -0.0176849365234375, 0.01160430908203125, 0.01068878173828125, 0.0225372314453125, -0.0128021240234375, 0.00699615478515625, -0.0250701904296875, 0.007457733154296875, -0.0217132568359375, -0.0124053955078125 ] ]
gpt2-large
2023-06-30T02:33:40.000Z
[ "transformers", "pytorch", "tf", "jax", "rust", "onnx", "safetensors", "gpt2", "text-generation", "en", "arxiv:1910.09700", "license:mit", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
null
null
null
gpt2-large
172
309,768
transformers
2022-03-02T23:29:04
--- language: en license: mit --- # GPT-2 Large ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-author) ## Model Details **Model Description:** GPT-2 Large is the **774M parameter** version of GPT-2, a transformer-based language model created and released by OpenAI. The model is a pretrained model on English language using a causal language modeling (CLM) objective. - **Developed by:** OpenAI, see [associated research paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) and [GitHub repo](https://github.com/openai/gpt-2) for model developers. - **Model Type:** Transformer-based language model - **Language(s):** English - **License:** [Modified MIT License](https://github.com/openai/gpt-2/blob/master/LICENSE) - **Related Models:** [GPT-2](https://huggingface.co/gpt2), [GPT-Medium](https://huggingface.co/gpt2-medium) and [GPT-XL](https://huggingface.co/gpt2-xl) - **Resources for more information:** - [Research Paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) - [OpenAI Blog Post](https://openai.com/blog/better-language-models/) - [GitHub Repo](https://github.com/openai/gpt-2) - [OpenAI Model Card for GPT-2](https://github.com/openai/gpt-2/blob/master/model_card.md) - Test the full generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large ## How to Get Started with the Model Use the code below to get started with the model. You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='gpt2-large') >>> set_seed(42) >>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5) [{'generated_text': "Hello, I'm a language model, I can do language modeling. In fact, this is one of the reasons I use languages. To get a"}, {'generated_text': "Hello, I'm a language model, which in its turn implements a model of how a human can reason about a language, and is in turn an"}, {'generated_text': "Hello, I'm a language model, why does this matter for you?\n\nWhen I hear new languages, I tend to start thinking in terms"}, {'generated_text': "Hello, I'm a language model, a functional language...\n\nI don't need to know anything else. If I want to understand about how"}, {'generated_text': "Hello, I'm a language model, not a toolbox.\n\nIn a nutshell, a language model is a set of attributes that define how"}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import GPT2Tokenizer, GPT2Model tokenizer = GPT2Tokenizer.from_pretrained('gpt2-large') model = GPT2Model.from_pretrained('gpt2-large') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import GPT2Tokenizer, TFGPT2Model tokenizer = GPT2Tokenizer.from_pretrained('gpt2-large') model = TFGPT2Model.from_pretrained('gpt2-large') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ## Uses #### Direct Use In their [model card about GPT-2](https://github.com/openai/gpt-2/blob/master/model_card.md), OpenAI wrote: > The primary intended users of these models are AI researchers and practitioners. > > We primarily imagine these language models will be used by researchers to better understand the behaviors, capabilities, biases, and constraints of large-scale generative language models. #### Downstream Use In their [model card about GPT-2](https://github.com/openai/gpt-2/blob/master/model_card.md), OpenAI wrote: > Here are some secondary use cases we believe are likely: > > - Writing assistance: Grammar assistance, autocompletion (for normal prose or code) > - Creative writing and art: exploring the generation of creative, fictional texts; aiding creation of poetry and other literary art. > - Entertainment: Creation of games, chat bots, and amusing generations. #### Misuse and Out-of-scope Use In their [model card about GPT-2](https://github.com/openai/gpt-2/blob/master/model_card.md), OpenAI wrote: > Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases that require the generated text to be true. > > Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do not recommend that they be deployed into systems that interact with humans unless the deployers first carry out a study of biases relevant to the intended use-case. We found no statistically significant difference in gender, race, and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with similar levels of caution around use cases that are sensitive to biases around human attributes. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). The training data used for this model has not been released as a dataset one can browse. We know it contains a lot of unfiltered content from the internet, which is far from neutral. Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example: ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='gpt2-large') >>> set_seed(42) >>> generator("The man worked as a", max_length=10, num_return_sequences=5) [{'generated_text': 'The man worked as a security guard in a hotel'}, {'generated_text': 'The man worked as a salesman in Mexico and in'}, {'generated_text': 'The man worked as a supervisor at the warehouse for'}, {'generated_text': "The man worked as a cleaner for the store's"}, {'generated_text': 'The man worked as a barbershop apprentice.'}] >>> set_seed(42) >>> generator("The woman worked as a", max_length=10, num_return_sequences=5) [{'generated_text': 'The woman worked as a clerk at the bank.'}, {'generated_text': 'The woman worked as a caregiver, and her'}, {'generated_text': 'The woman worked as a customer service agent for a'}, {'generated_text': 'The woman worked as a cleaner at the store,'}, {'generated_text': 'The woman worked as a barista and was "'}] ``` This bias will also affect all fine-tuned versions of this model. Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. ## Training #### Training Data The OpenAI team wanted to train this model on a corpus as large as possible. To build it, they scraped all the web pages from outbound links on Reddit which received at least 3 karma. Note that all Wikipedia pages were removed from this dataset, so the model was not trained on any part of Wikipedia. The resulting dataset (called WebText) weights 40GB of texts but has not been publicly released. You can find a list of the top 1,000 domains present in WebText [here](https://github.com/openai/gpt-2/blob/master/domains.txt). #### Training Procedure The model is pretrained on a very large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences. More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence, shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the predictions for the token `i` only uses the inputs from `1` to `i` but not the future tokens. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 1024 consecutive tokens. ## Evaluation The following evaluation information is extracted from the [associated paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf). #### Testing Data, Factors and Metrics The model authors write in the [associated paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) that: > Since our model operates on a byte level and does not require lossy pre-processing or tokenization, we can evaluate it on any language model benchmark. Results on language modeling datasets are commonly reported in a quantity which is a scaled or ex- ponentiated version of the average negative log probability per canonical prediction unit - usually a character, a byte, or a word. We evaluate the same quantity by computing the log-probability of a dataset according to a WebText LM and dividing by the number of canonical units. For many of these datasets, WebText LMs would be tested significantly out- of-distribution, having to predict aggressively standardized text, tokenization artifacts such as disconnected punctuation and contractions, shuffled sentences, and even the string <UNK> which is extremely rare in WebText - occurring only 26 times in 40 billion bytes. We report our main results...using invertible de-tokenizers which remove as many of these tokenization / pre-processing artifacts as possible. Since these de-tokenizers are invertible, we can still calculate the log probability of a dataset and they can be thought of as a simple form of domain adaptation. #### Results The model achieves the following results without any fine-tuning (zero-shot): | Dataset | LAMBADA | LAMBADA | CBT-CN | CBT-NE | WikiText2 | PTB | enwiki8 | text8 | WikiText103 | 1BW | |:--------:|:-------:|:-------:|:------:|:------:|:---------:|:------:|:-------:|:------:|:-----------:|:-----:| | (metric) | (PPL) | (ACC) | (ACC) | (ACC) | (PPL) | (PPL) | (BPB) | (BPC) | (PPL) | (PPL) | | | 10.87 | 60.12 | 93.45 | 88.0 | 19.93 | 40.31 | 0.97 | 1.02 | 22.05 | 44.575| ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Unknown - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @article{radford2019language, title={Language models are unsupervised multitask learners}, author={Radford, Alec and Wu, Jeffrey and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya and others}, journal={OpenAI blog}, volume={1}, number={8}, pages={9}, year={2019} } ``` ## Model Card Authors This model card was written by the Hugging Face team.
12,346
[ [ -0.01094818115234375, -0.058746337890625, 0.0230560302734375, 0.008026123046875, -0.0188446044921875, -0.02996826171875, -0.031646728515625, -0.041656494140625, -0.0246124267578125, 0.032958984375, -0.020477294921875, -0.0197906494140625, -0.051971435546875, -0.004241943359375, -0.0225372314453125, 0.10028076171875, -0.015472412109375, -0.0008053779602050781, 0.00850677490234375, 0.0136566162109375, -0.0250396728515625, -0.03265380859375, -0.059844970703125, -0.033905029296875, 0.02203369140625, 0.0032501220703125, 0.053680419921875, 0.041015625, 0.023101806640625, 0.0162811279296875, -0.0054473876953125, -0.00875091552734375, -0.032745361328125, -0.01235198974609375, -0.01276397705078125, -0.0282135009765625, -0.0299224853515625, 0.0180206298828125, 0.0406494140625, 0.024383544921875, 0.005016326904296875, 0.0196075439453125, 0.0160675048828125, 0.024444580078125, -0.0128326416015625, 0.03570556640625, -0.03228759765625, -0.0075225830078125, -0.0237274169921875, 0.0159149169921875, -0.0298614501953125, -0.0229644775390625, 0.00807952880859375, -0.03570556640625, 0.0231170654296875, -0.0163726806640625, 0.08135986328125, 0.010162353515625, -0.034027099609375, -0.020782470703125, -0.0618896484375, 0.056793212890625, -0.058197021484375, 0.021331787109375, 0.039215087890625, 0.0121917724609375, -0.0024509429931640625, -0.070556640625, -0.05303955078125, -0.0158843994140625, -0.02362060546875, 0.0172271728515625, -0.012542724609375, -0.0025348663330078125, 0.0293731689453125, 0.0274658203125, -0.06689453125, 0.0033512115478515625, -0.038909912109375, -0.0127105712890625, 0.033477783203125, -0.007488250732421875, 0.0362548828125, -0.0228271484375, -0.016082763671875, -0.022247314453125, -0.047576904296875, -0.00563812255859375, 0.04156494140625, 0.0164642333984375, -0.024749755859375, 0.0498046875, 0.00567626953125, 0.03338623046875, -0.0192718505859375, -0.0113067626953125, 0.021026611328125, -0.03570556640625, -0.01081085205078125, -0.0302581787109375, 0.081298828125, 0.02606201171875, 0.03466796875, -0.0115509033203125, -0.0152435302734375, 0.0031566619873046875, 0.0020122528076171875, -0.06903076171875, -0.0256805419921875, 0.01509857177734375, -0.034515380859375, -0.024658203125, -0.0008783340454101562, -0.07196044921875, -0.0068817138671875, -0.0123291015625, 0.02386474609375, -0.0285186767578125, -0.0496826171875, -0.01387786865234375, -0.0228271484375, 0.0198211669921875, 0.0024204254150390625, -0.08343505859375, 0.01500701904296875, 0.047454833984375, 0.07257080078125, 0.01044464111328125, -0.0264129638671875, -0.0007643699645996094, 0.0015277862548828125, -0.01038360595703125, 0.034332275390625, -0.0214691162109375, -0.0185699462890625, -0.0006322860717773438, 0.0018033981323242188, -0.00370025634765625, -0.0206756591796875, 0.036346435546875, -0.031494140625, 0.06048583984375, -0.013580322265625, -0.0188751220703125, -0.012451171875, -0.00433349609375, -0.04449462890625, 0.09185791015625, 0.0382080078125, -0.07196044921875, 0.01480865478515625, -0.0679931640625, -0.0304412841796875, -0.006252288818359375, -0.004825592041015625, -0.03466796875, -0.00746917724609375, 0.0028533935546875, 0.0102386474609375, -0.043182373046875, 0.0347900390625, 0.005092620849609375, -0.01255035400390625, 0.00885009765625, -0.035369873046875, 0.0849609375, 0.031280517578125, -0.043975830078125, -0.005222320556640625, -0.027069091796875, -0.001628875732421875, 0.035888671875, -0.034423828125, -0.008392333984375, 0.0004439353942871094, 0.03948974609375, 0.0284881591796875, 0.01419830322265625, -0.03045654296875, 0.0173492431640625, -0.042022705078125, 0.059234619140625, 0.048828125, -0.020538330078125, 0.02490234375, -0.01204681396484375, 0.0252685546875, -0.0013647079467773438, 0.005985260009765625, -0.0011148452758789062, -0.0633544921875, -0.049407958984375, -0.0022182464599609375, 0.0323486328125, 0.060089111328125, -0.059173583984375, 0.03271484375, -0.020904541015625, -0.03692626953125, -0.030120849609375, 0.00557708740234375, 0.045318603515625, 0.0292816162109375, 0.02972412109375, -0.01515960693359375, -0.049102783203125, -0.0577392578125, -0.021087646484375, -0.0301971435546875, -0.0170440673828125, 0.01509857177734375, 0.056488037109375, -0.0277862548828125, 0.0733642578125, -0.04083251953125, -0.0196685791015625, -0.0350341796875, 0.0138092041015625, 0.0143280029296875, 0.041015625, 0.0443115234375, -0.05712890625, -0.037017822265625, -0.00817108154296875, -0.060394287109375, 0.0003643035888671875, 0.0032787322998046875, -0.0008921623229980469, 0.029541015625, 0.017120361328125, -0.0679931640625, 0.014312744140625, 0.035980224609375, -0.036285400390625, 0.0430908203125, -0.0148468017578125, -0.01348876953125, -0.1048583984375, 0.0269775390625, 0.01529693603515625, -0.0026035308837890625, -0.06170654296875, 0.013519287109375, -0.0079345703125, -0.01439666748046875, -0.0207366943359375, 0.06134033203125, -0.03155517578125, 0.005462646484375, -0.0138092041015625, -0.00033974647521972656, -0.01837158203125, 0.04296875, 0.0055389404296875, 0.07879638671875, 0.03131103515625, -0.0296173095703125, 0.0203399658203125, 0.0245819091796875, -0.03033447265625, 0.0004982948303222656, -0.062225341796875, 0.02484130859375, -0.008331298828125, 0.0196685791015625, -0.07159423828125, -0.0296173095703125, 0.034820556640625, -0.04742431640625, 0.03118896484375, -0.021728515625, -0.0614013671875, -0.042205810546875, -0.01174163818359375, 0.0313720703125, 0.06982421875, -0.028656005859375, 0.0220794677734375, 0.032928466796875, -0.0218505859375, -0.0297698974609375, -0.064208984375, 0.007602691650390625, -0.01332855224609375, -0.045501708984375, 0.0199127197265625, 0.01629638671875, -0.01204681396484375, -0.0044708251953125, 0.0237274169921875, -0.01050567626953125, 0.0003275871276855469, 0.00937652587890625, 0.018402099609375, -0.014678955078125, 0.0021877288818359375, -0.0022125244140625, -0.021728515625, 0.005306243896484375, -0.037322998046875, 0.0535888671875, -0.003406524658203125, -0.0002682209014892578, -0.028228759765625, 0.0203399658203125, 0.0238037109375, -0.0178985595703125, 0.053466796875, 0.07672119140625, -0.03741455078125, 0.01163482666015625, -0.0318603515625, -0.01995849609375, -0.032958984375, 0.05926513671875, -0.014862060546875, -0.07098388671875, 0.02740478515625, 0.01385498046875, 0.007221221923828125, 0.057098388671875, 0.057952880859375, 0.01300811767578125, 0.0784912109375, 0.042449951171875, -0.0163726806640625, 0.03948974609375, -0.02337646484375, 0.025970458984375, -0.0655517578125, -0.0153350830078125, -0.042816162109375, -0.0018711090087890625, -0.06585693359375, -0.03546142578125, 0.01418304443359375, 0.018218994140625, -0.036163330078125, 0.039520263671875, -0.043426513671875, 0.0228729248046875, 0.058990478515625, -0.00506591796875, 0.0007796287536621094, 0.016326904296875, -0.0002484321594238281, 0.0028057098388671875, -0.03338623046875, -0.055938720703125, 0.09881591796875, 0.048248291015625, 0.038543701171875, 0.01538848876953125, 0.0225372314453125, 0.00396728515625, 0.022216796875, -0.034912109375, 0.0264892578125, -0.0255889892578125, -0.061126708984375, -0.031158447265625, -0.037384033203125, -0.0670166015625, 0.0107879638671875, 0.01097869873046875, -0.076171875, -0.006443023681640625, 0.015167236328125, -0.0038967132568359375, 0.029266357421875, -0.06536865234375, 0.080810546875, -0.0160980224609375, -0.0217132568359375, -0.0017337799072265625, -0.0518798828125, 0.035186767578125, 0.002925872802734375, 0.00916290283203125, 0.0135345458984375, 0.00891876220703125, 0.06768798828125, -0.037322998046875, 0.07080078125, -0.0250396728515625, -0.008331298828125, 0.035064697265625, -0.0137786865234375, 0.046112060546875, -0.0005140304565429688, -0.0017595291137695312, 0.0299224853515625, -0.01413726806640625, -0.0174407958984375, -0.02099609375, 0.0416259765625, -0.0836181640625, -0.0285491943359375, -0.03399658203125, -0.031951904296875, 0.01346588134765625, 0.0252685546875, 0.045074462890625, 0.0277252197265625, -0.007175445556640625, -0.006923675537109375, 0.033447265625, -0.03118896484375, 0.032684326171875, 0.0140533447265625, -0.01317596435546875, -0.034027099609375, 0.06793212890625, 0.007297515869140625, 0.0194549560546875, 0.0212554931640625, 0.01279449462890625, -0.0419921875, -0.0290374755859375, -0.045318603515625, 0.029266357421875, -0.04278564453125, 0.002277374267578125, -0.056793212890625, -0.024871826171875, -0.054412841796875, 0.025726318359375, -0.0157623291015625, -0.0321044921875, -0.0254974365234375, -0.008087158203125, 0.031768798828125, 0.05908203125, 0.0011758804321289062, 0.032806396484375, -0.031951904296875, 0.0227813720703125, 0.0273895263671875, 0.03692626953125, -0.005641937255859375, -0.052490234375, -0.011322021484375, 0.0215911865234375, -0.03656005859375, -0.06134033203125, 0.023712158203125, 0.00223541259765625, 0.0296173095703125, 0.0180816650390625, -0.0206756591796875, 0.017303466796875, -0.033843994140625, 0.08209228515625, 0.01263427734375, -0.056884765625, 0.03851318359375, -0.036376953125, 0.01151275634765625, 0.01947021484375, 0.0193023681640625, -0.05145263671875, -0.020904541015625, -0.0382080078125, -0.07159423828125, 0.07598876953125, 0.042877197265625, 0.02313232421875, -0.004665374755859375, 0.029388427734375, -0.002925872802734375, 0.0086669921875, -0.0863037109375, -0.0282745361328125, -0.040618896484375, -0.023712158203125, -0.0138702392578125, -0.037353515625, 0.0008955001831054688, -0.0143280029296875, 0.060028076171875, 0.00433349609375, 0.059814453125, -0.0002951622009277344, -0.0170440673828125, -0.0018415451049804688, 0.0214385986328125, 0.05523681640625, 0.040069580078125, -0.006866455078125, 0.0063629150390625, 0.0032596588134765625, -0.055084228515625, 0.0008630752563476562, 0.0250244140625, -0.03460693359375, -0.0008454322814941406, 0.020660400390625, 0.078857421875, -0.0173492431640625, -0.0267181396484375, 0.0394287109375, 0.00876617431640625, -0.0186920166015625, -0.034088134765625, 0.005306243896484375, 0.01148223876953125, 0.00768280029296875, 0.01268768310546875, -0.005016326904296875, -0.001056671142578125, -0.050537109375, 0.0037631988525390625, 0.0197601318359375, -0.0239410400390625, -0.034423828125, 0.07177734375, 0.0113983154296875, -0.031280517578125, 0.06634521484375, -0.0280609130859375, -0.048004150390625, 0.044647216796875, 0.058837890625, 0.0731201171875, -0.01018524169921875, 0.020111083984375, 0.04449462890625, 0.03656005859375, -0.0279998779296875, 0.01517486572265625, 0.022247314453125, -0.049041748046875, -0.0301513671875, -0.0423583984375, 0.002105712890625, 0.034820556640625, -0.02447509765625, 0.020233154296875, -0.023590087890625, -0.019378662109375, -0.013519287109375, -0.00196075439453125, -0.051116943359375, 0.0145111083984375, 0.0037326812744140625, 0.05712890625, -0.07366943359375, 0.07568359375, 0.05426025390625, -0.06402587890625, -0.06591796875, 0.00762176513671875, -0.0035762786865234375, -0.054901123046875, 0.042755126953125, 0.0186920166015625, 0.032012939453125, 0.006076812744140625, -0.0367431640625, -0.055389404296875, 0.0849609375, 0.022552490234375, -0.02386474609375, -0.0218048095703125, 0.03753662109375, 0.0516357421875, -0.0050811767578125, 0.0567626953125, 0.050018310546875, 0.045745849609375, -0.01131439208984375, -0.07562255859375, 0.0218353271484375, -0.0294952392578125, 0.020111083984375, 0.01317596435546875, -0.051300048828125, 0.08648681640625, -0.0147552490234375, -0.01160430908203125, 0.0161285400390625, 0.035247802734375, 0.0020618438720703125, 0.0025920867919921875, 0.0238037109375, 0.045013427734375, 0.044158935546875, -0.0316162109375, 0.0943603515625, -0.01509857177734375, 0.053680419921875, 0.077880859375, 0.00543975830078125, 0.045806884765625, 0.0170440673828125, -0.032684326171875, 0.0308074951171875, 0.042999267578125, -0.019866943359375, 0.043487548828125, 0.0053863525390625, -0.003314971923828125, 0.01322174072265625, 0.0074310302734375, -0.045654296875, 0.021148681640625, 0.01050567626953125, -0.045318603515625, -0.01678466796875, -0.007785797119140625, 0.0295867919921875, -0.0230255126953125, 0.00568389892578125, 0.05828857421875, 0.0098419189453125, -0.060302734375, 0.036163330078125, 0.036773681640625, 0.049346923828125, -0.045806884765625, 0.00009733438491821289, -0.012908935546875, 0.019622802734375, -0.0078277587890625, -0.06524658203125, 0.0169525146484375, 0.01531219482421875, -0.022705078125, -0.0198516845703125, 0.04840087890625, -0.04608154296875, -0.0361328125, 0.01910400390625, 0.0229949951171875, 0.0297088623046875, -0.0185699462890625, -0.058074951171875, -0.01427459716796875, 0.00855255126953125, -0.03033447265625, 0.031768798828125, 0.01708984375, -0.0110626220703125, 0.0289764404296875, 0.04718017578125, -0.0005903244018554688, -0.00861358642578125, 0.01275634765625, 0.057769775390625, -0.042449951171875, -0.028900146484375, -0.06829833984375, 0.046234130859375, -0.002605438232421875, -0.040863037109375, 0.044891357421875, 0.056610107421875, 0.07049560546875, -0.0174407958984375, 0.08563232421875, -0.031158447265625, 0.0309295654296875, -0.033905029296875, 0.064208984375, -0.031494140625, 0.002552032470703125, -0.0274200439453125, -0.07489013671875, -0.0079498291015625, 0.054412841796875, -0.0305023193359375, 0.034210205078125, 0.050384521484375, 0.067626953125, -0.0045928955078125, -0.002033233642578125, -0.0029239654541015625, 0.02728271484375, 0.0384521484375, 0.049163818359375, 0.03900146484375, -0.053802490234375, 0.052642822265625, -0.01702880859375, -0.0211334228515625, -0.007755279541015625, -0.0443115234375, -0.07208251953125, -0.047271728515625, -0.0165863037109375, -0.045867919921875, 0.0032062530517578125, 0.0633544921875, 0.038421630859375, -0.06573486328125, -0.02593994140625, -0.0219268798828125, -0.0022487640380859375, -0.00812530517578125, -0.0208892822265625, 0.03424072265625, -0.01357269287109375, -0.05804443359375, 0.00006878376007080078, -0.009674072265625, 0.0175628662109375, -0.0167694091796875, -0.01453399658203125, -0.0117950439453125, -0.020904541015625, 0.042999267578125, 0.0186004638671875, -0.060150146484375, -0.0225677490234375, -0.01264190673828125, -0.0163421630859375, -0.002361297607421875, 0.052703857421875, -0.031158447265625, 0.01442718505859375, 0.03619384765625, 0.0219268798828125, 0.040496826171875, -0.01141357421875, 0.04443359375, -0.04034423828125, 0.0181121826171875, 0.0005240440368652344, 0.0232696533203125, 0.0221710205078125, -0.033111572265625, 0.05242919921875, 0.0304412841796875, -0.042022705078125, -0.052154541015625, 0.013275146484375, -0.0587158203125, -0.01788330078125, 0.11517333984375, -0.0093994140625, 0.00246429443359375, -0.00612640380859375, -0.0232696533203125, 0.046722412109375, -0.028717041015625, 0.0462646484375, 0.046295166015625, 0.0187225341796875, -0.01177215576171875, -0.062042236328125, 0.045135498046875, 0.009918212890625, -0.060302734375, 0.0076904296875, 0.019439697265625, 0.045166015625, 0.007656097412109375, 0.051483154296875, -0.0201568603515625, 0.01016998291015625, 0.003673553466796875, 0.01422119140625, -0.00989532470703125, -0.007213592529296875, -0.0171661376953125, 0.002590179443359375, 0.004001617431640625, -0.002490997314453125 ] ]
google/bert_uncased_L-8_H-768_A-12
2021-05-19T17:36:32.000Z
[ "transformers", "pytorch", "jax", "bert", "arxiv:1908.08962", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
null
google
null
null
google/bert_uncased_L-8_H-768_A-12
0
309,303
transformers
2022-03-02T23:29:05
--- thumbnail: https://huggingface.co/front/thumbnails/google.png license: apache-2.0 --- BERT Miniatures === This is the set of 24 BERT models referenced in [Well-Read Students Learn Better: On the Importance of Pre-training Compact Models](https://arxiv.org/abs/1908.08962) (English only, uncased, trained with WordPiece masking). We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted computational resources. They can be fine-tuned in the same manner as the original BERT models. However, they are most effective in the context of knowledge distillation, where the fine-tuning labels are produced by a larger and more accurate teacher. Our goal is to enable research in institutions with fewer computational resources and encourage the community to seek directions of innovation alternative to increasing model capacity. You can download the 24 BERT miniatures either from the [official BERT Github page](https://github.com/google-research/bert/), or via HuggingFace from the links below: | |H=128|H=256|H=512|H=768| |---|:---:|:---:|:---:|:---:| | **L=2** |[**2/128 (BERT-Tiny)**][2_128]|[2/256][2_256]|[2/512][2_512]|[2/768][2_768]| | **L=4** |[4/128][4_128]|[**4/256 (BERT-Mini)**][4_256]|[**4/512 (BERT-Small)**][4_512]|[4/768][4_768]| | **L=6** |[6/128][6_128]|[6/256][6_256]|[6/512][6_512]|[6/768][6_768]| | **L=8** |[8/128][8_128]|[8/256][8_256]|[**8/512 (BERT-Medium)**][8_512]|[8/768][8_768]| | **L=10** |[10/128][10_128]|[10/256][10_256]|[10/512][10_512]|[10/768][10_768]| | **L=12** |[12/128][12_128]|[12/256][12_256]|[12/512][12_512]|[**12/768 (BERT-Base)**][12_768]| Note that the BERT-Base model in this release is included for completeness only; it was re-trained under the same regime as the original model. Here are the corresponding GLUE scores on the test set: |Model|Score|CoLA|SST-2|MRPC|STS-B|QQP|MNLI-m|MNLI-mm|QNLI(v2)|RTE|WNLI|AX| |---|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| |BERT-Tiny|64.2|0.0|83.2|81.1/71.1|74.3/73.6|62.2/83.4|70.2|70.3|81.5|57.2|62.3|21.0| |BERT-Mini|65.8|0.0|85.9|81.1/71.8|75.4/73.3|66.4/86.2|74.8|74.3|84.1|57.9|62.3|26.1| |BERT-Small|71.2|27.8|89.7|83.4/76.2|78.8/77.0|68.1/87.0|77.6|77.0|86.4|61.8|62.3|28.6| |BERT-Medium|73.5|38.0|89.6|86.6/81.6|80.4/78.4|69.6/87.9|80.0|79.1|87.7|62.2|62.3|30.5| For each task, we selected the best fine-tuning hyperparameters from the lists below, and trained for 4 epochs: - batch sizes: 8, 16, 32, 64, 128 - learning rates: 3e-4, 1e-4, 5e-5, 3e-5 If you use these models, please cite the following paper: ``` @article{turc2019, title={Well-Read Students Learn Better: On the Importance of Pre-training Compact Models}, author={Turc, Iulia and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina}, journal={arXiv preprint arXiv:1908.08962v2 }, year={2019} } ``` [2_128]: https://huggingface.co/google/bert_uncased_L-2_H-128_A-2 [2_256]: https://huggingface.co/google/bert_uncased_L-2_H-256_A-4 [2_512]: https://huggingface.co/google/bert_uncased_L-2_H-512_A-8 [2_768]: https://huggingface.co/google/bert_uncased_L-2_H-768_A-12 [4_128]: https://huggingface.co/google/bert_uncased_L-4_H-128_A-2 [4_256]: https://huggingface.co/google/bert_uncased_L-4_H-256_A-4 [4_512]: https://huggingface.co/google/bert_uncased_L-4_H-512_A-8 [4_768]: https://huggingface.co/google/bert_uncased_L-4_H-768_A-12 [6_128]: https://huggingface.co/google/bert_uncased_L-6_H-128_A-2 [6_256]: https://huggingface.co/google/bert_uncased_L-6_H-256_A-4 [6_512]: https://huggingface.co/google/bert_uncased_L-6_H-512_A-8 [6_768]: https://huggingface.co/google/bert_uncased_L-6_H-768_A-12 [8_128]: https://huggingface.co/google/bert_uncased_L-8_H-128_A-2 [8_256]: https://huggingface.co/google/bert_uncased_L-8_H-256_A-4 [8_512]: https://huggingface.co/google/bert_uncased_L-8_H-512_A-8 [8_768]: https://huggingface.co/google/bert_uncased_L-8_H-768_A-12 [10_128]: https://huggingface.co/google/bert_uncased_L-10_H-128_A-2 [10_256]: https://huggingface.co/google/bert_uncased_L-10_H-256_A-4 [10_512]: https://huggingface.co/google/bert_uncased_L-10_H-512_A-8 [10_768]: https://huggingface.co/google/bert_uncased_L-10_H-768_A-12 [12_128]: https://huggingface.co/google/bert_uncased_L-12_H-128_A-2 [12_256]: https://huggingface.co/google/bert_uncased_L-12_H-256_A-4 [12_512]: https://huggingface.co/google/bert_uncased_L-12_H-512_A-8 [12_768]: https://huggingface.co/google/bert_uncased_L-12_H-768_A-12
4,617
[ [ -0.053558349609375, -0.03546142578125, 0.0239410400390625, 0.013153076171875, -0.02374267578125, -0.016937255859375, -0.0239715576171875, -0.031219482421875, 0.04376220703125, -0.006103515625, -0.06103515625, -0.0306854248046875, -0.052093505859375, -0.0019063949584960938, -0.0018606185913085938, 0.0860595703125, 0.01557159423828125, -0.0009145736694335938, -0.01369476318359375, -0.0030345916748046875, -0.016265869140625, -0.021820068359375, -0.0292510986328125, -0.0182952880859375, 0.046875, 0.02886962890625, 0.06463623046875, 0.043487548828125, 0.039093017578125, 0.0274200439453125, -0.021575927734375, 0.004154205322265625, -0.0291900634765625, -0.032806396484375, 0.01541900634765625, -0.026153564453125, -0.0648193359375, 0.018798828125, 0.04217529296875, 0.054779052734375, -0.0037021636962890625, 0.0264434814453125, 0.024322509765625, 0.050323486328125, -0.037811279296875, 0.0110321044921875, -0.01953125, -0.006824493408203125, -0.007717132568359375, 0.0132904052734375, -0.019744873046875, -0.049346923828125, 0.024871826171875, -0.0609130859375, 0.0188140869140625, -0.01171875, 0.09710693359375, 0.007965087890625, -0.01812744140625, -0.024017333984375, -0.0205078125, 0.07318115234375, -0.067138671875, 0.0260162353515625, 0.02752685546875, 0.0011205673217773438, -0.0098724365234375, -0.053558349609375, -0.03656005859375, 0.005706787109375, -0.02899169921875, 0.0281219482421875, -0.0166015625, -0.002422332763671875, 0.02471923828125, 0.029144287109375, -0.043701171875, 0.0057220458984375, -0.039093017578125, -0.018798828125, 0.057647705078125, 0.004825592041015625, 0.0201873779296875, -0.00481414794921875, -0.0294189453125, -0.027069091796875, -0.0257415771484375, 0.025634765625, 0.025726318359375, 0.013092041015625, -0.03729248046875, 0.032135009765625, 0.005344390869140625, 0.05731201171875, 0.034271240234375, -0.031982421875, 0.042816162109375, -0.01763916015625, -0.0213775634765625, -0.01474761962890625, 0.057830810546875, 0.026885986328125, 0.01019287109375, 0.00830841064453125, -0.00836944580078125, -0.0070953369140625, 0.0164031982421875, -0.0721435546875, -0.03863525390625, 0.00789642333984375, -0.050628662109375, -0.01355743408203125, 0.0023632049560546875, -0.048675537109375, 0.00580596923828125, -0.025909423828125, 0.0430908203125, -0.054534912109375, 0.0023593902587890625, 0.0101470947265625, -0.01239776611328125, 0.0190277099609375, 0.032012939453125, -0.06634521484375, 0.0181884765625, 0.028076171875, 0.036376953125, 0.01039886474609375, -0.0175933837890625, 0.008209228515625, -0.0026531219482421875, -0.0304412841796875, 0.0452880859375, -0.026611328125, -0.020843505859375, -0.01361083984375, 0.01525115966796875, -0.021820068359375, -0.02801513671875, 0.050567626953125, -0.00257110595703125, 0.019134521484375, -0.035888671875, -0.061737060546875, -0.001155853271484375, 0.0181884765625, -0.04803466796875, 0.07293701171875, 0.0063934326171875, -0.05780029296875, 0.0283660888671875, -0.0294952392578125, -0.0082855224609375, -0.025634765625, -0.003406524658203125, -0.061004638671875, -0.0005364418029785156, 0.02337646484375, 0.0499267578125, -0.007221221923828125, -0.0118560791015625, -0.0367431640625, -0.023834228515625, 0.0102386474609375, 0.004955291748046875, 0.0712890625, 0.01049041748046875, -0.0202178955078125, 0.00691986083984375, -0.0654296875, 0.02569580078125, 0.0287933349609375, -0.028350830078125, -0.0024871826171875, -0.029754638671875, -0.00911712646484375, 0.0226898193359375, 0.04571533203125, -0.0382080078125, 0.01806640625, -0.0154266357421875, 0.028656005859375, 0.061553955078125, -0.0047607421875, 0.0281524658203125, -0.054443359375, 0.0200042724609375, 0.000888824462890625, 0.0350341796875, 0.0001342296600341797, -0.045440673828125, -0.06439208984375, -0.043365478515625, 0.030487060546875, 0.0176239013671875, -0.02471923828125, 0.06561279296875, -0.0181121826171875, -0.06585693359375, -0.043731689453125, 0.0081329345703125, 0.039886474609375, 0.02435302734375, 0.019012451171875, -0.0160980224609375, -0.032562255859375, -0.0802001953125, -0.00433349609375, -0.0109405517578125, -0.0023899078369140625, 0.037811279296875, 0.047607421875, -0.007476806640625, 0.056549072265625, -0.048553466796875, -0.0200653076171875, -0.0041961669921875, -0.003925323486328125, 0.02972412109375, 0.05810546875, 0.0726318359375, -0.058258056640625, -0.04754638671875, -0.025115966796875, -0.04718017578125, 0.0217437744140625, -0.007694244384765625, -0.01503753662109375, 0.0092010498046875, 0.019500732421875, -0.06500244140625, 0.04345703125, 0.038970947265625, -0.030914306640625, 0.059906005859375, -0.036529541015625, -0.002620697021484375, -0.065673828125, 0.016204833984375, 0.00638580322265625, -0.006053924560546875, -0.035614013671875, -0.002658843994140625, 0.015625, 0.01153564453125, -0.029266357421875, 0.038665771484375, -0.04644775390625, 0.00026679039001464844, 0.012603759765625, 0.006755828857421875, 0.0070648193359375, 0.05078125, -0.005184173583984375, 0.052032470703125, 0.035491943359375, -0.0218963623046875, 0.008087158203125, 0.03594970703125, -0.032318115234375, 0.0302581787109375, -0.058135986328125, 0.005615234375, 0.0019397735595703125, 0.0286712646484375, -0.08184814453125, -0.0290985107421875, 0.005146026611328125, -0.056732177734375, 0.03790283203125, 0.0011854171752929688, -0.03704833984375, -0.049530029296875, -0.047271728515625, 0.0118560791015625, 0.058441162109375, -0.040130615234375, 0.0250701904296875, 0.0172271728515625, -0.00499725341796875, -0.035919189453125, -0.036834716796875, -0.032135009765625, -0.01824951171875, -0.051116943359375, 0.042205810546875, -0.03253173828125, 0.01531982421875, 0.00732421875, -0.01386260986328125, -0.01776123046875, 0.0015916824340820312, 0.0185699462890625, 0.034149169921875, -0.0164947509765625, 0.0038547515869140625, -0.0048675537109375, 0.01082611083984375, 0.0009756088256835938, 0.004497528076171875, 0.035400390625, -0.03240966796875, 0.0033435821533203125, -0.05328369140625, 0.00632476806640625, 0.039093017578125, -0.00559234619140625, 0.07293701171875, 0.06304931640625, -0.0300445556640625, 0.0039825439453125, -0.044464111328125, -0.031494140625, -0.0379638671875, -0.00899505615234375, -0.0347900390625, -0.06549072265625, 0.05303955078125, 0.0026302337646484375, 0.01519775390625, 0.04864501953125, 0.041046142578125, -0.0236663818359375, 0.07427978515625, 0.043975830078125, -0.01457977294921875, 0.034149169921875, -0.04583740234375, 0.004146575927734375, -0.061279296875, -0.0216522216796875, -0.0230865478515625, -0.039764404296875, -0.046630859375, -0.016571044921875, 0.0271759033203125, 0.02520751953125, -0.033203125, 0.04425048828125, -0.04803466796875, 0.0224151611328125, 0.0533447265625, 0.041046142578125, -0.0209197998046875, -0.01678466796875, -0.022003173828125, -0.016754150390625, -0.05694580078125, -0.0204925537109375, 0.080078125, 0.025238037109375, 0.0413818359375, 0.0090179443359375, 0.06121826171875, 0.005298614501953125, -0.00362396240234375, -0.051666259765625, 0.044586181640625, -0.007701873779296875, -0.07513427734375, -0.031585693359375, -0.024505615234375, -0.07843017578125, 0.0095367431640625, -0.038238525390625, -0.066162109375, 0.01580810546875, 0.0184326171875, -0.046051025390625, 0.0191802978515625, -0.05853271484375, 0.0654296875, -0.0031261444091796875, -0.03314208984375, -0.0025310516357421875, -0.06744384765625, 0.0260772705078125, -0.00021517276763916016, 0.00627899169921875, -0.0066375732421875, 0.0119781494140625, 0.07049560546875, -0.049163818359375, 0.06884765625, -0.01209259033203125, 0.00453948974609375, 0.03472900390625, -0.00817108154296875, 0.04437255859375, -0.0021343231201171875, 0.00846099853515625, -0.0079345703125, 0.01160430908203125, -0.055389404296875, -0.0297088623046875, 0.051849365234375, -0.0721435546875, -0.035919189453125, -0.03253173828125, -0.045654296875, -0.02178955078125, 0.0257720947265625, 0.040008544921875, 0.03448486328125, 0.00557708740234375, 0.03546142578125, 0.061126708984375, -0.012237548828125, 0.033294677734375, 0.01393890380859375, 0.01111602783203125, -0.0228118896484375, 0.0653076171875, 0.01422119140625, 0.008270263671875, 0.01007843017578125, 0.0196380615234375, -0.022369384765625, -0.03216552734375, -0.00844573974609375, 0.049041748046875, -0.031280517578125, -0.01157379150390625, -0.041473388671875, -0.02117919921875, -0.04998779296875, -0.0338134765625, -0.04010009765625, -0.042694091796875, -0.0413818359375, -0.0035305023193359375, 0.0236053466796875, 0.0433349609375, -0.0204315185546875, 0.02032470703125, -0.060516357421875, 0.018951416015625, 0.037689208984375, 0.023162841796875, -0.00939178466796875, -0.039154052734375, -0.01776123046875, -0.00010567903518676758, -0.03155517578125, -0.048492431640625, 0.0294647216796875, 0.0190887451171875, 0.04931640625, 0.040924072265625, 0.0019741058349609375, 0.07489013671875, -0.040252685546875, 0.06524658203125, 0.046142578125, -0.057373046875, 0.040557861328125, -0.033966064453125, 0.01611328125, 0.036163330078125, 0.04345703125, -0.002166748046875, -0.007411956787109375, -0.08099365234375, -0.053009033203125, 0.05670166015625, 0.024383544921875, 0.0016317367553710938, 0.0011854171752929688, 0.0279541015625, -0.0030155181884765625, 0.01114654541015625, -0.04241943359375, -0.046630859375, -0.00701141357421875, -0.0187835693359375, -0.01041412353515625, -0.0306243896484375, -0.01381683349609375, -0.04766845703125, 0.057647705078125, -0.006969451904296875, 0.046966552734375, 0.00873565673828125, 0.01153564453125, 0.004245758056640625, -0.006237030029296875, 0.048919677734375, 0.055633544921875, -0.046051025390625, -0.0244903564453125, 0.00897216796875, -0.044158935546875, -0.0123291015625, 0.0183563232421875, -0.0006604194641113281, 0.0043487548828125, 0.037261962890625, 0.053619384765625, 0.0292816162109375, -0.036590576171875, 0.0526123046875, -0.0007014274597167969, -0.031158447265625, -0.03631591796875, 0.002170562744140625, 0.00937652587890625, 0.0292205810546875, 0.00786590576171875, -0.0003409385681152344, 0.01049041748046875, -0.045989990234375, 0.0223388671875, 0.03155517578125, -0.0377197265625, -0.032135009765625, 0.045379638671875, 0.005344390869140625, 0.01151275634765625, 0.033660888671875, -0.01163482666015625, -0.0394287109375, 0.050140380859375, 0.035247802734375, 0.03564453125, -0.0201568603515625, 0.01410675048828125, 0.0633544921875, 0.0169525146484375, -0.0010080337524414062, 0.0250244140625, 0.004207611083984375, -0.041259765625, 0.0007576942443847656, -0.04736328125, -0.0159912109375, 0.022003173828125, -0.0745849609375, 0.01181793212890625, -0.045501708984375, -0.0350341796875, 0.0216217041015625, 0.03173828125, -0.06365966796875, 0.0413818359375, 0.018646240234375, 0.07415771484375, -0.050933837890625, 0.07440185546875, 0.06402587890625, -0.024658203125, -0.0760498046875, -0.0106201171875, 0.00843048095703125, -0.06304931640625, 0.041259765625, 0.0018310546875, 0.0277557373046875, -0.0018091201782226562, -0.0443115234375, -0.0760498046875, 0.097412109375, 0.0198516845703125, -0.039642333984375, -0.0188751220703125, -0.00940704345703125, 0.033111572265625, -0.01044464111328125, 0.0261383056640625, 0.04150390625, 0.0283050537109375, 0.01145172119140625, -0.07513427734375, 0.00933837890625, -0.0316162109375, 0.0025196075439453125, 0.016693115234375, -0.08380126953125, 0.09100341796875, -0.0243988037109375, 0.00366973876953125, 0.01345062255859375, 0.043487548828125, 0.044189453125, 0.004558563232421875, 0.03948974609375, 0.06500244140625, 0.0457763671875, -0.01971435546875, 0.0758056640625, -0.0184783935546875, 0.04693603515625, 0.07012939453125, 0.0252838134765625, 0.049713134765625, 0.0264434814453125, -0.030670166015625, 0.032562255859375, 0.057342529296875, -0.01032257080078125, 0.04193115234375, 0.01480865478515625, 0.001865386962890625, -0.03741455078125, 0.01088714599609375, -0.03790283203125, 0.01456451416015625, 0.0361328125, -0.02008056640625, -0.00919342041015625, -0.0149078369140625, 0.0226287841796875, -0.00628662109375, -0.0174560546875, 0.045074462890625, 0.000014781951904296875, -0.0260467529296875, 0.055084228515625, -0.021514892578125, 0.051910400390625, -0.051239013671875, 0.01305389404296875, -0.00977325439453125, 0.0294952392578125, -0.0111236572265625, -0.06298828125, 0.00276947021484375, -0.00629425048828125, -0.01287078857421875, -0.004634857177734375, 0.047027587890625, -0.015625, -0.046051025390625, 0.0198516845703125, 0.0236053466796875, 0.0156402587890625, 0.01383209228515625, -0.0826416015625, 0.01338958740234375, 0.0047607421875, -0.04412841796875, 0.0311279296875, 0.031768798828125, 0.0164337158203125, 0.042694091796875, 0.042572021484375, -0.0116729736328125, 0.028167724609375, -0.020904541015625, 0.08111572265625, -0.033294677734375, -0.0294189453125, -0.041534423828125, 0.04473876953125, -0.01128387451171875, -0.0352783203125, 0.07061767578125, 0.044464111328125, 0.0654296875, -0.0185089111328125, 0.0423583984375, -0.01788330078125, 0.04986572265625, -0.0285797119140625, 0.057220458984375, -0.06854248046875, -0.01355743408203125, -0.028656005859375, -0.057159423828125, -0.016082763671875, 0.05621337890625, -0.020050048828125, 0.01174163818359375, 0.0308685302734375, 0.03253173828125, 0.0001341104507446289, -0.01323699951171875, 0.00954437255859375, 0.018035888671875, 0.01534271240234375, 0.0643310546875, 0.031005859375, -0.052581787109375, 0.033172607421875, -0.05859375, -0.015594482421875, -0.032440185546875, -0.0408935546875, -0.084716796875, -0.051483154296875, -0.0299224853515625, -0.01934814453125, -0.006206512451171875, 0.07196044921875, 0.0712890625, -0.0550537109375, -0.0082855224609375, 0.0185089111328125, -0.01242828369140625, -0.01291656494140625, -0.0137786865234375, 0.04815673828125, -0.0127716064453125, -0.06658935546875, 0.0024471282958984375, -0.0162353515625, 0.0294189453125, 0.0168609619140625, -0.01397705078125, -0.0272216796875, 0.01715087890625, 0.042205810546875, 0.017974853515625, -0.046356201171875, -0.031890869140625, 0.001407623291015625, -0.0119476318359375, -0.0014638900756835938, 0.01444244384765625, -0.0389404296875, 0.0170745849609375, 0.0316162109375, 0.0240325927734375, 0.050445556640625, 0.0032405853271484375, 0.002613067626953125, -0.05694580078125, 0.0197296142578125, 0.0166473388671875, 0.036590576171875, 0.006053924560546875, -0.01453399658203125, 0.048553466796875, 0.0200958251953125, -0.050811767578125, -0.0721435546875, -0.00969696044921875, -0.10040283203125, -0.020660400390625, 0.057952880859375, -0.03265380859375, -0.036163330078125, 0.037078857421875, -0.01038360595703125, 0.018951416015625, -0.031524658203125, 0.04815673828125, 0.050262451171875, -0.0163116455078125, -0.01067352294921875, -0.03204345703125, 0.044525146484375, 0.03155517578125, -0.04925537109375, -0.0245513916015625, 0.0288848876953125, 0.0273590087890625, 0.0361328125, 0.034393310546875, -0.01538848876953125, 0.0182037353515625, 0.00626373291015625, 0.00565338134765625, 0.01111602783203125, -0.025909423828125, -0.0002589225769042969, -0.0121307373046875, -0.0189361572265625, -0.03369140625 ] ]
sentence-transformers/paraphrase-MiniLM-L6-v2
2022-06-15T18:39:43.000Z
[ "sentence-transformers", "pytorch", "tf", "bert", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/paraphrase-MiniLM-L6-v2
55
309,178
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/paraphrase-MiniLM-L6-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/paraphrase-MiniLM-L6-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-MiniLM-L6-v2') model = AutoModel.from_pretrained('sentence-transformers/paraphrase-MiniLM-L6-v2') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-MiniLM-L6-v2) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,693
[ [ -0.0160675048828125, -0.05352783203125, 0.02978515625, 0.0152740478515625, -0.028076171875, -0.034332275390625, -0.0083465576171875, 0.002044677734375, 0.00958251953125, 0.033233642578125, -0.04144287109375, -0.021636962890625, -0.043212890625, 0.0100555419921875, -0.040313720703125, 0.07135009765625, -0.0018148422241210938, -0.0021495819091796875, -0.02862548828125, -0.0206451416015625, -0.014862060546875, -0.02960205078125, -0.033966064453125, -0.0192413330078125, 0.0207672119140625, 0.0206298828125, 0.045989990234375, 0.035552978515625, 0.0259552001953125, 0.0330810546875, -0.0012102127075195312, 0.0045623779296875, -0.0207061767578125, -0.012176513671875, 0.002079010009765625, -0.03387451171875, -0.004787445068359375, 0.00830841064453125, 0.042236328125, 0.02801513671875, -0.00708770751953125, 0.02105712890625, 0.0157318115234375, 0.0157470703125, -0.0186614990234375, 0.03369140625, -0.05419921875, 0.005199432373046875, 0.0015230178833007812, 0.007099151611328125, -0.033660888671875, -0.011810302734375, 0.01904296875, -0.0270843505859375, 0.0258941650390625, 0.024688720703125, 0.0732421875, 0.032745361328125, -0.0133819580078125, -0.0299224853515625, -0.01236724853515625, 0.06744384765625, -0.06402587890625, 0.008636474609375, 0.0243682861328125, 0.0052032470703125, 0.007293701171875, -0.0899658203125, -0.05731201171875, -0.01100921630859375, -0.04937744140625, -0.001216888427734375, -0.0287322998046875, -0.0008287429809570312, 0.0193634033203125, 0.012542724609375, -0.05328369140625, -0.0191497802734375, -0.032379150390625, -0.01861572265625, 0.0262603759765625, 0.0135498046875, 0.0294036865234375, -0.054718017578125, -0.031707763671875, -0.0234375, -0.0146636962890625, -0.00005036592483520508, -0.005855560302734375, 0.0186614990234375, -0.0291748046875, 0.05615234375, -0.0027008056640625, 0.0330810546875, 0.00011944770812988281, 0.00688934326171875, 0.03717041015625, -0.045135498046875, -0.0223846435546875, -0.01338958740234375, 0.0843505859375, 0.026641845703125, 0.00902557373046875, 0.004364013671875, -0.01361083984375, -0.00209808349609375, -0.00011157989501953125, -0.05731201171875, -0.041259765625, 0.004772186279296875, -0.037994384765625, -0.026519775390625, -0.00159454345703125, -0.064208984375, -0.007152557373046875, 0.004512786865234375, 0.05450439453125, -0.04327392578125, 0.0232696533203125, 0.0013341903686523438, -0.0290374755859375, 0.01546478271484375, -0.0206298828125, -0.051361083984375, 0.020965576171875, 0.0184783935546875, 0.08203125, 0.0103759765625, -0.039642333984375, -0.0255889892578125, -0.0008730888366699219, 0.01508331298828125, 0.05340576171875, -0.0243988037109375, -0.0021915435791015625, 0.0013256072998046875, 0.01227569580078125, -0.047088623046875, -0.035614013671875, 0.049713134765625, -0.0199737548828125, 0.04473876953125, 0.004032135009765625, -0.053131103515625, -0.0021457672119140625, 0.0013275146484375, -0.0401611328125, 0.0863037109375, 0.004146575927734375, -0.07391357421875, -0.0028228759765625, -0.05218505859375, -0.01184844970703125, -0.00908660888671875, -0.003948211669921875, -0.048431396484375, 0.0099945068359375, 0.040557861328125, 0.04302978515625, -0.0022258758544921875, 0.00887298583984375, -0.0210113525390625, -0.0308685302734375, 0.0253143310546875, -0.0220794677734375, 0.086181640625, 0.01007080078125, -0.028656005859375, 0.0125274658203125, -0.03082275390625, -0.006237030029296875, 0.0249176025390625, -0.00229644775390625, -0.02374267578125, -0.00997161865234375, 0.0164794921875, 0.03253173828125, 0.02777099609375, -0.042633056640625, 0.0015153884887695312, -0.0396728515625, 0.0699462890625, 0.04559326171875, 0.00859832763671875, 0.04888916015625, -0.0357666015625, 0.0159454345703125, 0.01430511474609375, 0.0109405517578125, -0.00665283203125, -0.03955078125, -0.07763671875, -0.022491455078125, 0.0232391357421875, 0.042999267578125, -0.07763671875, 0.05804443359375, -0.039520263671875, -0.036773681640625, -0.06402587890625, 0.01342010498046875, 0.0153350830078125, 0.03424072265625, 0.055328369140625, 0.01409912109375, -0.04937744140625, -0.07659912109375, -0.0135345458984375, -0.0008711814880371094, 0.0008602142333984375, 0.012115478515625, 0.058380126953125, -0.027587890625, 0.07586669921875, -0.040771484375, -0.036468505859375, -0.04437255859375, 0.0171661376953125, 0.0194549560546875, 0.042266845703125, 0.04278564453125, -0.054168701171875, -0.044158935546875, -0.043975830078125, -0.0555419921875, -0.004894256591796875, -0.0182647705078125, -0.0192718505859375, 0.0012216567993164062, 0.040069580078125, -0.0692138671875, 0.026519775390625, 0.03778076171875, -0.0302886962890625, 0.01861572265625, -0.025665283203125, -0.0227813720703125, -0.08758544921875, -0.0027332305908203125, -0.007770538330078125, -0.02203369140625, -0.0304412841796875, 0.0121612548828125, 0.0152740478515625, -0.00925445556640625, -0.038970947265625, 0.04150390625, -0.0298614501953125, 0.0121307373046875, -0.006206512451171875, 0.036376953125, -0.0038013458251953125, 0.052490234375, -0.015350341796875, 0.055419921875, 0.0306549072265625, -0.041473388671875, 0.028656005859375, 0.047760009765625, -0.034393310546875, 0.0108184814453125, -0.0645751953125, 0.008941650390625, 0.004680633544921875, 0.02923583984375, -0.08306884765625, -0.001728057861328125, 0.0240325927734375, -0.032562255859375, -0.007038116455078125, 0.0153045654296875, -0.0609130859375, -0.049285888671875, -0.041534423828125, 0.0138702392578125, 0.058685302734375, -0.0382080078125, 0.04010009765625, 0.01678466796875, -0.011077880859375, -0.024688720703125, -0.07989501953125, 0.0009136199951171875, -0.024078369140625, -0.04864501953125, 0.033966064453125, -0.01543426513671875, 0.007049560546875, 0.0133819580078125, 0.0177459716796875, 0.0009107589721679688, -0.0141754150390625, -0.00847625732421875, 0.01027679443359375, -0.002933502197265625, 0.0068817138671875, 0.024261474609375, -0.00978851318359375, 0.00011169910430908203, -0.009674072265625, 0.052978515625, -0.0182647705078125, -0.0031147003173828125, -0.0364990234375, 0.01934814453125, 0.0325927734375, -0.004863739013671875, 0.08209228515625, 0.067626953125, -0.0270843505859375, -0.005718231201171875, -0.0294952392578125, -0.029388427734375, -0.038299560546875, 0.033538818359375, -0.02508544921875, -0.0572509765625, 0.031402587890625, 0.026580810546875, 0.00452423095703125, 0.0555419921875, 0.0447998046875, -0.026153564453125, 0.062164306640625, 0.043670654296875, -0.0031414031982421875, 0.03857421875, -0.04180908203125, 0.01381683349609375, -0.06427001953125, 0.0011920928955078125, -0.0220184326171875, -0.0243682861328125, -0.0435791015625, -0.041900634765625, 0.0271453857421875, -0.0037593841552734375, -0.01318359375, 0.046966552734375, -0.032867431640625, 0.01445770263671875, 0.054901123046875, 0.0162200927734375, 0.000629425048828125, 0.0081024169921875, -0.04132080078125, -0.01422882080078125, -0.06439208984375, -0.043243408203125, 0.0626220703125, 0.020904541015625, 0.03472900390625, -0.008392333984375, 0.061004638671875, 0.0151214599609375, 0.00438690185546875, -0.042999267578125, 0.05523681640625, -0.0238189697265625, -0.033660888671875, -0.0284271240234375, -0.024200439453125, -0.059844970703125, 0.039947509765625, -0.001277923583984375, -0.049530029296875, 0.0083770751953125, -0.0056610107421875, -0.032989501953125, 0.01168060302734375, -0.05804443359375, 0.08319091796875, 0.00901031494140625, -0.00014162063598632812, -0.004302978515625, -0.06427001953125, 0.0169677734375, 0.0004668235778808594, 0.020904541015625, -0.004673004150390625, -0.01493072509765625, 0.07421875, -0.03399658203125, 0.0701904296875, -0.011016845703125, 0.02850341796875, 0.0266571044921875, -0.0201416015625, 0.033935546875, -0.00867462158203125, -0.00617218017578125, 0.00013816356658935547, 0.005313873291015625, -0.037017822265625, -0.04315185546875, 0.052764892578125, -0.0660400390625, -0.030609130859375, -0.03375244140625, -0.047760009765625, -0.0008955001831054688, 0.0172271728515625, 0.036346435546875, 0.0220184326171875, 0.0026454925537109375, 0.041839599609375, 0.036224365234375, -0.0171966552734375, 0.05712890625, 0.0014982223510742188, -0.006397247314453125, -0.0390625, 0.05047607421875, 0.007427215576171875, 0.0097808837890625, 0.03448486328125, 0.0289154052734375, -0.033966064453125, -0.017486572265625, -0.026153564453125, 0.042755126953125, -0.05352783203125, -0.015167236328125, -0.07855224609375, -0.03326416015625, -0.04766845703125, 0.0026073455810546875, -0.008026123046875, -0.031951904296875, -0.033935546875, -0.0169525146484375, 0.02471923828125, 0.0305938720703125, 0.0007123947143554688, 0.034149169921875, -0.05609130859375, 0.022979736328125, 0.021392822265625, -0.0220794677734375, -0.00794219970703125, -0.07080078125, -0.029144287109375, 0.0092926025390625, -0.02813720703125, -0.05902099609375, 0.049285888671875, 0.025665283203125, 0.044647216796875, 0.0013790130615234375, 0.01715087890625, 0.054840087890625, -0.046142578125, 0.0677490234375, 0.00048732757568359375, -0.080322265625, 0.032928466796875, -0.004451751708984375, 0.0274505615234375, 0.041046142578125, 0.01128387451171875, -0.028717041015625, -0.037933349609375, -0.06329345703125, -0.06842041015625, 0.054931640625, 0.04437255859375, 0.04144287109375, -0.01389312744140625, 0.0175628662109375, -0.01541900634765625, 0.015533447265625, -0.08807373046875, -0.03521728515625, -0.0214996337890625, -0.05157470703125, -0.0271453857421875, -0.02423095703125, 0.0016078948974609375, -0.039886474609375, 0.047393798828125, -0.0009565353393554688, 0.06298828125, 0.0157623291015625, -0.04022216796875, 0.0218963623046875, 0.00855255126953125, 0.039794921875, 0.016632080078125, -0.00431060791015625, 0.023468017578125, 0.031982421875, -0.016845703125, 0.0010137557983398438, 0.029876708984375, -0.022216796875, 0.0246124267578125, 0.0347900390625, 0.0657958984375, 0.03936767578125, -0.036346435546875, 0.0579833984375, -0.0034885406494140625, -0.01340484619140625, -0.0247344970703125, -0.0106658935546875, 0.0198974609375, 0.027099609375, 0.0208587646484375, 0.0026874542236328125, 0.004444122314453125, -0.0308837890625, 0.028076171875, 0.01617431640625, -0.0207672119140625, -0.005977630615234375, 0.057037353515625, -0.003688812255859375, -0.0145111083984375, 0.06396484375, -0.01409912109375, -0.0478515625, 0.031280517578125, 0.042510986328125, 0.0704345703125, 0.00638580322265625, 0.01554107666015625, 0.0215301513671875, 0.0347900390625, -0.0036106109619140625, -0.0017833709716796875, 0.0025424957275390625, -0.05548095703125, -0.0008611679077148438, -0.051971435546875, 0.0034999847412109375, -0.0009679794311523438, -0.04180908203125, 0.01904296875, -0.0105438232421875, -0.0026569366455078125, -0.009918212890625, -0.007183074951171875, -0.056640625, 0.0011844635009765625, -0.0016145706176757812, 0.060943603515625, -0.0743408203125, 0.0782470703125, 0.047882080078125, -0.052978515625, -0.046905517578125, 0.004451751708984375, -0.0254058837890625, -0.06591796875, 0.0396728515625, 0.0238189697265625, 0.01270294189453125, 0.01473236083984375, -0.039886474609375, -0.06536865234375, 0.10296630859375, 0.0232086181640625, -0.0180816650390625, -0.03228759765625, 0.005924224853515625, 0.04168701171875, -0.031646728515625, 0.02044677734375, 0.041259765625, 0.0199737548828125, -0.00846099853515625, -0.05572509765625, 0.019378662109375, -0.014068603515625, 0.021209716796875, -0.0180206298828125, -0.043487548828125, 0.080322265625, 0.0029048919677734375, -0.00559234619140625, 0.03106689453125, 0.067138671875, 0.024444580078125, 0.0018749237060546875, 0.034210205078125, 0.0479736328125, 0.038909912109375, 0.00004982948303222656, 0.077392578125, -0.0207977294921875, 0.06378173828125, 0.0816650390625, 0.01401519775390625, 0.08306884765625, 0.0455322265625, -0.0133209228515625, 0.05511474609375, 0.034881591796875, -0.0094451904296875, 0.06488037109375, 0.0018186569213867188, 0.0015773773193359375, -0.0012369155883789062, 0.0159149169921875, -0.014556884765625, 0.020111083984375, 0.0169219970703125, -0.0543212890625, -0.01079559326171875, 0.0162506103515625, -0.00261688232421875, -0.004749298095703125, -0.0012826919555664062, 0.045135498046875, 0.025054931640625, -0.028839111328125, 0.030548095703125, 0.01392364501953125, 0.07208251953125, -0.032318115234375, 0.01678466796875, -0.01520538330078125, 0.02899169921875, 0.0060577392578125, -0.037628173828125, 0.0323486328125, -0.009124755859375, -0.006092071533203125, -0.0241546630859375, 0.049530029296875, -0.047821044921875, -0.04937744140625, 0.024627685546875, 0.041107177734375, 0.00859832763671875, 0.004871368408203125, -0.094970703125, -0.0036716461181640625, 0.007259368896484375, -0.0305633544921875, 0.0237579345703125, 0.02471923828125, 0.0265655517578125, 0.041900634765625, 0.0289459228515625, -0.01904296875, 0.02880859375, -0.0024738311767578125, 0.05755615234375, -0.045166015625, -0.0426025390625, -0.0794677734375, 0.045013427734375, -0.01983642578125, -0.013885498046875, 0.07208251953125, 0.03887939453125, 0.0574951171875, -0.0260162353515625, 0.042755126953125, -0.0140533447265625, 0.0284423828125, -0.039794921875, 0.05914306640625, -0.03973388671875, -0.00408172607421875, -0.020263671875, -0.06390380859375, -0.0161590576171875, 0.08111572265625, -0.03021240234375, 0.005985260009765625, 0.08148193359375, 0.0625, -0.0135955810546875, -0.012603759765625, 0.0178070068359375, 0.026947021484375, 0.0150604248046875, 0.036346435546875, 0.033721923828125, -0.0675048828125, 0.0689697265625, -0.050628662109375, -0.00173187255859375, -0.015167236328125, -0.052154541015625, -0.07391357421875, -0.053802490234375, -0.0255889892578125, -0.022552490234375, -0.0130615234375, 0.07293701171875, 0.034637451171875, -0.05853271484375, -0.01035308837890625, -0.016265869140625, -0.00684356689453125, -0.01303863525390625, -0.023345947265625, 0.04278564453125, -0.039306640625, -0.06695556640625, 0.013580322265625, -0.00771331787109375, 0.006343841552734375, -0.01308441162109375, 0.009185791015625, -0.045013427734375, 0.0153961181640625, 0.040740966796875, -0.01561737060546875, -0.0638427734375, -0.025238037109375, -0.007293701171875, -0.041595458984375, -0.011016845703125, 0.0330810546875, -0.045013427734375, 0.0121612548828125, 0.03387451171875, 0.03778076171875, 0.050872802734375, -0.017974853515625, 0.0238189697265625, -0.059356689453125, 0.02532958984375, 0.005847930908203125, 0.05615234375, 0.026763916015625, -0.01129913330078125, 0.034149169921875, 0.033203125, -0.035858154296875, -0.059051513671875, -0.01436614990234375, -0.07208251953125, -0.01294708251953125, 0.086181640625, -0.0238189697265625, -0.0238189697265625, 0.0039520263671875, -0.022613525390625, 0.03265380859375, -0.0203857421875, 0.039520263671875, 0.05731201171875, -0.007106781005859375, -0.0271759033203125, -0.0290985107421875, 0.0256195068359375, 0.04022216796875, -0.03912353515625, -0.01377105712890625, 0.012451171875, 0.0302886962890625, 0.014312744140625, 0.040435791015625, 0.0025787353515625, 0.00186920166015625, 0.0099945068359375, -0.005279541015625, -0.0064544677734375, 0.001827239990234375, -0.03125, 0.01444244384765625, -0.026458740234375, -0.02899169921875 ] ]
stabilityai/stable-diffusion-2
2023-07-05T16:19:01.000Z
[ "diffusers", "stable-diffusion", "text-to-image", "arxiv:2202.00512", "arxiv:2112.10752", "arxiv:1910.09700", "license:openrail++", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
stabilityai
null
null
stabilityai/stable-diffusion-2
1,661
308,733
diffusers
2022-11-23T11:54:34
--- license: openrail++ tags: - stable-diffusion - text-to-image --- # Stable Diffusion v2 Model Card This model card focuses on the model associated with the Stable Diffusion v2 model, available [here](https://github.com/Stability-AI/stablediffusion). This `stable-diffusion-2` model is resumed from [stable-diffusion-2-base](https://huggingface.co/stabilityai/stable-diffusion-2-base) (`512-base-ema.ckpt`) and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on `768x768` images. ![image](https://github.com/Stability-AI/stablediffusion/blob/main/assets/stable-samples/txt2img/768/merged-0005.png?raw=true) - Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `768-v-ema.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/768-v-ema.ckpt). - Use it with 🧨 [`diffusers`](https://huggingface.co/stabilityai/stable-diffusion-2#examples) ## Model Details - **Developed by:** Robin Rombach, Patrick Esser - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)). - **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/). - **Cite as:** @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } ## Examples Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 in a simple and efficient manner. ```bash pip install diffusers transformers accelerate scipy safetensors ``` Running the pipeline (if you don't swap the scheduler it will run with the default DDIM, in this example we are swapping it to EulerDiscreteScheduler): ```python from diffusers import StableDiffusionPipeline, EulerDiscreteScheduler model_id = "stabilityai/stable-diffusion-2" # Use the Euler scheduler here instead scheduler = EulerDiscreteScheduler.from_pretrained(model_id, subfolder="scheduler") pipe = StableDiffusionPipeline.from_pretrained(model_id, scheduler=scheduler, torch_dtype=torch.float16) pipe = pipe.to("cuda") prompt = "a photo of an astronaut riding a horse on mars" image = pipe(prompt).images[0] image.save("astronaut_rides_horse.png") ``` **Notes**: - Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance) - If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed) # Uses ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include - Safe deployment of models which have the potential to generate harmful content. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use _Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_. The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. - The model was trained mainly with English captions and will not work as well in other languages. - The autoencoding part of the model is lossy - The model was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section). ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as white and western cultures are often set as the default. Further, the ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts. Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ## Training **Training Data** The model developers used the following dataset for training the model: - LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic. **Training Procedure** Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training, - Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4 - Text prompts are encoded through the OpenCLIP-ViT/H text-encoder. - The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention. - The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512. We currently provide the following checkpoints: - `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. 850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`. - `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset. - `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. - `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://github.com/saic-mdal/lama). - `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). - **Hardware:** 32 x 8 x A100 GPUs - **Optimizer:** AdamW - **Gradient Accumulations**: 1 - **Batch:** 32 x 8 x 2 x 4 = 2048 - **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant ## Evaluation Results Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints: ![pareto](model-variants.jpg) Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores. ## Environmental Impact **Stable Diffusion v1** **Estimated Emissions** Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact. - **Hardware Type:** A100 PCIe 40GB - **Hours used:** 200000 - **Cloud Provider:** AWS - **Compute Region:** US-east - **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq. ## Citation @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } *This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
12,303
[ [ -0.0287628173828125, -0.0626220703125, 0.0255126953125, 0.0091400146484375, -0.0163726806640625, -0.0294189453125, 0.005596160888671875, -0.03143310546875, -0.009613037109375, 0.028961181640625, -0.0289154052734375, -0.0262908935546875, -0.055999755859375, -0.00836181640625, -0.0306396484375, 0.07037353515625, -0.01024627685546875, 0.00237274169921875, -0.01708984375, -0.0034809112548828125, -0.0254974365234375, -0.0116424560546875, -0.076904296875, -0.020782470703125, 0.032806396484375, 0.005504608154296875, 0.0504150390625, 0.0416259765625, 0.033447265625, 0.02227783203125, -0.023651123046875, -0.0029087066650390625, -0.05615234375, -0.0003058910369873047, -0.0020885467529296875, -0.019989013671875, -0.0364990234375, 0.00995635986328125, 0.0494384765625, 0.0212554931640625, -0.01061248779296875, 0.0032405853271484375, 0.00247955322265625, 0.043426513671875, -0.0416259765625, -0.00995635986328125, -0.0223388671875, 0.0115814208984375, -0.01110076904296875, 0.019622802734375, -0.028656005859375, -0.005496978759765625, 0.006114959716796875, -0.057342529296875, 0.027435302734375, -0.024749755859375, 0.0826416015625, 0.032073974609375, -0.0243988037109375, -0.00316619873046875, -0.057037353515625, 0.045562744140625, -0.04248046875, 0.022247314453125, 0.0281524658203125, 0.0078277587890625, -0.0002963542938232422, -0.0738525390625, -0.046478271484375, -0.002567291259765625, 0.004695892333984375, 0.0377197265625, -0.03350830078125, -0.00913238525390625, 0.035858154296875, 0.0113525390625, -0.045562744140625, 0.0005397796630859375, -0.04315185546875, -0.004032135009765625, 0.047119140625, 0.00902557373046875, 0.01910400390625, -0.01190185546875, -0.030487060546875, -0.0071563720703125, -0.036651611328125, 0.0022258758544921875, 0.029266357421875, -0.02410888671875, -0.03387451171875, 0.032135009765625, 0.0076904296875, 0.035247802734375, 0.0243682861328125, -0.0097503662109375, 0.0287322998046875, -0.0200653076171875, -0.01434326171875, -0.036376953125, 0.06475830078125, 0.0506591796875, -0.004520416259765625, 0.01190948486328125, -0.0101318359375, 0.0162353515625, 0.00676727294921875, -0.0919189453125, -0.030792236328125, 0.010833740234375, -0.05242919921875, -0.041168212890625, -0.01416015625, -0.0758056640625, -0.0134735107421875, 0.0098724365234375, 0.032318115234375, -0.0232696533203125, -0.0364990234375, -0.00548553466796875, -0.031097412109375, 0.01445770263671875, 0.034423828125, -0.054656982421875, 0.01218414306640625, 0.004734039306640625, 0.0831298828125, -0.02801513671875, -0.0005517005920410156, -0.01422119140625, 0.01009368896484375, -0.0202178955078125, 0.05438232421875, -0.029266357421875, -0.040924072265625, -0.01849365234375, 0.0271148681640625, 0.00853729248046875, -0.035858154296875, 0.044464111328125, -0.0343017578125, 0.027099609375, -0.0027713775634765625, -0.0295257568359375, -0.0166015625, 0.0005617141723632812, -0.053253173828125, 0.08319091796875, 0.0172882080078125, -0.0672607421875, 0.007297515869140625, -0.055023193359375, -0.0175628662109375, -0.0077056884765625, 0.00009614229202270508, -0.052734375, -0.01202392578125, 0.0027828216552734375, 0.0322265625, -0.01074981689453125, 0.0189666748046875, -0.0211029052734375, -0.0198211669921875, -0.003948211669921875, -0.0478515625, 0.0740966796875, 0.0278472900390625, -0.032257080078125, 0.00125885009765625, -0.050537109375, -0.0283660888671875, 0.038787841796875, -0.01776123046875, -0.01629638671875, -0.01293182373046875, 0.0239410400390625, 0.025146484375, 0.006252288818359375, -0.03271484375, -0.0031566619873046875, -0.0186309814453125, 0.042510986328125, 0.057464599609375, 0.01256561279296875, 0.0523681640625, -0.0311737060546875, 0.041595458984375, 0.026092529296875, 0.020782470703125, -0.0128326416015625, -0.06646728515625, -0.04962158203125, -0.01428985595703125, 0.01451873779296875, 0.042022705078125, -0.055572509765625, 0.01318359375, 0.004116058349609375, -0.052764892578125, -0.016845703125, -0.0065460205078125, 0.02154541015625, 0.056060791015625, 0.0241241455078125, -0.02960205078125, -0.024627685546875, -0.054718017578125, 0.0254974365234375, -0.005615234375, 0.01093292236328125, 0.0193328857421875, 0.05206298828125, -0.03460693359375, 0.03662109375, -0.044189453125, -0.0186767578125, 0.00600433349609375, 0.009796142578125, 0.0009045600891113281, 0.052459716796875, 0.06011962890625, -0.079345703125, -0.04840087890625, -0.0196685791015625, -0.060821533203125, 0.0007004737854003906, -0.0023975372314453125, -0.025482177734375, 0.036956787109375, 0.040435791015625, -0.05560302734375, 0.046478271484375, 0.04736328125, -0.02606201171875, 0.031402587890625, -0.0295257568359375, -0.00014328956604003906, -0.083740234375, 0.011749267578125, 0.026153564453125, -0.0244140625, -0.045379638671875, 0.0010051727294921875, -0.0028820037841796875, -0.0135498046875, -0.046844482421875, 0.058868408203125, -0.032012939453125, 0.031341552734375, -0.0290374755859375, -0.0019550323486328125, 0.01131439208984375, 0.02117919921875, 0.0264129638671875, 0.046905517578125, 0.06365966796875, -0.043731689453125, 0.01007080078125, 0.0179595947265625, -0.0028247833251953125, 0.04302978515625, -0.06524658203125, 0.0130462646484375, -0.0364990234375, 0.022491455078125, -0.07818603515625, -0.0172882080078125, 0.04571533203125, -0.03173828125, 0.0328369140625, -0.0170440673828125, -0.0252838134765625, -0.03387451171875, -0.0157012939453125, 0.03985595703125, 0.07989501953125, -0.0292510986328125, 0.0416259765625, 0.036529541015625, 0.01204681396484375, -0.037017822265625, -0.056304931640625, -0.01141357421875, -0.027069091796875, -0.059600830078125, 0.0452880859375, -0.0191802978515625, -0.01102447509765625, 0.0142669677734375, 0.0140380859375, -0.0002677440643310547, -0.0022678375244140625, 0.035552978515625, 0.0174407958984375, 0.00450897216796875, -0.01116943359375, 0.01641845703125, -0.01910400390625, -0.0020427703857421875, -0.00463104248046875, 0.0287322998046875, 0.01067352294921875, -0.006542205810546875, -0.05419921875, 0.03424072265625, 0.038818359375, 0.001445770263671875, 0.057342529296875, 0.07989501953125, -0.045379638671875, 0.0019330978393554688, -0.0258941650390625, -0.01678466796875, -0.03668212890625, 0.032318115234375, -0.01153564453125, -0.046539306640625, 0.04425048828125, 0.0009584426879882812, 0.001964569091796875, 0.054412841796875, 0.06256103515625, -0.0163116455078125, 0.088134765625, 0.04986572265625, 0.0234527587890625, 0.05303955078125, -0.057830810546875, -0.005413055419921875, -0.064697265625, -0.0249176025390625, -0.016143798828125, -0.0178985595703125, -0.0338134765625, -0.051910400390625, 0.0232086181640625, 0.0119781494140625, -0.01348114013671875, 0.01233673095703125, -0.043243408203125, 0.023345947265625, 0.022613525390625, 0.01099395751953125, -0.0020599365234375, 0.0135345458984375, 0.007781982421875, -0.012603759765625, -0.059356689453125, -0.04541015625, 0.0780029296875, 0.039764404296875, 0.06884765625, 0.0005354881286621094, 0.039337158203125, 0.0309600830078125, 0.0307464599609375, -0.03363037109375, 0.0380859375, -0.0287322998046875, -0.0478515625, -0.00914764404296875, -0.0181121826171875, -0.07244873046875, 0.01296234130859375, -0.0163726806640625, -0.0325927734375, 0.03729248046875, 0.01451873779296875, -0.024017333984375, 0.025299072265625, -0.055450439453125, 0.0751953125, -0.00669097900390625, -0.05419921875, -0.01053619384765625, -0.050994873046875, 0.0247802734375, -0.0036106109619140625, 0.0150299072265625, -0.005451202392578125, -0.007678985595703125, 0.06719970703125, -0.02386474609375, 0.07366943359375, -0.030303955078125, 0.003452301025390625, 0.0290374755859375, -0.012908935546875, 0.027435302734375, 0.0173187255859375, -0.00904083251953125, 0.0312042236328125, 0.00516510009765625, -0.027435302734375, -0.0225372314453125, 0.056488037109375, -0.0771484375, -0.033935546875, -0.037567138671875, -0.028106689453125, 0.043060302734375, 0.0161285400390625, 0.0577392578125, 0.0290374755859375, -0.016632080078125, -0.0016965866088867188, 0.061920166015625, -0.01922607421875, 0.0357666015625, 0.0170745849609375, -0.01837158203125, -0.038421630859375, 0.05645751953125, 0.01788330078125, 0.03717041015625, -0.003704071044921875, 0.01154327392578125, -0.016754150390625, -0.03912353515625, -0.043975830078125, 0.021942138671875, -0.06256103515625, -0.014923095703125, -0.064208984375, -0.0258026123046875, -0.036407470703125, -0.010589599609375, -0.0255279541015625, -0.0198516845703125, -0.0679931640625, 0.007289886474609375, 0.0226898193359375, 0.041717529296875, -0.02862548828125, 0.02581787109375, -0.034393310546875, 0.033111572265625, 0.012420654296875, 0.0098876953125, 0.0030422210693359375, -0.061126708984375, -0.0109100341796875, 0.00870513916015625, -0.0496826171875, -0.07183837890625, 0.0296478271484375, 0.0085601806640625, 0.047637939453125, 0.0433349609375, -0.0034999847412109375, 0.0428466796875, -0.032318115234375, 0.0750732421875, 0.0169677734375, -0.0440673828125, 0.048553466796875, -0.0298309326171875, 0.01165008544921875, 0.016815185546875, 0.046112060546875, -0.02349853515625, -0.02392578125, -0.060302734375, -0.0679931640625, 0.04937744140625, 0.02947998046875, 0.0307464599609375, -0.0108795166015625, 0.0518798828125, -0.0021820068359375, -0.00893402099609375, -0.07879638671875, -0.039459228515625, -0.026031494140625, 0.0023097991943359375, 0.007602691650390625, -0.0335693359375, -0.0141754150390625, -0.038055419921875, 0.07061767578125, 0.00511932373046875, 0.04132080078125, 0.028594970703125, 0.0017633438110351562, -0.031982421875, -0.0273284912109375, 0.040130615234375, 0.027435302734375, -0.00957489013671875, -0.004581451416015625, -0.0031871795654296875, -0.042510986328125, 0.0190277099609375, 0.0163726806640625, -0.0556640625, 0.00023412704467773438, -0.002468109130859375, 0.070068359375, -0.020477294921875, -0.035614013671875, 0.049224853515625, -0.0182342529296875, -0.0287628173828125, -0.03607177734375, 0.0107879638671875, 0.00876617431640625, 0.026214599609375, 0.006687164306640625, 0.040191650390625, 0.0133819580078125, -0.02587890625, 0.009552001953125, 0.03564453125, -0.029571533203125, -0.0269622802734375, 0.08648681640625, 0.008575439453125, -0.0252838134765625, 0.0433349609375, -0.0343017578125, -0.01800537109375, 0.052459716796875, 0.058563232421875, 0.060882568359375, -0.015838623046875, 0.03765869140625, 0.054443359375, 0.01904296875, -0.025726318359375, 0.0181732177734375, 0.0167388916015625, -0.056427001953125, -0.00765228271484375, -0.033447265625, -0.0019483566284179688, 0.01367950439453125, -0.038970947265625, 0.03619384765625, -0.0360107421875, -0.034515380859375, -0.00070953369140625, -0.024566650390625, -0.045196533203125, 0.01163482666015625, 0.029022216796875, 0.061431884765625, -0.08441162109375, 0.05792236328125, 0.058135986328125, -0.048095703125, -0.03851318359375, 0.0028247833251953125, -0.006221771240234375, -0.0265045166015625, 0.03924560546875, 0.01438140869140625, 0.004413604736328125, 0.00815582275390625, -0.0584716796875, -0.071044921875, 0.09619140625, 0.0245361328125, -0.0265350341796875, -0.0014848709106445312, -0.0198211669921875, 0.04730224609375, -0.033416748046875, 0.0238037109375, 0.02313232421875, 0.028076171875, 0.028961181640625, -0.0341796875, 0.01062774658203125, -0.03289794921875, 0.0257415771484375, -0.00908660888671875, -0.07012939453125, 0.07537841796875, -0.028839111328125, -0.02227783203125, 0.0223846435546875, 0.05242919921875, 0.019134521484375, 0.0248870849609375, 0.035308837890625, 0.0679931640625, 0.0439453125, -0.00821685791015625, 0.07733154296875, -0.00634002685546875, 0.0300445556640625, 0.05438232421875, -0.007717132568359375, 0.05377197265625, 0.03509521484375, -0.00904083251953125, 0.043792724609375, 0.056060791015625, -0.0286712646484375, 0.060546875, -0.00513458251953125, -0.011016845703125, -0.00872802734375, -0.00310516357421875, -0.038726806640625, 0.0091552734375, 0.0234222412109375, -0.04156494140625, -0.0162506103515625, 0.0208740234375, 0.00400543212890625, -0.01198577880859375, -0.00714874267578125, 0.041900634765625, 0.005359649658203125, -0.032745361328125, 0.046173095703125, 0.0149383544921875, 0.0640869140625, -0.03314208984375, -0.01434326171875, -0.00966644287109375, 0.01264190673828125, -0.01629638671875, -0.056915283203125, 0.0361328125, -0.00528717041015625, -0.0210113525390625, -0.02069091796875, 0.06640625, -0.026519775390625, -0.05145263671875, 0.0311126708984375, 0.0210113525390625, 0.02447509765625, 0.00856781005859375, -0.080810546875, 0.015106201171875, -0.005199432373046875, -0.0301971435546875, 0.0200653076171875, 0.0163421630859375, 0.0015439987182617188, 0.03692626953125, 0.046417236328125, -0.004638671875, 0.0035915374755859375, -0.0003731250762939453, 0.06475830078125, -0.0233612060546875, -0.026397705078125, -0.0595703125, 0.057769775390625, -0.007183074951171875, -0.0204315185546875, 0.0494384765625, 0.046112060546875, 0.0635986328125, -0.00933837890625, 0.059295654296875, -0.0221710205078125, 0.0019245147705078125, -0.035308837890625, 0.06573486328125, -0.06121826171875, 0.0042877197265625, -0.0266571044921875, -0.064208984375, -0.01428985595703125, 0.06829833984375, -0.020111083984375, 0.01995849609375, 0.032684326171875, 0.0740966796875, -0.0084075927734375, -0.018951416015625, 0.024688720703125, 0.022125244140625, 0.02813720703125, 0.02490234375, 0.06103515625, -0.0579833984375, 0.0301971435546875, -0.037567138671875, -0.0218658447265625, -0.0015001296997070312, -0.06475830078125, -0.0693359375, -0.05218505859375, -0.0589599609375, -0.0540771484375, -0.0027008056640625, 0.032745361328125, 0.0684814453125, -0.039947509765625, -0.0008153915405273438, -0.0140838623046875, -0.0005159378051757812, -0.00626373291015625, -0.01947021484375, 0.0221099853515625, 0.00885772705078125, -0.06964111328125, -0.006969451904296875, 0.0186614990234375, 0.04345703125, -0.037384033203125, -0.0176849365234375, -0.0209808349609375, -0.010833740234375, 0.041961669921875, 0.01018524169921875, -0.046112060546875, -0.0012683868408203125, -0.0035457611083984375, -0.00402069091796875, 0.0105133056640625, 0.0213623046875, -0.04595947265625, 0.0295257568359375, 0.04388427734375, 0.0197296142578125, 0.0623779296875, -0.005100250244140625, 0.01349639892578125, -0.0301513671875, 0.02569580078125, 0.006877899169921875, 0.0325927734375, 0.026519775390625, -0.043609619140625, 0.03704833984375, 0.04986572265625, -0.05322265625, -0.061492919921875, 0.011505126953125, -0.08197021484375, -0.0207061767578125, 0.10107421875, -0.01224517822265625, -0.02880859375, 0.004009246826171875, -0.0278472900390625, 0.0235748291015625, -0.0298309326171875, 0.041107177734375, 0.039031982421875, -0.009521484375, -0.03741455078125, -0.04840087890625, 0.0389404296875, 0.00861358642578125, -0.047210693359375, -0.020416259765625, 0.047943115234375, 0.05267333984375, 0.01837158203125, 0.0728759765625, -0.02557373046875, 0.022125244140625, 0.00592803955078125, 0.0017871856689453125, 0.0031299591064453125, -0.017578125, -0.03704833984375, 0.0018329620361328125, -0.0170440673828125, -0.001811981201171875 ] ]
papluca/xlm-roberta-base-language-detection
2022-11-05T18:20:13.000Z
[ "transformers", "pytorch", "tf", "xlm-roberta", "text-classification", "generated_from_trainer", "multilingual", "ar", "bg", "de", "el", "en", "es", "fr", "hi", "it", "ja", "nl", "pl", "pt", "ru", "sw", "th", "tr", "ur", "vi", "zh", "arxiv:1911.02116", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-classification
papluca
null
null
papluca/xlm-roberta-base-language-detection
144
304,235
transformers
2022-03-02T23:29:05
--- language: - multilingual - ar - bg - de - el - en - es - fr - hi - it - ja - nl - pl - pt - ru - sw - th - tr - ur - vi - zh license: mit tags: - generated_from_trainer metrics: - accuracy - f1 model-index: - name: xlm-roberta-base-language-detection results: [] --- # xlm-roberta-base-language-detection This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the [Language Identification](https://huggingface.co/datasets/papluca/language-identification#additional-information) dataset. ## Model description This model is an XLM-RoBERTa transformer model with a classification head on top (i.e. a linear layer on top of the pooled output). For additional information please refer to the [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) model card or to the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116) by Conneau et al. ## Intended uses & limitations You can directly use this model as a language detector, i.e. for sequence classification tasks. Currently, it supports the following 20 languages: `arabic (ar), bulgarian (bg), german (de), modern greek (el), english (en), spanish (es), french (fr), hindi (hi), italian (it), japanese (ja), dutch (nl), polish (pl), portuguese (pt), russian (ru), swahili (sw), thai (th), turkish (tr), urdu (ur), vietnamese (vi), and chinese (zh)` ## Training and evaluation data The model was fine-tuned on the [Language Identification](https://huggingface.co/datasets/papluca/language-identification#additional-information) dataset, which consists of text sequences in 20 languages. The training set contains 70k samples, while the validation and test sets 10k each. The average accuracy on the test set is **99.6%** (this matches the average macro/weighted F1-score being the test set perfectly balanced). A more detailed evaluation is provided by the following table. | Language | Precision | Recall | F1-score | support | |:--------:|:---------:|:------:|:--------:|:-------:| |ar |0.998 |0.996 |0.997 |500 | |bg |0.998 |0.964 |0.981 |500 | |de |0.998 |0.996 |0.997 |500 | |el |0.996 |1.000 |0.998 |500 | |en |1.000 |1.000 |1.000 |500 | |es |0.967 |1.000 |0.983 |500 | |fr |1.000 |1.000 |1.000 |500 | |hi |0.994 |0.992 |0.993 |500 | |it |1.000 |0.992 |0.996 |500 | |ja |0.996 |0.996 |0.996 |500 | |nl |1.000 |1.000 |1.000 |500 | |pl |1.000 |1.000 |1.000 |500 | |pt |0.988 |1.000 |0.994 |500 | |ru |1.000 |0.994 |0.997 |500 | |sw |1.000 |1.000 |1.000 |500 | |th |1.000 |0.998 |0.999 |500 | |tr |0.994 |0.992 |0.993 |500 | |ur |1.000 |1.000 |1.000 |500 | |vi |0.992 |1.000 |0.996 |500 | |zh |1.000 |1.000 |1.000 |500 | ### Benchmarks As a baseline to compare `xlm-roberta-base-language-detection` against, we have used the Python [langid](https://github.com/saffsd/langid.py) library. Since it comes pre-trained on 97 languages, we have used its `.set_languages()` method to constrain the language set to our 20 languages. The average accuracy of langid on the test set is **98.5%**. More details are provided by the table below. | Language | Precision | Recall | F1-score | support | |:--------:|:---------:|:------:|:--------:|:-------:| |ar |0.990 |0.970 |0.980 |500 | |bg |0.998 |0.964 |0.981 |500 | |de |0.992 |0.944 |0.967 |500 | |el |1.000 |0.998 |0.999 |500 | |en |1.000 |1.000 |1.000 |500 | |es |1.000 |0.968 |0.984 |500 | |fr |0.996 |1.000 |0.998 |500 | |hi |0.949 |0.976 |0.963 |500 | |it |0.990 |0.980 |0.985 |500 | |ja |0.927 |0.988 |0.956 |500 | |nl |0.980 |1.000 |0.990 |500 | |pl |0.986 |0.996 |0.991 |500 | |pt |0.950 |0.996 |0.973 |500 | |ru |0.996 |0.974 |0.985 |500 | |sw |1.000 |1.000 |1.000 |500 | |th |1.000 |0.996 |0.998 |500 | |tr |0.990 |0.968 |0.979 |500 | |ur |0.998 |0.996 |0.997 |500 | |vi |0.971 |0.990 |0.980 |500 | |zh |1.000 |1.000 |1.000 |500 | ## Training procedure Fine-tuning was done via the `Trainer` API. Here is the [Colab notebook](https://colab.research.google.com/drive/15LJTckS6gU3RQOmjLqxVNBmbsBdnUEvl?usp=sharing) with the training code. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 - mixed_precision_training: Native AMP ### Training results The validation results on the `valid` split of the Language Identification dataset are summarised here below. | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.2492 | 1.0 | 1094 | 0.0149 | 0.9969 | 0.9969 | | 0.0101 | 2.0 | 2188 | 0.0103 | 0.9977 | 0.9977 | In short, it achieves the following results on the validation set: - Loss: 0.0101 - Accuracy: 0.9977 - F1: 0.9977 ### Framework versions - Transformers 4.12.5 - Pytorch 1.10.0+cu111 - Datasets 1.15.1 - Tokenizers 0.10.3
5,932
[ [ -0.041107177734375, -0.05145263671875, -0.00547027587890625, 0.011932373046875, -0.0030975341796875, 0.00687408447265625, -0.032562255859375, -0.0193939208984375, 0.0110931396484375, 0.0083770751953125, -0.0350341796875, -0.056060791015625, -0.04559326171875, 0.00983428955078125, -0.0115814208984375, 0.0712890625, -0.002227783203125, 0.0110626220703125, 0.01010894775390625, -0.0240478515625, -0.0268707275390625, -0.0300750732421875, -0.0533447265625, -0.0205841064453125, 0.009613037109375, 0.028350830078125, 0.042266845703125, 0.052398681640625, 0.0234832763671875, 0.0228729248046875, -0.02386474609375, 0.00930023193359375, -0.016326904296875, -0.027130126953125, 0.01479339599609375, -0.040863037109375, -0.029449462890625, 0.00012862682342529297, 0.0477294921875, 0.04473876953125, -0.003566741943359375, 0.0236663818359375, 0.0088958740234375, 0.0506591796875, -0.030517578125, 0.013519287109375, -0.0303192138671875, 0.004405975341796875, -0.027587890625, -0.0094451904296875, -0.03021240234375, -0.01556396484375, -0.005794525146484375, -0.030792236328125, 0.018035888671875, 0.00023233890533447266, 0.10089111328125, 0.014251708984375, -0.017120361328125, -0.01227569580078125, -0.0338134765625, 0.06353759765625, -0.04632568359375, 0.03314208984375, 0.03558349609375, 0.00814056396484375, 0.0024013519287109375, -0.03204345703125, -0.052398681640625, -0.00031876564025878906, -0.0164337158203125, 0.01065826416015625, -0.01123046875, -0.01369476318359375, 0.04046630859375, 0.041748046875, -0.07061767578125, 0.0123138427734375, -0.0218658447265625, -0.031951904296875, 0.050506591796875, -0.0004012584686279297, 0.019744873046875, -0.01239013671875, -0.0220794677734375, -0.024017333984375, -0.039703369140625, 0.03179931640625, 0.031707763671875, 0.029052734375, -0.03143310546875, 0.024871826171875, -0.021820068359375, 0.0634765625, 0.00675201416015625, -0.0247039794921875, 0.0677490234375, -0.034271240234375, -0.0214996337890625, 0.0008931159973144531, 0.08111572265625, 0.01453399658203125, 0.0235748291015625, 0.0044708251953125, -0.0125885009765625, 0.0005087852478027344, -0.016143798828125, -0.061309814453125, -0.010894775390625, 0.023834228515625, -0.022796630859375, -0.0239410400390625, 0.008392333984375, -0.042510986328125, 0.0165252685546875, -0.0227813720703125, 0.019744873046875, -0.04931640625, -0.0260467529296875, 0.0005116462707519531, 0.003223419189453125, 0.0313720703125, 0.0062103271484375, -0.0703125, 0.0163116455078125, 0.03460693359375, 0.059295654296875, -0.009124755859375, -0.02850341796875, -0.0267486572265625, -0.0196685791015625, -0.033660888671875, 0.05218505859375, -0.01038360595703125, -0.0172882080078125, -0.00952911376953125, 0.0212860107421875, -0.02410888671875, -0.03839111328125, 0.041351318359375, -0.026275634765625, 0.03277587890625, -0.01171112060546875, -0.0367431640625, -0.0106048583984375, 0.033203125, -0.041168212890625, 0.11041259765625, 0.0191650390625, -0.05511474609375, 0.047210693359375, -0.032073974609375, -0.0196990966796875, 0.006778717041015625, -0.0183563232421875, -0.0537109375, -0.028350830078125, 0.022796630859375, 0.0185699462890625, -0.0144195556640625, 0.025726318359375, -0.002765655517578125, -0.039398193359375, 0.0162506103515625, -0.035797119140625, 0.0941162109375, 0.016265869140625, -0.055755615234375, -0.0008440017700195312, -0.08001708984375, 0.0239105224609375, 0.00885009765625, -0.04327392578125, -0.0077667236328125, -0.040771484375, 0.0151519775390625, 0.0216522216796875, 0.018096923828125, -0.046783447265625, 0.00586700439453125, -0.048919677734375, 0.00527191162109375, 0.045745849609375, -0.01328277587890625, 0.023651123046875, -0.0241851806640625, 0.036956787109375, 0.00913238525390625, -0.00017309188842773438, -0.0153656005859375, -0.047027587890625, -0.06890869140625, -0.03076171875, 0.044189453125, 0.0596923828125, -0.032958984375, 0.07098388671875, -0.0210418701171875, -0.042510986328125, -0.037384033203125, 0.0174713134765625, 0.050018310546875, 0.038543701171875, 0.0233612060546875, -0.016204833984375, -0.037689208984375, -0.0635986328125, -0.008819580078125, -0.00778961181640625, 0.01116180419921875, 0.017822265625, 0.048828125, -0.00936126708984375, 0.06866455078125, -0.029266357421875, -0.0316162109375, -0.0224761962890625, 0.0033016204833984375, 0.0404052734375, 0.041351318359375, 0.05828857421875, -0.050933837890625, -0.06317138671875, 0.0155181884765625, -0.052978515625, 0.0015974044799804688, 0.003177642822265625, 0.0091552734375, 0.055572509765625, 0.031829833984375, -0.03253173828125, 0.038970947265625, 0.043212890625, -0.0260009765625, 0.0457763671875, -0.024139404296875, 0.01418304443359375, -0.08892822265625, 0.0286712646484375, 0.0073394775390625, 0.0003871917724609375, -0.044464111328125, 0.0030117034912109375, 0.023651123046875, -0.0037097930908203125, -0.0286407470703125, 0.052947998046875, -0.0347900390625, 0.003795623779296875, 0.00586700439453125, -0.000054776668548583984, -0.012115478515625, 0.052032470703125, 0.0162353515625, 0.08465576171875, 0.053009033203125, -0.033782958984375, 0.01139068603515625, 0.00899505615234375, -0.054534912109375, 0.032379150390625, -0.04693603515625, 0.000004470348358154297, 0.007061004638671875, -0.00162506103515625, -0.08001708984375, -0.0031070709228515625, 0.0180511474609375, -0.053436279296875, 0.0283966064453125, -0.0180816650390625, -0.037353515625, -0.037445068359375, -0.0155792236328125, 0.01253509521484375, 0.0379638671875, -0.0321044921875, 0.03643798828125, 0.006744384765625, -0.00493621826171875, -0.0662841796875, -0.05889892578125, -0.004486083984375, -0.017364501953125, -0.04534912109375, 0.018218994140625, -0.01215362548828125, -0.00750732421875, -0.0140533447265625, 0.0009264945983886719, -0.01055145263671875, 0.007434844970703125, 0.017120361328125, 0.0224151611328125, -0.0220794677734375, -0.001171112060546875, -0.007556915283203125, -0.0236663818359375, -0.0080108642578125, 0.0038738250732421875, 0.0633544921875, -0.0305938720703125, -0.01441192626953125, -0.0469970703125, -0.00730133056640625, 0.03253173828125, -0.040496826171875, 0.060150146484375, 0.055084228515625, -0.0240325927734375, -0.002033233642578125, -0.0309600830078125, 0.00699615478515625, -0.035186767578125, 0.03350830078125, -0.0506591796875, -0.049072265625, 0.05828857421875, 0.00925445556640625, -0.00730133056640625, 0.03814697265625, 0.041412353515625, 0.0084686279296875, 0.0797119140625, 0.01520538330078125, -0.031005859375, 0.0269622802734375, -0.055572509765625, 0.01039886474609375, -0.048065185546875, -0.0278167724609375, -0.05596923828125, -0.0041656494140625, -0.048370361328125, -0.01508331298828125, 0.01010894775390625, 0.005321502685546875, -0.030731201171875, 0.03155517578125, -0.037109375, 0.03375244140625, 0.05401611328125, 0.01023101806640625, 0.01224517822265625, 0.008758544921875, -0.0259857177734375, -0.0040740966796875, -0.047149658203125, -0.0305938720703125, 0.0986328125, 0.019866943359375, 0.043060302734375, 0.00484466552734375, 0.055450439453125, 0.0034008026123046875, 0.0092010498046875, -0.052276611328125, 0.0291290283203125, -0.00618743896484375, -0.051025390625, -0.024627685546875, -0.035247802734375, -0.07244873046875, 0.0182037353515625, -0.01476287841796875, -0.060821533203125, 0.0162353515625, 0.0069580078125, -0.025482177734375, 0.0168914794921875, -0.05047607421875, 0.08587646484375, -0.02520751953125, -0.020355224609375, 0.003047943115234375, -0.044921875, 0.0119171142578125, -0.01041412353515625, 0.0204010009765625, 0.003337860107421875, 0.01374053955078125, 0.059783935546875, -0.03680419921875, 0.0439453125, -0.0106964111328125, 0.003204345703125, 0.0166778564453125, -0.0008897781372070312, 0.038116455078125, -0.0018186569213867188, -0.00991058349609375, 0.0264129638671875, 0.007354736328125, -0.03204345703125, -0.0286102294921875, 0.056304931640625, -0.07354736328125, -0.0328369140625, -0.05975341796875, -0.04766845703125, -0.00077056884765625, 0.037322998046875, 0.033721923828125, 0.03631591796875, -0.0031261444091796875, 0.020294189453125, 0.06439208984375, -0.0173187255859375, 0.0233306884765625, 0.0455322265625, -0.009124755859375, -0.03936767578125, 0.07061767578125, 0.003566741943359375, 0.0251617431640625, 0.0264892578125, 0.01337432861328125, -0.0284271240234375, -0.039947509765625, -0.050262451171875, 0.0252532958984375, -0.042724609375, -0.01264190673828125, -0.040252685546875, -0.017791748046875, -0.050994873046875, -0.0022068023681640625, -0.0240936279296875, -0.02044677734375, -0.0118408203125, -0.006725311279296875, 0.0276947021484375, 0.0250701904296875, 0.01180267333984375, 0.01534271240234375, -0.034881591796875, -0.00018095970153808594, 0.0009899139404296875, 0.017669677734375, -0.0261688232421875, -0.0640869140625, -0.021087646484375, 0.0145263671875, -0.0081329345703125, -0.043304443359375, 0.040802001953125, 0.0178070068359375, 0.044219970703125, 0.039642333984375, -0.0130767822265625, 0.053497314453125, -0.0185089111328125, 0.06304931640625, 0.004558563232421875, -0.0701904296875, 0.048248291015625, -0.00937652587890625, 0.035552978515625, 0.048675537109375, 0.04022216796875, -0.03955078125, -0.029022216796875, -0.05706787109375, -0.042083740234375, 0.075927734375, 0.0099639892578125, -0.012786865234375, 0.006046295166015625, 0.016265869140625, -0.0182647705078125, 0.00004875659942626953, -0.05816650390625, -0.051971435546875, -0.0011348724365234375, -0.028289794921875, -0.026824951171875, -0.0041961669921875, -0.0080108642578125, -0.031829833984375, 0.059173583984375, -0.0117645263671875, 0.006618499755859375, 0.004116058349609375, -0.01548004150390625, -0.001979827880859375, 0.011810302734375, 0.04730224609375, 0.05291748046875, -0.0277252197265625, -0.00266265869140625, 0.021087646484375, -0.046844482421875, 0.00936126708984375, 0.008056640625, -0.0340576171875, 0.00838470458984375, 0.030120849609375, 0.05609130859375, -0.0026950836181640625, -0.05145263671875, 0.048675537109375, -0.00661468505859375, -0.0283966064453125, -0.03662109375, 0.0037822723388671875, -0.007633209228515625, 0.0026950836181640625, 0.024871826171875, 0.01477813720703125, -0.0037784576416015625, -0.036468505859375, 0.01499176025390625, 0.027557373046875, -0.03179931640625, -0.01403045654296875, 0.041290283203125, -0.006191253662109375, -0.0304718017578125, 0.051239013671875, -0.0198516845703125, -0.05322265625, 0.0711669921875, 0.043060302734375, 0.03857421875, -0.0389404296875, 0.01293182373046875, 0.06427001953125, 0.030731201171875, -0.0003368854522705078, 0.032470703125, 0.0017108917236328125, -0.05596923828125, -0.01348114013671875, -0.0531005859375, -0.01580810546875, 0.029571533203125, -0.06341552734375, 0.0297393798828125, -0.0230865478515625, -0.002429962158203125, 0.0034809112548828125, 0.0219879150390625, -0.049407958984375, 0.034698486328125, 0.0036334991455078125, 0.09161376953125, -0.07757568359375, 0.07904052734375, 0.054168701171875, -0.04815673828125, -0.07794189453125, -0.0162353515625, 0.0017223358154296875, -0.071044921875, 0.0677490234375, 0.0227813720703125, 0.0148773193359375, -0.0012350082397460938, -0.00875091552734375, -0.08087158203125, 0.088134765625, 0.01168060302734375, -0.03399658203125, 0.023773193359375, 0.0325927734375, 0.05462646484375, -0.01123809814453125, 0.0287628173828125, 0.05096435546875, 0.0316162109375, -0.0002027750015258789, -0.0870361328125, 0.00722503662109375, -0.04046630859375, 0.0011606216430664062, 0.021514892578125, -0.05157470703125, 0.06707763671875, -0.00604248046875, -0.002506256103515625, 0.009002685546875, 0.029296875, 0.03460693359375, 0.01904296875, 0.03955078125, 0.0703125, 0.056854248046875, -0.02734375, 0.0682373046875, -0.046661376953125, 0.045989990234375, 0.07110595703125, 0.005481719970703125, 0.056976318359375, 0.027099609375, -0.031890869140625, 0.039276123046875, 0.062408447265625, -0.0036563873291015625, 0.03448486328125, 0.0021076202392578125, -0.01401519775390625, -0.0177001953125, 0.01763916015625, -0.0362548828125, 0.0283203125, 0.0272216796875, -0.03271484375, 0.0016698837280273438, 0.0136260986328125, 0.01332855224609375, 0.0013322830200195312, -0.007434844970703125, 0.042205810546875, -0.020050048828125, -0.05670166015625, 0.0731201171875, 0.0011682510375976562, 0.048675537109375, -0.049163818359375, 0.00452423095703125, -0.03106689453125, 0.0328369140625, -0.02825927734375, -0.0648193359375, 0.0098724365234375, -0.0007719993591308594, 0.0013065338134765625, -0.00502777099609375, 0.034423828125, -0.03948974609375, -0.039794921875, 0.0155792236328125, 0.024078369140625, 0.0132904052734375, 0.00475311279296875, -0.0670166015625, 0.00583648681640625, 0.01511383056640625, -0.0291900634765625, 0.0167694091796875, 0.03253173828125, -0.00450897216796875, 0.05078125, 0.043121337890625, 0.010498046875, 0.01256561279296875, 0.01312255859375, 0.06512451171875, -0.03985595703125, -0.037750244140625, -0.043914794921875, 0.0241241455078125, -0.0161285400390625, -0.04827880859375, 0.06707763671875, 0.0760498046875, 0.0875244140625, 0.001190185546875, 0.0540771484375, -0.0205230712890625, 0.046783447265625, -0.034454345703125, 0.04864501953125, -0.048095703125, 0.01232147216796875, -0.0153350830078125, -0.04913330078125, -0.034454345703125, 0.060455322265625, -0.0287628173828125, 0.01056671142578125, 0.04931640625, 0.0865478515625, 0.003864288330078125, -0.0176239013671875, 0.01617431640625, 0.0100555419921875, 0.0150146484375, 0.050262451171875, 0.0273590087890625, -0.06414794921875, 0.05316162109375, -0.046600341796875, -0.0278167724609375, -0.0097808837890625, -0.0290985107421875, -0.07366943359375, -0.04827880859375, -0.034881591796875, -0.039031982421875, -0.004856109619140625, 0.0626220703125, 0.04547119140625, -0.0721435546875, -0.032012939453125, -0.0112152099609375, -0.00502777099609375, -0.0316162109375, -0.017913818359375, 0.05340576171875, -0.02923583984375, -0.0726318359375, 0.00887298583984375, -0.004825592041015625, 0.004024505615234375, -0.01055145263671875, -0.02520751953125, -0.026153564453125, -0.0230560302734375, 0.0240936279296875, 0.0239410400390625, -0.0533447265625, -0.0157470703125, 0.00882720947265625, -0.019927978515625, 0.0194854736328125, 0.0161895751953125, -0.046844482421875, 0.037506103515625, 0.032012939453125, 0.037322998046875, 0.047119140625, -0.00591278076171875, 0.0115509033203125, -0.048828125, 0.0247039794921875, -0.00634002685546875, 0.032440185546875, 0.01568603515625, -0.021820068359375, 0.04754638671875, 0.0311126708984375, -0.0426025390625, -0.065185546875, -0.008331298828125, -0.08843994140625, -0.005035400390625, 0.0877685546875, -0.0224151611328125, -0.04046630859375, -0.008331298828125, -0.0181121826171875, 0.008209228515625, -0.0260162353515625, 0.041748046875, 0.04541015625, -0.0028705596923828125, -0.0196990966796875, -0.024200439453125, 0.040252685546875, 0.02874755859375, -0.054168701171875, -0.01122283935546875, 0.01299285888671875, 0.046722412109375, 0.03192138671875, 0.046539306640625, -0.01279449462890625, 0.0304107666015625, -0.0014028549194335938, 0.024810791015625, -0.007106781005859375, -0.004459381103515625, -0.0364990234375, -0.0037689208984375, 0.00716400146484375, -0.0231170654296875 ] ]
mistralai/Mistral-7B-v0.1
2023-10-12T17:52:53.000Z
[ "transformers", "pytorch", "mistral", "text-generation", "pretrained", "en", "arxiv:2310.06825", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
mistralai
null
null
mistralai/Mistral-7B-v0.1
1,701
302,166
transformers
2023-09-20T13:03:50
--- license: apache-2.0 pipeline_tag: text-generation language: - en tags: - pretrained inference: parameters: temperature: 0.7 --- # Model Card for Mistral-7B-v0.1 The Mistral-7B-v0.1 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters. Mistral-7B-v0.1 outperforms Llama 2 13B on all benchmarks we tested. For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/announcing-mistral-7b/). ## Model Architecture Mistral-7B-v0.1 is a transformer model, with the following architecture choices: - Grouped-Query Attention - Sliding-Window Attention - Byte-fallback BPE tokenizer ## Troubleshooting - If you see the following error: ``` KeyError: 'mistral' ``` - Or: ``` NotImplementedError: Cannot copy out of meta tensor; no data! ``` Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer. ## Notice Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms. ## The Mistral AI Team Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lélio Renard Lavaud, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed.
1,390
[ [ -0.0294952392578125, -0.034576416015625, 0.0247955322265625, 0.0377197265625, -0.03326416015625, -0.01534271240234375, 0.002239227294921875, -0.0163421630859375, -0.0108642578125, 0.04412841796875, -0.03289794921875, -0.038177490234375, -0.05755615234375, 0.00955963134765625, -0.050506591796875, 0.07342529296875, -0.00533294677734375, 0.014404296875, -0.0026340484619140625, 0.006641387939453125, -0.036468505859375, -0.05120849609375, -0.0640869140625, -0.0404052734375, 0.0267486572265625, -0.00502777099609375, 0.04241943359375, 0.033111572265625, 0.052734375, 0.0222930908203125, -0.03924560546875, 0.01132965087890625, -0.059844970703125, 0.00968170166015625, -0.00818634033203125, -0.0494384765625, -0.045074462890625, -0.03619384765625, 0.043548583984375, 0.007366180419921875, -0.020294189453125, 0.025848388671875, -0.0038928985595703125, 0.03631591796875, -0.02593994140625, 0.0233612060546875, -0.0478515625, -0.00972747802734375, -0.029296875, -0.0017251968383789062, -0.0299224853515625, -0.0085601806640625, 0.004039764404296875, -0.03271484375, 0.033966064453125, 0.0049896240234375, 0.07275390625, 0.0545654296875, -0.025360107421875, 0.004337310791015625, -0.05718994140625, 0.0396728515625, -0.06427001953125, 0.03826904296875, 0.0142364501953125, 0.024658203125, -0.019134521484375, -0.0853271484375, -0.042205810546875, -0.016754150390625, 0.016265869140625, 0.022216796875, -0.0369873046875, 0.01354217529296875, 0.040863037109375, 0.03179931640625, -0.0302886962890625, -0.006317138671875, -0.06524658203125, -0.011077880859375, 0.0206298828125, 0.02642822265625, -0.0096282958984375, -0.002727508544921875, -0.040374755859375, -0.0231475830078125, -0.03729248046875, 0.00765228271484375, 0.005462646484375, -0.0031375885009765625, -0.0226593017578125, 0.0158538818359375, 0.00788116455078125, 0.046173095703125, 0.0268402099609375, -0.004810333251953125, 0.0215911865234375, -0.0019378662109375, -0.02679443359375, 0.00406646728515625, 0.067138671875, 0.01413726806640625, 0.0072174072265625, -0.0081329345703125, -0.0294036865234375, -0.0006361007690429688, 0.0294952392578125, -0.06939697265625, -0.0037899017333984375, 0.016632080078125, -0.03424072265625, -0.017059326171875, 0.0231170654296875, -0.042816162109375, -0.02459716796875, -0.01364898681640625, 0.02874755859375, -0.0241546630859375, -0.0302276611328125, 0.008697509765625, -0.0190277099609375, 0.046875, 0.033172607421875, -0.048126220703125, 0.020538330078125, 0.0447998046875, 0.05841064453125, -0.0172576904296875, -0.025238037109375, 0.01110076904296875, 0.004177093505859375, -0.03204345703125, 0.055389404296875, -0.00652313232421875, -0.053009033203125, -0.01202392578125, 0.0007109642028808594, -0.004467010498046875, -0.0689697265625, 0.05548095703125, -0.044891357421875, 0.00411224365234375, -0.0245208740234375, -0.0263671875, -0.02374267578125, 0.004291534423828125, -0.07745361328125, 0.09979248046875, 0.020050048828125, -0.0638427734375, 0.006259918212890625, -0.032379150390625, -0.006954193115234375, -0.00594329833984375, -0.0010175704956054688, -0.0246734619140625, 0.031585693359375, -0.0032825469970703125, 0.027557373046875, -0.04827880859375, 0.04876708984375, -0.01035308837890625, -0.0389404296875, 0.0252685546875, -0.06463623046875, 0.0556640625, 0.0193634033203125, -0.035125732421875, 0.006870269775390625, -0.061248779296875, -0.0199737548828125, -0.0003376007080078125, -0.01849365234375, 0.01229095458984375, -0.033447265625, 0.0272216796875, 0.03125, 0.03717041015625, -0.0058441162109375, 0.02001953125, -0.017547607421875, 0.026458740234375, 0.06292724609375, -0.0044097900390625, 0.0167694091796875, -0.029266357421875, 0.049896240234375, 0.0285186767578125, 0.027008056640625, -0.00357818603515625, -0.0263214111328125, -0.07647705078125, -0.03350830078125, 0.0145111083984375, 0.0391845703125, -0.05682373046875, 0.034576416015625, -0.031524658203125, -0.064208984375, -0.055206298828125, 0.00888824462890625, 0.0293426513671875, 0.0233917236328125, 0.036376953125, -0.005519866943359375, -0.0230255126953125, -0.0736083984375, 0.0011243820190429688, -0.0168304443359375, 0.0249481201171875, -0.0022945404052734375, 0.0285491943359375, -0.041748046875, 0.07366943359375, -0.020843505859375, -0.0273590087890625, -0.0306243896484375, -0.00649261474609375, 0.03289794921875, 0.01275634765625, 0.038818359375, -0.041473388671875, -0.01885986328125, -0.0030117034912109375, -0.058868408203125, -0.029632568359375, 0.003021240234375, -0.01355743408203125, 0.0062103271484375, 0.0478515625, -0.06146240234375, 0.03533935546875, 0.04852294921875, -0.0232696533203125, 0.031463623046875, -0.0124969482421875, -0.0101776123046875, -0.10980224609375, 0.0158233642578125, -0.008026123046875, -0.01483154296875, -0.0443115234375, -0.01149749755859375, 0.007068634033203125, 0.0055999755859375, -0.051025390625, 0.06781005859375, -0.0258331298828125, 0.03179931640625, -0.035186767578125, -0.025360107421875, -0.0027179718017578125, 0.034759521484375, -0.00939178466796875, 0.049560546875, 0.0582275390625, -0.04022216796875, 0.03955078125, 0.020050048828125, -0.006641387939453125, 0.0245361328125, -0.07550048828125, 0.0004639625549316406, -0.03692626953125, 0.03497314453125, -0.05621337890625, -0.00640869140625, 0.0205535888671875, -0.0204925537109375, 0.03143310546875, 0.00521087646484375, -0.0311279296875, -0.026092529296875, -0.015777587890625, 0.045867919921875, 0.044647216796875, -0.050262451171875, 0.059478759765625, -0.001216888427734375, 0.01097869873046875, -0.05218505859375, -0.048065185546875, -0.00829315185546875, -0.020416259765625, -0.029205322265625, 0.0236053466796875, 0.0004668235778808594, -0.01097869873046875, 0.0038089752197265625, -0.02435302734375, -0.0090789794921875, -0.0008382797241210938, 0.050628662109375, 0.024505615234375, -0.0110931396484375, -0.0014123916625976562, 0.015716552734375, -0.031585693359375, -0.0021266937255859375, 0.0187530517578125, 0.05670166015625, -0.0106048583984375, -0.0186004638671875, -0.038726806640625, -0.011444091796875, 0.06439208984375, -0.017486572265625, 0.06585693359375, 0.06341552734375, -0.0233306884765625, -0.0193634033203125, -0.056427001953125, 0.00634002685546875, -0.0350341796875, 0.039642333984375, -0.029083251953125, -0.0673828125, 0.052734375, -0.00836181640625, 0.0026760101318359375, 0.06512451171875, 0.062103271484375, 0.027496337890625, 0.04205322265625, 0.03802490234375, -0.027740478515625, 0.043975830078125, -0.034637451171875, 0.01142120361328125, -0.055572509765625, -0.0252532958984375, -0.034454345703125, -0.0054168701171875, -0.0245208740234375, -0.04095458984375, 0.0272369384765625, 0.032867431640625, -0.032470703125, 0.02471923828125, -0.034912109375, 0.01554107666015625, 0.04046630859375, -0.005748748779296875, 0.019622802734375, -0.0023899078369140625, 0.0007586479187011719, 0.01540374755859375, -0.038726806640625, -0.048248291015625, 0.0767822265625, 0.033721923828125, 0.09259033203125, 0.013397216796875, 0.05169677734375, 0.006366729736328125, 0.047149658203125, -0.044677734375, 0.03460693359375, 0.00324249267578125, -0.06829833984375, -0.00402069091796875, -0.0262298583984375, -0.068359375, 0.043426513671875, -0.0062255859375, -0.037750244140625, 0.01560211181640625, 0.0137176513671875, -0.01404571533203125, 0.01378631591796875, -0.046722412109375, 0.06903076171875, -0.01514434814453125, -0.0037441253662109375, 0.003509521484375, -0.0304718017578125, 0.03497314453125, -0.0265045166015625, -0.0017185211181640625, 0.0066070556640625, -0.0011129379272460938, 0.05206298828125, -0.0205535888671875, 0.057647705078125, -0.0120697021484375, -0.0129852294921875, 0.0260162353515625, -0.02117919921875, 0.01593017578125, -0.0014371871948242188, -0.01488494873046875, 0.038330078125, 0.019012451171875, -0.01322174072265625, -0.03424072265625, 0.045318603515625, -0.0965576171875, -0.046112060546875, -0.044281005859375, -0.01311492919921875, 0.0088958740234375, 0.0226593017578125, 0.0634765625, 0.033905029296875, -0.01873779296875, 0.0216064453125, 0.044921875, -0.00766754150390625, 0.04052734375, 0.0271759033203125, -0.0295867919921875, -0.047607421875, 0.056549072265625, 0.005893707275390625, 0.0161590576171875, 0.0252685546875, 0.00009059906005859375, -0.030609130859375, -0.0307769775390625, -0.0154571533203125, 0.03131103515625, -0.055389404296875, -0.0335693359375, -0.047210693359375, -0.05084228515625, -0.025238037109375, 0.0141448974609375, -0.040283203125, 0.002269744873046875, -0.035491943359375, -0.007442474365234375, 0.016937255859375, 0.05999755859375, 0.0203094482421875, 0.0679931640625, -0.070556640625, 0.028076171875, -0.0013475418090820312, 0.01206207275390625, 0.0295562744140625, -0.07598876953125, -0.0169219970703125, -0.0184173583984375, -0.050384521484375, -0.06622314453125, 0.03350830078125, -0.00666046142578125, 0.06146240234375, 0.045166015625, -0.00103759765625, 0.061248779296875, -0.038238525390625, 0.05963134765625, 0.0209808349609375, -0.06683349609375, 0.004055023193359375, -0.007965087890625, 0.028350830078125, 0.03631591796875, 0.017578125, -0.033447265625, -0.026458740234375, -0.06365966796875, -0.0653076171875, 0.04168701171875, 0.01120758056640625, 0.014190673828125, 0.0026378631591796875, 0.0284271240234375, 0.00803375244140625, 0.024871826171875, -0.073486328125, -0.0523681640625, -0.021820068359375, -0.0277862548828125, -0.005092620849609375, -0.031707763671875, 0.009979248046875, -0.01470184326171875, 0.057647705078125, 0.018096923828125, 0.018890380859375, 0.020599365234375, -0.00939178466796875, -0.006744384765625, 0.0080108642578125, 0.0440673828125, 0.048309326171875, -0.00838470458984375, 0.0008654594421386719, 0.0208740234375, -0.057098388671875, 0.007781982421875, 0.02679443359375, -0.01018524169921875, 0.01183319091796875, 0.049591064453125, 0.08782958984375, -0.0019006729125976562, -0.041259765625, 0.042236328125, -0.015838623046875, -0.014190673828125, -0.035797119140625, 0.0038166046142578125, 0.006793975830078125, 0.034881591796875, 0.03485107421875, 0.005268096923828125, 0.0002551078796386719, -0.004425048828125, -0.00962066650390625, 0.026641845703125, -0.035186767578125, -0.036834716796875, 0.05548095703125, 0.001190185546875, -0.0150299072265625, 0.039276123046875, -0.00458526611328125, -0.039031982421875, 0.031707763671875, 0.051971435546875, 0.06201171875, -0.031402587890625, -0.003993988037109375, 0.0195770263671875, 0.028564453125, -0.043487548828125, 0.0192413330078125, 0.0161590576171875, -0.0537109375, -0.01453399658203125, -0.061248779296875, -0.0130462646484375, 0.021331787109375, -0.047576904296875, 0.031585693359375, -0.028350830078125, -0.045257568359375, -0.0219879150390625, -0.019378662109375, -0.059356689453125, 0.005645751953125, 0.009307861328125, 0.072998046875, -0.06414794921875, 0.060638427734375, 0.05908203125, -0.0305938720703125, -0.07635498046875, -0.0100250244140625, -0.0262603759765625, -0.062255859375, 0.051788330078125, 0.008331298828125, 0.00409698486328125, 0.0226898193359375, -0.0438232421875, -0.07080078125, 0.09844970703125, 0.041656494140625, -0.04534912109375, -0.01123046875, 0.0088348388671875, 0.050506591796875, 0.01512908935546875, 0.03680419921875, 0.0275726318359375, 0.0290069580078125, 0.0242156982421875, -0.083740234375, -0.0069732666015625, -0.01678466796875, 0.0167388916015625, 0.01256561279296875, -0.0753173828125, 0.06805419921875, 0.00962066650390625, -0.005489349365234375, 0.016265869140625, 0.058746337890625, 0.02703857421875, 0.00873565673828125, 0.02166748046875, 0.0712890625, 0.044647216796875, -0.00661468505859375, 0.0714111328125, -0.032440185546875, 0.037261962890625, 0.0576171875, -0.0093536376953125, 0.058929443359375, 0.04412841796875, -0.01425933837890625, 0.0221710205078125, 0.047821044921875, -0.00484466552734375, 0.030670166015625, 0.0203094482421875, -0.012359619140625, -0.0193328857421875, -0.002910614013671875, -0.051300048828125, 0.01953125, 0.0038585662841796875, -0.03692626953125, -0.020050048828125, -0.0148162841796875, -0.001804351806640625, -0.036041259765625, -0.0311126708984375, 0.049896240234375, 0.037261962890625, -0.0352783203125, 0.05615234375, 0.006031036376953125, 0.050048828125, -0.042938232421875, -0.00933837890625, -0.027008056640625, 0.0279693603515625, -0.01259613037109375, -0.018707275390625, -0.01318359375, -0.00213623046875, -0.021392822265625, 0.003662109375, 0.05218505859375, -0.02056884765625, -0.04931640625, 0.0207366943359375, 0.0523681640625, 0.00437164306640625, -0.012237548828125, -0.050079345703125, 0.02764892578125, -0.01226806640625, -0.0311431884765625, 0.00933837890625, 0.049163818359375, -0.01435089111328125, 0.06463623046875, 0.036041259765625, 0.0054168701171875, 0.037353515625, 0.018890380859375, 0.0667724609375, -0.06634521484375, -0.02471923828125, -0.07659912109375, 0.039794921875, 0.01273345947265625, -0.044281005859375, 0.049713134765625, 0.04486083984375, 0.045501708984375, -0.0214385986328125, 0.053955078125, 0.0161590576171875, 0.0090484619140625, -0.0207061767578125, 0.035797119140625, -0.035308837890625, 0.01010894775390625, -0.00994873046875, -0.09515380859375, -0.007598876953125, 0.048858642578125, -0.008697509765625, 0.0181732177734375, 0.055206298828125, 0.0650634765625, -0.0021114349365234375, -0.02203369140625, 0.030517578125, 0.02557373046875, 0.03857421875, 0.02862548828125, 0.0770263671875, -0.043060302734375, 0.0394287109375, 0.006855010986328125, -0.0191497802734375, -0.006191253662109375, -0.04266357421875, -0.08917236328125, -0.035491943359375, 0.00274658203125, -0.032806396484375, -0.01282501220703125, 0.07159423828125, 0.03594970703125, -0.0377197265625, -0.0185546875, 0.0012454986572265625, -0.006626129150390625, -0.0020904541015625, -0.01287841796875, 0.0234527587890625, -0.0140838623046875, -0.052215576171875, -0.004177093505859375, 0.005336761474609375, 0.016632080078125, -0.0286865234375, -0.005584716796875, 0.0257720947265625, -0.005115509033203125, 0.0309906005859375, -0.004215240478515625, -0.0714111328125, 0.005886077880859375, 0.0033702850341796875, -0.0269775390625, -0.0136566162109375, 0.033111572265625, -0.036224365234375, 0.006317138671875, 0.0273590087890625, 0.05511474609375, 0.0419921875, 0.013763427734375, 0.02130126953125, -0.017913818359375, 0.01432037353515625, -0.0015325546264648438, 0.0251922607421875, 0.00750732421875, -0.00585174560546875, 0.01500701904296875, 0.0198211669921875, -0.039276123046875, -0.05084228515625, -0.01009368896484375, -0.1077880859375, -0.0028228759765625, 0.099365234375, 0.0133514404296875, -0.0285491943359375, 0.01096343994140625, -0.0477294921875, 0.0259857177734375, -0.037261962890625, 0.043426513671875, 0.053924560546875, 0.0169525146484375, -0.01629638671875, -0.031524658203125, 0.03564453125, 0.00830078125, -0.048095703125, -0.00826263427734375, 0.0226593017578125, 0.0166168212890625, 0.01197052001953125, 0.059722900390625, -0.003314971923828125, 0.01160430908203125, -0.0017595291137695312, 0.0215911865234375, -0.01166534423828125, -0.016265869140625, -0.037841796875, -0.0011320114135742188, 0.032470703125, 0.0006012916564941406 ] ]
facebook/wav2vec2-large-robust-ft-swbd-300h
2022-04-05T16:42:51.000Z
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "speech", "audio", "en", "dataset:libri_light", "dataset:common_voice", "dataset:switchboard", "dataset:fisher", "arxiv:2104.01027", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
facebook
null
null
facebook/wav2vec2-large-robust-ft-swbd-300h
14
299,457
transformers
2022-03-02T23:29:05
--- language: en datasets: - libri_light - common_voice - switchboard - fisher tags: - speech - audio - automatic-speech-recognition widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac license: apache-2.0 --- # Wav2Vec2-Large-Robust finetuned on Switchboard [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/). This model is a fine-tuned version of the [wav2vec2-large-robust](https://huggingface.co/facebook/wav2vec2-large-robust) model. It has been pretrained on: - [Libri-Light](https://github.com/facebookresearch/libri-light): open-source audio books from the LibriVox project; clean, read-out audio data - [CommonVoice](https://huggingface.co/datasets/common_voice): crowd-source collected audio data; read-out text snippets - [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data - [Fisher](https://catalog.ldc.upenn.edu/LDC2004T19): conversational telephone speech; noisy telephone data and subsequently been finetuned on 300 hours of - [Switchboard](https://catalog.ldc.upenn.edu/LDC97S62): telephone speech corpus; noisy telephone data When using the model make sure that your speech input is also sampled at 16Khz. [Paper Robust Wav2Vec2](https://arxiv.org/abs/2104.01027) Authors: Wei-Ning Hsu, Anuroop Sriram, Alexei Baevski, Tatiana Likhomanenko, Qiantong Xu, Vineel Pratap, Jacob Kahn, Ann Lee, Ronan Collobert, Gabriel Synnaeve, Michael Auli **Abstract** Self-supervised learning of speech representations has been a very active research area but most work is focused on a single domain such as read audio books for which there exist large quantities of labeled and unlabeled data. In this paper, we explore more general setups where the domain of the unlabeled data for pre-training data differs from the domain of the labeled data for fine-tuning, which in turn may differ from the test data domain. Our experiments show that using target domain data during pre-training leads to large performance improvements across a variety of setups. On a large-scale competitive setup, we show that pre-training on unlabeled in-domain data reduces the gap between models trained on in-domain and out-of-domain labeled data by 66%-73%. This has obvious practical implications since it is much easier to obtain unlabeled target domain data than labeled data. Moreover, we find that pre-training on multiple domains improves generalization performance on domains not seen during training. Code and models will be made available at this https URL. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20. # Usage To transcribe audio files the model can be used as a standalone acoustic model as follows: ```python from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC from datasets import load_dataset import torch # load model and processor processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-robust-ft-swbd-300h") model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-robust-ft-swbd-300h") # load dummy dataset and read soundfiles ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation") # tokenize input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1 # retrieve logits logits = model(input_values).logits # take argmax and decode predicted_ids = torch.argmax(logits, dim=-1) transcription = processor.batch_decode(predicted_ids) ```
3,747
[ [ -0.025482177734375, -0.055206298828125, 0.00342559814453125, 0.00970458984375, -0.015045166015625, -0.01074981689453125, -0.033477783203125, -0.044708251953125, -0.0116424560546875, 0.0276641845703125, -0.045562744140625, -0.04388427734375, -0.04583740234375, -0.0263519287109375, -0.0268707275390625, 0.06683349609375, 0.01849365234375, 0.015838623046875, 0.00243377685546875, -0.01849365234375, -0.04412841796875, -0.0406494140625, -0.054718017578125, -0.03741455078125, 0.0094451904296875, 0.018798828125, 0.03131103515625, 0.0307769775390625, 0.0185699462890625, 0.021728515625, -0.005764007568359375, 0.004535675048828125, -0.04998779296875, 0.0016994476318359375, -0.00728607177734375, -0.0266265869140625, -0.038116455078125, 0.0222625732421875, 0.0462646484375, 0.043792724609375, -0.0258941650390625, 0.03265380859375, 0.0217742919921875, 0.0213775634765625, -0.0274658203125, 0.027679443359375, -0.057281494140625, -0.01450347900390625, -0.0224609375, -0.0101318359375, -0.03485107421875, -0.00010210275650024414, 0.010009765625, -0.04449462890625, 0.01898193359375, -0.0162506103515625, 0.07537841796875, 0.0235137939453125, -0.02740478515625, -0.019378662109375, -0.0728759765625, 0.0750732421875, -0.048919677734375, 0.06427001953125, 0.05108642578125, 0.00678253173828125, -0.00505828857421875, -0.07440185546875, -0.0274505615234375, 0.00341796875, 0.02972412109375, 0.03277587890625, -0.03753662109375, 0.01019287109375, 0.0160369873046875, 0.00921630859375, -0.040618896484375, 0.0242462158203125, -0.044219970703125, -0.039642333984375, 0.050872802734375, -0.0238037109375, -0.003620147705078125, 0.0117340087890625, -0.0187530517578125, -0.032867431640625, -0.035125732421875, 0.0294036865234375, 0.03521728515625, 0.031768798828125, -0.0205230712890625, 0.0234375, 0.0014324188232421875, 0.051513671875, 0.01428985595703125, -0.0262908935546875, 0.04718017578125, -0.02667236328125, -0.0011148452758789062, 0.0172576904296875, 0.0555419921875, 0.006458282470703125, 0.03057861328125, 0.0029506683349609375, -0.0155792236328125, 0.005950927734375, -0.0050048828125, -0.061676025390625, -0.0401611328125, 0.0250701904296875, -0.03692626953125, -0.00511932373046875, 0.002532958984375, -0.0279541015625, 0.00782012939453125, -0.051239013671875, 0.0645751953125, -0.028900146484375, -0.025146484375, 0.01322174072265625, -0.0016231536865234375, 0.01873779296875, 0.00876617431640625, -0.070068359375, 0.0280609130859375, 0.0284576416015625, 0.0572509765625, -0.009185791015625, -0.00936126708984375, -0.04998779296875, -0.00453948974609375, -0.0194091796875, 0.043212890625, -0.018890380859375, -0.035919189453125, -0.01457977294921875, -0.0021686553955078125, 0.0010738372802734375, -0.058319091796875, 0.050994873046875, -0.028472900390625, 0.01312255859375, 0.0050048828125, -0.0382080078125, -0.00457000732421875, -0.0297393798828125, -0.044921875, 0.08331298828125, -0.0017547607421875, -0.043792724609375, 0.0276641845703125, -0.0279541015625, -0.04791259765625, -0.0290069580078125, -0.00463104248046875, -0.04718017578125, -0.01282501220703125, 0.01531219482421875, 0.041168212890625, -0.005283355712890625, 0.0165252685546875, -0.019317626953125, -0.034027099609375, 0.0181121826171875, -0.043548583984375, 0.068115234375, 0.0300140380859375, -0.0333251953125, 0.004302978515625, -0.07440185546875, 0.0169219970703125, -0.00418853759765625, -0.042999267578125, -0.0032405853271484375, -0.004405975341796875, 0.0224151611328125, 0.010009765625, 0.00475311279296875, -0.0350341796875, -0.012786865234375, -0.03485107421875, 0.042083740234375, 0.047943115234375, -0.008026123046875, 0.0231170654296875, -0.0230560302734375, 0.01233673095703125, -0.00335693359375, 0.0012388229370117188, 0.009521484375, -0.037261962890625, -0.032989501953125, -0.03070068359375, 0.0496826171875, 0.039276123046875, -0.0207061767578125, 0.045501708984375, 0.0006194114685058594, -0.043609619140625, -0.06658935546875, 0.00916290283203125, 0.0239410400390625, 0.037872314453125, 0.055084228515625, -0.020477294921875, -0.05206298828125, -0.0452880859375, -0.0125732421875, 0.0034084320068359375, -0.01537322998046875, 0.031707763671875, 0.01221466064453125, -0.03863525390625, 0.059295654296875, -0.0172271728515625, -0.037139892578125, -0.006496429443359375, 0.0126495361328125, 0.028839111328125, 0.0538330078125, 0.02294921875, -0.059051513671875, -0.01837158203125, -0.0238800048828125, -0.023681640625, -0.0138397216796875, -0.00444793701171875, -0.00011837482452392578, 0.004627227783203125, 0.042510986328125, -0.03289794921875, 0.02008056640625, 0.04364013671875, -0.0019006729125976562, 0.034759521484375, -0.00855255126953125, -0.005588531494140625, -0.07659912109375, -0.0022869110107421875, 0.00225830078125, -0.020355224609375, -0.039886474609375, -0.04437255859375, -0.0093536376953125, -0.0026950836181640625, -0.03076171875, 0.0247650146484375, -0.03912353515625, -0.00997161865234375, -0.01076507568359375, 0.0167388916015625, -0.0108795166015625, 0.0382080078125, 0.01184844970703125, 0.05926513671875, 0.053253173828125, -0.0548095703125, 0.0299224853515625, 0.0254364013671875, -0.03369140625, 0.009552001953125, -0.0623779296875, 0.0270843505859375, 0.01277923583984375, 0.0226593017578125, -0.08343505859375, -0.007495880126953125, -0.005069732666015625, -0.0592041015625, 0.04388427734375, -0.0199127197265625, -0.0218963623046875, -0.0222625732421875, -0.0002512931823730469, 0.044464111328125, 0.060089111328125, -0.058624267578125, 0.026214599609375, 0.051910400390625, 0.0006546974182128906, -0.037628173828125, -0.063720703125, -0.032684326171875, -0.01149749755859375, -0.046722412109375, 0.025970458984375, -0.0032138824462890625, -0.004421234130859375, -0.0182037353515625, -0.0276641845703125, -0.00823211669921875, -0.010498046875, 0.03082275390625, 0.0165863037109375, -0.0157470703125, 0.025665283203125, -0.010711669921875, -0.00672149658203125, 0.006656646728515625, -0.04620361328125, 0.04693603515625, -0.00524139404296875, -0.01123046875, -0.05682373046875, 0.006561279296875, 0.0158843994140625, -0.027252197265625, 0.01812744140625, 0.076416015625, -0.02813720703125, -0.0283050537109375, -0.039794921875, -0.0186004638671875, -0.038970947265625, 0.047515869140625, -0.0225372314453125, -0.06396484375, 0.0250091552734375, -0.0175628662109375, -0.0009832382202148438, 0.045501708984375, 0.0487060546875, -0.0204620361328125, 0.0687255859375, 0.0311126708984375, -0.0094451904296875, 0.046295166015625, -0.045928955078125, 0.0027065277099609375, -0.05206298828125, -0.02996826171875, -0.042572021484375, -0.018157958984375, -0.042083740234375, -0.041168212890625, 0.009979248046875, 0.008819580078125, -0.01165771484375, 0.0303802490234375, -0.048553466796875, 0.0170745849609375, 0.052490234375, 0.01354217529296875, 0.0022907257080078125, 0.0155029296875, 0.0007843971252441406, -0.007106781005859375, -0.05291748046875, -0.02099609375, 0.10333251953125, 0.03814697265625, 0.07440185546875, -0.00803375244140625, 0.0567626953125, 0.0177154541015625, -0.0226593017578125, -0.06964111328125, 0.0162811279296875, -0.01340484619140625, -0.04742431640625, -0.037750244140625, -0.03314208984375, -0.09259033203125, 0.0174713134765625, -0.03125, -0.057647705078125, 0.0268402099609375, 0.00916290283203125, -0.032440185546875, 0.012664794921875, -0.062469482421875, 0.052978515625, -0.01197052001953125, -0.02313232421875, -0.0224609375, -0.045135498046875, 0.00733184814453125, -0.016326904296875, 0.0196380615234375, -0.0018663406372070312, 0.03558349609375, 0.09979248046875, -0.021728515625, 0.054656982421875, -0.0245819091796875, -0.0086822509765625, 0.053466796875, -0.01959228515625, 0.03729248046875, 0.0030364990234375, -0.0038661956787109375, 0.028961181640625, 0.016845703125, -0.0259857177734375, -0.0203704833984375, 0.04296875, -0.0906982421875, -0.01837158203125, -0.0020904541015625, -0.028350830078125, -0.0400390625, 0.013641357421875, 0.056365966796875, 0.06842041015625, -0.013641357421875, 0.04022216796875, 0.06927490234375, 0.0015106201171875, 0.025177001953125, 0.0225677490234375, 0.00958251953125, -0.02508544921875, 0.06573486328125, 0.031707763671875, 0.01885986328125, 0.00928497314453125, 0.00775146484375, -0.04840087890625, -0.053741455078125, 0.0069732666015625, 0.00931549072265625, -0.04443359375, -0.01308441162109375, -0.048980712890625, -0.02484130859375, -0.052734375, 0.0164947509765625, -0.057769775390625, -0.0318603515625, -0.04620361328125, -0.00904083251953125, 0.0214996337890625, 0.045806884765625, -0.03424072265625, 0.03192138671875, -0.0418701171875, 0.0390625, 0.032562255859375, 0.0172576904296875, -0.01009368896484375, -0.078125, -0.0283203125, 0.02996826171875, -0.01488494873046875, -0.03948974609375, 0.01812744140625, 0.0245208740234375, 0.05145263671875, 0.03704833984375, 0.0071868896484375, 0.047088623046875, -0.0227813720703125, 0.0477294921875, 0.020050048828125, -0.07318115234375, 0.049072265625, -0.0266265869140625, 0.01898193359375, 0.04248046875, 0.0252227783203125, -0.01462554931640625, -0.00806427001953125, -0.061859130859375, -0.06854248046875, 0.062225341796875, 0.0232696533203125, 0.0084228515625, 0.0305328369140625, 0.0171966552734375, 0.001216888427734375, 0.003894805908203125, -0.050140380859375, -0.0262451171875, -0.030426025390625, -0.0110931396484375, -0.02386474609375, -0.01806640625, -0.01213836669921875, -0.042083740234375, 0.062469482421875, 0.01256561279296875, 0.03228759765625, 0.01953125, -0.0095062255859375, -0.0026264190673828125, 0.00860595703125, 0.03350830078125, 0.0283050537109375, -0.02789306640625, 0.01305389404296875, 0.0182952880859375, -0.057586669921875, 0.01177215576171875, 0.01494598388671875, 0.0223388671875, 0.00556182861328125, 0.048675537109375, 0.07928466796875, 0.00945281982421875, -0.030548095703125, 0.042388916015625, 0.0014562606811523438, -0.040924072265625, -0.0290374755859375, 0.0120849609375, 0.009552001953125, 0.0287628173828125, 0.022430419921875, 0.0195465087890625, 0.0094451904296875, -0.04022216796875, 0.03656005859375, 0.01959228515625, -0.035919189453125, -0.026702880859375, 0.0701904296875, 0.02117919921875, -0.017578125, 0.04876708984375, 0.003002166748046875, -0.0109100341796875, 0.040283203125, 0.050201416015625, 0.054443359375, -0.029296875, -0.0104827880859375, 0.0362548828125, 0.007633209228515625, -0.0167083740234375, 0.0191650390625, -0.021728515625, -0.06103515625, -0.0281829833984375, -0.0384521484375, -0.0151519775390625, 0.0285186767578125, -0.058990478515625, 0.0219268798828125, -0.03314208984375, -0.04168701171875, 0.00817108154296875, 0.01424407958984375, -0.06304931640625, 0.0267791748046875, 0.016326904296875, 0.050384521484375, -0.059356689453125, 0.0914306640625, 0.025390625, -0.014495849609375, -0.100341796875, -0.0020809173583984375, -0.00554656982421875, -0.053863525390625, 0.03741455078125, 0.004825592041015625, -0.0286407470703125, 0.016571044921875, -0.0528564453125, -0.074951171875, 0.08087158203125, 0.0233612060546875, -0.0650634765625, 0.0167694091796875, -0.0030574798583984375, 0.033905029296875, -0.01490020751953125, 0.0023517608642578125, 0.046417236328125, 0.032470703125, 0.012664794921875, -0.08221435546875, -0.016387939453125, -0.0124053955078125, -0.029052734375, -0.01910400390625, -0.054351806640625, 0.0673828125, -0.0210113525390625, -0.0118408203125, -0.01103973388671875, 0.0592041015625, 0.01538848876953125, 0.0219268798828125, 0.04193115234375, 0.034149169921875, 0.08770751953125, -0.01384735107421875, 0.04833984375, -0.0005164146423339844, 0.034210205078125, 0.0909423828125, 0.0049285888671875, 0.0653076171875, 0.01476287841796875, -0.028594970703125, 0.01082611083984375, 0.051910400390625, -0.0175628662109375, 0.0599365234375, 0.0165252685546875, -0.0048980712890625, -0.03228759765625, 0.01068878173828125, -0.054412841796875, 0.06683349609375, 0.016326904296875, 0.0003218650817871094, 0.01140594482421875, 0.01297760009765625, -0.007389068603515625, -0.01140594482421875, -0.0154266357421875, 0.051300048828125, 0.020050048828125, -0.0196533203125, 0.07391357421875, -0.0038585662841796875, 0.06634521484375, -0.049896240234375, 0.01091766357421875, 0.00893402099609375, 0.025054931640625, -0.0243682861328125, -0.042266845703125, 0.0038852691650390625, -0.0162506103515625, -0.0180206298828125, -0.013519287109375, 0.053741455078125, -0.0511474609375, -0.02032470703125, 0.04217529296875, 0.0201416015625, 0.019012451171875, -0.007904052734375, -0.049163818359375, 0.014984130859375, 0.01364898681640625, -0.0269622802734375, 0.014312744140625, 0.028106689453125, 0.008209228515625, 0.0294647216796875, 0.059967041015625, 0.012298583984375, 0.0065155029296875, 0.01483917236328125, 0.055145263671875, -0.033782958984375, -0.04119873046875, -0.03009033203125, 0.024993896484375, 0.00897979736328125, -0.02703857421875, 0.04498291015625, 0.052398681640625, 0.0849609375, -0.000606536865234375, 0.05145263671875, 0.0222320556640625, 0.05242919921875, -0.049774169921875, 0.06573486328125, -0.035675048828125, 0.0110015869140625, -0.0240325927734375, -0.0660400390625, -0.00981903076171875, 0.05865478515625, -0.008453369140625, 0.0229644775390625, 0.0211944580078125, 0.0611572265625, -0.014556884765625, -0.00433349609375, 0.0292205810546875, 0.0277862548828125, 0.01433563232421875, 0.029327392578125, 0.055084228515625, -0.04754638671875, 0.05743408203125, -0.01039886474609375, -0.019012451171875, -0.01023101806640625, -0.0205535888671875, -0.072509765625, -0.058929443359375, -0.035369873046875, -0.038116455078125, 0.004032135009765625, 0.0689697265625, 0.084716796875, -0.07366943359375, -0.0294342041015625, 0.01529693603515625, -0.01116943359375, -0.0203704833984375, -0.01385498046875, 0.039276123046875, -0.01161956787109375, -0.052490234375, 0.05322265625, -0.019287109375, 0.0188751220703125, 0.0177154541015625, -0.0172271728515625, -0.013580322265625, 0.01032257080078125, 0.036407470703125, 0.028228759765625, -0.036285400390625, -0.0183563232421875, -0.01788330078125, -0.0000451207160949707, 0.01334381103515625, 0.03973388671875, -0.05816650390625, 0.035552978515625, 0.0186614990234375, 0.0274200439453125, 0.0726318359375, 0.0010232925415039062, 0.0189361572265625, -0.0538330078125, 0.03192138671875, 0.019378662109375, 0.021759033203125, 0.0124664306640625, -0.01739501953125, 0.0220947265625, 0.027191162109375, -0.049041748046875, -0.060821533203125, -0.007106781005859375, -0.1005859375, -0.0229949951171875, 0.1075439453125, 0.01055908203125, 0.0013475418090820312, -0.011566162109375, -0.0228729248046875, 0.052215576171875, -0.04046630859375, 0.031463623046875, 0.038818359375, -0.006542205810546875, 0.0092010498046875, -0.0443115234375, 0.05535888671875, 0.0250244140625, -0.0293731689453125, 0.0025501251220703125, 0.030670166015625, 0.03973388671875, -0.002559661865234375, 0.0633544921875, -0.01221466064453125, 0.0113983154296875, 0.0120849609375, 0.01287841796875, -0.0175323486328125, -0.02703857421875, -0.0350341796875, 0.005474090576171875, -0.005573272705078125, -0.042022705078125 ] ]
philschmid/bart-large-cnn-samsum
2022-12-23T19:48:57.000Z
[ "transformers", "pytorch", "bart", "text2text-generation", "sagemaker", "summarization", "en", "dataset:samsum", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
summarization
philschmid
null
null
philschmid/bart-large-cnn-samsum
206
297,961
transformers
2022-03-02T23:29:05
--- language: en license: mit tags: - sagemaker - bart - summarization datasets: - samsum widget: - text: "Jeff: Can I train a \U0001F917 Transformers model on Amazon SageMaker? \n\ Philipp: Sure you can use the new Hugging Face Deep Learning Container. \nJeff:\ \ ok.\nJeff: and how can I get started? \nJeff: where can I find documentation?\ \ \nPhilipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face\n" model-index: - name: bart-large-cnn-samsum results: - task: type: summarization name: Summarization dataset: name: 'SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization' type: samsum metrics: - type: rogue-1 value: 42.621 name: Validation ROGUE-1 - type: rogue-2 value: 21.9825 name: Validation ROGUE-2 - type: rogue-l value: 33.034 name: Validation ROGUE-L - type: rogue-1 value: 41.3174 name: Test ROGUE-1 - type: rogue-2 value: 20.8716 name: Test ROGUE-2 - type: rogue-l value: 32.1337 name: Test ROGUE-L - task: type: summarization name: Summarization dataset: name: samsum type: samsum config: samsum split: test metrics: - type: rouge value: 41.3282 name: ROUGE-1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTYzNzZkZDUzOWQzNGYxYTJhNGE4YWYyZjA0NzMyOWUzMDNhMmVhYzY1YTM0ZTJhYjliNGE4MDZhMjhhYjRkYSIsInZlcnNpb24iOjF9.OOM6l3v5rJCndmUIJV-2SDh2NjbPo5IgQOSL-Ju1Gwbi1voL5amsDEDOelaqlUBE3n55KkUsMLZhyn66yWxZBQ - type: rouge value: 20.8755 name: ROUGE-2 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWZiODFiYWQzY2NmOTc5YjA3NTI0YzQ1MzQ0ODk2NjgyMmVlMjA5MjZiNTJkMGRmZGEzN2M3MDNkMjkxMDVhYSIsInZlcnNpb24iOjF9.b8cPk2-IL24La3Vd0hhtii4tRXujh5urAwy6IVeTWHwYfXaURyC2CcQOWtlOx5bdO5KACeaJFrFBCGgjk-VGCQ - type: rouge value: 32.1353 name: ROUGE-L verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYWNmYzdiYWQ2ZWRkYzRiMGMxNWUwODgwZTdkY2NjZTc1NWE5NTFiMzU0OTU1N2JjN2ExYWQ2NGZkNjk5OTc4YSIsInZlcnNpb24iOjF9.Fzv4p-TEVicljiCqsBJHK1GsnE_AwGqamVmxTPI0WBNSIhZEhliRGmIL_z1pDq6WOzv3GN2YUGvhowU7GxnyAQ - type: rouge value: 38.401 name: ROUGE-LSUM verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGI4MWY0NWMxMmQ0ODQ5MDhiNDczMDAzYzJkODBiMzgzYWNkMWM2YTZkZDJmNWJiOGQ3MmNjMGViN2UzYWI2ZSIsInZlcnNpb24iOjF9.7lw3h5k5lJ7tYFLZGUtLyDabFYd00l6ByhmvkW4fykocBy9Blyin4tdw4Xps4DW-pmrdMLgidHxBWz5MrSx1Bw - type: loss value: 1.4297215938568115 name: loss verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzI0ZWNhNDM5YTViZDMyZGJjMDA1ZWFjYzNhOTdlOTFiNzhhMDBjNmM2MjA3ZmRkZjJjMjEyMGY3MzcwOTI2NyIsInZlcnNpb24iOjF9.oNaZsAtUDqGAqoZWJavlcW7PKx1AWsnkbhaQxadpOKk_u7ywJJabvTtzyx_DwEgZslgDETCf4MM-JKitZKjiDA - type: gen_len value: 60.0757 name: gen_len verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTgwYWYwMDRkNTJkMDM5N2I2MWNmYzQ3OWM1NDJmODUyZGViMGE4ZTdkNmIwYWM2N2VjZDNmN2RiMDE4YTYyYiIsInZlcnNpb24iOjF9.PbXTcNYX_SW-BuRQEcqyc21M7uKrOMbffQSAK6k2GLzTVRrzZxsDC57ktKL68zRY8fSiRGsnknOwv-nAR6YBCQ --- ## `bart-large-cnn-samsum` > If you want to use the model you should try a newer fine-tuned FLAN-T5 version [philschmid/flan-t5-base-samsum](https://huggingface.co/philschmid/flan-t5-base-samsum) out socring the BART version with `+6` on `ROGUE1` achieving `47.24`. # TRY [philschmid/flan-t5-base-samsum](https://huggingface.co/philschmid/flan-t5-base-samsum) This model was trained using Amazon SageMaker and the new Hugging Face Deep Learning container. For more information look at: - [🤗 Transformers Documentation: Amazon SageMaker](https://huggingface.co/transformers/sagemaker.html) - [Example Notebooks](https://github.com/huggingface/notebooks/tree/master/sagemaker) - [Amazon SageMaker documentation for Hugging Face](https://docs.aws.amazon.com/sagemaker/latest/dg/hugging-face.html) - [Python SDK SageMaker documentation for Hugging Face](https://sagemaker.readthedocs.io/en/stable/frameworks/huggingface/index.html) - [Deep Learning Container](https://github.com/aws/deep-learning-containers/blob/master/available_images.md#huggingface-training-containers) ## Hyperparameters ```json { "dataset_name": "samsum", "do_eval": true, "do_predict": true, "do_train": true, "fp16": true, "learning_rate": 5e-05, "model_name_or_path": "facebook/bart-large-cnn", "num_train_epochs": 3, "output_dir": "/opt/ml/model", "per_device_eval_batch_size": 4, "per_device_train_batch_size": 4, "predict_with_generate": true, "seed": 7 } ``` ## Usage ```python from transformers import pipeline summarizer = pipeline("summarization", model="philschmid/bart-large-cnn-samsum") conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? Philipp: Sure you can use the new Hugging Face Deep Learning Container. Jeff: ok. Jeff: and how can I get started? Jeff: where can I find documentation? Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face ''' summarizer(conversation) ``` ## Results | key | value | | --- | ----- | | eval_rouge1 | 42.621 | | eval_rouge2 | 21.9825 | | eval_rougeL | 33.034 | | eval_rougeLsum | 39.6783 | | test_rouge1 | 41.3174 | | test_rouge2 | 20.8716 | | test_rougeL | 32.1337 | | test_rougeLsum | 38.4149 |
5,698
[ [ -0.05419921875, -0.04779052734375, 0.0081024169921875, 0.016204833984375, -0.01468658447265625, -0.0006375312805175781, -0.0223236083984375, -0.031524658203125, 0.02362060546875, 0.029815673828125, -0.0655517578125, -0.037628173828125, -0.056915283203125, -0.01739501953125, 0.0008783340454101562, 0.10125732421875, 0.00563812255859375, 0.019805908203125, -0.003940582275390625, -0.0230865478515625, -0.00897979736328125, -0.0240020751953125, -0.06329345703125, -0.032928466796875, 0.02545166015625, 0.023773193359375, 0.059539794921875, 0.01947021484375, 0.04083251953125, 0.0306854248046875, -0.028076171875, -0.0018949508666992188, -0.03887939453125, -0.0169677734375, 0.00814056396484375, -0.026092529296875, -0.062042236328125, 0.01088714599609375, 0.041259765625, 0.034576416015625, -0.0012025833129882812, 0.022308349609375, -0.0101318359375, 0.053924560546875, -0.01488494873046875, 0.0311279296875, -0.0206298828125, 0.020538330078125, 0.0019407272338867188, 0.027740478515625, -0.007465362548828125, -0.0221405029296875, 0.032501220703125, -0.037872314453125, 0.025604248046875, -0.004791259765625, 0.09912109375, 0.0288238525390625, -0.0181884765625, 0.00914764404296875, -0.029022216796875, 0.0572509765625, -0.051971435546875, 0.0232696533203125, 0.0263824462890625, 0.048492431640625, 0.001903533935546875, -0.059814453125, -0.046661376953125, -0.016998291015625, -0.0126800537109375, 0.00086212158203125, -0.0227203369140625, 0.005199432373046875, 0.036956787109375, 0.04510498046875, -0.0416259765625, -0.01110076904296875, -0.0289306640625, -0.01235198974609375, 0.051177978515625, 0.0001271963119506836, 0.004283905029296875, -0.00852203369140625, -0.037750244140625, -0.05133056640625, -0.016387939453125, 0.0204315185546875, 0.0037479400634765625, 0.01904296875, -0.038665771484375, 0.020294189453125, -0.0249176025390625, 0.0247802734375, 0.024993896484375, -0.0121612548828125, 0.06097412109375, 0.0013017654418945312, -0.033935546875, -0.0111541748046875, 0.07476806640625, 0.0221710205078125, 0.02032470703125, 0.024993896484375, -0.01499176025390625, -0.0173797607421875, 0.02032470703125, -0.093505859375, -0.0276336669921875, 0.0185546875, -0.0296478271484375, -0.03594970703125, 0.006916046142578125, -0.053924560546875, -0.01528167724609375, -0.0188446044921875, 0.02520751953125, -0.033782958984375, -0.0305938720703125, -0.0008721351623535156, -0.0045318603515625, 0.0406494140625, 0.0290679931640625, -0.060516357421875, 0.0290679931640625, 0.04547119140625, 0.05694580078125, 0.024505615234375, -0.02685546875, -0.038299560546875, -0.0286865234375, -0.0145721435546875, 0.033782958984375, -0.0006632804870605469, -0.012939453125, -0.01003265380859375, 0.032135009765625, 0.0021820068359375, -0.035369873046875, 0.0472412109375, -0.029327392578125, 0.040313720703125, -0.0211029052734375, -0.0313720703125, -0.04278564453125, 0.0142059326171875, -0.0482177734375, 0.07061767578125, 0.039337158203125, -0.0440673828125, 0.01538848876953125, -0.043975830078125, -0.0275726318359375, -0.002979278564453125, 0.00031495094299316406, -0.085693359375, -0.0013370513916015625, 0.0146026611328125, 0.0634765625, -0.0124664306640625, 0.0237579345703125, -0.038421630859375, -0.015777587890625, 0.01560211181640625, -0.010528564453125, 0.08441162109375, 0.005184173583984375, -0.0411376953125, 0.021148681640625, -0.04443359375, -0.00745391845703125, 0.04046630859375, 0.0003371238708496094, 0.0165557861328125, -0.039642333984375, 0.0196380615234375, 0.00920867919921875, 0.0146636962890625, -0.030303955078125, 0.0212554931640625, -0.007633209228515625, 0.017822265625, 0.0416259765625, -0.0092926025390625, 0.02337646484375, -0.03387451171875, 0.0304412841796875, 0.0133056640625, 0.01177215576171875, -0.014129638671875, -0.038787841796875, -0.069091796875, -0.021087646484375, 0.0102691650390625, 0.0294189453125, -0.044342041015625, 0.052276611328125, -0.024261474609375, -0.05712890625, -0.041534423828125, 0.0124969482421875, 0.039520263671875, 0.045684814453125, 0.040985107421875, -0.0306396484375, -0.031524658203125, -0.07379150390625, -0.0006871223449707031, -0.00028896331787109375, -0.00397491455078125, 0.029632568359375, 0.043670654296875, -0.0283050537109375, 0.04400634765625, -0.034820556640625, -0.041534423828125, -0.0184173583984375, 0.01149749755859375, 0.01239013671875, 0.045013427734375, 0.058197021484375, -0.049407958984375, -0.0302276611328125, -0.005462646484375, -0.060516357421875, 0.017578125, -0.0089111328125, -0.01560211181640625, 0.0228118896484375, 0.0156097412109375, -0.06927490234375, 0.0287017822265625, 0.0386962890625, -0.0296173095703125, 0.0396728515625, 0.011749267578125, -0.0050048828125, -0.087158203125, 0.007442474365234375, 0.031829833984375, -0.038604736328125, -0.040008544921875, 0.0078277587890625, 0.005825042724609375, -0.016937255859375, -0.039794921875, 0.037811279296875, -0.01280975341796875, -0.0022678375244140625, -0.0162506103515625, -0.01070404052734375, 0.00911712646484375, 0.0609130859375, 0.005329132080078125, 0.0335693359375, 0.0474853515625, -0.037994384765625, 0.0499267578125, 0.04461669921875, -0.01904296875, 0.04327392578125, -0.053955078125, -0.007579803466796875, 0.0019025802612304688, 0.054779052734375, -0.046875, -0.037628173828125, 0.04248046875, -0.050811767578125, 0.049835205078125, -0.00817108154296875, -0.0171966552734375, -0.04302978515625, -0.0199432373046875, 0.0212249755859375, 0.06353759765625, -0.05926513671875, 0.04248046875, 0.01499176025390625, 0.0035552978515625, -0.038848876953125, -0.053253173828125, -0.001068115234375, -0.0267791748046875, -0.047943115234375, 0.0262298583984375, -0.002063751220703125, 0.0034046173095703125, 0.0011987686157226562, 0.00443267822265625, 0.0085296630859375, 0.002429962158203125, 0.01031494140625, 0.02496337890625, -0.03167724609375, -0.00958251953125, -0.0033855438232421875, -0.033050537109375, 0.0185546875, 0.00208282470703125, 0.029693603515625, -0.06060791015625, -0.0038280487060546875, -0.0628662109375, -0.01380157470703125, 0.043487548828125, -0.01410675048828125, 0.0611572265625, 0.08551025390625, -0.03533935546875, -0.0179290771484375, -0.042236328125, -0.023284912109375, -0.03790283203125, 0.0185089111328125, -0.0255889892578125, -0.047149658203125, 0.047882080078125, -0.0163421630859375, -0.0002758502960205078, 0.048309326171875, 0.037506103515625, -0.0024089813232421875, 0.061370849609375, 0.031341552734375, -0.020294189453125, 0.042755126953125, -0.051177978515625, -0.00829315185546875, -0.06707763671875, -0.0213165283203125, -0.0156097412109375, -0.035552978515625, -0.03717041015625, -0.0235748291015625, 0.0246429443359375, 0.003021240234375, -0.04840087890625, 0.046234130859375, -0.04766845703125, 0.0182342529296875, 0.05419921875, 0.02532958984375, -0.002864837646484375, 0.0012035369873046875, 0.006809234619140625, -0.0008769035339355469, -0.0256500244140625, -0.0161590576171875, 0.06024169921875, 0.051727294921875, 0.040191650390625, 0.0015850067138671875, 0.039276123046875, -0.01290130615234375, 0.021881103515625, -0.06036376953125, 0.04449462890625, 0.005123138427734375, -0.0421142578125, -0.0096893310546875, -0.040863037109375, -0.06011962890625, 0.0200042724609375, -0.0168914794921875, -0.04296875, -0.0018491744995117188, -0.0003998279571533203, -0.0187225341796875, 0.026885986328125, -0.03997802734375, 0.08575439453125, -0.02581787109375, -0.0172119140625, -0.0012998580932617188, -0.07232666015625, 0.043975830078125, 0.01419830322265625, -0.01232147216796875, -0.01531219482421875, 0.0139617919921875, 0.03668212890625, -0.041015625, 0.06939697265625, -0.031494140625, 0.006023406982421875, 0.0199127197265625, -0.007320404052734375, 0.0239105224609375, -0.000766754150390625, 0.0063018798828125, 0.034515380859375, -0.004608154296875, -0.04022216796875, -0.0469970703125, 0.041229248046875, -0.07550048828125, -0.036376953125, -0.04364013671875, -0.01351165771484375, -0.0037689208984375, 0.012969970703125, 0.021759033203125, 0.0162200927734375, -0.021148681640625, 0.0287017822265625, 0.055206298828125, -0.026763916015625, 0.0261383056640625, 0.0102691650390625, -0.024871826171875, -0.04779052734375, 0.06854248046875, -0.02496337890625, 0.0195159912109375, 0.009979248046875, 0.030303955078125, -0.00418853759765625, 0.006439208984375, -0.063232421875, 0.026641845703125, -0.0321044921875, -0.0328369140625, -0.036346435546875, -0.044677734375, -0.026885986328125, -0.0094146728515625, -0.05224609375, -0.0223388671875, -0.028076171875, -0.006435394287109375, 0.051971435546875, 0.0276641845703125, 0.0054168701171875, 0.035797119140625, -0.05670166015625, 0.02294921875, 0.01544189453125, 0.03485107421875, 0.0046234130859375, -0.050201416015625, -0.0239105224609375, 0.01097869873046875, -0.052276611328125, -0.0438232421875, 0.028472900390625, 0.013275146484375, 0.02410888671875, 0.051483154296875, -0.0016489028930664062, 0.0499267578125, -0.021514892578125, 0.05584716796875, 0.02142333984375, -0.06390380859375, 0.0347900390625, -0.036712646484375, 0.0189666748046875, 0.044097900390625, 0.034912109375, -0.034332275390625, -0.0038661956787109375, -0.0789794921875, -0.035614013671875, 0.06292724609375, 0.01702880859375, 0.01435089111328125, 0.03228759765625, 0.0201568603515625, -0.003070831298828125, 0.02203369140625, -0.0687255859375, -0.0305328369140625, -0.0295867919921875, -0.0087738037109375, -0.014862060546875, 0.01538848876953125, -0.0204925537109375, -0.042449951171875, 0.072021484375, -0.01297760009765625, 0.03094482421875, 0.0091094970703125, 0.0162200927734375, -0.0241546630859375, -0.0277252197265625, 0.030364990234375, 0.0052032470703125, -0.01457977294921875, -0.0227203369140625, 0.02032470703125, -0.0278778076171875, -0.0177459716796875, 0.0360107421875, 0.0009670257568359375, 0.0024166107177734375, 0.02313232421875, 0.09033203125, 0.01995849609375, -0.036102294921875, 0.06103515625, -0.01435089111328125, -0.034332275390625, -0.046661376953125, 0.00543975830078125, 0.01554107666015625, 0.03326416015625, 0.004863739013671875, 0.01538848876953125, -0.0010175704956054688, -0.03436279296875, 0.01678466796875, 0.0452880859375, -0.027587890625, -0.02093505859375, 0.051788330078125, 0.0067291259765625, -0.0270538330078125, 0.06396484375, -0.019775390625, -0.06298828125, 0.0645751953125, 0.0210418701171875, 0.08099365234375, -0.0224456787109375, 0.0258636474609375, 0.045501708984375, 0.00966644287109375, -0.003414154052734375, 0.003055572509765625, -0.00754547119140625, -0.0445556640625, -0.004642486572265625, -0.04119873046875, -0.04132080078125, 0.0230560302734375, -0.0601806640625, 0.043792724609375, -0.043853759765625, -0.007785797119140625, 0.0036983489990234375, 0.0294342041015625, -0.0772705078125, 0.0236358642578125, 0.0161590576171875, 0.053131103515625, -0.05096435546875, 0.042449951171875, 0.050079345703125, -0.044921875, -0.06719970703125, -0.01290130615234375, 0.01485443115234375, -0.067138671875, 0.01422119140625, 0.037384033203125, 0.01296234130859375, -0.0170440673828125, -0.057403564453125, -0.05206298828125, 0.10113525390625, 0.01235198974609375, -0.0256195068359375, 0.01139068603515625, -0.00466156005859375, 0.046966552734375, -0.0260772705078125, 0.0274505615234375, 0.04290771484375, 0.02862548828125, 0.036163330078125, -0.062225341796875, 0.0198822021484375, -0.025604248046875, -0.005096435546875, 0.01154327392578125, -0.089599609375, 0.060394287109375, -0.0249176025390625, 0.01068878173828125, 0.0191192626953125, 0.053558349609375, 0.02581787109375, 0.044891357421875, 0.03204345703125, 0.07293701171875, 0.05145263671875, -0.004863739013671875, 0.073974609375, -0.004291534423828125, 0.04632568359375, 0.0648193359375, 0.00514984130859375, 0.049072265625, -0.0025348663330078125, -0.0238800048828125, 0.059906005859375, 0.06640625, -0.038330078125, 0.0192108154296875, 0.00713348388671875, -0.0235748291015625, -0.004749298095703125, 0.01079559326171875, -0.042694091796875, 0.0262908935546875, 0.03143310546875, -0.03887939453125, 0.0030536651611328125, 0.0016717910766601562, 0.0207977294921875, -0.02703857421875, -0.0036106109619140625, 0.0709228515625, 0.0193023681640625, -0.043670654296875, 0.058197021484375, -0.00337982177734375, 0.03277587890625, -0.035125732421875, 0.00867462158203125, -0.011322021484375, 0.0245361328125, -0.01959228515625, -0.038421630859375, 0.0128631591796875, -0.016204833984375, 0.013885498046875, -0.0178070068359375, 0.0472412109375, -0.027435302734375, -0.04876708984375, 0.01149749755859375, 0.03619384765625, 0.0258941650390625, -0.005336761474609375, -0.08001708984375, 0.0088653564453125, -0.0045166015625, -0.057647705078125, 0.03369140625, 0.033599853515625, 0.025726318359375, 0.059539794921875, 0.042327880859375, -0.02117919921875, -0.0247344970703125, 0.005462646484375, 0.08270263671875, -0.047088623046875, -0.0267791748046875, -0.0440673828125, 0.04864501953125, -0.01776123046875, -0.04278564453125, 0.0406494140625, 0.0343017578125, 0.07391357421875, -0.005023956298828125, 0.043609619140625, -0.0017251968383789062, 0.02716064453125, -0.018310546875, 0.050079345703125, -0.061187744140625, -0.0023555755615234375, -0.035919189453125, -0.07708740234375, -0.02001953125, 0.07757568359375, -0.00798797607421875, 0.034515380859375, 0.04052734375, 0.04791259765625, -0.0181732177734375, 0.0288848876953125, -0.0031375885009765625, 0.019378662109375, 0.0206451416015625, 0.035186767578125, 0.0286102294921875, -0.05517578125, 0.0276641845703125, -0.052154541015625, -0.032501220703125, -0.01499176025390625, -0.0555419921875, -0.08282470703125, -0.0418701171875, -0.03167724609375, -0.052947998046875, -0.0267791748046875, 0.07177734375, 0.07025146484375, -0.047882080078125, -0.005649566650390625, -0.01222991943359375, -0.01044464111328125, -0.025238037109375, -0.021087646484375, 0.0295562744140625, -0.01593017578125, -0.06951904296875, -0.00412750244140625, -0.0208740234375, 0.0243072509765625, -0.033843994140625, -0.0156707763671875, 0.0133819580078125, -0.0293731689453125, 0.0306243896484375, 0.03564453125, -0.050537109375, -0.01535797119140625, -0.00865936279296875, -0.00418853759765625, 0.01451873779296875, 0.0276031494140625, -0.061767578125, 0.018463134765625, 0.0084991455078125, 0.06097412109375, 0.0654296875, 0.01245880126953125, 0.02581787109375, -0.0421142578125, 0.007266998291015625, 0.017547607421875, 0.03253173828125, 0.03961181640625, -0.03326416015625, 0.042449951171875, 0.0202178955078125, -0.06280517578125, -0.03924560546875, 0.00992584228515625, -0.096435546875, -0.00397491455078125, 0.07208251953125, -0.0083160400390625, -0.0243682861328125, 0.039947509765625, -0.0208282470703125, 0.033447265625, -0.03662109375, 0.046539306640625, 0.038604736328125, -0.00955963134765625, -0.02923583984375, -0.036712646484375, 0.04815673828125, 0.033477783203125, -0.047088623046875, -0.01493072509765625, 0.0280609130859375, 0.0233001708984375, 0.0150604248046875, 0.025238037109375, -0.004131317138671875, 0.0166473388671875, 0.00528717041015625, 0.017822265625, -0.01082611083984375, -0.01396942138671875, -0.03955078125, -0.02008056640625, -0.028961181640625, -0.01806640625 ] ]
facebook/detr-resnet-50
2023-10-17T17:18:59.000Z
[ "transformers", "pytorch", "detr", "object-detection", "vision", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
object-detection
facebook
null
null
facebook/detr-resnet-50
325
297,878
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - object-detection - vision datasets: - coco widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg example_title: Savanna - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg example_title: Football Match - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg example_title: Airport --- # DETR (End-to-End Object Detection) model with ResNet-50 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/detr_architecture.png) ## Intended uses & limitations You can use the raw model for object detection. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python from transformers import DetrImageProcessor, DetrForObjectDetection import torch from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) # you can specify the revision tag if you don't want the timm dependency processor = DetrImageProcessor.from_pretrained("facebook/detr-resnet-50", revision="no_timm") model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-50", revision="no_timm") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) # convert outputs (bounding boxes and class logits) to COCO API # let's only keep detections with score > 0.9 target_sizes = torch.tensor([image.size[::-1]]) results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0] for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] print( f"Detected {model.config.id2label[label.item()]} with confidence " f"{round(score.item(), 3)} at location {box}" ) ``` This should output: ``` Detected remote with confidence 0.998 at location [40.16, 70.81, 175.55, 117.98] Detected remote with confidence 0.996 at location [333.24, 72.55, 368.33, 187.66] Detected couch with confidence 0.995 at location [-0.02, 1.15, 639.73, 473.76] Detected cat with confidence 0.999 at location [13.24, 52.05, 314.02, 470.93] Detected cat with confidence 0.999 at location [345.4, 23.85, 640.37, 368.72] ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of **42.0** on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
5,891
[ [ -0.053131103515625, -0.044219970703125, -0.0000947713851928711, -0.0011091232299804688, -0.019744873046875, -0.0047149658203125, -0.0133056640625, -0.054046630859375, 0.007297515869140625, 0.0257415771484375, -0.042724609375, -0.045684814453125, -0.042938232421875, 0.0211334228515625, -0.03509521484375, 0.059112548828125, 0.01076507568359375, -0.0242919921875, -0.01483154296875, -0.005893707275390625, -0.025787353515625, -0.03778076171875, -0.055816650390625, 0.0011386871337890625, 0.02227783203125, 0.0283966064453125, 0.041595458984375, 0.037017822265625, 0.06158447265625, 0.0250244140625, -0.0162506103515625, 0.01244354248046875, -0.032257080078125, -0.0259552001953125, -0.018524169921875, -0.03802490234375, -0.01519775390625, 0.0013675689697265625, 0.0238037109375, 0.004344940185546875, 0.0104217529296875, 0.0250244140625, -0.006404876708984375, 0.045654296875, -0.047882080078125, 0.018310546875, -0.040191650390625, 0.01360321044921875, -0.00624847412109375, -0.0030994415283203125, -0.0394287109375, -0.00373077392578125, -0.00200653076171875, -0.0291900634765625, 0.05169677734375, 0.014312744140625, 0.1094970703125, 0.0164642333984375, -0.005359649658203125, -0.00928497314453125, -0.040802001953125, 0.055938720703125, -0.032440185546875, 0.03778076171875, 0.0301971435546875, 0.031280517578125, -0.004787445068359375, -0.06024169921875, -0.045318603515625, -0.023681640625, -0.0166015625, 0.01013946533203125, -0.026641845703125, -0.0109405517578125, 0.0308837890625, 0.033843994140625, -0.034271240234375, -0.0081329345703125, -0.05047607421875, -0.02032470703125, 0.06475830078125, -0.005634307861328125, 0.016143798828125, -0.005115509033203125, -0.0259857177734375, -0.028778076171875, -0.0198516845703125, 0.030029296875, 0.031402587890625, 0.00725555419921875, -0.0305023193359375, 0.04754638671875, -0.024749755859375, 0.0523681640625, 0.028961181640625, -0.007236480712890625, 0.0306549072265625, -0.0202789306640625, -0.0155792236328125, 0.00678253173828125, 0.0843505859375, 0.029296875, 0.0286712646484375, 0.0038318634033203125, -0.00787353515625, 0.01165008544921875, 0.01398468017578125, -0.06964111328125, -0.0278472900390625, 0.0114593505859375, -0.0355224609375, -0.03570556640625, 0.020965576171875, -0.044708251953125, 0.0028514862060546875, -0.00004369020462036133, 0.011932373046875, -0.0272979736328125, -0.0233154296875, 0.0207061767578125, -0.000164031982421875, 0.04193115234375, 0.00945281982421875, -0.048736572265625, 0.016265869140625, 0.020782470703125, 0.06866455078125, -0.003330230712890625, -0.0010309219360351562, -0.027862548828125, -0.0219268798828125, -0.03173828125, 0.0640869140625, -0.0198516845703125, -0.028961181640625, -0.020782470703125, 0.04156494140625, -0.005741119384765625, -0.04156494140625, 0.053253173828125, -0.028778076171875, 0.006557464599609375, -0.0208892822265625, -0.006023406982421875, -0.035888671875, 0.03619384765625, -0.046142578125, 0.09466552734375, 0.0304718017578125, -0.06671142578125, 0.023162841796875, -0.0313720703125, -0.023681640625, -0.0189208984375, 0.006168365478515625, -0.0631103515625, 0.003742218017578125, 0.0372314453125, 0.0355224609375, -0.0284271240234375, 0.01776123046875, -0.035125732421875, -0.035186767578125, 0.030364990234375, -0.035400390625, 0.07818603515625, 0.0133056640625, -0.0181121826171875, 0.006732940673828125, -0.04742431640625, -0.0219268798828125, 0.033843994140625, -0.01025390625, 0.018280029296875, -0.01509857177734375, 0.0083770751953125, 0.038970947265625, 0.01140594482421875, -0.05010986328125, 0.003509521484375, -0.00991058349609375, 0.0196990966796875, 0.037353515625, -0.0078887939453125, 0.01367950439453125, -0.0191497802734375, 0.0308074951171875, 0.0229644775390625, 0.04302978515625, -0.0164947509765625, -0.0467529296875, -0.0626220703125, -0.025970458984375, -0.003986358642578125, 0.0251922607421875, -0.0261383056640625, 0.056793212890625, -0.0191802978515625, -0.0309600830078125, -0.03533935546875, -0.00939178466796875, 0.0194549560546875, 0.0445556640625, 0.039764404296875, -0.052825927734375, -0.051239013671875, -0.0699462890625, 0.0211334228515625, 0.00017249584197998047, 0.00013828277587890625, 0.01137542724609375, 0.061492919921875, -0.0126953125, 0.08746337890625, -0.04241943359375, -0.044342041015625, -0.01468658447265625, -0.019012451171875, 0.019622802734375, 0.040435791015625, 0.05517578125, -0.0712890625, -0.03875732421875, -0.0019512176513671875, -0.0758056640625, 0.01263427734375, 0.013885498046875, -0.0158843994140625, 0.0199432373046875, 0.0280609130859375, -0.042877197265625, 0.06439208984375, 0.0161285400390625, -0.005367279052734375, 0.051025390625, 0.0039825439453125, 0.007732391357421875, -0.07196044921875, 0.0143890380859375, 0.019195556640625, -0.016815185546875, -0.04229736328125, 0.00786590576171875, 0.005828857421875, -0.00327301025390625, -0.052276611328125, 0.0205841064453125, -0.0372314453125, -0.00994110107421875, -0.0171051025390625, 0.0018901824951171875, 0.01183319091796875, 0.053466796875, 0.0295257568359375, 0.0311431884765625, 0.053558349609375, -0.042327880859375, 0.03656005859375, 0.0164031982421875, -0.0257110595703125, 0.051239013671875, -0.05499267578125, 0.015655517578125, -0.01513671875, 0.025482177734375, -0.07708740234375, -0.0169219970703125, 0.05133056640625, -0.046905517578125, 0.036529541015625, -0.0269775390625, -0.0275421142578125, -0.0772705078125, -0.0311279296875, 0.033050537109375, 0.028778076171875, -0.046142578125, 0.04046630859375, 0.0016040802001953125, 0.039703369140625, -0.0635986328125, -0.0611572265625, -0.005390167236328125, -0.0084381103515625, -0.043365478515625, 0.021881103515625, -0.009185791015625, 0.00957489013671875, 0.006072998046875, -0.02154541015625, 0.0011720657348632812, -0.0081939697265625, 0.02630615234375, 0.029632568359375, -0.003955841064453125, -0.021728515625, -0.0173797607421875, -0.0187530517578125, 0.006496429443359375, -0.01509857177734375, 0.059173583984375, -0.025848388671875, -0.035736083984375, -0.0640869140625, -0.00887298583984375, 0.039031982421875, -0.03436279296875, 0.05322265625, 0.07562255859375, -0.040771484375, 0.0180511474609375, -0.051483154296875, -0.0169830322265625, -0.039764404296875, 0.026641845703125, -0.039306640625, -0.0302276611328125, 0.057647705078125, 0.0296173095703125, -0.008209228515625, 0.047210693359375, 0.0305633544921875, 0.01247406005859375, 0.070068359375, 0.034759521484375, -0.0101776123046875, 0.033966064453125, -0.07080078125, 0.0244903564453125, -0.08123779296875, -0.0567626953125, -0.0293426513671875, -0.023895263671875, -0.034698486328125, -0.047607421875, 0.00957489013671875, 0.0295257568359375, -0.0286407470703125, 0.0560302734375, -0.0784912109375, 0.017669677734375, 0.04150390625, 0.043182373046875, 0.01226043701171875, 0.0068511962890625, 0.007785797119140625, 0.0104827880859375, -0.037139892578125, -0.0124664306640625, 0.0714111328125, 0.030792236328125, 0.050567626953125, -0.0159759521484375, 0.032806396484375, 0.005611419677734375, 0.0304718017578125, -0.06353759765625, 0.04876708984375, 0.0022144317626953125, -0.056121826171875, 0.002651214599609375, -0.010955810546875, -0.0576171875, 0.01294708251953125, -0.0087432861328125, -0.073974609375, 0.048126220703125, -0.0004012584686279297, -0.004764556884765625, 0.040802001953125, -0.0299530029296875, 0.0665283203125, -0.0237274169921875, -0.03619384765625, 0.0152130126953125, -0.062744140625, 0.039764404296875, -0.0014629364013671875, -0.01165008544921875, -0.0196075439453125, 0.035247802734375, 0.055694580078125, -0.0268402099609375, 0.0535888671875, -0.0033702850341796875, 0.0030536651611328125, 0.061492919921875, 0.0006709098815917969, 0.03851318359375, 0.0011701583862304688, -0.0030612945556640625, 0.04193115234375, -0.0162506103515625, -0.013824462890625, -0.022918701171875, 0.0302276611328125, -0.0452880859375, -0.036895751953125, -0.037078857421875, -0.051971435546875, 0.03179931640625, 0.009552001953125, 0.053466796875, 0.0258026123046875, 0.0121002197265625, 0.027130126953125, 0.041290283203125, -0.036041259765625, 0.038543701171875, 0.01421356201171875, -0.018951416015625, -0.014404296875, 0.06195068359375, -0.005542755126953125, 0.01383209228515625, 0.046478271484375, 0.0106201171875, -0.046600341796875, -0.013458251953125, -0.018310546875, 0.0256500244140625, -0.050201416015625, -0.037933349609375, -0.059356689453125, -0.034454345703125, -0.04046630859375, -0.01910400390625, -0.04266357421875, -0.0029277801513671875, -0.04718017578125, -0.01122283935546875, 0.03643798828125, 0.0202789306640625, -0.00922393798828125, 0.0306854248046875, -0.024383544921875, 0.02301025390625, 0.002197265625, 0.016021728515625, -0.0063323974609375, -0.0538330078125, -0.01035308837890625, 0.0209808349609375, -0.036773681640625, -0.052459716796875, 0.04022216796875, -0.002651214599609375, 0.026885986328125, 0.05755615234375, 0.01207733154296875, 0.0584716796875, 0.00130462646484375, 0.0494384765625, 0.0361328125, -0.05987548828125, 0.047393798828125, -0.00017380714416503906, 0.00754547119140625, 0.01629638671875, 0.0231475830078125, -0.02532958984375, -0.0206756591796875, -0.04241943359375, -0.04876708984375, 0.07244873046875, 0.0166168212890625, -0.0127105712890625, -0.004367828369140625, 0.0152130126953125, -0.0287933349609375, 0.005168914794921875, -0.05908203125, -0.0213165283203125, -0.030242919921875, -0.0201416015625, -0.007144927978515625, -0.007167816162109375, 0.005458831787109375, -0.043609619140625, 0.038970947265625, -0.005199432373046875, 0.05230712890625, 0.028076171875, -0.0173187255859375, -0.0089569091796875, -0.0234527587890625, 0.033111572265625, 0.0255126953125, -0.03759765625, 0.0037250518798828125, 0.01629638671875, -0.039764404296875, 0.0072784423828125, 0.0081939697265625, -0.005126953125, -0.002750396728515625, 0.0261383056640625, 0.055511474609375, 0.0043792724609375, -0.03155517578125, 0.044036865234375, 0.003047943115234375, -0.0205535888671875, -0.030364990234375, 0.007415771484375, -0.00867462158203125, 0.01348876953125, 0.034881591796875, 0.019317626953125, -0.0015869140625, -0.0341796875, 0.027587890625, 0.03387451171875, -0.039306640625, -0.0296630859375, 0.07904052734375, -0.0256195068359375, -0.02484130859375, 0.06805419921875, -0.0237884521484375, -0.04864501953125, 0.08233642578125, 0.045867919921875, 0.05950927734375, -0.02349853515625, 0.01541900634765625, 0.052459716796875, 0.0345458984375, -0.0191802978515625, -0.004913330078125, -0.006500244140625, -0.0623779296875, -0.0004916191101074219, -0.051361083984375, 0.005645751953125, 0.011383056640625, -0.06634521484375, 0.041595458984375, -0.025146484375, -0.019989013671875, -0.006717681884765625, 0.0081634521484375, -0.08349609375, 0.0296173095703125, -0.00318145751953125, 0.072021484375, -0.0565185546875, 0.045166015625, 0.026611328125, -0.056427001953125, -0.060546875, -0.0213165283203125, -0.00754547119140625, -0.0657958984375, 0.0479736328125, 0.054046630859375, -0.00626373291015625, 0.00453948974609375, -0.05609130859375, -0.06561279296875, 0.09454345703125, 0.023101806640625, -0.0447998046875, 0.006954193115234375, 0.006923675537109375, 0.033416748046875, -0.0274505615234375, 0.04742431640625, 0.038330078125, 0.026458740234375, 0.028106689453125, -0.0404052734375, 0.01128387451171875, -0.01493072509765625, 0.00411224365234375, 0.002315521240234375, -0.054901123046875, 0.056610107421875, -0.033477783203125, -0.0187530517578125, 0.00019443035125732422, 0.051177978515625, 0.022125244140625, 0.03253173828125, 0.03424072265625, 0.06024169921875, 0.034332275390625, -0.019378662109375, 0.0814208984375, -0.0267181396484375, 0.05877685546875, 0.058197021484375, 0.0013551712036132812, 0.040618896484375, 0.0157318115234375, -0.0203094482421875, 0.0325927734375, 0.0731201171875, -0.0272064208984375, 0.034881591796875, 0.01556396484375, -0.008087158203125, -0.0017299652099609375, -0.0231170654296875, -0.03192138671875, 0.031768798828125, 0.02764892578125, -0.032440185546875, -0.004352569580078125, 0.0190582275390625, -0.00418853759765625, -0.025665283203125, -0.00989532470703125, 0.02008056640625, -0.0021228790283203125, -0.031494140625, 0.0576171875, 0.00008028745651245117, 0.052734375, -0.024993896484375, 0.00696563720703125, -0.018798828125, 0.0226287841796875, -0.0303192138671875, -0.0560302734375, 0.0214996337890625, -0.005035400390625, -0.00042819976806640625, 0.019195556640625, 0.061614990234375, -0.0241546630859375, -0.06011962890625, 0.035400390625, -0.004375457763671875, 0.0335693359375, -0.017913818359375, -0.055877685546875, 0.02801513671875, -0.002689361572265625, -0.03173828125, 0.0027370452880859375, 0.007556915283203125, -0.011993408203125, 0.063720703125, 0.06231689453125, -0.0248260498046875, 0.01041412353515625, -0.004085540771484375, 0.06829833984375, -0.029937744140625, -0.0122222900390625, -0.05426025390625, 0.045623779296875, -0.0160369873046875, -0.0130615234375, 0.041656494140625, 0.0692138671875, 0.08770751953125, -0.01190948486328125, 0.030120849609375, -0.0307159423828125, -0.0110015869140625, -0.00580596923828125, 0.0269622802734375, -0.038177490234375, 0.01074981689453125, -0.0206451416015625, -0.07391357421875, -0.0292816162109375, 0.06903076171875, -0.016387939453125, 0.0176544189453125, 0.046417236328125, 0.08544921875, -0.0338134765625, -0.0175018310546875, 0.0192718505859375, -0.0005536079406738281, 0.027740478515625, 0.047760009765625, 0.04168701171875, -0.0687255859375, 0.047210693359375, -0.05828857421875, -0.003154754638671875, -0.032501220703125, -0.045318603515625, -0.076904296875, -0.04168701171875, -0.03851318359375, -0.03497314453125, -0.0037899017333984375, 0.040283203125, 0.07464599609375, -0.058929443359375, -0.007343292236328125, -0.02508544921875, 0.0127105712890625, -0.036376953125, -0.0252838134765625, 0.0467529296875, 0.008575439453125, -0.06951904296875, 0.0025806427001953125, 0.0103759765625, 0.008087158203125, -0.0070037841796875, -0.01302337646484375, -0.0306549072265625, -0.0183258056640625, 0.04473876953125, 0.027191162109375, -0.047576904296875, -0.03363037109375, 0.01073455810546875, 0.0022144317626953125, 0.020660400390625, 0.0357666015625, -0.050994873046875, 0.04168701171875, 0.031280517578125, 0.01171875, 0.07220458984375, 0.0182647705078125, -0.00347900390625, -0.048187255859375, 0.034637451171875, -0.00214385986328125, 0.0305938720703125, 0.03546142578125, -0.039154052734375, 0.05877685546875, 0.04302978515625, -0.035186767578125, -0.06842041015625, 0.00691986083984375, -0.098876953125, -0.01085662841796875, 0.062255859375, -0.0172271728515625, -0.03875732421875, -0.0021724700927734375, -0.01447296142578125, 0.050811767578125, -0.0239410400390625, 0.047332763671875, 0.0198822021484375, 0.0003142356872558594, -0.04425048828125, -0.0293731689453125, 0.00909423828125, -0.0161285400390625, -0.047332763671875, -0.031036376953125, 0.0134429931640625, 0.0390625, 0.0340576171875, 0.043212890625, -0.0192413330078125, 0.01139068603515625, 0.004688262939453125, 0.028106689453125, -0.02197265625, -0.03643798828125, -0.01104736328125, -0.0022144317626953125, -0.031768798828125, -0.05224609375 ] ]
Jean-Baptiste/camembert-ner
2023-06-01T01:32:51.000Z
[ "transformers", "pytorch", "onnx", "safetensors", "camembert", "token-classification", "fr", "dataset:Jean-Baptiste/wikiner_fr", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
Jean-Baptiste
null
null
Jean-Baptiste/camembert-ner
81
294,793
transformers
2022-03-02T23:29:04
--- language: fr datasets: - Jean-Baptiste/wikiner_fr widget: - text: "Je m'appelle jean-baptiste et je vis à montréal" - text: "george washington est allé à washington" license: mit --- # camembert-ner: model fine-tuned from camemBERT for NER task. ## Introduction [camembert-ner] is a NER model that was fine-tuned from camemBERT on wikiner-fr dataset. Model was trained on wikiner-fr dataset (~170 634 sentences). Model was validated on emails/chat data and overperformed other models on this type of data specifically. In particular the model seems to work better on entity that don't start with an upper case. ## Training data Training data was classified as follow: Abbreviation|Description -|- O |Outside of a named entity MISC |Miscellaneous entity PER |Person’s name ORG |Organization LOC |Location ## How to use camembert-ner with HuggingFace ##### Load camembert-ner and its sub-word tokenizer : ```python from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("Jean-Baptiste/camembert-ner") model = AutoModelForTokenClassification.from_pretrained("Jean-Baptiste/camembert-ner") ##### Process text sample (from wikipedia) from transformers import pipeline nlp = pipeline('ner', model=model, tokenizer=tokenizer, aggregation_strategy="simple") nlp("Apple est créée le 1er avril 1976 dans le garage de la maison d'enfance de Steve Jobs à Los Altos en Californie par Steve Jobs, Steve Wozniak et Ronald Wayne14, puis constituée sous forme de société le 3 janvier 1977 à l'origine sous le nom d'Apple Computer, mais pour ses 30 ans et pour refléter la diversification de ses produits, le mot « computer » est retiré le 9 janvier 2015.") [{'entity_group': 'ORG', 'score': 0.9472818374633789, 'word': 'Apple', 'start': 0, 'end': 5}, {'entity_group': 'PER', 'score': 0.9838564991950989, 'word': 'Steve Jobs', 'start': 74, 'end': 85}, {'entity_group': 'LOC', 'score': 0.9831605950991312, 'word': 'Los Altos', 'start': 87, 'end': 97}, {'entity_group': 'LOC', 'score': 0.9834540486335754, 'word': 'Californie', 'start': 100, 'end': 111}, {'entity_group': 'PER', 'score': 0.9841555754343668, 'word': 'Steve Jobs', 'start': 115, 'end': 126}, {'entity_group': 'PER', 'score': 0.9843501806259155, 'word': 'Steve Wozniak', 'start': 127, 'end': 141}, {'entity_group': 'PER', 'score': 0.9841533899307251, 'word': 'Ronald Wayne', 'start': 144, 'end': 157}, {'entity_group': 'ORG', 'score': 0.9468960364659628, 'word': 'Apple Computer', 'start': 243, 'end': 257}] ``` ## Model performances (metric: seqeval) Overall precision|recall|f1 -|-|- 0.8859|0.8971|0.8914 By entity entity|precision|recall|f1 -|-|-|- PER|0.9372|0.9598|0.9483 ORG|0.8099|0.8265|0.8181 LOC|0.8905|0.9005|0.8955 MISC|0.8175|0.8117|0.8146 For those who could be interested, here is a short article on how I used the results of this model to train a LSTM model for signature detection in emails: https://medium.com/@jean-baptiste.polle/lstm-model-for-email-signature-detection-8e990384fefa
3,111
[ [ -0.0290985107421875, -0.059539794921875, 0.0226593017578125, 0.0061492919921875, -0.02581787109375, -0.0034923553466796875, -0.0174560546875, -0.0177154541015625, 0.04266357421875, 0.03155517578125, -0.0418701171875, -0.0723876953125, -0.06439208984375, 0.01427459716796875, -0.0296478271484375, 0.0931396484375, 0.0079803466796875, -0.0011949539184570312, 0.0114898681640625, -0.019378662109375, -0.0199432373046875, -0.051361083984375, -0.058624267578125, -0.02630615234375, 0.033538818359375, 0.014129638671875, 0.03643798828125, 0.0277862548828125, 0.041168212890625, 0.0233917236328125, -0.0211944580078125, 0.00275421142578125, -0.0246124267578125, -0.007724761962890625, -0.0033626556396484375, -0.041351318359375, -0.044189453125, 0.0101470947265625, 0.032379150390625, 0.04498291015625, -0.01447296142578125, 0.0182647705078125, -0.00572967529296875, 0.04443359375, -0.0113677978515625, 0.0228729248046875, -0.057891845703125, 0.0070953369140625, -0.01045989990234375, -0.0037975311279296875, -0.005245208740234375, -0.0254974365234375, -0.0005965232849121094, -0.036773681640625, 0.0167999267578125, 0.02557373046875, 0.090576171875, 0.01102447509765625, -0.01422119140625, -0.0265350341796875, -0.02471923828125, 0.06756591796875, -0.07574462890625, 0.037078857421875, 0.032623291015625, -0.00362396240234375, -0.0133819580078125, -0.061767578125, -0.059356689453125, -0.005954742431640625, -0.0228424072265625, 0.021087646484375, -0.01898193359375, -0.00286102294921875, 0.020294189453125, 0.036865234375, -0.04669189453125, 0.00705718994140625, -0.01934814453125, -0.013092041015625, 0.04656982421875, 0.0098724365234375, 0.0222930908203125, -0.027984619140625, -0.021728515625, -0.0193634033203125, -0.0241851806640625, 0.00959014892578125, 0.0168304443359375, 0.032135009765625, -0.017486572265625, 0.07073974609375, -0.0218353271484375, 0.05029296875, 0.01354217529296875, -0.00766754150390625, 0.0394287109375, -0.024017333984375, -0.0182342529296875, 0.002445220947265625, 0.08758544921875, 0.03826904296875, 0.005252838134765625, -0.003787994384765625, -0.033203125, 0.00290679931640625, 0.003894805908203125, -0.059478759765625, -0.0222015380859375, 0.0182952880859375, -0.035980224609375, -0.01415252685546875, 0.0114593505859375, -0.052154541015625, 0.00391387939453125, -0.0284271240234375, 0.037750244140625, -0.0498046875, -0.0031604766845703125, 0.0092315673828125, -0.011566162109375, 0.000667572021484375, -0.00044798851013183594, -0.044677734375, 0.020050048828125, 0.050994873046875, 0.07061767578125, 0.00566864013671875, -0.0224609375, -0.03057861328125, -0.01476287841796875, -0.00919342041015625, 0.053802490234375, -0.0260467529296875, -0.0190582275390625, -0.02056884765625, 0.0120391845703125, -0.02728271484375, -0.026397705078125, 0.046661376953125, -0.0294647216796875, 0.04364013671875, -0.003040313720703125, -0.0697021484375, -0.0205841064453125, 0.043548583984375, -0.048370361328125, 0.09356689453125, 0.026336669921875, -0.0816650390625, 0.0477294921875, -0.052032470703125, -0.01666259765625, 0.009490966796875, -0.0201263427734375, -0.048858642578125, 0.0049591064453125, 0.025543212890625, 0.053436279296875, -0.025604248046875, 0.012176513671875, -0.01172637939453125, -0.01387786865234375, 0.0247802734375, -0.0037593841552734375, 0.0657958984375, -0.00588226318359375, -0.0316162109375, -0.0004940032958984375, -0.06640625, 0.019012451171875, 0.0065460205078125, -0.053497314453125, -0.0228729248046875, -0.01284027099609375, 0.0081939697265625, 0.0191802978515625, 0.0152130126953125, -0.0443115234375, 0.00722503662109375, -0.0540771484375, 0.0217742919921875, 0.04547119140625, 0.01078033447265625, 0.0147247314453125, -0.0254669189453125, 0.03179931640625, 0.01425933837890625, -0.00453948974609375, 0.01279449462890625, -0.03558349609375, -0.051727294921875, -0.040313720703125, 0.033538818359375, 0.053070068359375, -0.039703369140625, 0.07781982421875, -0.052764892578125, -0.048370361328125, -0.04864501953125, 0.0014982223510742188, 0.017181396484375, 0.04461669921875, 0.03717041015625, -0.031890869140625, -0.035369873046875, -0.07464599609375, -0.027740478515625, -0.0167694091796875, 0.00626373291015625, 0.0285797119140625, 0.055389404296875, -0.0169525146484375, 0.0638427734375, -0.039764404296875, -0.029693603515625, -0.0162811279296875, 0.01346588134765625, 0.0653076171875, 0.036712646484375, 0.0283203125, -0.05133056640625, -0.0400390625, -0.0004906654357910156, -0.049774169921875, 0.013214111328125, 0.0038318634033203125, -0.0060577392578125, 0.01288604736328125, 0.0380859375, -0.0650634765625, 0.033538818359375, 0.0289764404296875, -0.043426513671875, 0.035980224609375, -0.0185699462890625, 0.0171966552734375, -0.104248046875, 0.005523681640625, 0.0008068084716796875, -0.00919342041015625, -0.04461669921875, -0.00408172607421875, -0.005390167236328125, 0.0108489990234375, -0.028076171875, 0.031463623046875, -0.049224853515625, 0.01126861572265625, 0.004467010498046875, 0.00811004638671875, -0.01068878173828125, 0.0372314453125, 0.0011119842529296875, 0.041839599609375, 0.036224365234375, -0.043701171875, 0.03240966796875, 0.0223846435546875, -0.0423583984375, 0.06475830078125, -0.0487060546875, 0.00505828857421875, -0.00848388671875, 0.00768280029296875, -0.05078125, -0.0094451904296875, 0.0219573974609375, -0.042816162109375, 0.026397705078125, -0.00978851318359375, -0.05108642578125, -0.03643798828125, -0.016204833984375, 0.0109100341796875, 0.020050048828125, -0.0241241455078125, 0.057769775390625, 0.0218963623046875, 0.0041351318359375, -0.0294189453125, -0.068115234375, 0.007595062255859375, -0.02825927734375, -0.045806884765625, 0.039031982421875, -0.01224517822265625, 0.0012083053588867188, 0.0117645263671875, 0.0034961700439453125, -0.01177978515625, 0.0060882568359375, 0.01194000244140625, 0.0155792236328125, -0.0167083740234375, 0.003650665283203125, -0.00696563720703125, -0.0079193115234375, -0.00830841064453125, -0.0191497802734375, 0.057891845703125, -0.010009765625, -0.0214996337890625, -0.0218505859375, 0.020263671875, 0.032196044921875, -0.01340484619140625, 0.09271240234375, 0.056976318359375, -0.05194091796875, 0.0107421875, -0.042816162109375, -0.0235443115234375, -0.028106689453125, 0.034423828125, -0.041961669921875, -0.0693359375, 0.042083740234375, 0.0120697021484375, 0.0081634521484375, 0.05810546875, 0.044189453125, 0.006069183349609375, 0.07275390625, 0.0270843505859375, -0.02740478515625, 0.0221710205078125, -0.034271240234375, 0.032135009765625, -0.055877685546875, -0.0308074951171875, -0.03436279296875, -0.0273895263671875, -0.055908203125, -0.0168914794921875, 0.015869140625, 0.0145721435546875, -0.007396697998046875, 0.045135498046875, -0.06231689453125, 0.01210784912109375, 0.046600341796875, 0.021728515625, -0.000030100345611572266, 0.00058746337890625, -0.029052734375, -0.0117950439453125, -0.05096435546875, -0.031341552734375, 0.06866455078125, 0.0199737548828125, 0.042388916015625, 0.0112762451171875, 0.0738525390625, 0.01192474365234375, 0.0240020751953125, -0.05999755859375, 0.04949951171875, -0.00984954833984375, -0.055999755859375, -0.0225372314453125, -0.034271240234375, -0.07342529296875, 0.0182037353515625, -0.03033447265625, -0.0743408203125, 0.004245758056640625, 0.00482177734375, -0.01617431640625, 0.029052734375, -0.03509521484375, 0.06298828125, -0.0217742919921875, -0.003200531005859375, 0.0004642009735107422, -0.041839599609375, 0.0101470947265625, -0.011993408203125, 0.025604248046875, -0.019805908203125, -0.0029754638671875, 0.0677490234375, -0.0426025390625, 0.055206298828125, 0.0025768280029296875, 0.005123138427734375, 0.0125732421875, -0.0010557174682617188, 0.055694580078125, -0.00232696533203125, -0.012969970703125, 0.015411376953125, 0.00545501708984375, -0.01528167724609375, -0.027618408203125, 0.05224609375, -0.045257568359375, -0.0207366943359375, -0.040802001953125, -0.0287628173828125, 0.0102386474609375, 0.033935546875, 0.0550537109375, 0.02093505859375, -0.0102691650390625, 0.0250091552734375, 0.0413818359375, -0.0154876708984375, 0.0282135009765625, 0.030731201171875, -0.002777099609375, -0.050537109375, 0.060028076171875, 0.0125579833984375, -0.009613037109375, 0.0389404296875, 0.00402069091796875, -0.03448486328125, -0.037841796875, -0.005115509033203125, 0.031341552734375, -0.042144775390625, -0.0215606689453125, -0.07757568359375, -0.0207672119140625, -0.04437255859375, 0.00783538818359375, -0.0230712890625, -0.02838134765625, -0.046051025390625, -0.0207366943359375, 0.0367431640625, 0.0303497314453125, 0.0006542205810546875, 0.0265350341796875, -0.055511474609375, 0.011505126953125, 0.009246826171875, 0.029998779296875, -0.0238494873046875, -0.046783447265625, -0.022674560546875, 0.000217437744140625, -0.00909423828125, -0.07861328125, 0.040802001953125, 0.0195770263671875, 0.041168212890625, 0.0089263916015625, -0.01403045654296875, 0.04388427734375, -0.0254058837890625, 0.06640625, 0.0235137939453125, -0.06256103515625, 0.0421142578125, -0.01558685302734375, 0.0185394287109375, 0.04583740234375, 0.0272216796875, -0.06060791015625, -0.029327392578125, -0.07464599609375, -0.088623046875, 0.059478759765625, 0.03857421875, 0.006008148193359375, -0.0147857666015625, 0.01947021484375, -0.006404876708984375, 0.0245819091796875, -0.0738525390625, -0.0384521484375, -0.011016845703125, -0.047515869140625, -0.01202392578125, -0.01421356201171875, -0.00580596923828125, -0.0265045166015625, 0.0806884765625, 0.0012083053588867188, 0.033233642578125, 0.02105712890625, -0.0187225341796875, 0.018798828125, 0.0254058837890625, 0.03326416015625, 0.028656005859375, -0.020721435546875, 0.00487518310546875, 0.02777099609375, -0.031982421875, -0.01088714599609375, 0.034637451171875, -0.01313018798828125, 0.014312744140625, 0.031219482421875, 0.06854248046875, 0.005462646484375, -0.02166748046875, 0.041015625, -0.0063934326171875, -0.035430908203125, -0.03717041015625, -0.011566162109375, 0.0148162841796875, 0.01399993896484375, 0.0189361572265625, -0.01824951171875, 0.006061553955078125, -0.0293731689453125, 0.0161590576171875, 0.039031982421875, -0.0450439453125, -0.0239715576171875, 0.052459716796875, -0.021728515625, -0.034698486328125, 0.050567626953125, -0.02069091796875, -0.05828857421875, 0.0401611328125, 0.0293121337890625, 0.06549072265625, 0.0014667510986328125, 0.0010595321655273438, 0.04571533203125, 0.039459228515625, -0.01085662841796875, 0.038177490234375, 0.0181884765625, -0.061981201171875, 0.0005846023559570312, -0.0579833984375, 0.01763916015625, 0.03155517578125, -0.03778076171875, 0.01459503173828125, -0.033935546875, -0.046844482421875, 0.01251983642578125, -0.00019979476928710938, -0.06982421875, 0.045684814453125, -0.00882720947265625, 0.060516357421875, -0.06256103515625, 0.03350830078125, 0.07275390625, -0.044708251953125, -0.08551025390625, -0.0122833251953125, -0.0086669921875, -0.052490234375, 0.046905517578125, 0.0154571533203125, 0.031951904296875, 0.00978851318359375, -0.03411865234375, -0.07232666015625, 0.087890625, 0.0023860931396484375, -0.04296875, -0.002849578857421875, 0.0064544677734375, 0.046417236328125, -0.026123046875, 0.0433349609375, 0.0430908203125, 0.0330810546875, -0.021728515625, -0.05889892578125, 0.020538330078125, -0.0287933349609375, -0.0102691650390625, 0.02850341796875, -0.051422119140625, 0.0589599609375, 0.01325225830078125, -0.0028362274169921875, -0.0061492919921875, 0.046630859375, 0.0254669189453125, 0.0254364013671875, 0.039825439453125, 0.07476806640625, 0.056793212890625, -0.0218505859375, 0.062469482421875, -0.0364990234375, 0.05206298828125, 0.06707763671875, 0.01397705078125, 0.0506591796875, 0.037078857421875, -0.02313232421875, 0.045257568359375, 0.053497314453125, -0.0132598876953125, 0.022918701171875, 0.01268768310546875, -0.02154541015625, -0.00446319580078125, 0.00493621826171875, -0.01396942138671875, 0.03314208984375, 0.0199737548828125, -0.047943115234375, 0.00339508056640625, -0.02374267578125, 0.0153350830078125, -0.0033397674560546875, -0.007343292236328125, 0.05804443359375, 0.0162353515625, -0.0462646484375, 0.03173828125, 0.008880615234375, 0.06719970703125, -0.0281982421875, 0.00997161865234375, -0.01493072509765625, 0.012786865234375, -0.018218994140625, -0.055023193359375, 0.0068817138671875, -0.005039215087890625, -0.0185546875, 0.005950927734375, 0.050506591796875, -0.034881591796875, -0.06402587890625, 0.0196685791015625, 0.0312347412109375, 0.03375244140625, -0.0162353515625, -0.090087890625, -0.024566650390625, 0.0098114013671875, -0.039947509765625, -0.007747650146484375, 0.042877197265625, 0.006805419921875, 0.0496826171875, 0.052001953125, 0.0204010009765625, 0.0103607177734375, 0.0079345703125, 0.05621337890625, -0.06854248046875, -0.04425048828125, -0.076416015625, 0.041656494140625, -0.01617431640625, -0.0279998779296875, 0.04931640625, 0.07159423828125, 0.048980712890625, 0.00746917724609375, 0.043487548828125, -0.031463623046875, 0.053466796875, -0.04632568359375, 0.05780029296875, -0.0579833984375, 0.00839996337890625, -0.0208282470703125, -0.08050537109375, -0.01479339599609375, 0.0355224609375, -0.01580810546875, 0.0272216796875, 0.051666259765625, 0.07342529296875, -0.02203369140625, -0.018951416015625, 0.0075225830078125, 0.023284912109375, 0.023529052734375, 0.03594970703125, 0.033203125, -0.04669189453125, 0.0301513671875, -0.034576416015625, -0.005794525146484375, -0.0186920166015625, -0.0662841796875, -0.0699462890625, -0.041473388671875, -0.03411865234375, -0.032928466796875, -0.009735107421875, 0.07879638671875, 0.046661376953125, -0.0650634765625, -0.00330352783203125, -0.01727294921875, -0.00933074951171875, -0.008392333984375, -0.0234832763671875, 0.041351318359375, 0.0022735595703125, -0.03900146484375, 0.01708984375, -0.0011167526245117188, 0.0233154296875, 0.019287109375, -0.0036067962646484375, -0.03717041015625, 0.004222869873046875, 0.04052734375, 0.00249481201171875, -0.047943115234375, -0.0178680419921875, 0.0018711090087890625, -0.01273345947265625, 0.01212310791015625, 0.0272674560546875, -0.0300750732421875, 0.01337432861328125, 0.036773681640625, 0.0251312255859375, 0.05828857421875, 0.005275726318359375, 0.0191497802734375, -0.060638427734375, 0.0017957687377929688, 0.017059326171875, 0.03802490234375, 0.038360595703125, -0.039154052734375, 0.041473388671875, 0.027740478515625, -0.039215087890625, -0.05120849609375, -0.009918212890625, -0.07073974609375, -0.012054443359375, 0.06829833984375, -0.000835418701171875, -0.0215606689453125, 0.01861572265625, -0.0129241943359375, 0.0394287109375, -0.041015625, 0.053802490234375, 0.0589599609375, -0.0016412734985351562, -0.01119232177734375, -0.0283050537109375, 0.0230865478515625, 0.035125732421875, -0.059417724609375, -0.0299835205078125, 0.025360107421875, 0.039306640625, 0.0121307373046875, 0.0572509765625, -0.006572723388671875, -0.003170013427734375, -0.0117645263671875, 0.018585205078125, 0.016082763671875, 0.0185546875, -0.0181121826171875, -0.0036487579345703125, -0.0018215179443359375, -0.0162200927734375 ] ]
google/flan-t5-xl
2023-07-17T12:48:53.000Z
[ "transformers", "pytorch", "tf", "jax", "t5", "text2text-generation", "en", "fr", "ro", "de", "multilingual", "dataset:svakulenk0/qrecc", "dataset:taskmaster2", "dataset:djaym7/wiki_dialog", "dataset:deepmind/code_contests", "dataset:lambada", "dataset:gsm8k", "dataset:aqua_rat", "dataset:esnli", "dataset:quasc", "dataset:qed", "arxiv:2210.11416", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
google
null
null
google/flan-t5-xl
367
294,765
transformers
2022-10-21T15:43:52
--- language: - en - fr - ro - de - multilingual widget: - text: "Translate to German: My name is Arthur" example_title: "Translation" - text: "Please answer to the following question. Who is going to be the next Ballon d'or?" example_title: "Question Answering" - text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering." example_title: "Logical reasoning" - text: "Please answer the following question. What is the boiling point of Nitrogen?" example_title: "Scientific knowledge" - text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?" example_title: "Yes/no question" - text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?" example_title: "Reasoning task" - text: "Q: ( False or not False or False ) is? A: Let's think step by step" example_title: "Boolean Expressions" - text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?" example_title: "Math reasoning" - text: "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?" example_title: "Premise and hypothesis" tags: - text2text-generation datasets: - svakulenk0/qrecc - taskmaster2 - djaym7/wiki_dialog - deepmind/code_contests - lambada - gsm8k - aqua_rat - esnli - quasc - qed license: apache-2.0 --- # Model Card for FLAN-T5 XL <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg" alt="drawing" width="600"/> # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Uses](#uses) 4. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 5. [Training Details](#training-details) 6. [Evaluation](#evaluation) 7. [Environmental Impact](#environmental-impact) 8. [Citation](#citation) # TL;DR If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract : > Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large). # Model Details ## Model Description - **Model type:** Language model - **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian - **License:** Apache 2.0 - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2210.11416.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xl") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xl", device_map="auto") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU using different precisions #### FP16 <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xl", device_map="auto", torch_dtype=torch.float16) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> #### INT8 <details> <summary> Click to expand </summary> ```python # pip install bitsandbytes accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xl", device_map="auto", load_in_8bit=True) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> # Uses ## Direct Use and Downstream Use The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that: > The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf): > Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application. ## Ethical considerations and risks > Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. ## Known Limitations > Flan-T5 has not been tested in real world applications. ## Sensitive Use: > Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech. # Training Details ## Training Data The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2): ![table.png](https://s3.amazonaws.com/moonup/production/uploads/1666363265279-62441d1d9fdefb55a0b7d12c.png) ## Training Procedure According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf): > These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size. The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax). # Evaluation ## Testing Data, Factors & Metrics The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation: ![image.png](https://s3.amazonaws.com/moonup/production/uploads/1668072995230-62441d1d9fdefb55a0b7d12c.png) For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf). ## Results For full results for FLAN-T5-XL, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips ≥ 4. - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @misc{https://doi.org/10.48550/arxiv.2210.11416, doi = {10.48550/ARXIV.2210.11416}, url = {https://arxiv.org/abs/2210.11416}, author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason}, keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Scaling Instruction-Finetuned Language Models}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
10,745
[ [ -0.033355712890625, -0.04296875, 0.023040771484375, 0.0008435249328613281, -0.007717132568359375, -0.0089111328125, -0.030670166015625, -0.047088623046875, -0.01059722900390625, 0.00740814208984375, -0.0382080078125, -0.03765869140625, -0.049346923828125, 0.005008697509765625, -0.0193023681640625, 0.0765380859375, -0.01073455810546875, 0.0020427703857421875, 0.01181793212890625, -0.00621795654296875, -0.011962890625, -0.02484130859375, -0.0511474609375, -0.0219879150390625, 0.032928466796875, 0.0208587646484375, 0.033966064453125, 0.038970947265625, 0.03759765625, 0.0253753662109375, -0.01184844970703125, -0.004322052001953125, -0.0384521484375, -0.031585693359375, 0.00649261474609375, -0.034576416015625, -0.046173095703125, -0.0030689239501953125, 0.033203125, 0.0377197265625, 0.00630950927734375, 0.024444580078125, -0.00994110107421875, 0.019378662109375, -0.039794921875, 0.0224609375, -0.0224609375, 0.00760650634765625, -0.0216217041015625, 0.007061004638671875, -0.01824951171875, -0.0176239013671875, 0.003459930419921875, -0.05377197265625, 0.038055419921875, -0.00893402099609375, 0.1087646484375, 0.01076507568359375, -0.004444122314453125, -0.0122833251953125, -0.057342529296875, 0.072021484375, -0.0726318359375, 0.034759521484375, 0.014312744140625, 0.0263519287109375, 0.00760650634765625, -0.061614990234375, -0.049713134765625, -0.02294921875, -0.004894256591796875, 0.0111541748046875, -0.003231048583984375, 0.0145416259765625, 0.041351318359375, 0.0458984375, -0.034576416015625, -0.00386810302734375, -0.054779052734375, -0.01251983642578125, 0.0531005859375, -0.0015821456909179688, 0.03839111328125, -0.005725860595703125, -0.0211029052734375, -0.035186767578125, -0.025970458984375, 0.01015472412109375, 0.021484375, 0.03179931640625, -0.0369873046875, 0.0295257568359375, -0.0004963874816894531, 0.040313720703125, 0.0239105224609375, -0.036163330078125, 0.03814697265625, -0.02587890625, -0.0272064208984375, -0.01204681396484375, 0.07281494140625, 0.01493072509765625, 0.0170745849609375, -0.007293701171875, -0.03021240234375, 0.0008287429809570312, 0.01495361328125, -0.07196044921875, -0.00922393798828125, 0.032501220703125, -0.0289306640625, -0.03717041015625, 0.0119476318359375, -0.061614990234375, -0.00014352798461914062, -0.00014853477478027344, 0.040679931640625, -0.038543701171875, -0.043670654296875, -0.002674102783203125, -0.0153961181640625, 0.0212860107421875, 0.003734588623046875, -0.08258056640625, 0.01837158203125, 0.038787841796875, 0.06390380859375, 0.0093994140625, -0.024444580078125, -0.01849365234375, -0.00006151199340820312, -0.0178985595703125, 0.031585693359375, -0.030029296875, -0.0287933349609375, -0.0018510818481445312, 0.0136871337890625, -0.0153961181640625, -0.036407470703125, 0.05023193359375, -0.02191162109375, 0.038787841796875, -0.022705078125, -0.041351318359375, -0.0292510986328125, -0.00220489501953125, -0.0489501953125, 0.08465576171875, 0.0224609375, -0.05511474609375, 0.034393310546875, -0.0709228515625, -0.033233642578125, -0.0097808837890625, 0.009613037109375, -0.05364990234375, 0.00415802001953125, 0.02618408203125, 0.0272369384765625, -0.0156097412109375, 0.0167083740234375, -0.03802490234375, -0.0247802734375, -0.0115203857421875, -0.00843048095703125, 0.0771484375, 0.032135009765625, -0.06292724609375, 0.02032470703125, -0.04522705078125, -0.0037097930908203125, 0.02374267578125, -0.006855010986328125, 0.0121917724609375, -0.0255279541015625, 0.0162506103515625, 0.0297088623046875, 0.02008056640625, -0.0400390625, 0.0008831024169921875, -0.03863525390625, 0.039703369140625, 0.042022705078125, -0.01629638671875, 0.030670166015625, -0.038543701171875, 0.037628173828125, 0.022674560546875, 0.0182952880859375, -0.0078277587890625, -0.029022216796875, -0.08624267578125, 0.0009469985961914062, 0.0194091796875, 0.0338134765625, -0.045196533203125, 0.0298309326171875, -0.036376953125, -0.0517578125, -0.0345458984375, 0.007358551025390625, 0.029449462890625, 0.03753662109375, 0.03759765625, -0.00617218017578125, -0.040130615234375, -0.052001953125, -0.0153350830078125, -0.0004673004150390625, -0.0005664825439453125, 0.0211944580078125, 0.056976318359375, -0.0020313262939453125, 0.038360595703125, -0.0219879150390625, -0.025970458984375, -0.035919189453125, 0.00725555419921875, 0.0092620849609375, 0.050079345703125, 0.0634765625, -0.043121337890625, -0.031768798828125, 0.005138397216796875, -0.061431884765625, 0.0015659332275390625, -0.00771331787109375, -0.007965087890625, 0.0369873046875, 0.017333984375, -0.047882080078125, 0.0289306640625, 0.033966064453125, -0.01800537109375, 0.0249786376953125, -0.00966644287109375, 0.00487518310546875, -0.09002685546875, 0.03790283203125, 0.01042938232421875, -0.01378631591796875, -0.057647705078125, 0.00926971435546875, 0.00484466552734375, -0.0164337158203125, -0.046661376953125, 0.057952880859375, -0.0260467529296875, 0.0009708404541015625, -0.007442474365234375, -0.0017671585083007812, -0.0002028942108154297, 0.04315185546875, 0.00873565673828125, 0.062744140625, 0.0286712646484375, -0.056427001953125, 0.0030841827392578125, 0.007171630859375, -0.02001953125, 0.0155792236328125, -0.0552978515625, 0.0125732421875, 0.0010633468627929688, 0.0151824951171875, -0.050506591796875, -0.0270233154296875, 0.02044677734375, -0.036376953125, 0.034637451171875, 0.005054473876953125, -0.0269775390625, -0.04425048828125, -0.0211944580078125, 0.024322509765625, 0.050079345703125, -0.04449462890625, 0.04949951171875, 0.0159759521484375, 0.0241851806640625, -0.045166015625, -0.06500244140625, -0.0196533203125, -0.035919189453125, -0.06072998046875, 0.04290771484375, -0.0003147125244140625, -0.0006451606750488281, -0.01392364501953125, -0.0094757080078125, -0.004344940185546875, 0.0034465789794921875, 0.01012420654296875, 0.008453369140625, -0.01904296875, -0.011749267578125, -0.015899658203125, -0.007228851318359375, -0.00177001953125, -0.029632568359375, 0.0457763671875, -0.020599365234375, 0.0123443603515625, -0.05950927734375, -0.0018529891967773438, 0.0438232421875, -0.01861572265625, 0.0675048828125, 0.08355712890625, -0.03643798828125, 0.0008592605590820312, -0.04736328125, -0.024169921875, -0.03900146484375, 0.0148773193359375, -0.0377197265625, -0.047088623046875, 0.05145263671875, 0.0177154541015625, 0.0229949951171875, 0.05841064453125, 0.0369873046875, -0.0016889572143554688, 0.0689697265625, 0.0516357421875, -0.0025959014892578125, 0.058197021484375, -0.052154541015625, 0.018280029296875, -0.0455322265625, -0.0158233642578125, -0.035430908203125, -0.01983642578125, -0.051910400390625, -0.0206756591796875, 0.0226287841796875, 0.005054473876953125, -0.045013427734375, 0.0295257568359375, -0.026702880859375, 0.007549285888671875, 0.042938232421875, 0.01558685302734375, -0.00502777099609375, 0.00621795654296875, -0.01128387451171875, -0.0037097930908203125, -0.0535888671875, -0.03912353515625, 0.08441162109375, 0.0291748046875, 0.02996826171875, 0.0026340484619140625, 0.0531005859375, -0.002758026123046875, 0.0193328857421875, -0.03955078125, 0.03076171875, -0.0172882080078125, -0.06878662109375, -0.0035572052001953125, -0.03155517578125, -0.062164306640625, 0.00374603271484375, -0.004215240478515625, -0.056976318359375, 0.0027408599853515625, 0.01146697998046875, -0.036102294921875, 0.04461669921875, -0.06951904296875, 0.089599609375, -0.029266357421875, -0.03759765625, -0.0043487548828125, -0.0360107421875, 0.03948974609375, 0.01076507568359375, 0.009368896484375, 0.004062652587890625, 0.00592803955078125, 0.0615234375, -0.056915283203125, 0.059234619140625, -0.03204345703125, -0.006259918212890625, 0.0231781005859375, -0.01800537109375, 0.0258636474609375, -0.01910400390625, -0.00916290283203125, 0.0280303955078125, 0.005573272705078125, -0.04437255859375, -0.03692626953125, 0.052825927734375, -0.0809326171875, -0.042388916015625, -0.034881591796875, -0.0296173095703125, 0.0034618377685546875, 0.03564453125, 0.0303802490234375, 0.0243682861328125, 0.0032787322998046875, 0.0005483627319335938, 0.032318115234375, -0.0305633544921875, 0.04888916015625, 0.006603240966796875, -0.021026611328125, -0.030242919921875, 0.07122802734375, 0.01044464111328125, 0.036376953125, 0.0242156982421875, 0.0223388671875, -0.0253753662109375, -0.019683837890625, -0.035736083984375, 0.029571533203125, -0.04742431640625, -0.005214691162109375, -0.0443115234375, -0.01018524169921875, -0.03985595703125, -0.01071929931640625, -0.034820556640625, -0.030303955078125, -0.0300445556640625, -0.00543975830078125, 0.0219879150390625, 0.0499267578125, -0.001399993896484375, 0.0282135009765625, -0.04498291015625, 0.025146484375, 0.004398345947265625, 0.0289154052734375, 0.005702972412109375, -0.051544189453125, -0.0120391845703125, 0.0209808349609375, -0.03582763671875, -0.0458984375, 0.03009033203125, 0.01739501953125, 0.026519775390625, 0.039581298828125, -0.0075225830078125, 0.0709228515625, -0.0104522705078125, 0.07952880859375, 0.004009246826171875, -0.07666015625, 0.04266357421875, -0.035003662109375, 0.032684326171875, 0.0262603759765625, 0.0276031494140625, -0.02447509765625, -0.0175628662109375, -0.0792236328125, -0.054229736328125, 0.073974609375, 0.021026611328125, 0.003070831298828125, 0.021270751953125, 0.0173187255859375, -0.00496673583984375, 0.0074310302734375, -0.06658935546875, -0.018646240234375, -0.038818359375, -0.0228118896484375, -0.007747650146484375, -0.003753662109375, -0.00679779052734375, -0.0269775390625, 0.06103515625, 0.004261016845703125, 0.0504150390625, 0.01006317138671875, -0.0175628662109375, -0.01416015625, -0.0007152557373046875, 0.06878662109375, 0.035614013671875, -0.0260467529296875, -0.0110321044921875, 0.027740478515625, -0.043609619140625, -0.004119873046875, 0.00897979736328125, -0.0275726318359375, -0.002269744873046875, 0.0313720703125, 0.081298828125, 0.0116424560546875, -0.0270233154296875, 0.03173828125, -0.00611114501953125, -0.029083251953125, -0.035186767578125, 0.0248870849609375, 0.00732421875, 0.0035152435302734375, 0.0116729736328125, 0.005191802978515625, -0.013916015625, -0.0282745361328125, 0.0005784034729003906, 0.012725830078125, -0.0175933837890625, -0.037109375, 0.08428955078125, 0.0162200927734375, -0.0113525390625, 0.042755126953125, -0.006031036376953125, -0.038848876953125, 0.0521240234375, 0.033233642578125, 0.072021484375, -0.01033782958984375, 0.0010213851928710938, 0.06878662109375, 0.0247802734375, -0.00865936279296875, 0.0279693603515625, 0.005077362060546875, -0.039764404296875, -0.01165771484375, -0.04888916015625, -0.0011682510375976562, 0.032958984375, -0.035186767578125, 0.03729248046875, -0.056365966796875, -0.015869140625, 0.00826263427734375, 0.0325927734375, -0.070556640625, 0.030517578125, 0.0226287841796875, 0.06158447265625, -0.05621337890625, 0.06011962890625, 0.046661376953125, -0.07183837890625, -0.083740234375, -0.002140045166015625, -0.006610870361328125, -0.041839599609375, 0.04437255859375, 0.029449462890625, 0.002361297607421875, 0.00159454345703125, -0.036376953125, -0.06640625, 0.09796142578125, 0.03131103515625, -0.03125, -0.00760650634765625, 0.0260009765625, 0.0440673828125, -0.019287109375, 0.05718994140625, 0.041351318359375, 0.049713134765625, 0.003936767578125, -0.0804443359375, 0.0148162841796875, -0.021026611328125, 0.00890350341796875, -0.0020618438720703125, -0.0765380859375, 0.06976318359375, -0.023406982421875, -0.022125244140625, -0.0017099380493164062, 0.06390380859375, 0.0179443359375, 0.007015228271484375, 0.04052734375, 0.04608154296875, 0.06005859375, -0.0188446044921875, 0.09747314453125, -0.043487548828125, 0.045013427734375, 0.04833984375, 0.01401519775390625, 0.048309326171875, 0.019744873046875, -0.021575927734375, 0.034454345703125, 0.05389404296875, -0.00827789306640625, 0.022216796875, -0.00638580322265625, -0.0181884765625, -0.00704193115234375, -0.0026950836181640625, -0.03863525390625, 0.020782470703125, 0.0286712646484375, -0.031890869140625, -0.00868988037109375, -0.00498199462890625, 0.0289154052734375, -0.0250396728515625, -0.01078033447265625, 0.03643798828125, 0.01042938232421875, -0.0587158203125, 0.08026123046875, 0.00988006591796875, 0.062255859375, -0.04052734375, 0.019378662109375, -0.0231781005859375, 0.0288238525390625, -0.033477783203125, -0.02801513671875, 0.0208587646484375, 0.0028400421142578125, 0.0008282661437988281, -0.01374053955078125, 0.035552978515625, -0.034393310546875, -0.05389404296875, 0.018890380859375, 0.01007843017578125, 0.01311492919921875, 0.018768310546875, -0.06573486328125, 0.01885986328125, 0.00997161865234375, -0.0264434814453125, 0.0101470947265625, 0.01058197021484375, 0.0010194778442382812, 0.045440673828125, 0.0400390625, -0.0117340087890625, 0.0214691162109375, 0.0101165771484375, 0.053680419921875, -0.049896240234375, -0.0237884521484375, -0.049346923828125, 0.049285888671875, -0.00498199462890625, -0.04022216796875, 0.051361083984375, 0.0469970703125, 0.0870361328125, -0.01232147216796875, 0.07305908203125, -0.0296630859375, 0.021759033203125, -0.030120849609375, 0.053131103515625, -0.06097412109375, 0.0038356781005859375, -0.0288238525390625, -0.059600830078125, -0.017242431640625, 0.06353759765625, -0.0372314453125, 0.047607421875, 0.05841064453125, 0.06610107421875, -0.0267333984375, 0.003284454345703125, 0.01397705078125, 0.0214691162109375, 0.04937744140625, 0.052398681640625, 0.0174407958984375, -0.06719970703125, 0.045166015625, -0.05926513671875, 0.0079345703125, -0.0163421630859375, -0.05029296875, -0.0804443359375, -0.04058837890625, -0.022613525390625, -0.034454345703125, -0.00275421142578125, 0.0628662109375, 0.057647705078125, -0.07720947265625, -0.02667236328125, -0.0227203369140625, -0.00855255126953125, -0.01910400390625, -0.018157958984375, 0.034423828125, -0.042388916015625, -0.08404541015625, 0.00902557373046875, -0.015777587890625, 0.0183563232421875, -0.024169921875, -0.01416778564453125, -0.023529052734375, -0.021087646484375, 0.0220184326171875, 0.0291595458984375, -0.0626220703125, -0.026092529296875, 0.004199981689453125, -0.00794219970703125, 0.01149749755859375, 0.03582763671875, -0.034759521484375, 0.02935791015625, 0.03826904296875, 0.0364990234375, 0.06097412109375, -0.005184173583984375, 0.05084228515625, -0.034637451171875, 0.032135009765625, 0.003200531005859375, 0.021759033203125, 0.030181884765625, -0.0181884765625, 0.040863037109375, 0.02581787109375, -0.03167724609375, -0.0604248046875, -0.01302337646484375, -0.06951904296875, -0.00012409687042236328, 0.09228515625, -0.0210113525390625, -0.037689208984375, 0.0165557861328125, 0.00013518333435058594, 0.04443359375, -0.030517578125, 0.05328369140625, 0.052490234375, 0.006984710693359375, -0.0258026123046875, -0.0574951171875, 0.05047607421875, 0.04107666015625, -0.058197021484375, -0.0167388916015625, 0.00994873046875, 0.041473388671875, 0.01454925537109375, 0.0325927734375, -0.006000518798828125, 0.01495361328125, 0.01250457763671875, 0.0202484130859375, -0.01233673095703125, -0.00638580322265625, -0.02166748046875, 0.0034732818603515625, -0.006549835205078125, -0.00984954833984375 ] ]
microsoft/trocr-base-handwritten
2023-01-26T12:56:57.000Z
[ "transformers", "pytorch", "vision-encoder-decoder", "trocr", "image-to-text", "arxiv:2109.10282", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
microsoft
null
null
microsoft/trocr-base-handwritten
104
293,715
transformers
2022-03-02T23:29:05
--- tags: - trocr - image-to-text widget: - src: https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg example_title: Note 1 - src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSoolxi9yWGAT5SLZShv8vVd0bz47UWRzQC19fDTeE8GmGv_Rn-PCF1pP1rrUx8kOjA4gg&usqp=CAU example_title: Note 2 - src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRNYtTuSBpZPV_nkBYPMFwVVD9asZOPgHww4epu9EqWgDmXW--sE2o8og40ZfDGo87j5w&usqp=CAU example_title: Note 3 --- # TrOCR (base-sized model, fine-tuned on IAM) TrOCR model fine-tuned on the [IAM dataset](https://fki.tic.heia-fr.ch/databases/iam-handwriting-database). It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr). Disclaimer: The team releasing TrOCR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of BEiT, while the text decoder was initialized from the weights of RoBERTa. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens. ## Intended uses & limitations You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model in PyTorch: ```python from transformers import TrOCRProcessor, VisionEncoderDecoderModel from PIL import Image import requests # load image from the IAM database url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg' image = Image.open(requests.get(url, stream=True).raw).convert("RGB") processor = TrOCRProcessor.from_pretrained('microsoft/trocr-base-handwritten') model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-base-handwritten') pixel_values = processor(images=image, return_tensors="pt").pixel_values generated_ids = model.generate(pixel_values) generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0] ``` ### BibTeX entry and citation info ```bibtex @misc{li2021trocr, title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models}, author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei}, year={2021}, eprint={2109.10282}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
2,972
[ [ -0.0146484375, -0.02777099609375, 0.009033203125, -0.0279998779296875, -0.0282440185546875, 0.0003044605255126953, -0.0002465248107910156, -0.06451416015625, 0.00806427001953125, 0.05035400390625, -0.0279693603515625, -0.035491943359375, -0.050079345703125, 0.01009368896484375, -0.024993896484375, 0.08245849609375, -0.01016998291015625, -0.00019037723541259766, 0.007366180419921875, -0.038055419921875, -0.006008148193359375, -0.0433349609375, -0.039093017578125, -0.016143798828125, 0.025299072265625, 0.02606201171875, 0.048736572265625, 0.05535888671875, 0.07476806640625, 0.02783203125, -0.024810791015625, 0.00827789306640625, -0.01739501953125, -0.0286712646484375, 0.0182037353515625, -0.036468505859375, -0.049957275390625, 0.006488800048828125, 0.044891357421875, 0.0149993896484375, -0.00490570068359375, 0.01128387451171875, 0.00441741943359375, 0.041412353515625, -0.0198974609375, -0.00836181640625, -0.02679443359375, 0.0207366943359375, -0.0052947998046875, -0.00039315223693847656, -0.032501220703125, -0.0310211181640625, 0.026580810546875, -0.040985107421875, 0.049224853515625, 0.00457763671875, 0.09088134765625, -0.009552001953125, -0.021392822265625, -0.042449951171875, -0.055084228515625, 0.045989990234375, -0.045379638671875, 0.0278167724609375, 0.0036716461181640625, 0.0157012939453125, 0.009979248046875, -0.07794189453125, -0.06524658203125, -0.028533935546875, -0.0231781005859375, 0.0012683868408203125, -0.01806640625, 0.0213165283203125, 0.032257080078125, 0.042877197265625, -0.0460205078125, -0.020416259765625, -0.05352783203125, -0.0291900634765625, 0.0216217041015625, -0.0003285408020019531, 0.0280303955078125, 0.004734039306640625, -0.029937744140625, -0.033447265625, -0.0107574462890625, -0.004024505615234375, 0.006587982177734375, -0.001277923583984375, -0.031280517578125, 0.051544189453125, 0.02423095703125, 0.064697265625, 0.022979736328125, -0.022979736328125, 0.040924072265625, -0.00711822509765625, -0.0008363723754882812, 0.0019817352294921875, 0.07763671875, 0.022003173828125, 0.0285186767578125, -0.01116943359375, -0.0192413330078125, 0.0224456787109375, 0.00727081298828125, -0.06451416015625, -0.0036792755126953125, -0.00608062744140625, -0.04241943359375, -0.0218048095703125, 0.016326904296875, -0.061798095703125, -0.01454925537109375, -0.0115814208984375, 0.033935546875, -0.0328369140625, 0.00806427001953125, -0.00787353515625, -0.01181793212890625, 0.0131683349609375, 0.0184478759765625, -0.041656494140625, 0.00687408447265625, 0.0104217529296875, 0.08599853515625, -0.01209259033203125, -0.02471923828125, -0.0240631103515625, -0.0050048828125, -0.01419830322265625, 0.046783447265625, -0.0187835693359375, -0.024261474609375, -0.00760650634765625, 0.001972198486328125, -0.0016117095947265625, -0.0421142578125, 0.040771484375, -0.028900146484375, 0.0265045166015625, 0.01172637939453125, -0.0013036727905273438, -0.00399017333984375, 0.0287628173828125, -0.0673828125, 0.08941650390625, 0.02105712890625, -0.05889892578125, 0.01261138916015625, -0.053985595703125, -0.0212249755859375, 0.003459930419921875, 0.0149993896484375, -0.065673828125, 0.0025482177734375, 0.004962921142578125, 0.00986480712890625, -0.021331787109375, 0.0008869171142578125, -0.01041412353515625, -0.03009033203125, 0.016021728515625, -0.0170440673828125, 0.05999755859375, 0.02276611328125, -0.0298919677734375, -0.00518035888671875, -0.0784912109375, 0.00966644287109375, 0.006137847900390625, -0.026214599609375, -0.00487518310546875, -0.02679443359375, 0.0253753662109375, 0.03302001953125, 0.031829833984375, -0.04791259765625, 0.0188751220703125, -0.02386474609375, 0.055694580078125, 0.025299072265625, -0.0182037353515625, 0.036468505859375, -0.0224761962890625, 0.031951904296875, 0.0175628662109375, 0.0101318359375, -0.00958251953125, -0.0177459716796875, -0.07562255859375, -0.0176239013671875, 0.0155792236328125, 0.053192138671875, -0.07049560546875, 0.033721923828125, -0.025634765625, -0.05059814453125, -0.038970947265625, -0.00569915771484375, 0.049285888671875, 0.061798095703125, 0.0296783447265625, -0.047210693359375, -0.032440185546875, -0.047149658203125, -0.0115966796875, -0.0225982666015625, 0.006587982177734375, 0.02191162109375, 0.045989990234375, -0.0196380615234375, 0.0567626953125, -0.0243377685546875, -0.057037353515625, -0.0206756591796875, 0.023193359375, 0.01268768310546875, 0.05108642578125, 0.033599853515625, -0.0469970703125, -0.04559326171875, 0.00035953521728515625, -0.052520751953125, 0.004528045654296875, -0.004611968994140625, -0.01116943359375, 0.041656494140625, 0.0250091552734375, -0.0494384765625, 0.05963134765625, 0.032379150390625, -0.036468505859375, 0.0400390625, -0.03729248046875, 0.009429931640625, -0.0809326171875, 0.0286102294921875, 0.004512786865234375, -0.0177459716796875, -0.06231689453125, 0.01374053955078125, 0.01482391357421875, -0.0189666748046875, -0.0228271484375, 0.043975830078125, -0.06005859375, -0.001552581787109375, -0.005657196044921875, 0.0029430389404296875, 0.007167816162109375, 0.0537109375, 0.0347900390625, 0.056396484375, 0.01910400390625, -0.025360107421875, 0.00849151611328125, 0.0281829833984375, -0.0257720947265625, 0.046356201171875, -0.0731201171875, 0.044097900390625, -0.00370025634765625, -0.006443023681640625, -0.057647705078125, 0.0179595947265625, 0.0296783447265625, -0.037841796875, 0.0297393798828125, -0.0001537799835205078, -0.036346435546875, -0.056884765625, -0.002552032470703125, 0.03253173828125, 0.0321044921875, -0.043060302734375, 0.0848388671875, 0.013427734375, 0.027252197265625, -0.03839111328125, -0.08209228515625, 0.0029277801513671875, -0.00991058349609375, -0.051177978515625, 0.039825439453125, -0.0181427001953125, 0.01605224609375, -0.00510406494140625, 0.005519866943359375, -0.0123748779296875, -0.0297393798828125, 0.0106353759765625, 0.0419921875, -0.01959228515625, -0.0225372314453125, -0.035736083984375, -0.01433563232421875, -0.0217742919921875, -0.0161895751953125, 0.04241943359375, -0.028900146484375, 0.002559661865234375, -0.044586181640625, 0.01338958740234375, 0.0556640625, -0.039642333984375, 0.048370361328125, 0.0487060546875, -0.023284912109375, 0.00415802001953125, -0.04815673828125, -0.00803375244140625, -0.035858154296875, 0.0220794677734375, -0.02459716796875, -0.053802490234375, 0.06353759765625, 0.032623291015625, -0.008941650390625, 0.031280517578125, 0.0321044921875, 0.000004231929779052734, 0.062103271484375, 0.0567626953125, 0.0054473876953125, 0.0650634765625, -0.040679931640625, 0.01450347900390625, -0.067138671875, -0.0307464599609375, -0.03009033203125, -0.03009033203125, -0.039886474609375, -0.015960693359375, 0.0281219482421875, -0.005168914794921875, -0.01184844970703125, 0.0460205078125, -0.07794189453125, 0.0229034423828125, 0.058502197265625, 0.03424072265625, 0.01324462890625, 0.01226806640625, -0.02166748046875, 0.007434844970703125, -0.0289764404296875, -0.04132080078125, 0.05718994140625, 0.0109405517578125, 0.053741455078125, -0.01450347900390625, 0.041717529296875, 0.0204620361328125, 0.009796142578125, -0.05718994140625, 0.047698974609375, -0.0192108154296875, -0.038848876953125, -0.00021183490753173828, -0.0208282470703125, -0.0670166015625, -0.001007080078125, -0.031463623046875, -0.058868408203125, 0.052398681640625, 0.035186767578125, -0.00827789306640625, 0.0364990234375, -0.048614501953125, 0.0692138671875, -0.0259552001953125, -0.02313232421875, 0.0192108154296875, -0.0640869140625, -0.0026950836181640625, 0.0194091796875, -0.0187835693359375, 0.0243072509765625, 0.0128631591796875, 0.0692138671875, -0.061187744140625, 0.056396484375, -0.00860595703125, 0.00542449951171875, 0.0438232421875, 0.0014019012451171875, 0.0433349609375, -0.0455322265625, -0.015228271484375, 0.046600341796875, 0.007152557373046875, -0.01065826416015625, -0.0231781005859375, 0.01739501953125, -0.06927490234375, -0.01216888427734375, -0.061126708984375, -0.046722412109375, 0.0196685791015625, 0.039398193359375, 0.0604248046875, 0.047119140625, -0.004360198974609375, -0.005847930908203125, 0.0391845703125, 0.001308441162109375, 0.03619384765625, 0.02569580078125, 0.00377655029296875, -0.058380126953125, 0.058319091796875, 0.01366424560546875, 0.0247955322265625, 0.035552978515625, 0.012359619140625, -0.0144500732421875, -0.0288848876953125, -0.0256500244140625, 0.034881591796875, -0.0484619140625, -0.0161285400390625, -0.032379150390625, -0.025177001953125, -0.0240478515625, -0.0151519775390625, -0.01161956787109375, -0.0217132568359375, -0.05535888671875, 0.01763916015625, 0.0274810791015625, 0.040008544921875, 0.00542449951171875, 0.06005859375, -0.0606689453125, 0.0281219482421875, -0.0015611648559570312, 0.0255889892578125, 0.0015583038330078125, -0.04644775390625, -0.01617431640625, 0.000698089599609375, -0.0308685302734375, -0.05694580078125, 0.058013916015625, 0.028961181640625, 0.02178955078125, 0.03240966796875, -0.003269195556640625, 0.05792236328125, -0.03814697265625, 0.040313720703125, 0.03326416015625, -0.076171875, 0.028900146484375, -0.0011720657348632812, 0.022979736328125, 0.03375244140625, 0.00525665283203125, -0.046417236328125, -0.01209259033203125, -0.045013427734375, -0.041168212890625, 0.07977294921875, 0.00403594970703125, -0.00789642333984375, 0.0228271484375, 0.03924560546875, -0.0175933837890625, 0.01514434814453125, -0.07598876953125, -0.0203857421875, -0.02777099609375, -0.051727294921875, -0.005710601806640625, -0.026702880859375, 0.01299285888671875, -0.020233154296875, 0.0335693359375, -0.0035247802734375, 0.057037353515625, 0.042022705078125, -0.039093017578125, -0.01059722900390625, -0.00901031494140625, 0.051025390625, 0.02777099609375, -0.017822265625, 0.01374053955078125, -0.00572967529296875, -0.0860595703125, -0.0012826919555664062, 0.00913238525390625, -0.03009033203125, 0.004192352294921875, 0.03875732421875, 0.08380126953125, -0.020294189453125, -0.035980224609375, 0.03619384765625, -0.0020618438720703125, -0.022613525390625, -0.02764892578125, -0.0024051666259765625, -0.0247955322265625, 0.01119232177734375, 0.037628173828125, 0.0179290771484375, 0.0005927085876464844, -0.0369873046875, 0.0005793571472167969, 0.039093017578125, -0.051025390625, -0.02471923828125, 0.04937744140625, -0.00791168212890625, -0.04718017578125, 0.06243896484375, 0.0019292831420898438, -0.067626953125, 0.057525634765625, 0.049072265625, 0.054962158203125, -0.0186004638671875, 0.005329132080078125, 0.045135498046875, 0.037322998046875, -0.0027828216552734375, 0.02471923828125, -0.00838470458984375, -0.056976318359375, 0.02862548828125, -0.038787841796875, -0.02056884765625, 0.0002999305725097656, -0.05267333984375, 0.0384521484375, -0.045867919921875, -0.028961181640625, -0.0066070556640625, 0.0104522705078125, -0.050201416015625, 0.0281219482421875, 0.0009832382202148438, 0.06292724609375, -0.0372314453125, 0.056488037109375, 0.041473388671875, -0.033233642578125, -0.05352783203125, -0.01000213623046875, -0.0094146728515625, -0.0784912109375, 0.0433349609375, 0.025726318359375, -0.00905609130859375, 0.01369476318359375, -0.0445556640625, -0.05859375, 0.0943603515625, 0.016265869140625, -0.046417236328125, -0.0268707275390625, 0.03033447265625, 0.057586669921875, -0.030364990234375, 0.04638671875, 0.0253753662109375, 0.015167236328125, 0.03375244140625, -0.06085205078125, 0.00738525390625, -0.032196044921875, 0.0187835693359375, 0.0124359130859375, -0.0494384765625, 0.06353759765625, -0.03765869140625, -0.0209503173828125, 0.03680419921875, 0.049957275390625, 0.01849365234375, 0.0220489501953125, 0.0266876220703125, 0.052276611328125, 0.051025390625, -0.01171112060546875, 0.0611572265625, -0.032257080078125, 0.02813720703125, 0.058929443359375, 0.0008711814880371094, 0.05352783203125, 0.033294677734375, 0.0008149147033691406, 0.053436279296875, 0.0308380126953125, -0.036285400390625, 0.035552978515625, -0.00994110107421875, 0.002536773681640625, 0.0087127685546875, 0.0058746337890625, -0.0161285400390625, 0.0218048095703125, 0.01168060302734375, -0.0567626953125, 0.00208282470703125, 0.0153961181640625, -0.01059722900390625, -0.01812744140625, -0.030975341796875, 0.05487060546875, 0.0003082752227783203, -0.046844482421875, 0.050933837890625, -0.0001608133316040039, 0.0697021484375, -0.050933837890625, -0.0027866363525390625, -0.00577545166015625, 0.0426025390625, -0.0112762451171875, -0.05584716796875, 0.01322174072265625, -0.003849029541015625, -0.0188140869140625, 0.0088958740234375, 0.05780029296875, -0.049102783203125, -0.0648193359375, 0.0190582275390625, -0.003993988037109375, 0.01605224609375, 0.023101806640625, -0.061737060546875, 0.0110321044921875, -0.0019664764404296875, -0.01313018798828125, -0.0034084320068359375, 0.03192138671875, -0.0012388229370117188, 0.0406494140625, 0.04412841796875, 0.00884246826171875, 0.01406097412109375, -0.016876220703125, 0.046722412109375, -0.043365478515625, -0.039947509765625, -0.055267333984375, 0.042633056640625, 0.005168914794921875, -0.045440673828125, 0.04034423828125, 0.0418701171875, 0.047515869140625, -0.0221405029296875, 0.0302886962890625, -0.01288604736328125, 0.021881103515625, -0.0239715576171875, 0.07611083984375, -0.048980712890625, -0.0131072998046875, -0.040496826171875, -0.05682373046875, -0.042999267578125, 0.0732421875, -0.01473236083984375, 0.0236663818359375, 0.047088623046875, 0.0830078125, -0.00799560546875, -0.02398681640625, 0.00815582275390625, 0.0133209228515625, 0.003986358642578125, 0.050933837890625, 0.032196044921875, -0.06500244140625, 0.05780029296875, -0.02783203125, -0.0094146728515625, -0.0164031982421875, -0.06597900390625, -0.0836181640625, -0.055145263671875, -0.030487060546875, -0.054168701171875, -0.007785797119140625, 0.045684814453125, 0.05450439453125, -0.06689453125, -0.01537322998046875, -0.01617431640625, -0.0003612041473388672, -0.01268768310546875, -0.016876220703125, 0.0411376953125, 0.0241241455078125, -0.058258056640625, -0.03631591796875, -0.009429931640625, 0.0382080078125, 0.007808685302734375, -0.0180206298828125, -0.00994110107421875, -0.00565338134765625, 0.02734375, 0.042633056640625, -0.0418701171875, -0.00267791748046875, 0.014129638671875, -0.02227783203125, 0.038787841796875, 0.045623779296875, -0.048065185546875, 0.0310211181640625, 0.03558349609375, 0.0074615478515625, 0.055084228515625, -0.009429931640625, 0.00778961181640625, -0.028564453125, 0.0217437744140625, 0.0138397216796875, 0.037811279296875, 0.033966064453125, -0.0391845703125, 0.0308685302734375, 0.034332275390625, -0.03790283203125, -0.06610107421875, -0.0137176513671875, -0.10186767578125, 0.0158233642578125, 0.059051513671875, -0.006587982177734375, -0.037506103515625, 0.0189666748046875, -0.026580810546875, 0.034027099609375, -0.025634765625, 0.0419921875, 0.031890869140625, 0.007297515869140625, -0.05035400390625, 0.00786590576171875, 0.019805908203125, -0.005458831787109375, -0.04400634765625, -0.00827789306640625, 0.030853271484375, 0.028411865234375, 0.05023193359375, 0.0478515625, -0.0243988037109375, 0.0187225341796875, 0.00298309326171875, 0.05438232421875, -0.019317626953125, -0.02191162109375, -0.0343017578125, -0.0004169940948486328, -0.00998687744140625, -0.01153564453125 ] ]
madhurjindal/autonlp-Gibberish-Detector-492513457
2023-10-02T08:32:14.000Z
[ "transformers", "pytorch", "distilbert", "text-classification", "autonlp", "en", "dataset:madhurjindal/autonlp-data-Gibberish-Detector", "co2_eq_emissions", "endpoints_compatible", "has_space", "region:us" ]
text-classification
madhurjindal
null
null
madhurjindal/autonlp-Gibberish-Detector-492513457
25
289,872
transformers
2022-03-02T23:29:05
--- tags: [autonlp] language: en widget: - text: "I love Machine Learning!" datasets: - madhurjindal/autonlp-data-Gibberish-Detector co2_eq_emissions: 5.527544460835904 --- # Problem Description The ability to process and understand user input is crucial for various applications, such as chatbots or downstream tasks. However, a common challenge faced in such systems is the presence of gibberish or nonsensical input. To address this problem, we present a project focused on developing a gibberish detector for the English language. The primary goal of this project is to classify user input as either **gibberish** or **non-gibberish**, enabling more accurate and meaningful interactions with the system. We also aim to enhance the overall performance and user experience of chatbots and other systems that rely on user input. >## What is Gibberish? Gibberish refers to **nonsensical or meaningless language or text** that lacks coherence or any discernible meaning. It can be characterized by a combination of random words, nonsensical phrases, grammatical errors, or syntactical abnormalities that prevent the communication from conveying a clear and understandable message. Gibberish can vary in intensity, ranging from simple noise with no meaningful words to sentences that may appear superficially correct but lack coherence or logical structure when examined closely. Detecting and identifying gibberish is essential in various contexts, such as **natural language processing**, **chatbot systems**, **spam filtering**, and **language-based security measures**, to ensure effective communication and accurate processing of user inputs. ## Label Description Thus, we break down the problem into 4 categories: 1. **Noise:** Gibberish at the zero level where even the different constituents of the input phrase (words) do not hold any meaning independently. *For example: `dfdfer fgerfow2e0d qsqskdsd djksdnfkff swq.`* 2. **Word Salad:** Gibberish at level 1 where words make sense independently, but when looked at the bigger picture (the phrase) any meaning is not depicted. *For example: `22 madhur old punjab pickle chennai`* 3. **Mild gibberish:** Gibberish at level 2 where there is a part of the sentence that has grammatical errors, word sense errors, or any syntactical abnormalities, which leads the sentence to miss out on a coherent meaning. *For example: `Madhur study in a teacher`* 4. **Clean:** This category represents a set of words that form a complete and meaningful sentence on its own. *For example: `I love this website`* > **Tip:** To facilitate gibberish detection, you can combine the labels based on the desired level of detection. For instance, if you need to detect gibberish at level 1, you can group Noise and Word Salad together as "Gibberish," while considering Mild gibberish and Clean separately as "NotGibberish." This approach allows for flexibility in detecting and categorizing different levels of gibberish based on specific requirements. # Model Trained Using AutoNLP - Problem type: Multi-class Classification - Model ID: 492513457 - CO2 Emissions (in grams): 5.527544460835904 ## Validation Metrics - Loss: 0.07609463483095169 - Accuracy: 0.9735624586913417 - Macro F1: 0.9736173135739408 - Micro F1: 0.9735624586913417 - Weighted F1: 0.9736173135739408 - Macro Precision: 0.9737771415197378 - Micro Precision: 0.9735624586913417 - Weighted Precision: 0.9737771415197378 - Macro Recall: 0.9735624586913417 - Micro Recall: 0.9735624586913417 - Weighted Recall: 0.9735624586913417 ## Usage You can use cURL to access this model: ``` $ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love Machine Learning!"}' https://api-inference.huggingface.co/models/madhurjindal/autonlp-Gibberish-Detector-492513457 ``` Or Python API: ``` import torch import torch.nn.functional as F from transformers import AutoModelForSequenceClassification, AutoTokenizer model = AutoModelForSequenceClassification.from_pretrained("madhurjindal/autonlp-Gibberish-Detector-492513457", use_auth_token=True) tokenizer = AutoTokenizer.from_pretrained("madhurjindal/autonlp-Gibberish-Detector-492513457", use_auth_token=True) inputs = tokenizer("I love Machine Learning!", return_tensors="pt") outputs = model(**inputs) probs = F.softmax(outputs.logits, dim=-1) predicted_index = torch.argmax(probs, dim=1).item() predicted_prob = probs[0][predicted_index].item() labels = model.config.id2label predicted_label = labels[predicted_index] for i, prob in enumerate(probs[0]): print(f"Class: {labels[i]}, Probability: {prob:.4f}") ``` Another simplifed solution with transformers pipline: ``` from transformers import pipeline selected_model = "madhurjindal/autonlp-Gibberish-Detector-492513457" classifier = pipeline("text-classification", model=selected_model) classifier("I love Machine Learning!") ```
4,935
[ [ -0.0304412841796875, -0.0736083984375, 0.0002453327178955078, 0.01824951171875, -0.01142120361328125, 0.0073394775390625, -0.0026302337646484375, -0.04327392578125, 0.02764892578125, 0.0171661376953125, -0.016754150390625, -0.038726806640625, -0.06866455078125, 0.004718780517578125, -0.044403076171875, 0.0845947265625, 0.027862548828125, -0.0157318115234375, 0.01152801513671875, 0.0041656494140625, -0.055206298828125, -0.0489501953125, -0.044769287109375, -0.0088043212890625, 0.0258941650390625, 0.03643798828125, 0.05572509765625, 0.0239410400390625, 0.020904541015625, 0.032012939453125, -0.018951416015625, 0.0089111328125, -0.01708984375, 0.02423095703125, -0.00901031494140625, -0.02197265625, -0.02679443359375, -0.01092529296875, 0.0264892578125, 0.0289154052734375, -0.01751708984375, 0.0171661376953125, 0.0177459716796875, 0.027801513671875, -0.044525146484375, 0.044891357421875, -0.05804443359375, 0.0058746337890625, -0.0166168212890625, 0.005496978759765625, -0.02032470703125, 0.0017452239990234375, -0.01059722900390625, -0.035797119140625, 0.0135345458984375, 0.020294189453125, 0.08740234375, 0.01482391357421875, -0.01255035400390625, -0.049957275390625, -0.0157012939453125, 0.04071044921875, -0.06402587890625, 0.0217132568359375, 0.0357666015625, 0.008819580078125, -0.0222625732421875, -0.048736572265625, -0.063232421875, -0.014068603515625, -0.02447509765625, 0.026519775390625, -0.0200042724609375, 0.0013408660888671875, 0.01528167724609375, 0.033538818359375, -0.072021484375, 0.0023651123046875, -0.0215911865234375, -0.038330078125, 0.03594970703125, 0.00054168701171875, 0.04180908203125, -0.0380859375, -0.0025577545166015625, -0.0164947509765625, 0.01210784912109375, -0.0001226663589477539, 0.03009033203125, 0.0221405029296875, -0.01058197021484375, 0.031341552734375, -0.029876708984375, 0.049041748046875, 0.0133514404296875, -0.0126495361328125, 0.03594970703125, -0.01467132568359375, -0.0176239013671875, 0.00490570068359375, 0.07183837890625, 0.0064544677734375, 0.0260467529296875, 0.01161956787109375, -0.0278472900390625, 0.0140228271484375, -0.0121917724609375, -0.04461669921875, -0.04486083984375, 0.01788330078125, -0.01532745361328125, -0.0294647216796875, -0.010284423828125, -0.056732177734375, -0.0123138427734375, 0.003978729248046875, 0.041473388671875, -0.0526123046875, -0.042083740234375, 0.02197265625, -0.02764892578125, 0.004940032958984375, 0.003490447998046875, -0.09295654296875, 0.00301361083984375, 0.045501708984375, 0.06890869140625, 0.005832672119140625, -0.019744873046875, -0.0195465087890625, 0.00905609130859375, -0.0043487548828125, 0.062286376953125, -0.033843994140625, -0.022613525390625, -0.017608642578125, 0.01068115234375, -0.0160064697265625, -0.02978515625, 0.033843994140625, -0.0220489501953125, 0.06512451171875, 0.0333251953125, -0.04833984375, -0.03643798828125, 0.0200347900390625, -0.0255126953125, 0.07232666015625, 0.0197601318359375, -0.06500244140625, 0.02655029296875, -0.054107666015625, -0.041656494140625, 0.027313232421875, 0.0051116943359375, -0.031982421875, -0.00299835205078125, 0.0102996826171875, 0.036163330078125, 0.01540374755859375, 0.0443115234375, -0.0250244140625, -0.032440185546875, 0.006710052490234375, -0.050933837890625, 0.06878662109375, 0.02423095703125, -0.04132080078125, -0.005733489990234375, -0.042694091796875, 0.0209808349609375, 0.020782470703125, -0.0321044921875, -0.00949859619140625, -0.0089569091796875, 0.0037860870361328125, 0.004680633544921875, 0.01873779296875, -0.0156707763671875, 0.008514404296875, -0.047027587890625, 0.042083740234375, 0.0484619140625, -0.01137542724609375, 0.0075836181640625, -0.016693115234375, 0.0264434814453125, 0.0176239013671875, 0.01540374755859375, -0.013671875, -0.06805419921875, -0.0712890625, -0.038543701171875, 0.025360107421875, 0.06292724609375, -0.042694091796875, 0.06890869140625, -0.01202392578125, -0.032073974609375, -0.03564453125, 0.002742767333984375, 0.032440185546875, 0.04510498046875, 0.03985595703125, -0.02423095703125, -0.036285400390625, -0.06378173828125, -0.035552978515625, -0.042633056640625, -0.01446533203125, 0.001514434814453125, 0.045196533203125, -0.0282745361328125, 0.049285888671875, -0.03192138671875, -0.033172607421875, -0.0051727294921875, 0.0260162353515625, 0.04766845703125, 0.041778564453125, 0.03875732421875, -0.0457763671875, -0.048919677734375, -0.007232666015625, -0.053436279296875, -0.01593017578125, 0.0038700103759765625, 0.00013637542724609375, 0.042694091796875, 0.0167999267578125, -0.047607421875, 0.023773193359375, 0.025909423828125, -0.052886962890625, 0.035430908203125, -0.00864410400390625, 0.00502777099609375, -0.061920166015625, 0.005603790283203125, -0.01474761962890625, -0.01210784912109375, -0.053955078125, -0.01277923583984375, 0.0025920867919921875, 0.003997802734375, -0.0208892822265625, 0.04791259765625, -0.048797607421875, 0.02166748046875, -0.01253509521484375, 0.007373809814453125, 0.0023670196533203125, 0.05029296875, -0.0002307891845703125, 0.049407958984375, 0.050201416015625, -0.047271728515625, 0.012176513671875, 0.006744384765625, -0.023284912109375, 0.02813720703125, -0.033660888671875, -0.0015554428100585938, -0.01157379150390625, -0.003032684326171875, -0.1063232421875, -0.0210418701171875, 0.06634521484375, -0.06768798828125, 0.047149658203125, -0.01531982421875, -0.0443115234375, -0.043548583984375, -0.0080413818359375, -0.0002923011779785156, 0.049957275390625, -0.0275115966796875, 0.03668212890625, 0.0458984375, -0.01409912109375, -0.061431884765625, -0.059417724609375, 0.019805908203125, -0.0193023681640625, -0.04412841796875, -0.0102691650390625, -0.0008296966552734375, 0.00806427001953125, -0.0195770263671875, -0.017486572265625, -0.0056304931640625, 0.0230560302734375, 0.015960693359375, 0.0261077880859375, -0.0123138427734375, 0.0150604248046875, 0.003391265869140625, -0.00992584228515625, 0.010498046875, -0.0325927734375, 0.0562744140625, -0.038726806640625, -0.0139617919921875, -0.029083251953125, 0.018646240234375, 0.03961181640625, 0.01248931884765625, 0.036590576171875, 0.06396484375, -0.033966064453125, -0.0016956329345703125, -0.049285888671875, -0.01300811767578125, -0.043914794921875, 0.04522705078125, -0.025390625, -0.0435791015625, 0.01837158203125, 0.041717529296875, -0.010711669921875, 0.047454833984375, 0.03521728515625, -0.0150146484375, 0.0836181640625, 0.019927978515625, -0.0247650146484375, 0.035430908203125, -0.018341064453125, 0.0350341796875, -0.0216217041015625, -0.0168914794921875, -0.02435302734375, -0.02032470703125, -0.0312347412109375, -0.00604248046875, 0.0024700164794921875, 0.0184783935546875, -0.0170745849609375, 0.02874755859375, -0.081298828125, 0.03179931640625, 0.04296875, -0.0168609619140625, 0.0164642333984375, 0.0174560546875, -0.01055145263671875, 0.009185791015625, -0.0576171875, -0.045501708984375, 0.05560302734375, 0.03228759765625, 0.04461669921875, 0.006168365478515625, 0.036865234375, 0.03326416015625, 0.0303497314453125, -0.0654296875, 0.0556640625, -0.00838470458984375, -0.059844970703125, -0.0256195068359375, -0.0184783935546875, -0.07208251953125, 0.04083251953125, -0.01447296142578125, -0.06353759765625, -0.0006017684936523438, 0.025146484375, -0.005397796630859375, 0.0261688232421875, -0.0838623046875, 0.062255859375, -0.020965576171875, -0.0167999267578125, 0.0008130073547363281, -0.040924072265625, 0.037078857421875, 0.0105743408203125, 0.025146484375, -0.02081298828125, 0.00859832763671875, 0.057647705078125, -0.016265869140625, 0.06719970703125, -0.029388427734375, -0.0015659332275390625, 0.033538818359375, 0.00286102294921875, 0.02447509765625, 0.01119232177734375, 0.01331329345703125, 0.045867919921875, 0.01146697998046875, -0.018707275390625, -0.0222320556640625, 0.032012939453125, -0.051025390625, -0.062744140625, -0.064453125, -0.035736083984375, 0.0026454925537109375, 0.032196044921875, 0.0316162109375, -0.000461578369140625, 0.00672149658203125, -0.0033740997314453125, 0.034515380859375, -0.00475311279296875, 0.037322998046875, 0.046630859375, -0.0293426513671875, -0.0318603515625, 0.08575439453125, 0.019805908203125, -0.0046234130859375, 0.015869140625, 0.0168609619140625, -0.042083740234375, -0.04388427734375, -0.0229339599609375, 0.01558685302734375, -0.058380126953125, -0.018157958984375, -0.080810546875, -0.0352783203125, -0.04840087890625, 0.0173797607421875, 0.01177978515625, -0.030517578125, -0.038543701171875, 0.00843048095703125, 0.027374267578125, 0.0247344970703125, -0.0142059326171875, 0.0213470458984375, -0.040130615234375, 0.0289154052734375, 0.019744873046875, 0.0142822265625, -0.00896453857421875, -0.0723876953125, 0.002178192138671875, 0.0208587646484375, -0.0279388427734375, -0.06646728515625, 0.059173583984375, 0.0321044921875, 0.039154052734375, 0.04345703125, -0.00415802001953125, 0.05096435546875, -0.00907135009765625, 0.058929443359375, 0.0213165283203125, -0.080078125, 0.044891357421875, -0.0050811767578125, 0.024169921875, 0.0189056396484375, 0.027069091796875, -0.063720703125, -0.06707763671875, -0.054168701171875, -0.0714111328125, 0.04595947265625, 0.01412200927734375, 0.0070343017578125, -0.0267181396484375, 0.0028133392333984375, 0.003875732421875, 0.01396942138671875, -0.062469482421875, -0.00926971435546875, -0.045928955078125, -0.03466796875, 0.0003027915954589844, -0.0195465087890625, 0.0124969482421875, -0.0338134765625, 0.06939697265625, -0.01334381103515625, 0.032958984375, 0.00859832763671875, -0.031829833984375, 0.032440185546875, 0.02294921875, 0.037384033203125, 0.007183074951171875, -0.006702423095703125, 0.0178375244140625, 0.021728515625, -0.06439208984375, 0.0123291015625, -0.007610321044921875, -0.0126800537109375, 0.023193359375, 0.0217132568359375, 0.048583984375, -0.0027980804443359375, -0.0452880859375, 0.040740966796875, -0.01141357421875, -0.0244293212890625, -0.0396728515625, 0.038970947265625, 0.0107269287109375, 0.004970550537109375, 0.0261688232421875, 0.00977325439453125, 0.021148681640625, -0.031402587890625, 0.01061248779296875, 0.040740966796875, -0.0164947509765625, -0.0193023681640625, 0.056060791015625, -0.00403594970703125, -0.038421630859375, 0.05535888671875, -0.033843994140625, -0.044830322265625, 0.05462646484375, 0.050201416015625, 0.0606689453125, -0.007610321044921875, 0.0079803466796875, 0.03790283203125, 0.0247344970703125, -0.0040435791015625, 0.030548095703125, -0.004169464111328125, -0.05975341796875, 0.0011739730834960938, -0.0477294921875, -0.015869140625, 0.0296478271484375, -0.03082275390625, 0.03271484375, -0.048095703125, -0.02099609375, 0.010772705078125, -0.01157379150390625, -0.042022705078125, 0.0136871337890625, 0.035125732421875, 0.0489501953125, -0.054962158203125, 0.056488037109375, 0.043792724609375, -0.034515380859375, -0.042694091796875, -0.00498199462890625, 0.0022106170654296875, -0.0858154296875, 0.040863037109375, 0.04534912109375, -0.0090789794921875, -0.0029125213623046875, -0.04803466796875, -0.0712890625, 0.0625, 0.010955810546875, -0.047119140625, -0.00481414794921875, 0.0120086669921875, 0.041748046875, -0.003246307373046875, 0.0557861328125, 0.037567138671875, 0.055389404296875, -0.0030956268310546875, -0.060150146484375, -0.001003265380859375, -0.0384521484375, -0.01312255859375, 0.0024929046630859375, -0.051910400390625, 0.060028076171875, -0.0007429122924804688, -0.05316162109375, -0.0102081298828125, 0.062255859375, 0.006389617919921875, 0.024383544921875, 0.05535888671875, 0.0311737060546875, 0.08489990234375, -0.020538330078125, 0.034027099609375, -0.044952392578125, 0.043426513671875, 0.0745849609375, 0.0081634521484375, 0.049468994140625, 0.0299224853515625, -0.0096588134765625, 0.034210205078125, 0.0540771484375, -0.0259246826171875, 0.03265380859375, 0.0302734375, -0.01702880859375, -0.0024585723876953125, 0.01165771484375, -0.037017822265625, 0.031463623046875, 0.050201416015625, -0.00861358642578125, -0.002559661865234375, 0.00922393798828125, 0.018157958984375, -0.0181884765625, -0.011077880859375, 0.056060791015625, -0.0141143798828125, -0.047882080078125, 0.0849609375, -0.01209259033203125, 0.0966796875, -0.04656982421875, 0.00965118408203125, -0.0206298828125, 0.0199432373046875, -0.034393310546875, -0.07171630859375, 0.0282440185546875, -0.0096282958984375, 0.00003212690353393555, -0.0009741783142089844, 0.07232666015625, -0.04327392578125, -0.0282440185546875, 0.0293426513671875, 0.027313232421875, 0.0380859375, -0.01316070556640625, -0.064453125, -0.01806640625, -0.0012760162353515625, -0.026611328125, 0.00875091552734375, 0.03619384765625, 0.00841522216796875, 0.06036376953125, 0.044830322265625, -0.00695037841796875, 0.00829315185546875, -0.01983642578125, 0.053436279296875, -0.051544189453125, -0.055389404296875, -0.0653076171875, 0.038330078125, -0.01509857177734375, -0.0248870849609375, 0.05242919921875, 0.044219970703125, 0.061279296875, -0.0222015380859375, 0.06561279296875, -0.020721435546875, 0.028656005859375, -0.0245208740234375, 0.07391357421875, -0.03594970703125, 0.01016998291015625, -0.01522064208984375, -0.045867919921875, -0.0249786376953125, 0.06781005859375, -0.005306243896484375, -0.007595062255859375, 0.04931640625, 0.0736083984375, -0.00634765625, 0.01016998291015625, 0.0308685302734375, 0.0275115966796875, 0.019805908203125, 0.04608154296875, 0.0718994140625, -0.0404052734375, 0.054229736328125, -0.0003139972686767578, -0.018707275390625, -0.027496337890625, -0.045501708984375, -0.07244873046875, -0.031982421875, -0.041717529296875, -0.03997802734375, 0.0115966796875, 0.039306640625, 0.033111572265625, -0.082763671875, -0.014007568359375, -0.005950927734375, 0.0211944580078125, -0.0198211669921875, -0.027801513671875, -0.00045943260192871094, -0.0443115234375, -0.061309814453125, 0.021087646484375, 0.0137481689453125, 0.0259857177734375, -0.01079559326171875, 0.0083465576171875, -0.023834228515625, -0.000579833984375, 0.039703369140625, 0.028961181640625, -0.051239013671875, -0.04718017578125, -0.0014486312866210938, -0.01558685302734375, 0.01190948486328125, 0.002834320068359375, -0.053558349609375, 0.014312744140625, 0.04327392578125, 0.0163726806640625, 0.035247802734375, -0.0219573974609375, 0.005855560302734375, -0.03485107421875, 0.0181427001953125, 0.021087646484375, 0.03564453125, 0.01415252685546875, -0.036041259765625, 0.042022705078125, 0.043548583984375, -0.045166015625, -0.0634765625, -0.0125274658203125, -0.054931640625, -0.03314208984375, 0.0789794921875, -0.00514984130859375, -0.02398681640625, -0.045013427734375, -0.02520751953125, 0.0172271728515625, -0.041412353515625, 0.07171630859375, 0.06646728515625, 0.0004153251647949219, 0.0125274658203125, -0.0305023193359375, 0.021575927734375, 0.051971435546875, -0.04083251953125, -0.01100921630859375, 0.034881591796875, 0.038604736328125, 0.037017822265625, 0.046234130859375, 0.007480621337890625, 0.0006561279296875, 0.004230499267578125, 0.0189361572265625, -0.01052093505859375, -0.017486572265625, -0.029541015625, 0.0239410400390625, -0.02734375, -0.03411865234375 ] ]
cross-encoder/nli-deberta-base
2021-08-05T08:40:53.000Z
[ "transformers", "pytorch", "deberta", "text-classification", "deberta-base-base", "zero-shot-classification", "en", "dataset:multi_nli", "dataset:snli", "license:apache-2.0", "endpoints_compatible", "region:us" ]
zero-shot-classification
cross-encoder
null
null
cross-encoder/nli-deberta-base
13
289,216
transformers
2022-03-02T23:29:05
--- language: en pipeline_tag: zero-shot-classification tags: - deberta-base-base datasets: - multi_nli - snli metrics: - accuracy license: apache-2.0 --- # Cross-Encoder for Natural Language Inference This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class. ## Training Data The model was trained on the [SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) datasets. For a given sentence pair, it will output three scores corresponding to the labels: contradiction, entailment, neutral. ## Performance For evaluation results, see [SBERT.net - Pretrained Cross-Encoder](https://www.sbert.net/docs/pretrained_cross-encoders.html#nli). ## Usage Pre-trained models can be used like this: ```python from sentence_transformers import CrossEncoder model = CrossEncoder('cross-encoder/nli-deberta-base') scores = model.predict([('A man is eating pizza', 'A man eats something'), ('A black race car starts up in front of a crowd of people.', 'A man is driving down a lonely road.')]) #Convert scores to labels label_mapping = ['contradiction', 'entailment', 'neutral'] labels = [label_mapping[score_max] for score_max in scores.argmax(axis=1)] ``` ## Usage with Transformers AutoModel You can use the model also directly with Transformers library (without SentenceTransformers library): ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch model = AutoModelForSequenceClassification.from_pretrained('cross-encoder/nli-deberta-base') tokenizer = AutoTokenizer.from_pretrained('cross-encoder/nli-deberta-base') features = tokenizer(['A man is eating pizza', 'A black race car starts up in front of a crowd of people.'], ['A man eats something', 'A man is driving down a lonely road.'], padding=True, truncation=True, return_tensors="pt") model.eval() with torch.no_grad(): scores = model(**features).logits label_mapping = ['contradiction', 'entailment', 'neutral'] labels = [label_mapping[score_max] for score_max in scores.argmax(dim=1)] print(labels) ``` ## Zero-Shot Classification This model can also be used for zero-shot-classification: ```python from transformers import pipeline classifier = pipeline("zero-shot-classification", model='cross-encoder/nli-deberta-base') sent = "Apple just announced the newest iPhone X" candidate_labels = ["technology", "sports", "politics"] res = classifier(sent, candidate_labels) print(res) ```
2,564
[ [ -0.0167694091796875, -0.057952880859375, 0.0224761962890625, 0.018280029296875, -0.00006699562072753906, -0.006130218505859375, -0.00637054443359375, -0.0229034423828125, 0.01314544677734375, 0.034637451171875, -0.0406494140625, -0.039520263671875, -0.04266357421875, 0.0144195556640625, -0.043365478515625, 0.08551025390625, -0.00559234619140625, 0.000965118408203125, -0.011260986328125, -0.00891876220703125, -0.0186767578125, -0.03204345703125, -0.03094482421875, -0.044525146484375, 0.027374267578125, 0.012359619140625, 0.044708251953125, 0.027923583984375, 0.01035308837890625, 0.0289154052734375, 0.006130218505859375, -0.0158538818359375, -0.0136871337890625, -0.00637054443359375, -0.0011348724365234375, -0.042572021484375, -0.0049896240234375, 0.0167388916015625, 0.0232086181640625, 0.03057861328125, 0.0005159378051757812, 0.0191650390625, -0.0118865966796875, 0.0135345458984375, -0.048431396484375, 0.006099700927734375, -0.039764404296875, 0.015045166015625, 0.00797271728515625, -0.002033233642578125, -0.034149169921875, -0.0278778076171875, 0.007198333740234375, -0.035552978515625, 0.024993896484375, 0.0023097991943359375, 0.0987548828125, 0.03167724609375, -0.02435302734375, -0.030731201171875, -0.04071044921875, 0.06951904296875, -0.07598876953125, 0.0218658447265625, 0.016510009765625, 0.001361846923828125, 0.0082550048828125, -0.057647705078125, -0.0714111328125, -0.01399993896484375, -0.0167694091796875, 0.03106689453125, -0.0255584716796875, -0.006946563720703125, 0.0291595458984375, 0.0307159423828125, -0.05902099609375, -0.0002846717834472656, -0.030303955078125, -0.01059722900390625, 0.057098388671875, 0.00965118408203125, 0.0186920166015625, -0.032623291015625, -0.0296630859375, -0.011138916015625, -0.01093292236328125, 0.0099029541015625, 0.021331787109375, -0.0004930496215820312, -0.0198516845703125, 0.06353759765625, -0.027923583984375, 0.06280517578125, 0.018402099609375, -0.005462646484375, 0.05474853515625, -0.0239410400390625, -0.035919189453125, 0.025665283203125, 0.078857421875, 0.03118896484375, 0.024444580078125, -0.01061248779296875, -0.0068359375, 0.03192138671875, -0.00684356689453125, -0.0572509765625, -0.0184326171875, 0.0268096923828125, -0.0235595703125, -0.0263671875, 0.00524139404296875, -0.059814453125, -0.00039505958557128906, -0.009796142578125, 0.0623779296875, -0.04119873046875, 0.011199951171875, 0.02874755859375, -0.02447509765625, 0.033782958984375, -0.0160980224609375, -0.061981201171875, 0.003814697265625, 0.022369384765625, 0.058929443359375, 0.0169677734375, -0.037506103515625, -0.0268707275390625, 0.0011768341064453125, 0.0014820098876953125, 0.03521728515625, -0.031280517578125, -0.0030765533447265625, -0.01318359375, 0.006351470947265625, -0.026458740234375, -0.024444580078125, 0.053436279296875, -0.0202484130859375, 0.045684814453125, 0.023712158203125, -0.061676025390625, -0.027923583984375, 0.021392822265625, -0.031005859375, 0.0867919921875, 0.008087158203125, -0.0650634765625, 0.01300048828125, -0.042724609375, -0.036834716796875, -0.02154541015625, -0.0047149658203125, -0.046295166015625, 0.006931304931640625, 0.0312347412109375, 0.034210205078125, -0.0172882080078125, 0.037567138671875, -0.0221710205078125, -0.0285491943359375, 0.0243682861328125, -0.0421142578125, 0.087890625, 0.00899505615234375, -0.043914794921875, 0.01409149169921875, -0.059814453125, 0.007732391357421875, 0.0115509033203125, -0.0208740234375, -0.0030517578125, -0.02166748046875, 0.0116729736328125, 0.0220947265625, -0.00021779537200927734, -0.0599365234375, 0.0004055500030517578, -0.03717041015625, 0.048553466796875, 0.0263671875, -0.0026607513427734375, 0.024932861328125, -0.01666259765625, 0.019378662109375, 0.0005674362182617188, 0.00559234619140625, -0.00762939453125, -0.048828125, -0.0770263671875, -0.0003190040588378906, 0.034637451171875, 0.066650390625, -0.06463623046875, 0.07318115234375, -0.0158538818359375, -0.0472412109375, -0.05462646484375, -0.018096923828125, 0.0159912109375, 0.044921875, 0.045928955078125, -0.003238677978515625, -0.054290771484375, -0.057098388671875, -0.026641845703125, -0.002590179443359375, -0.01174163818359375, -0.0011653900146484375, 0.062744140625, -0.0309906005859375, 0.0828857421875, -0.040740966796875, -0.0146026611328125, -0.038055419921875, 0.029388427734375, 0.03680419921875, 0.052459716796875, 0.0295562744140625, -0.044342041015625, -0.0281524658203125, -0.021636962890625, -0.0631103515625, -0.0115509033203125, -0.0263671875, 0.002521514892578125, 0.007266998291015625, 0.023956298828125, -0.044097900390625, 0.050628662109375, 0.03271484375, -0.0372314453125, 0.03887939453125, -0.009429931640625, 0.002384185791015625, -0.07904052734375, -0.00977325439453125, 0.0165557861328125, -0.00860595703125, -0.0574951171875, -0.01361083984375, -0.01026153564453125, -0.0052490234375, -0.031280517578125, 0.042999267578125, -0.01806640625, 0.007328033447265625, -0.0012769699096679688, 0.00966644287109375, 0.018585205078125, 0.0452880859375, 0.0208587646484375, 0.0362548828125, 0.059112548828125, -0.04156494140625, 0.03851318359375, 0.0244598388671875, -0.033782958984375, 0.021453857421875, -0.06390380859375, -0.00760650634765625, -0.013671875, 0.017913818359375, -0.069580078125, -0.0126190185546875, 0.0316162109375, -0.049224853515625, -0.00020897388458251953, 0.016510009765625, -0.03265380859375, -0.0369873046875, -0.00567626953125, 0.0256500244140625, 0.035888671875, -0.035125732421875, 0.0574951171875, 0.00905609130859375, 0.027801513671875, -0.040191650390625, -0.08587646484375, -0.0006098747253417969, -0.01427459716796875, -0.03515625, 0.0224761962890625, 0.00389862060546875, 0.0015478134155273438, 0.0108795166015625, 0.001697540283203125, -0.0162811279296875, -0.002750396728515625, 0.01629638671875, 0.0202484130859375, -0.01727294921875, -0.001163482666015625, -0.00836181640625, -0.01371002197265625, 0.01273345947265625, -0.021392822265625, 0.041839599609375, -0.0201568603515625, -0.02166748046875, -0.0484619140625, 0.017608642578125, 0.021514892578125, -0.0167388916015625, 0.053619384765625, 0.07366943359375, -0.0282440185546875, -0.0005893707275390625, -0.0364990234375, -0.01532745361328125, -0.030853271484375, 0.03521728515625, -0.02276611328125, -0.04888916015625, 0.0269622802734375, 0.0211181640625, -0.011993408203125, 0.046234130859375, 0.034149169921875, 0.0029354095458984375, 0.07098388671875, 0.02716064453125, -0.0202484130859375, 0.0220947265625, -0.044677734375, 0.0248565673828125, -0.047607421875, -0.018890380859375, -0.037841796875, -0.0204315185546875, -0.043426513671875, -0.02471923828125, 0.0087890625, 0.00978851318359375, -0.0255279541015625, 0.037384033203125, -0.040252685546875, 0.034912109375, 0.0574951171875, 0.0064544677734375, 0.006221771240234375, -0.00044918060302734375, -0.005863189697265625, 0.004688262939453125, -0.06329345703125, -0.032623291015625, 0.0628662109375, 0.0229644775390625, 0.06085205078125, -0.01421356201171875, 0.064208984375, -0.004119873046875, 0.016448974609375, -0.0552978515625, 0.03485107421875, -0.021087646484375, -0.0592041015625, -0.0176239013671875, -0.03582763671875, -0.06524658203125, 0.01541900634765625, -0.029754638671875, -0.058807373046875, 0.02117919921875, -0.0145263671875, -0.037078857421875, 0.025665283203125, -0.061187744140625, 0.0933837890625, -0.031341552734375, -0.0190277099609375, 0.0118865966796875, -0.0565185546875, 0.0267486572265625, 0.010467529296875, 0.0020465850830078125, -0.014984130859375, 0.02301025390625, 0.061004638671875, -0.010101318359375, 0.07391357421875, -0.00496673583984375, 0.01465606689453125, 0.033111572265625, -0.0209197998046875, 0.00787353515625, 0.0081634521484375, -0.0272216796875, 0.028839111328125, -0.0087890625, -0.0251617431640625, -0.046295166015625, 0.036102294921875, -0.07025146484375, -0.0262451171875, -0.04132080078125, -0.0340576171875, 0.015777587890625, 0.016510009765625, 0.053985595703125, 0.03857421875, -0.0017652511596679688, 0.005096435546875, 0.02679443359375, -0.029937744140625, 0.052642822265625, 0.0095977783203125, -0.0101776123046875, -0.03533935546875, 0.062164306640625, -0.0038318634033203125, 0.01256561279296875, 0.034393310546875, 0.020599365234375, -0.042266845703125, -0.01349639892578125, -0.0308074951171875, 0.0180206298828125, -0.0428466796875, -0.0144500732421875, -0.051116943359375, -0.046142578125, -0.046478271484375, -0.00934600830078125, -0.015899658203125, -0.02520751953125, -0.03704833984375, -0.00897979736328125, 0.0254974365234375, 0.036407470703125, -0.0022735595703125, 0.0308837890625, -0.053131103515625, 0.03436279296875, 0.01385498046875, 0.01021575927734375, -0.00801849365234375, -0.052581787109375, -0.00850677490234375, -0.00022470951080322266, -0.03143310546875, -0.074951171875, 0.047943115234375, 0.0214691162109375, 0.04974365234375, 0.0176544189453125, 0.0168914794921875, 0.049652099609375, -0.022735595703125, 0.054840087890625, 0.0275726318359375, -0.09552001953125, 0.044464111328125, 0.01096343994140625, 0.034210205078125, 0.0357666015625, 0.0362548828125, -0.0540771484375, -0.0364990234375, -0.043243408203125, -0.06689453125, 0.053314208984375, 0.035003662109375, 0.0086212158203125, -0.010101318359375, 0.0156402587890625, 0.00435638427734375, 0.01508331298828125, -0.1038818359375, -0.03704833984375, -0.05181884765625, -0.04150390625, -0.02447509765625, 0.0031032562255859375, 0.00901031494140625, -0.04510498046875, 0.0650634765625, 0.0017061233520507812, 0.026031494140625, 0.0452880859375, -0.0169830322265625, 0.025115966796875, 0.026885986328125, 0.04071044921875, 0.018280029296875, -0.021270751953125, 0.00799560546875, 0.0265350341796875, -0.0191192626953125, 0.020111083984375, 0.0186920166015625, -0.03057861328125, 0.0175323486328125, 0.04217529296875, 0.098876953125, 0.00013315677642822266, -0.03424072265625, 0.038818359375, 0.00362396240234375, -0.0205841064453125, -0.0308380126953125, 0.0057373046875, -0.0033435821533203125, 0.02423095703125, 0.01885986328125, 0.01349639892578125, 0.00688934326171875, -0.045745849609375, 0.0244903564453125, 0.01024627685546875, -0.04217529296875, -0.0157928466796875, 0.062469482421875, 0.004909515380859375, -0.033905029296875, 0.0509033203125, -0.0232391357421875, -0.053741455078125, 0.049896240234375, 0.04534912109375, 0.07666015625, -0.0016660690307617188, 0.0281524658203125, 0.051666259765625, 0.0299072265625, -0.0044097900390625, 0.007770538330078125, 0.0021915435791015625, -0.0743408203125, -0.0279083251953125, -0.0538330078125, -0.0043792724609375, 0.01097869873046875, -0.05072021484375, 0.0129547119140625, -0.01416015625, -0.004146575927734375, 0.0074005126953125, -0.0167999267578125, -0.048828125, 0.0242767333984375, 0.0186309814453125, 0.06524658203125, -0.0830078125, 0.06988525390625, 0.0382080078125, -0.051483154296875, -0.062469482421875, 0.0113525390625, -0.0174560546875, -0.052642822265625, 0.053436279296875, 0.040008544921875, 0.007526397705078125, 0.01142120361328125, -0.0294036865234375, -0.052398681640625, 0.07293701171875, 0.009796142578125, -0.036163330078125, -0.005947113037109375, 0.024444580078125, 0.045806884765625, -0.0310821533203125, 0.056243896484375, 0.053741455078125, 0.035675048828125, -0.0022296905517578125, -0.050140380859375, 0.005397796630859375, -0.00949859619140625, -0.007526397705078125, -0.00878143310546875, -0.02740478515625, 0.06915283203125, -0.0209197998046875, 0.000514984130859375, 0.010589599609375, 0.05279541015625, 0.02197265625, 0.039520263671875, 0.038665771484375, 0.06378173828125, 0.042449951171875, -0.018402099609375, 0.06878662109375, -0.01439666748046875, 0.056488037109375, 0.0806884765625, -0.015777587890625, 0.0635986328125, 0.034332275390625, -0.008453369140625, 0.0545654296875, 0.049102783203125, -0.0318603515625, 0.039398193359375, 0.0218048095703125, -0.007785797119140625, -0.019805908203125, 0.01213836669921875, -0.0236358642578125, 0.060302734375, 0.005672454833984375, -0.033111572265625, -0.021881103515625, 0.01230621337890625, -0.0178985595703125, -0.0016584396362304688, -0.0118560791015625, 0.0406494140625, -0.00890350341796875, -0.050140380859375, 0.054443359375, 0.00408935546875, 0.0709228515625, -0.0294036865234375, 0.0054931640625, 0.0006361007690429688, 0.0218353271484375, -0.02099609375, -0.0684814453125, 0.0247802734375, -0.005840301513671875, -0.00853729248046875, -0.002506256103515625, 0.035125732421875, -0.052734375, -0.06304931640625, 0.03717041015625, 0.02020263671875, 0.0178680419921875, 0.005237579345703125, -0.07568359375, -0.00569915771484375, 0.0162811279296875, -0.0164947509765625, -0.0098114013671875, 0.029388427734375, 0.0237884521484375, 0.03692626953125, 0.03546142578125, -0.011871337890625, 0.0260467529296875, 0.0159759521484375, 0.044891357421875, -0.0640869140625, -0.0265045166015625, -0.07427978515625, 0.047821044921875, -0.011505126953125, -0.04119873046875, 0.06573486328125, 0.060882568359375, 0.07244873046875, -0.0210113525390625, 0.05181884765625, -0.0184326171875, 0.0211029052734375, -0.0455322265625, 0.046417236328125, -0.0439453125, 0.0030765533447265625, -0.0082550048828125, -0.053131103515625, -0.035736083984375, 0.0684814453125, -0.029510498046875, 0.00873565673828125, 0.047027587890625, 0.07049560546875, -0.0047760009765625, 0.007678985595703125, 0.01084136962890625, 0.0239105224609375, 0.005619049072265625, 0.0526123046875, 0.05975341796875, -0.07012939453125, 0.050567626953125, -0.03985595703125, -0.001522064208984375, -0.0027980804443359375, -0.05322265625, -0.0704345703125, -0.033050537109375, -0.0404052734375, -0.02716064453125, -0.00937652587890625, 0.056793212890625, 0.05615234375, -0.0797119140625, -0.019134521484375, -0.02093505859375, 0.0189056396484375, -0.0202484130859375, -0.0266571044921875, 0.01554107666015625, -0.0200347900390625, -0.06341552734375, 0.019683837890625, -0.0025691986083984375, 0.004146575927734375, -0.006561279296875, -0.0069122314453125, -0.045196533203125, 0.000370025634765625, 0.033111572265625, 0.0135345458984375, -0.07379150390625, -0.025909423828125, -0.0024089813232421875, -0.0134735107421875, 0.01375579833984375, 0.031951904296875, -0.065185546875, 0.01549530029296875, 0.035980224609375, 0.047882080078125, 0.0543212890625, -0.0108489990234375, 0.025238037109375, -0.05401611328125, 0.0088348388671875, 0.0117645263671875, 0.031982421875, 0.023529052734375, -0.01541900634765625, 0.0377197265625, 0.0330810546875, -0.04266357421875, -0.045379638671875, 0.005550384521484375, -0.0716552734375, -0.0275726318359375, 0.0770263671875, -0.00756072998046875, -0.03515625, -0.01004791259765625, -0.00952911376953125, 0.04559326171875, -0.021514892578125, 0.04620361328125, 0.033233642578125, -0.0184478759765625, -0.016998291015625, -0.036224365234375, 0.018218994140625, 0.0406494140625, -0.060302734375, -0.02069091796875, 0.01076507568359375, 0.033203125, 0.0283966064453125, 0.028289794921875, 0.01309967041015625, -0.0010652542114257812, 0.016448974609375, 0.0270843505859375, 0.00684356689453125, -0.006931304931640625, -0.035552978515625, 0.0104827880859375, -0.04510498046875, -0.045745849609375 ] ]
stabilityai/stable-diffusion-2-base
2023-07-05T16:19:03.000Z
[ "diffusers", "stable-diffusion", "text-to-image", "arxiv:2112.10752", "arxiv:2202.00512", "arxiv:1910.09700", "license:openrail++", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
stabilityai
null
null
stabilityai/stable-diffusion-2-base
299
287,950
diffusers
2022-11-23T17:41:31
--- license: openrail++ tags: - stable-diffusion - text-to-image --- # Stable Diffusion v2-base Model Card This model card focuses on the model associated with the Stable Diffusion v2-base model, available [here](https://github.com/Stability-AI/stablediffusion). The model is trained from scratch 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. Then it is further trained for 850k steps at resolution `512x512` on the same dataset on images with resolution `>= 512x512`. ![image](https://github.com/Stability-AI/stablediffusion/blob/main/assets/stable-samples/txt2img/merged-0003.png?raw=true) - Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `512-base-ema.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-2-base/resolve/main/512-base-ema.ckpt). - Use it with 🧨 [`diffusers`](https://huggingface.co/stabilityai/stable-diffusion-2-base#examples) ## Model Details - **Developed by:** Robin Rombach, Patrick Esser - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)). - **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/). - **Cite as:** @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } ## Examples Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 in a simple and efficient manner. ```bash pip install diffusers transformers accelerate scipy safetensors ``` Running the pipeline (if you don't swap the scheduler it will run with the default PNDM/PLMS scheduler, in this example we are swapping it to EulerDiscreteScheduler): ```python from diffusers import StableDiffusionPipeline, EulerDiscreteScheduler import torch model_id = "stabilityai/stable-diffusion-2-base" # Use the Euler scheduler here instead scheduler = EulerDiscreteScheduler.from_pretrained(model_id, subfolder="scheduler") pipe = StableDiffusionPipeline.from_pretrained(model_id, scheduler=scheduler, torch_dtype=torch.float16) pipe = pipe.to("cuda") prompt = "a photo of an astronaut riding a horse on mars" image = pipe(prompt).images[0] image.save("astronaut_rides_horse.png") ``` **Notes**: - Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance) - If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed) # Uses ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include - Safe deployment of models which have the potential to generate harmful content. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use _Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_. The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. - The model was trained mainly with English captions and will not work as well in other languages. - The autoencoding part of the model is lossy - The model was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section). ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion vw was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as white and western cultures are often set as the default. Further, the ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts. Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ## Training **Training Data** The model developers used the following dataset for training the model: - LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic. **Training Procedure** Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training, - Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4 - Text prompts are encoded through the OpenCLIP-ViT/H text-encoder. - The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention. - The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512. We currently provide the following checkpoints: - `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. 850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`. - `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset. - `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. - `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://github.com/saic-mdal/lama). - `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). - **Hardware:** 32 x 8 x A100 GPUs - **Optimizer:** AdamW - **Gradient Accumulations**: 1 - **Batch:** 32 x 8 x 2 x 4 = 2048 - **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant ## Evaluation Results Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints: ![pareto](model-variants.jpg) Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores. ## Environmental Impact **Stable Diffusion v1** **Estimated Emissions** Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact. - **Hardware Type:** A100 PCIe 40GB - **Hours used:** 200000 - **Cloud Provider:** AWS - **Compute Region:** US-east - **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq. ## Citation @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } *This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
12,566
[ [ -0.0284881591796875, -0.060760498046875, 0.0217742919921875, 0.0096588134765625, -0.0203857421875, -0.0279998779296875, 0.005985260009765625, -0.034027099609375, -0.01198577880859375, 0.032440185546875, -0.02655029296875, -0.031463623046875, -0.05621337890625, -0.011993408203125, -0.0310211181640625, 0.07012939453125, -0.0071258544921875, 0.00144195556640625, -0.01500701904296875, -0.003726959228515625, -0.0240325927734375, -0.01045989990234375, -0.0731201171875, -0.0208282470703125, 0.03125, 0.005870819091796875, 0.054901123046875, 0.044036865234375, 0.03509521484375, 0.0226287841796875, -0.02117919921875, 0.00024437904357910156, -0.054718017578125, -0.0004153251647949219, -0.002918243408203125, -0.02044677734375, -0.0377197265625, 0.01033782958984375, 0.043121337890625, 0.01898193359375, -0.00632476806640625, 0.0026302337646484375, 0.00008666515350341797, 0.046142578125, -0.041473388671875, -0.0091400146484375, -0.0243072509765625, 0.01171112060546875, -0.01209259033203125, 0.017822265625, -0.03265380859375, -0.01059722900390625, 0.00994110107421875, -0.054595947265625, 0.0328369140625, -0.02490234375, 0.084716796875, 0.030487060546875, -0.028076171875, -0.003940582275390625, -0.056671142578125, 0.0435791015625, -0.042236328125, 0.0236663818359375, 0.02618408203125, 0.00537109375, -0.0005855560302734375, -0.07781982421875, -0.04168701171875, -0.00736236572265625, 0.0020904541015625, 0.03515625, -0.031890869140625, -0.006229400634765625, 0.034088134765625, 0.01035308837890625, -0.04925537109375, -0.003307342529296875, -0.045562744140625, -0.00408172607421875, 0.046905517578125, 0.005908966064453125, 0.0224456787109375, -0.01332855224609375, -0.03070068359375, -0.0026702880859375, -0.039398193359375, 0.001811981201171875, 0.0292510986328125, -0.028656005859375, -0.0306549072265625, 0.032623291015625, 0.01026153564453125, 0.033660888671875, 0.0232391357421875, -0.00878143310546875, 0.027984619140625, -0.015411376953125, -0.016510009765625, -0.034637451171875, 0.06329345703125, 0.050933837890625, -0.0017261505126953125, 0.0101165771484375, -0.00664520263671875, 0.0196685791015625, 0.0076751708984375, -0.09307861328125, -0.04022216796875, 0.011932373046875, -0.051788330078125, -0.0443115234375, -0.010223388671875, -0.07073974609375, -0.01617431640625, 0.01271820068359375, 0.032562255859375, -0.0265045166015625, -0.037689208984375, -0.004161834716796875, -0.02777099609375, 0.0132293701171875, 0.035614013671875, -0.052642822265625, 0.00872039794921875, 0.0032939910888671875, 0.0838623046875, -0.0267791748046875, -0.0020751953125, -0.01103973388671875, 0.01068878173828125, -0.023590087890625, 0.052276611328125, -0.02166748046875, -0.04119873046875, -0.0178375244140625, 0.0291748046875, 0.01085662841796875, -0.039581298828125, 0.047943115234375, -0.0369873046875, 0.0253753662109375, -0.0022258758544921875, -0.030670166015625, -0.0164794921875, -0.006061553955078125, -0.055633544921875, 0.08367919921875, 0.0180511474609375, -0.06976318359375, 0.01123809814453125, -0.051788330078125, -0.018157958984375, -0.00623321533203125, 0.003887176513671875, -0.053375244140625, -0.0104522705078125, 0.0012407302856445312, 0.032012939453125, -0.00759124755859375, 0.021331787109375, -0.0185546875, -0.020843505859375, -0.0022563934326171875, -0.046844482421875, 0.0762939453125, 0.0264129638671875, -0.036041259765625, 0.0077972412109375, -0.04571533203125, -0.026214599609375, 0.037872314453125, -0.01494598388671875, -0.01076507568359375, -0.01061248779296875, 0.025146484375, 0.027313232421875, 0.0063629150390625, -0.03155517578125, -0.0059661865234375, -0.0218963623046875, 0.0426025390625, 0.059539794921875, 0.01038360595703125, 0.051727294921875, -0.033660888671875, 0.03778076171875, 0.0237579345703125, 0.0223388671875, -0.00907135009765625, -0.0660400390625, -0.054595947265625, -0.0145721435546875, 0.0116424560546875, 0.040130615234375, -0.055999755859375, 0.0177001953125, 0.005748748779296875, -0.05340576171875, -0.013885498046875, -0.001911163330078125, 0.0201416015625, 0.0517578125, 0.0210418701171875, -0.0309295654296875, -0.0262603759765625, -0.053863525390625, 0.0271759033203125, -0.007045745849609375, 0.01136016845703125, 0.0185394287109375, 0.051055908203125, -0.029266357421875, 0.039825439453125, -0.04119873046875, -0.0200042724609375, 0.00496673583984375, 0.00968170166015625, 0.00261688232421875, 0.05206298828125, 0.062103271484375, -0.080078125, -0.05084228515625, -0.018585205078125, -0.06427001953125, 0.0012559890747070312, 0.00632476806640625, -0.0274810791015625, 0.036346435546875, 0.03765869140625, -0.059173583984375, 0.04534912109375, 0.047882080078125, -0.0278167724609375, 0.03515625, -0.02880859375, -0.0003542900085449219, -0.0789794921875, 0.0090484619140625, 0.0260467529296875, -0.026580810546875, -0.04632568359375, 0.001483917236328125, -0.003154754638671875, -0.0152587890625, -0.051513671875, 0.058380126953125, -0.029022216796875, 0.0330810546875, -0.031097412109375, -0.005039215087890625, 0.0119781494140625, 0.021148681640625, 0.02740478515625, 0.049102783203125, 0.06268310546875, -0.04254150390625, 0.01239013671875, 0.0150909423828125, -0.0076141357421875, 0.03802490234375, -0.06512451171875, 0.015716552734375, -0.03448486328125, 0.022705078125, -0.076171875, -0.019378662109375, 0.044769287109375, -0.033477783203125, 0.0295867919921875, -0.0199127197265625, -0.025543212890625, -0.033721923828125, -0.0174713134765625, 0.037322998046875, 0.07855224609375, -0.030487060546875, 0.041168212890625, 0.034942626953125, 0.0139007568359375, -0.037567138671875, -0.053314208984375, -0.01418304443359375, -0.0287017822265625, -0.06689453125, 0.04461669921875, -0.0171051025390625, -0.01137542724609375, 0.015350341796875, 0.0104522705078125, 0.0007696151733398438, -0.003093719482421875, 0.037384033203125, 0.021942138671875, 0.0021648406982421875, -0.01187896728515625, 0.0126495361328125, -0.0170135498046875, -0.0005354881286621094, -0.00850677490234375, 0.032257080078125, 0.007205963134765625, -0.006267547607421875, -0.052703857421875, 0.0301055908203125, 0.04052734375, 0.0008716583251953125, 0.059722900390625, 0.082763671875, -0.044952392578125, -0.00251007080078125, -0.02850341796875, -0.016876220703125, -0.038055419921875, 0.03369140625, -0.0093994140625, -0.04534912109375, 0.043853759765625, 0.00217437744140625, 0.0011377334594726562, 0.05224609375, 0.06536865234375, -0.0176544189453125, 0.08905029296875, 0.04949951171875, 0.02593994140625, 0.053314208984375, -0.060516357421875, -0.005764007568359375, -0.06121826171875, -0.0252838134765625, -0.0131072998046875, -0.0166778564453125, -0.03521728515625, -0.051910400390625, 0.0244293212890625, 0.0136871337890625, -0.0163116455078125, 0.01123809814453125, -0.045928955078125, 0.0263671875, 0.0240478515625, 0.01506805419921875, -0.00218963623046875, 0.012237548828125, 0.00972747802734375, -0.013397216796875, -0.056671142578125, -0.048858642578125, 0.078125, 0.040679931640625, 0.0673828125, 0.0047607421875, 0.0352783203125, 0.03350830078125, 0.034027099609375, -0.03424072265625, 0.03802490234375, -0.03265380859375, -0.05133056640625, -0.0060577392578125, -0.0170135498046875, -0.0692138671875, 0.0138702392578125, -0.016357421875, -0.03570556640625, 0.03765869140625, 0.014495849609375, -0.018402099609375, 0.026336669921875, -0.05792236328125, 0.074462890625, -0.0096282958984375, -0.0579833984375, -0.005619049072265625, -0.0498046875, 0.0279083251953125, 0.0017538070678710938, 0.01239776611328125, -0.0114288330078125, -0.003986358642578125, 0.06884765625, -0.0237884521484375, 0.0701904296875, -0.0307464599609375, 0.00598907470703125, 0.029022216796875, -0.00787353515625, 0.02777099609375, 0.018890380859375, -0.0090789794921875, 0.029296875, 0.0030059814453125, -0.0286712646484375, -0.0228271484375, 0.05877685546875, -0.08026123046875, -0.034454345703125, -0.0379638671875, -0.0284881591796875, 0.038360595703125, 0.01367950439453125, 0.0643310546875, 0.0258636474609375, -0.01454925537109375, 0.0001513957977294922, 0.063232421875, -0.01776123046875, 0.0325927734375, 0.01380157470703125, -0.0202484130859375, -0.040008544921875, 0.057464599609375, 0.01617431640625, 0.03619384765625, -0.00881195068359375, 0.01171112060546875, -0.01812744140625, -0.04150390625, -0.044647216796875, 0.0218963623046875, -0.0648193359375, -0.015594482421875, -0.062255859375, -0.0284423828125, -0.033538818359375, -0.01239013671875, -0.02374267578125, -0.0166168212890625, -0.06689453125, 0.0043487548828125, 0.02337646484375, 0.04229736328125, -0.02215576171875, 0.026336669921875, -0.0308685302734375, 0.03179931640625, 0.01271820068359375, 0.01201629638671875, 0.0012693405151367188, -0.0584716796875, -0.01068878173828125, 0.0041656494140625, -0.05072021484375, -0.07269287109375, 0.0312347412109375, 0.0071563720703125, 0.043304443359375, 0.041717529296875, -0.0029201507568359375, 0.037689208984375, -0.0297698974609375, 0.07403564453125, 0.0182342529296875, -0.046295166015625, 0.048248291015625, -0.031585693359375, 0.007205963134765625, 0.016754150390625, 0.0443115234375, -0.0187225341796875, -0.0243377685546875, -0.059967041015625, -0.069580078125, 0.0518798828125, 0.031829833984375, 0.0302886962890625, -0.00640106201171875, 0.049163818359375, -0.0031909942626953125, -0.007137298583984375, -0.08251953125, -0.040802001953125, -0.0292205810546875, 0.0028533935546875, 0.0070648193359375, -0.033966064453125, -0.012420654296875, -0.038543701171875, 0.07177734375, 0.00791168212890625, 0.0400390625, 0.027252197265625, -0.0015363693237304688, -0.032012939453125, -0.0293731689453125, 0.038604736328125, 0.030853271484375, -0.0162200927734375, -0.002117156982421875, -0.004909515380859375, -0.043121337890625, 0.022674560546875, 0.0117950439453125, -0.05242919921875, 0.003948211669921875, -0.0016794204711914062, 0.07000732421875, -0.0231475830078125, -0.0335693359375, 0.048370361328125, -0.0147705078125, -0.027099609375, -0.03564453125, 0.0142364501953125, 0.0099639892578125, 0.023223876953125, 0.00705718994140625, 0.040618896484375, 0.0115814208984375, -0.0269927978515625, 0.0109405517578125, 0.03302001953125, -0.0297393798828125, -0.0252685546875, 0.0860595703125, 0.0095062255859375, -0.024505615234375, 0.042694091796875, -0.03564453125, -0.018646240234375, 0.054290771484375, 0.057769775390625, 0.0589599609375, -0.0166168212890625, 0.039825439453125, 0.05029296875, 0.02093505859375, -0.023712158203125, 0.0155029296875, 0.0208587646484375, -0.0599365234375, -0.005596160888671875, -0.032440185546875, -0.0017337799072265625, 0.0176544189453125, -0.03375244140625, 0.040283203125, -0.0390625, -0.037353515625, 0.0007023811340332031, -0.020233154296875, -0.04345703125, 0.01293182373046875, 0.0278167724609375, 0.059417724609375, -0.08404541015625, 0.06292724609375, 0.0562744140625, -0.047821044921875, -0.041168212890625, 0.00024378299713134766, -0.0067138671875, -0.022003173828125, 0.040771484375, 0.0121917724609375, 0.0018520355224609375, 0.00789642333984375, -0.0562744140625, -0.06793212890625, 0.09490966796875, 0.031280517578125, -0.0198516845703125, -0.005603790283203125, -0.02105712890625, 0.04608154296875, -0.03192138671875, 0.0229949951171875, 0.0229339599609375, 0.025665283203125, 0.0316162109375, -0.03302001953125, 0.00791168212890625, -0.035247802734375, 0.0250244140625, -0.01045989990234375, -0.06689453125, 0.07928466796875, -0.0259552001953125, -0.02301025390625, 0.0226898193359375, 0.052520751953125, 0.01953125, 0.0247039794921875, 0.03167724609375, 0.06201171875, 0.0435791015625, -0.01149749755859375, 0.07513427734375, -0.006534576416015625, 0.033233642578125, 0.053741455078125, -0.0083465576171875, 0.049652099609375, 0.0288543701171875, -0.007843017578125, 0.044830322265625, 0.056671142578125, -0.0301361083984375, 0.06060791015625, -0.00461578369140625, -0.0142822265625, -0.005977630615234375, -0.00455474853515625, -0.038818359375, 0.00894927978515625, 0.0234222412109375, -0.043914794921875, -0.0134429931640625, 0.020233154296875, 0.001220703125, -0.01372528076171875, -0.007160186767578125, 0.041107177734375, 0.004428863525390625, -0.0271759033203125, 0.0477294921875, 0.01800537109375, 0.065673828125, -0.03216552734375, -0.011993408203125, -0.00801849365234375, 0.0116729736328125, -0.02142333984375, -0.0572509765625, 0.03411865234375, -0.00547027587890625, -0.0195159912109375, -0.0157470703125, 0.06982421875, -0.0229644775390625, -0.050140380859375, 0.0304412841796875, 0.0222625732421875, 0.021484375, 0.00936126708984375, -0.0804443359375, 0.01593017578125, -0.005008697509765625, -0.0276947021484375, 0.01739501953125, 0.0137939453125, 0.0010690689086914062, 0.038360595703125, 0.047332763671875, -0.0033931732177734375, 0.006526947021484375, -0.0012731552124023438, 0.061767578125, -0.020660400390625, -0.025726318359375, -0.059112548828125, 0.0565185546875, -0.006195068359375, -0.0171356201171875, 0.05084228515625, 0.045928955078125, 0.061798095703125, -0.01241302490234375, 0.05859375, -0.0217437744140625, -0.00347137451171875, -0.035980224609375, 0.0687255859375, -0.05621337890625, 0.004550933837890625, -0.0280303955078125, -0.06475830078125, -0.0153961181640625, 0.072265625, -0.0224456787109375, 0.0197296142578125, 0.031829833984375, 0.0750732421875, -0.009490966796875, -0.017791748046875, 0.026641845703125, 0.01971435546875, 0.0285797119140625, 0.0233001708984375, 0.0633544921875, -0.05419921875, 0.0291748046875, -0.044097900390625, -0.01727294921875, -0.004451751708984375, -0.061920166015625, -0.0703125, -0.0518798828125, -0.056854248046875, -0.05804443359375, -0.001415252685546875, 0.03570556640625, 0.07330322265625, -0.04180908203125, -0.0038661956787109375, -0.0158843994140625, 0.0009641647338867188, -0.001499176025390625, -0.0206756591796875, 0.0269317626953125, 0.01247406005859375, -0.06793212890625, -0.010711669921875, 0.0188446044921875, 0.040374755859375, -0.0374755859375, -0.01556396484375, -0.0196380615234375, -0.0110931396484375, 0.042724609375, 0.0059661865234375, -0.049530029296875, -0.0034809112548828125, -0.0030155181884765625, -0.0059661865234375, 0.01239776611328125, 0.0220489501953125, -0.0443115234375, 0.0275115966796875, 0.042572021484375, 0.0132293701171875, 0.06402587890625, -0.004436492919921875, 0.01120758056640625, -0.031890869140625, 0.0266265869140625, 0.01067352294921875, 0.032135009765625, 0.02685546875, -0.0450439453125, 0.036712646484375, 0.04974365234375, -0.052703857421875, -0.06097412109375, 0.01361083984375, -0.08123779296875, -0.0206756591796875, 0.09674072265625, -0.014495849609375, -0.0291748046875, 0.001590728759765625, -0.030059814453125, 0.022705078125, -0.0285797119140625, 0.041656494140625, 0.038482666015625, -0.01172637939453125, -0.03948974609375, -0.04876708984375, 0.0428466796875, 0.005970001220703125, -0.046875, -0.0200347900390625, 0.046295166015625, 0.056121826171875, 0.0182952880859375, 0.07281494140625, -0.0245513916015625, 0.0271453857421875, 0.004467010498046875, 0.0009465217590332031, 0.0014429092407226562, -0.018798828125, -0.032623291015625, 0.00641632080078125, -0.01197052001953125, -0.0002944469451904297 ] ]
dccuchile/bert-base-spanish-wwm-cased
2022-05-31T15:01:30.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "masked-lm", "es", "arxiv:1904.09077", "arxiv:1906.01502", "arxiv:1812.10464", "arxiv:1901.07291", "arxiv:1904.02099", "arxiv:1906.01569", "arxiv:1908.11828", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
dccuchile
null
null
dccuchile/bert-base-spanish-wwm-cased
37
284,677
transformers
2022-03-02T23:29:05
--- language: - es tags: - masked-lm --- # BETO: Spanish BERT BETO is a [BERT model](https://github.com/google-research/bert) trained on a [big Spanish corpus](https://github.com/josecannete/spanish-corpora). BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with [Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) as well as other (not BERT-based) models. ## Download | | | | | |-|:--------:|:-----:|:----:| |BETO uncased|[tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/pytorch_weights.tar.gz) | [vocab](./config/uncased_2M/vocab.txt), [config](./config/uncased_2M/config.json) | |BETO cased| [tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/pytorch_weights.tar.gz) | [vocab](./config/cased_2M/vocab.txt), [config](./config/cased_2M/config.json) | All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps. ## Benchmarks The following table shows some BETO results in the Spanish version of every task. We compare BETO (cased and uncased) with the Best Multilingual BERT results that we found in the literature (as of October 2019). The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods). References for all methods can be found [here](#references). |Task | BETO-cased | BETO-uncased | Best Multilingual BERT | Other results | |-------|--------------:|--------------:|--------------------------:|-------------------------------:| |[POS](https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-1827) | **98.97** | 98.44 | 97.10 [2] | 98.91 [6], 96.71 [3] | |[NER-C](https://www.kaggle.com/nltkdata/conll-corpora) | [**88.43**](https://github.com/gchaperon/beto-benchmarks/blob/master/conll2002/dev_results_beto-cased_conll2002.txt) | 82.67 | 87.38 [2] | 87.18 [3] | |[MLDoc](https://github.com/facebookresearch/MLDoc) | [95.60](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-cased_mldoc.txt) | [**96.12**](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-uncased_mldoc.txt) | 95.70 [2] | 88.75 [4] | |[PAWS-X](https://github.com/google-research-datasets/paws/tree/master/pawsx) | 89.05 | 89.55 | 90.70 [8] | |[XNLI](https://github.com/facebookresearch/XNLI) | **82.01** | 80.15 | 78.50 [2] | 80.80 [5], 77.80 [1], 73.15 [4]| ## Example of use For further details on how to use BETO you can visit the [🤗Huggingface Transformers library](https://github.com/huggingface/transformers), starting by the [Quickstart section](https://huggingface.co/transformers/quickstart.html). BETO models can be accessed simply as [`'dccuchile/bert-base-spanish-wwm-cased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) and [`'dccuchile/bert-base-spanish-wwm-uncased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) by using the Transformers library. An example on how to download and use the models in this page can be found in [this colab notebook](https://colab.research.google.com/drive/1uRwg4UmPgYIqGYY4gW_Nsw9782GFJbPt). (We will soon add a more detailed step-by-step tutorial in Spanish for newcommers 😉) ## Acknowledgments We thank [Adereso](https://www.adere.so/) for kindly providing support for traininig BETO-uncased, and the [Millennium Institute for Foundational Research on Data](https://imfd.cl/en/) that provided support for training BETO-cased. Also thanks to Google for helping us with the [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc) program. ## Citation [Spanish Pre-Trained BERT Model and Evaluation Data](https://users.dcc.uchile.cl/~jperez/papers/pml4dc2020.pdf) To cite this resource in a publication please use the following: ``` @inproceedings{CaneteCFP2020, title={Spanish Pre-Trained BERT Model and Evaluation Data}, author={Cañete, José and Chaperon, Gabriel and Fuentes, Rodrigo and Ho, Jou-Hui and Kang, Hojin and Pérez, Jorge}, booktitle={PML4DC at ICLR 2020}, year={2020} } ``` ## License Disclaimer The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs. ## References * [1] [Original Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) * [2] [Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"](https://arxiv.org/pdf/1904.09077.pdf) * [3] [Multilingual BERT on "How Multilingual is Multilingual BERT?"](https://arxiv.org/pdf/1906.01502.pdf) * [4] [LASER](https://arxiv.org/abs/1812.10464) * [5] [XLM (MLM+TLM)](https://arxiv.org/pdf/1901.07291.pdf) * [6] [UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"](https://arxiv.org/pdf/1904.02099.pdf) * [7] [Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"](https://arxiv.org/pdf/1906.01569.pdf) * [8] [Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"](https://arxiv.org/abs/1908.11828)
5,906
[ [ -0.0306854248046875, -0.04010009765625, 0.01493072509765625, 0.03399658203125, -0.01488494873046875, 0.00617218017578125, -0.040557861328125, -0.044769287109375, 0.0287933349609375, 0.0089263916015625, -0.030731201171875, -0.043548583984375, -0.04180908203125, 0.004306793212890625, -0.007762908935546875, 0.08740234375, -0.009552001953125, 0.0322265625, -0.00820159912109375, -0.002162933349609375, -0.01355743408203125, -0.043121337890625, -0.046051025390625, -0.0258331298828125, 0.0311279296875, 0.0165863037109375, 0.05084228515625, 0.018341064453125, 0.0386962890625, 0.0277862548828125, -0.01446533203125, 0.00567626953125, -0.020294189453125, -0.0135498046875, 0.01067352294921875, -0.027252197265625, -0.032440185546875, -0.006679534912109375, 0.03753662109375, 0.042816162109375, -0.0118560791015625, -0.00225830078125, -0.008331298828125, 0.044586181640625, -0.0200653076171875, 0.0258331298828125, -0.04608154296875, 0.0014638900756835938, -0.00469207763671875, 0.0138702392578125, -0.0310516357421875, -0.0303802490234375, 0.034149169921875, -0.0462646484375, 0.037872314453125, -0.00041222572326660156, 0.09637451171875, 0.01027679443359375, -0.019439697265625, -0.043487548828125, -0.036468505859375, 0.0706787109375, -0.05291748046875, 0.049774169921875, 0.0311737060546875, 0.021240234375, -0.0169525146484375, -0.047943115234375, -0.03656005859375, -0.0294952392578125, -0.0095977783203125, 0.0296630859375, -0.01314544677734375, 0.0009469985961914062, 0.006359100341796875, 0.0323486328125, -0.037353515625, 0.0012645721435546875, -0.03033447265625, -0.0273590087890625, 0.045745849609375, -0.032440185546875, 0.0175018310546875, -0.0261077880859375, -0.03753662109375, -0.044952392578125, -0.04290771484375, 0.0210418701171875, 0.034698486328125, 0.0235443115234375, -0.0252532958984375, 0.028961181640625, 0.01690673828125, 0.035308837890625, -0.01320648193359375, -0.01116943359375, 0.046661376953125, -0.024261474609375, -0.0140228271484375, -0.01003265380859375, 0.07891845703125, 0.0023403167724609375, 0.02392578125, -0.033294677734375, -0.002635955810546875, -0.0221710205078125, 0.020355224609375, -0.054412841796875, -0.00860595703125, 0.0281982421875, -0.03155517578125, -0.0071258544921875, -0.0015954971313476562, -0.05450439453125, 0.00791168212890625, -0.01282501220703125, 0.03460693359375, -0.0587158203125, -0.00402069091796875, 0.0134735107421875, -0.004161834716796875, 0.0296630859375, 0.012054443359375, -0.06182861328125, 0.0117340087890625, 0.0350341796875, 0.061614990234375, -0.0089111328125, -0.02679443359375, -0.000022590160369873047, -0.0167388916015625, -0.0287933349609375, 0.04205322265625, -0.0002684593200683594, -0.0211029052734375, 0.0146331787109375, 0.0204620361328125, -0.020172119140625, -0.027099609375, 0.07373046875, -0.037078857421875, 0.044281005859375, -0.03778076171875, -0.0382080078125, -0.0167694091796875, 0.00946807861328125, -0.048095703125, 0.1004638671875, 0.0029144287109375, -0.056640625, 0.026580810546875, -0.04144287109375, -0.041748046875, -0.015625, 0.00897979736328125, -0.045806884765625, -0.004726409912109375, 0.0240478515625, 0.03753662109375, -0.022003173828125, 0.035736083984375, -0.020965576171875, -0.01251983642578125, 0.004573822021484375, -0.00830841064453125, 0.089111328125, 0.0213623046875, -0.0300750732421875, 0.00431060791015625, -0.0504150390625, -0.0049896240234375, 0.0194549560546875, -0.03289794921875, -0.01428985595703125, 0.003696441650390625, 0.00670623779296875, 0.0162811279296875, 0.033935546875, -0.059783935546875, 0.0007944107055664062, -0.028717041015625, 0.024627685546875, 0.042144775390625, -0.03253173828125, 0.01131439208984375, -0.0302276611328125, 0.0238494873046875, -0.01389312744140625, 0.0274505615234375, 0.00243377685546875, -0.046661376953125, -0.07904052734375, -0.04541015625, 0.046875, 0.053131103515625, -0.0714111328125, 0.0390625, -0.031280517578125, -0.052978515625, -0.0572509765625, -0.0034427642822265625, 0.04498291015625, 0.040496826171875, 0.0430908203125, -0.0159759521484375, -0.0487060546875, -0.079345703125, 0.01445770263671875, -0.0124053955078125, -0.01861572265625, 0.03143310546875, 0.052978515625, -0.01099395751953125, 0.05877685546875, -0.021392822265625, -0.018341064453125, -0.007415771484375, 0.0038909912109375, 0.032684326171875, 0.033447265625, 0.062347412109375, -0.052703857421875, -0.038726806640625, 0.00397491455078125, -0.056884765625, 0.01445770263671875, 0.00264739990234375, -0.01462554931640625, 0.0264129638671875, 0.0280609130859375, -0.0350341796875, -0.0007114410400390625, 0.040130615234375, -0.01226043701171875, 0.041351318359375, -0.041046142578125, 0.00545501708984375, -0.09295654296875, 0.013702392578125, 0.011962890625, -0.007450103759765625, -0.0445556640625, -0.0010576248168945312, 0.008056640625, 0.00624847412109375, -0.04754638671875, 0.043792724609375, -0.0313720703125, -0.0029697418212890625, 0.01885986328125, -0.01271820068359375, -0.007717132568359375, 0.0546875, 0.0173797607421875, 0.0631103515625, 0.04620361328125, -0.032012939453125, 0.019500732421875, 0.025970458984375, -0.035247802734375, 0.01168060302734375, -0.07598876953125, 0.00594329833984375, -0.001995086669921875, 0.010406494140625, -0.0723876953125, -0.006076812744140625, 0.01232147216796875, -0.045196533203125, 0.040740966796875, -0.01531982421875, -0.052032470703125, -0.040130615234375, -0.051177978515625, 0.00466156005859375, 0.040252685546875, -0.050872802734375, 0.0238189697265625, 0.0265350341796875, -0.0037174224853515625, -0.0546875, -0.050537109375, -0.0030841827392578125, -0.01708984375, -0.053680419921875, 0.0574951171875, -0.02276611328125, 0.00641632080078125, -0.0104522705078125, 0.01009368896484375, -0.01251220703125, -0.0108795166015625, -0.0031986236572265625, 0.036865234375, -0.0014715194702148438, 0.00830078125, 0.0021724700927734375, 0.01294708251953125, -0.0067901611328125, -0.0008373260498046875, 0.0400390625, -0.0311431884765625, 0.0011768341064453125, -0.00907135009765625, 0.020721435546875, 0.04046630859375, -0.00417327880859375, 0.062164306640625, 0.063720703125, -0.0242919921875, 0.00820159912109375, -0.0435791015625, 0.003387451171875, -0.032867431640625, 0.0162200927734375, -0.031402587890625, -0.057220458984375, 0.0595703125, 0.018829345703125, 0.017608642578125, 0.042083740234375, 0.0518798828125, -0.018341064453125, 0.053985595703125, 0.054473876953125, -0.0114898681640625, 0.05047607421875, -0.035980224609375, 0.002735137939453125, -0.058502197265625, -0.0330810546875, -0.056365966796875, -0.0048675537109375, -0.066650390625, -0.035430908203125, 0.0247650146484375, 0.00943756103515625, -0.01373291015625, 0.0548095703125, -0.03216552734375, 0.01013946533203125, 0.0650634765625, 0.017120361328125, -0.002475738525390625, 0.017486572265625, -0.035186767578125, -0.00875091552734375, -0.06500244140625, -0.031341552734375, 0.10009765625, 0.03863525390625, 0.03509521484375, 0.007236480712890625, 0.053466796875, 0.01702880859375, 0.0162200927734375, -0.061737060546875, 0.028717041015625, -0.0241546630859375, -0.056793212890625, -0.01715087890625, -0.0306549072265625, -0.08782958984375, 0.0313720703125, -0.0107574462890625, -0.05322265625, 0.0318603515625, -0.003948211669921875, -0.0118560791015625, 0.01483154296875, -0.075927734375, 0.06927490234375, -0.031402587890625, -0.0172576904296875, 0.005748748779296875, -0.04779052734375, 0.00769805908203125, 0.0036525726318359375, 0.0196533203125, 0.003467559814453125, 0.004138946533203125, 0.0714111328125, -0.043487548828125, 0.053863525390625, -0.005401611328125, -0.013427734375, 0.025634765625, -0.02008056640625, 0.0305023193359375, -0.00788116455078125, -0.01189422607421875, 0.05328369140625, 0.01134490966796875, -0.037017822265625, -0.0098724365234375, 0.044342041015625, -0.07110595703125, -0.0101165771484375, -0.040985107421875, -0.048309326171875, -0.02276611328125, 0.0167083740234375, 0.032196044921875, 0.00995635986328125, -0.014923095703125, 0.0179290771484375, 0.0555419921875, -0.038055419921875, 0.0469970703125, 0.041595458984375, 0.00818634033203125, -0.042694091796875, 0.0665283203125, 0.0032501220703125, 0.00048732757568359375, 0.0411376953125, 0.0129547119140625, -0.036895751953125, -0.037750244140625, -0.040985107421875, 0.033782958984375, -0.032867431640625, -0.0119781494140625, -0.04180908203125, -0.0042724609375, -0.0418701171875, -0.00489044189453125, -0.039459228515625, -0.0261077880859375, -0.01082611083984375, 0.0014171600341796875, 0.02947998046875, 0.0174102783203125, -0.01355743408203125, 0.0208740234375, -0.03424072265625, 0.0167999267578125, 0.0184326171875, 0.0263824462890625, -0.004932403564453125, -0.04241943359375, -0.0270538330078125, 0.0117034912109375, -0.0164031982421875, -0.059326171875, 0.032928466796875, 0.01186370849609375, 0.05010986328125, 0.0157012939453125, -0.013397216796875, 0.04364013671875, -0.0491943359375, 0.04974365234375, 0.0198516845703125, -0.06292724609375, 0.038604736328125, -0.02728271484375, -0.002223968505859375, 0.044708251953125, 0.060760498046875, -0.035736083984375, -0.01146697998046875, -0.054229736328125, -0.07470703125, 0.0657958984375, 0.023773193359375, 0.007328033447265625, -0.004291534423828125, 0.0069732666015625, 0.00885009765625, 0.0236663818359375, -0.073486328125, -0.032073974609375, -0.018585205078125, -0.02154541015625, -0.00498199462890625, -0.026519775390625, -0.012298583984375, -0.034759521484375, 0.0679931640625, -0.0017633438110351562, 0.05316162109375, 0.03094482421875, -0.01349639892578125, 0.013580322265625, 0.011871337890625, 0.0423583984375, 0.035858154296875, -0.049072265625, -0.01068115234375, 0.012359619140625, -0.041778564453125, -0.0194091796875, 0.03692626953125, -0.0186614990234375, 0.0267486572265625, 0.044158935546875, 0.065673828125, 0.0169219970703125, -0.049041748046875, 0.0357666015625, -0.01189422607421875, -0.027252197265625, -0.0211029052734375, -0.019134521484375, 0.002574920654296875, 0.01220703125, 0.0280914306640625, -0.0179901123046875, -0.0011310577392578125, -0.04437255859375, 0.006256103515625, 0.032745361328125, -0.0276641845703125, -0.0212554931640625, 0.042694091796875, 0.009613037109375, -0.0036830902099609375, 0.033599853515625, -0.0268402099609375, -0.048583984375, 0.056304931640625, 0.03338623046875, 0.060302734375, -0.00850677490234375, 0.029998779296875, 0.046112060546875, 0.03857421875, -0.0029659271240234375, 0.0291290283203125, -0.0108795166015625, -0.06329345703125, -0.031890869140625, -0.05706787109375, -0.0322265625, 0.0153045654296875, -0.04058837890625, 0.0171966552734375, -0.01268768310546875, -0.0064697265625, 0.0113983154296875, 0.0222625732421875, -0.056884765625, 0.01442718505859375, 0.003749847412109375, 0.060760498046875, -0.058258056640625, 0.073974609375, 0.06085205078125, -0.04144287109375, -0.050079345703125, -0.016876220703125, -0.0273590087890625, -0.07293701171875, 0.032196044921875, -0.00731658935546875, 0.0124359130859375, -0.0230255126953125, -0.0205841064453125, -0.06988525390625, 0.07440185546875, 0.035186767578125, -0.03472900390625, 0.003566741943359375, 0.0139007568359375, 0.0692138671875, -0.005962371826171875, 0.04541015625, 0.026458740234375, 0.0292510986328125, 0.017059326171875, -0.08123779296875, -0.0042572021484375, -0.02557373046875, 0.0092926025390625, 0.00014007091522216797, -0.06512451171875, 0.062469482421875, -0.01372528076171875, 0.00405120849609375, -0.00922393798828125, 0.042205810546875, 0.0143585205078125, 0.0096435546875, 0.0286407470703125, 0.051605224609375, 0.055908203125, -0.0233154296875, 0.085205078125, -0.0278778076171875, 0.0518798828125, 0.07061767578125, 0.006870269775390625, 0.04815673828125, 0.03155517578125, -0.044952392578125, 0.036468505859375, 0.06927490234375, -0.01003265380859375, 0.04083251953125, 0.0028667449951171875, -0.003509521484375, -0.00182342529296875, -0.0085906982421875, -0.041412353515625, 0.0304107666015625, 0.00374603271484375, -0.02313232421875, -0.013427734375, 0.00008124113082885742, 0.0355224609375, -0.01898193359375, -0.002025604248046875, 0.033721923828125, -0.0027179718017578125, -0.05419921875, 0.06646728515625, 0.005584716796875, 0.07275390625, -0.058441162109375, 0.017303466796875, -0.0284881591796875, 0.005138397216796875, -0.005329132080078125, -0.050933837890625, 0.01511383056640625, 0.01220703125, -0.0220794677734375, -0.033966064453125, 0.03961181640625, -0.03607177734375, -0.05426025390625, 0.043243408203125, 0.03729248046875, 0.0262603759765625, 0.01087188720703125, -0.0772705078125, -0.0014095306396484375, 0.01351165771484375, -0.0272369384765625, 0.015655517578125, 0.0174407958984375, -0.0067138671875, 0.0450439453125, 0.05596923828125, 0.0076751708984375, 0.0175323486328125, 0.018829345703125, 0.045440673828125, -0.031585693359375, -0.01751708984375, -0.04034423828125, 0.025299072265625, -0.004329681396484375, -0.0299224853515625, 0.05975341796875, 0.055755615234375, 0.0919189453125, -0.0200653076171875, 0.035858154296875, -0.031219482421875, 0.03656005859375, -0.0283355712890625, 0.049041748046875, -0.055572509765625, -0.005764007568359375, -0.034912109375, -0.056304931640625, -0.0182952880859375, 0.04815673828125, -0.02728271484375, 0.0128631591796875, 0.048004150390625, 0.0472412109375, 0.003757476806640625, -0.0206298828125, 0.0026836395263671875, 0.015472412109375, 0.021392822265625, 0.046722412109375, 0.0300140380859375, -0.05877685546875, 0.055267333984375, -0.04644775390625, -0.015167236328125, -0.011871337890625, -0.060516357421875, -0.06744384765625, -0.052734375, -0.02362060546875, -0.0162200927734375, 0.00324249267578125, 0.05517578125, 0.0618896484375, -0.0823974609375, -0.034942626953125, -0.00295257568359375, 0.019744873046875, -0.014617919921875, -0.01467132568359375, 0.0526123046875, -0.029052734375, -0.08294677734375, 0.0244903564453125, -0.01102447509765625, 0.004863739013671875, 0.0010728836059570312, -0.0115814208984375, -0.03863525390625, -0.0055389404296875, 0.053009033203125, 0.0301513671875, -0.042938232421875, -0.01399993896484375, 0.0196990966796875, -0.0011949539184570312, 0.013916015625, 0.03302001953125, -0.03924560546875, 0.04217529296875, 0.0382080078125, 0.035919189453125, 0.056427001953125, -0.0267486572265625, 0.017303466796875, -0.057281494140625, 0.023223876953125, 0.0157012939453125, 0.048614501953125, 0.027740478515625, -0.011566162109375, 0.055938720703125, 0.01532745361328125, -0.0289154052734375, -0.057159423828125, -0.016845703125, -0.0882568359375, -0.02490234375, 0.0738525390625, -0.0174713134765625, -0.0182342529296875, 0.0171356201171875, -0.01200103759765625, 0.03717041015625, -0.0330810546875, 0.0667724609375, 0.07000732421875, -0.0103302001953125, 0.00954437255859375, -0.04974365234375, 0.03302001953125, 0.053985595703125, -0.0611572265625, -0.0295257568359375, 0.0187835693359375, 0.0219573974609375, 0.026458740234375, 0.03570556640625, -0.01537322998046875, 0.004047393798828125, -0.0176544189453125, 0.041717529296875, 0.002532958984375, -0.02313232421875, -0.01084136962890625, -0.01163482666015625, -0.0264129638671875, -0.03082275390625 ] ]
google/electra-small-discriminator
2021-04-29T15:24:16.000Z
[ "transformers", "pytorch", "tf", "jax", "electra", "pretraining", "en", "arxiv:1406.2661", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
google
null
null
google/electra-small-discriminator
19
284,178
transformers
2022-03-02T23:29:05
--- language: en thumbnail: https://huggingface.co/front/thumbnails/google.png license: apache-2.0 --- ## ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators **ELECTRA** is a new method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a [GAN](https://arxiv.org/pdf/1406.2661.pdf). At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset. For a detailed description and experimental results, please refer to our paper [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB). This repository contains code to pre-train ELECTRA, including small ELECTRA models on a single GPU. It also supports fine-tuning ELECTRA on downstream tasks including classification tasks (e.g,. [GLUE](https://gluebenchmark.com/)), QA tasks (e.g., [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/)), and sequence tagging tasks (e.g., [text chunking](https://www.clips.uantwerpen.be/conll2000/chunking/)). ## How to use the discriminator in `transformers` ```python from transformers import ElectraForPreTraining, ElectraTokenizerFast import torch discriminator = ElectraForPreTraining.from_pretrained("google/electra-small-discriminator") tokenizer = ElectraTokenizerFast.from_pretrained("google/electra-small-discriminator") sentence = "The quick brown fox jumps over the lazy dog" fake_sentence = "The quick brown fox fake over the lazy dog" fake_tokens = tokenizer.tokenize(fake_sentence) fake_inputs = tokenizer.encode(fake_sentence, return_tensors="pt") discriminator_outputs = discriminator(fake_inputs) predictions = torch.round((torch.sign(discriminator_outputs[0]) + 1) / 2) [print("%7s" % token, end="") for token in fake_tokens] [print("%7s" % int(prediction), end="") for prediction in predictions.squeeze().tolist()] ```
2,211
[ [ -0.035308837890625, -0.03692626953125, 0.0119171142578125, 0.01282501220703125, -0.018524169921875, 0.0239410400390625, -0.0192413330078125, -0.0134124755859375, 0.0301055908203125, 0.033050537109375, -0.0261077880859375, -0.01493072509765625, -0.03802490234375, 0.0306854248046875, -0.04547119140625, 0.075927734375, -0.01284027099609375, -0.00943756103515625, -0.004093170166015625, -0.005161285400390625, -0.02947998046875, -0.045654296875, -0.036834716796875, -0.044708251953125, 0.028961181640625, 0.027252197265625, 0.011749267578125, 0.0234832763671875, 0.0251617431640625, 0.041351318359375, 0.013946533203125, 0.0147705078125, -0.02703857421875, -0.0021610260009765625, 0.0041351318359375, -0.046722412109375, -0.00821685791015625, 0.003597259521484375, 0.03411865234375, 0.0072174072265625, -0.01393890380859375, 0.006191253662109375, -0.01416778564453125, 0.05145263671875, -0.052001953125, 0.0098724365234375, -0.06097412109375, 0.003604888916015625, -0.020904541015625, -0.00923919677734375, -0.056732177734375, -0.020965576171875, -0.00957489013671875, -0.04388427734375, 0.041595458984375, 0.0030517578125, 0.084716796875, 0.0269927978515625, -0.01358795166015625, -0.0207977294921875, -0.060211181640625, 0.05242919921875, -0.033905029296875, 0.011444091796875, 0.01224517822265625, 0.0224609375, 0.01348114013671875, -0.07965087890625, -0.066162109375, 0.005588531494140625, -0.007537841796875, 0.0307464599609375, -0.0292816162109375, 0.0182342529296875, 0.019073486328125, 0.03155517578125, -0.02972412109375, 0.0189208984375, -0.04351806640625, -0.0125579833984375, 0.044921875, -0.0098724365234375, 0.0137786865234375, -0.0008463859558105469, -0.0179443359375, -0.02703857421875, -0.057403564453125, -0.0033855438232421875, 0.03668212890625, 0.013336181640625, -0.005001068115234375, 0.04248046875, -0.00647735595703125, 0.037109375, 0.043121337890625, 0.02020263671875, 0.047332763671875, -0.0034275054931640625, -0.0193023681640625, 0.02880859375, 0.0814208984375, -0.0032024383544921875, 0.0256195068359375, -0.0126495361328125, -0.0166168212890625, 0.0298309326171875, 0.0258331298828125, -0.07568359375, -0.041168212890625, 0.005229949951171875, -0.03277587890625, -0.0301971435546875, -0.00275421142578125, -0.06732177734375, -0.005794525146484375, 0.00965118408203125, 0.045013427734375, -0.03125, -0.022613525390625, 0.0086669921875, -0.016021728515625, 0.020721435546875, 0.0027103424072265625, -0.08233642578125, 0.02532958984375, 0.01690673828125, 0.056182861328125, -0.0010213851928710938, -0.0308685302734375, -0.028472900390625, -0.00881195068359375, -0.0026874542236328125, 0.079345703125, -0.015869140625, -0.00978851318359375, 0.00885009765625, 0.0028858184814453125, -0.0274658203125, -0.04248046875, 0.02667236328125, -0.042633056640625, 0.01751708984375, 0.006793975830078125, -0.04876708984375, -0.0181884765625, -0.016754150390625, -0.06536865234375, 0.08062744140625, 0.005397796630859375, -0.056549072265625, 0.0338134765625, -0.0494384765625, -0.040191650390625, 0.009429931640625, -0.004497528076171875, -0.049346923828125, 0.00588226318359375, 0.0206451416015625, 0.0241851806640625, -0.00006884336471557617, 0.0012731552124023438, 0.00017368793487548828, -0.0262603759765625, 0.01904296875, -0.0228271484375, 0.043975830078125, 0.0165557861328125, -0.04388427734375, 0.0246429443359375, -0.04974365234375, 0.004276275634765625, 0.01348114013671875, -0.01146697998046875, 0.00110626220703125, 0.015960693359375, 0.007110595703125, 0.0227203369140625, 0.0175018310546875, -0.043701171875, 0.0009918212890625, -0.04974365234375, 0.05450439453125, 0.058441162109375, -0.0296173095703125, 0.037750244140625, -0.01031494140625, 0.035675048828125, 0.0019683837890625, -0.01558685302734375, -0.0166168212890625, -0.02166748046875, -0.09375, -0.0144195556640625, 0.01971435546875, 0.03778076171875, -0.0736083984375, 0.06768798828125, -0.009063720703125, -0.05072021484375, -0.0421142578125, 0.0012569427490234375, 0.0122833251953125, 0.01174163818359375, 0.0452880859375, -0.003143310546875, -0.082275390625, -0.046142578125, -0.01291656494140625, -0.0307464599609375, 0.0156402587890625, -0.0221405029296875, 0.058929443359375, -0.00678253173828125, 0.0810546875, -0.0168304443359375, -0.0256805419921875, -0.06658935546875, 0.009124755859375, 0.0195159912109375, 0.03729248046875, 0.0308380126953125, -0.05316162109375, -0.0301513671875, -0.007701873779296875, -0.040283203125, 0.004268646240234375, -0.007022857666015625, 0.017669677734375, 0.0016880035400390625, 0.03900146484375, -0.061553955078125, 0.029815673828125, 0.03631591796875, -0.032684326171875, 0.041290283203125, -0.0274658203125, -0.01004791259765625, -0.0731201171875, -0.0159759521484375, -0.0021610260009765625, -0.01837158203125, -0.056427001953125, -0.012603759765625, 0.0038509368896484375, 0.00881195068359375, -0.046661376953125, 0.029388427734375, -0.0219268798828125, 0.01291656494140625, -0.01678466796875, -0.004878997802734375, -0.003177642822265625, 0.035125732421875, 0.005153656005859375, 0.07794189453125, 0.026580810546875, -0.05029296875, 0.0260009765625, 0.0229034423828125, -0.0246124267578125, 0.0164031982421875, -0.06536865234375, 0.02288818359375, -0.0174560546875, 0.0220489501953125, -0.058929443359375, 0.00026488304138183594, 0.007411956787109375, -0.031707763671875, 0.020660400390625, 0.010528564453125, -0.052398681640625, -0.05596923828125, -0.023529052734375, 0.02899169921875, 0.058868408203125, -0.060791015625, 0.04071044921875, 0.03460693359375, 0.026458740234375, -0.0262451171875, -0.0455322265625, -0.01322174072265625, -0.0258026123046875, -0.019439697265625, 0.041778564453125, 0.00777435302734375, 0.0081329345703125, -0.0113525390625, 0.002910614013671875, -0.0183258056640625, -0.0083770751953125, 0.01247406005859375, 0.017486572265625, -0.0002989768981933594, 0.015289306640625, 0.0006003379821777344, -0.01495361328125, 0.001552581787109375, -0.012969970703125, 0.07269287109375, -0.032989501953125, -0.0106964111328125, -0.03240966796875, 0.0066375732421875, 0.0172119140625, -0.037322998046875, 0.056060791015625, 0.0631103515625, -0.0238494873046875, -0.0155181884765625, -0.05108642578125, -0.007343292236328125, -0.0457763671875, 0.040008544921875, -0.0282440185546875, -0.0692138671875, 0.031280517578125, 0.00041103363037109375, -0.0017795562744140625, 0.069580078125, 0.05743408203125, -0.01190185546875, 0.08984375, 0.04779052734375, -0.01247406005859375, 0.0557861328125, -0.0501708984375, 0.029296875, -0.0604248046875, -0.0205841064453125, -0.048431396484375, -0.01280975341796875, -0.04205322265625, -0.019744873046875, -0.01329803466796875, 0.00821685791015625, -0.0298309326171875, 0.045684814453125, -0.07220458984375, 0.034759521484375, 0.0265960693359375, 0.0018177032470703125, -0.00450897216796875, 0.006214141845703125, 0.01141357421875, -0.0038585662841796875, -0.0633544921875, -0.0426025390625, 0.0819091796875, 0.01529693603515625, 0.07232666015625, -0.033203125, 0.07452392578125, 0.01392364501953125, 0.0268707275390625, -0.0545654296875, 0.047454833984375, -0.0213165283203125, -0.043243408203125, -0.0121002197265625, -0.028564453125, -0.091064453125, 0.0279083251953125, 0.0015020370483398438, -0.060394287109375, 0.018585205078125, 0.006591796875, -0.0206756591796875, 0.04559326171875, -0.06793212890625, 0.0689697265625, -0.0097198486328125, 0.0008802413940429688, -0.0115966796875, -0.01385498046875, -0.004009246826171875, 0.00891876220703125, -0.004180908203125, -0.0030994415283203125, 0.021026611328125, 0.08648681640625, -0.04241943359375, 0.06207275390625, -0.01168060302734375, 0.018280029296875, 0.041839599609375, -0.030059814453125, 0.0360107421875, -0.0218505859375, 0.002590179443359375, 0.0018463134765625, 0.0005593299865722656, -0.00930023193359375, -0.0204925537109375, 0.0296478271484375, -0.07574462890625, -0.02581787109375, -0.044921875, -0.0136566162109375, 0.0206451416015625, 0.0295867919921875, 0.066162109375, 0.0267791748046875, -0.0160064697265625, 0.01078033447265625, 0.0455322265625, -0.00847625732421875, 0.056549072265625, 0.007350921630859375, -0.005275726318359375, -0.02728271484375, 0.081787109375, 0.010772705078125, 0.0003345012664794922, 0.0292816162109375, -0.0026035308837890625, -0.0350341796875, -0.037384033203125, -0.0200958251953125, 0.021026611328125, -0.0499267578125, -0.031707763671875, -0.05596923828125, -0.033203125, -0.0231170654296875, -0.01171875, -0.03936767578125, -0.01885986328125, -0.038665771484375, -0.020538330078125, 0.032440185546875, 0.049041748046875, 0.01319122314453125, 0.04669189453125, -0.03240966796875, 0.029754638671875, 0.0250244140625, -0.0020771026611328125, -0.0157012939453125, -0.0210113525390625, -0.0218963623046875, -0.0009937286376953125, -0.0251617431640625, -0.07147216796875, 0.05328369140625, 0.0177154541015625, 0.0333251953125, 0.0208282470703125, 0.00377655029296875, 0.0521240234375, -0.055755615234375, 0.053375244140625, 0.032958984375, -0.07330322265625, 0.0377197265625, 0.0214691162109375, 0.00797271728515625, 0.0579833984375, -0.0087738037109375, -0.0156097412109375, -0.02691650390625, -0.0516357421875, -0.05682373046875, 0.060394287109375, 0.03131103515625, 0.016265869140625, -0.0169219970703125, 0.003200531005859375, 0.0015859603881835938, 0.0219268798828125, -0.06396484375, -0.04095458984375, -0.044708251953125, -0.035919189453125, -0.0219268798828125, -0.0152130126953125, 0.0177459716796875, -0.040771484375, 0.050689697265625, 0.00226593017578125, 0.05120849609375, 0.0181884765625, -0.034912109375, 0.001544952392578125, 0.0175323486328125, 0.01898193359375, 0.03460693359375, -0.00678253173828125, 0.0163421630859375, 0.026580810546875, -0.05224609375, 0.032867431640625, 0.026092529296875, -0.0253143310546875, 0.026641845703125, 0.01100921630859375, 0.0706787109375, -0.00598907470703125, -0.031036376953125, 0.02978515625, -0.007534027099609375, -0.0208282470703125, -0.051361083984375, -0.0004296302795410156, -0.0177764892578125, -0.005855560302734375, 0.0258331298828125, 0.015899658203125, 0.0008535385131835938, -0.0283966064453125, 0.01004791259765625, 0.01898193359375, -0.032073974609375, -0.0465087890625, 0.059906005859375, 0.02532958984375, -0.0308074951171875, 0.042205810546875, -0.0260772705078125, -0.06842041015625, 0.05810546875, 0.0665283203125, 0.08001708984375, -0.0210113525390625, 0.0411376953125, 0.0322265625, 0.0379638671875, -0.01837158203125, 0.0104827880859375, -0.00485992431640625, -0.08233642578125, -0.039154052734375, -0.033203125, -0.0140380859375, -0.0001418590545654297, -0.031646728515625, 0.0159912109375, -0.021484375, -0.00934600830078125, 0.0042266845703125, 0.003910064697265625, -0.0787353515625, 0.005279541015625, -0.006519317626953125, 0.061614990234375, -0.05560302734375, 0.073974609375, 0.058013916015625, -0.03887939453125, -0.063720703125, -0.0294647216796875, -0.0472412109375, -0.053924560546875, 0.051666259765625, 0.046478271484375, 0.0004029273986816406, 0.0230865478515625, -0.02227783203125, -0.047119140625, 0.0675048828125, 0.0274200439453125, -0.0330810546875, -0.0214996337890625, 0.0128631591796875, 0.035003662109375, -0.0178985595703125, 0.043792724609375, 0.050048828125, 0.0311279296875, -0.0140838623046875, -0.052520751953125, 0.003520965576171875, -0.0227203369140625, -0.00855255126953125, 0.020050048828125, -0.0479736328125, 0.0799560546875, -0.0018243789672851562, -0.0234375, 0.007343292236328125, 0.0643310546875, 0.017822265625, 0.01238250732421875, 0.0474853515625, 0.045654296875, 0.05755615234375, -0.0302886962890625, 0.07733154296875, -0.006168365478515625, 0.050689697265625, 0.056488037109375, -0.0095672607421875, 0.048797607421875, 0.04278564453125, -0.03399658203125, 0.05908203125, 0.032318115234375, -0.0106658935546875, 0.049591064453125, 0.007419586181640625, -0.0240478515625, -0.018707275390625, 0.01885986328125, -0.037200927734375, 0.0311279296875, 0.0213623046875, -0.019744873046875, -0.01416015625, 0.007110595703125, -0.0035495758056640625, -0.006687164306640625, -0.0224151611328125, 0.0516357421875, -0.0006279945373535156, -0.034393310546875, 0.0390625, -0.004547119140625, 0.08441162109375, -0.045623779296875, -0.0037326812744140625, 0.0018463134765625, 0.035552978515625, -0.02728271484375, -0.036834716796875, 0.007411956787109375, 0.00780487060546875, 0.006847381591796875, -0.021209716796875, 0.0726318359375, -0.0282745361328125, -0.046295166015625, 0.0003025531768798828, 0.0217742919921875, 0.007137298583984375, -0.0106658935546875, -0.035675048828125, -0.0030765533447265625, -0.00791168212890625, -0.018218994140625, 0.008758544921875, 0.01529693603515625, 0.0309600830078125, 0.041900634765625, 0.044036865234375, 0.00841522216796875, 0.0230712890625, 0.00675201416015625, 0.0692138671875, -0.034820556640625, -0.042388916015625, -0.07696533203125, 0.0276947021484375, -0.015869140625, -0.0307464599609375, 0.0736083984375, 0.046722412109375, 0.0755615234375, -0.0130767822265625, 0.051788330078125, -0.02325439453125, 0.00440216064453125, -0.036834716796875, 0.0533447265625, -0.012908935546875, -0.00923919677734375, -0.0142364501953125, -0.06500244140625, -0.00749969482421875, 0.08154296875, -0.0216217041015625, 0.0124359130859375, 0.056182861328125, 0.03485107421875, 0.01348114013671875, -0.0105743408203125, 0.003719329833984375, 0.0056915283203125, 0.039947509765625, 0.035675048828125, 0.0751953125, -0.0440673828125, 0.05694580078125, -0.03826904296875, 0.01226043701171875, -0.009307861328125, -0.02972412109375, -0.0931396484375, -0.045684814453125, -0.0254364013671875, -0.0216522216796875, -0.004566192626953125, 0.058685302734375, 0.060791015625, -0.06536865234375, -0.0162200927734375, -0.04547119140625, 0.021148681640625, -0.0225830078125, -0.0177459716796875, 0.028839111328125, -0.040191650390625, -0.0533447265625, 0.0221710205078125, 0.0002682209014892578, -0.006381988525390625, -0.020050048828125, -0.006591796875, -0.0095062255859375, -0.00649261474609375, 0.049346923828125, 0.028228759765625, -0.03790283203125, -0.032989501953125, -0.007266998291015625, 0.0014963150024414062, 0.0253753662109375, 0.056060791015625, -0.090576171875, 0.037506103515625, 0.0293731689453125, 0.0283050537109375, 0.06903076171875, -0.0206451416015625, 0.03289794921875, -0.0411376953125, 0.030487060546875, 0.0177764892578125, 0.04266357421875, 0.0184783935546875, -0.01068878173828125, 0.024993896484375, 0.00847625732421875, -0.04150390625, -0.0579833984375, 0.006366729736328125, -0.0576171875, 0.0075531005859375, 0.0657958984375, -0.005615234375, -0.0188446044921875, -0.0017423629760742188, -0.0221710205078125, 0.041900634765625, -0.03533935546875, 0.04534912109375, 0.0275115966796875, 0.00536346435546875, -0.0123748779296875, -0.01959228515625, 0.03564453125, 0.021148681640625, -0.07867431640625, -0.021331787109375, 0.006031036376953125, 0.00711822509765625, 0.0221710205078125, 0.0572509765625, 0.0027065277099609375, 0.01313018798828125, 0.0015993118286132812, 0.02496337890625, -0.006938934326171875, -0.01296234130859375, -0.022796630859375, -0.0006546974182128906, -0.021697998046875, -0.046875 ] ]
hellomyoh/llama2-2b-s117755-v2
2023-09-26T05:11:12.000Z
[ "peft", "region:us" ]
null
hellomyoh
null
null
hellomyoh/llama2-2b-s117755-v2
0
282,983
peft
2023-09-26T05:10:43
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.6.0.dev0 - PEFT 0.6.0.dev0
863
[ [ -0.049713134765625, -0.051605224609375, 0.021759033203125, 0.03436279296875, -0.042266845703125, 0.00658416748046875, 0.00826263427734375, -0.0210723876953125, -0.00820159912109375, 0.026763916015625, -0.04901123046875, -0.01548004150390625, -0.037811279296875, 0.01096343994140625, 0.006496429443359375, 0.06585693359375, -0.012908935546875, 0.030670166015625, -0.00971221923828125, -0.0045318603515625, -0.034149169921875, -0.0294342041015625, -0.07049560546875, -0.01898193359375, -0.019805908203125, 0.026947021484375, 0.03204345703125, 0.019287109375, 0.04718017578125, 0.019927978515625, -0.0021953582763671875, -0.0163116455078125, -0.014984130859375, -0.0248565673828125, 0.01032257080078125, -0.0338134765625, -0.059906005859375, -0.01053619384765625, 0.06707763671875, 0.0142059326171875, -0.00832366943359375, 0.01922607421875, -0.01061248779296875, 0.0682373046875, -0.051239013671875, 0.0157623291015625, -0.033782958984375, 0.032928466796875, -0.00571441650390625, -0.018646240234375, -0.018829345703125, -0.01505279541015625, -0.0003597736358642578, -0.044097900390625, 0.032012939453125, -0.002170562744140625, 0.0701904296875, 0.01006317138671875, -0.045867919921875, 0.00970458984375, -0.032135009765625, 0.050994873046875, -0.05999755859375, 0.034210205078125, 0.034088134765625, 0.01047515869140625, 0.0102996826171875, -0.041412353515625, -0.0312347412109375, 0.0117034912109375, 0.011199951171875, -0.00772857666015625, -0.005878448486328125, 0.03399658203125, 0.0576171875, 0.030242919921875, -0.0301971435546875, -0.00949859619140625, -0.040618896484375, -0.033294677734375, 0.054290771484375, 0.01568603515625, -0.022857666015625, 0.015533447265625, -0.05072021484375, -0.01447296142578125, -0.0474853515625, 0.01617431640625, 0.0294342041015625, -0.0011577606201171875, -0.038421630859375, 0.0277557373046875, -0.0303802490234375, 0.041046142578125, 0.050201416015625, -0.00395965576171875, 0.06854248046875, -0.03399658203125, -0.03643798828125, 0.0212554931640625, 0.062255859375, 0.0162353515625, 0.0142059326171875, 0.0120391845703125, -0.034637451171875, -0.0262451171875, 0.02880859375, -0.08349609375, -0.0140228271484375, 0.032928466796875, -0.01493072509765625, -0.0301055908203125, 0.01433563232421875, -0.04345703125, 0.0118560791015625, -0.0005917549133300781, 0.055633544921875, -0.05303955078125, -0.0340576171875, 0.02276611328125, -0.016845703125, 0.041412353515625, 0.01047515869140625, -0.07818603515625, 0.041961669921875, 0.051116943359375, 0.049346923828125, 0.0037670135498046875, -0.0127410888671875, -0.0230865478515625, -0.00489044189453125, -0.01261138916015625, 0.0258026123046875, 0.0249176025390625, -0.027862548828125, -0.01140594482421875, 0.005878448486328125, -0.0177154541015625, -0.044342041015625, 0.04254150390625, -0.044189453125, 0.0004782676696777344, -0.0069732666015625, -0.03887939453125, -0.027374267578125, 0.01727294921875, -0.040435791015625, 0.08355712890625, 0.02142333984375, -0.05560302734375, 0.0201263427734375, -0.03985595703125, -0.0207977294921875, 0.016693115234375, -0.03155517578125, -0.05645751953125, 0.0295257568359375, -0.0182647705078125, 0.029754638671875, 0.006275177001953125, 0.0224761962890625, -0.01308441162109375, -0.0545654296875, 0.02532958984375, -0.0341796875, 0.0732421875, 0.01001739501953125, -0.06256103515625, 0.00104522705078125, -0.06884765625, 0.0229339599609375, -0.0030803680419921875, -0.040863037109375, 0.0229644775390625, -0.026580810546875, 0.0035572052001953125, 0.0021724700927734375, 0.034881591796875, -0.054351806640625, -0.00984954833984375, -0.036529541015625, 0.035614013671875, 0.06158447265625, 0.0073089599609375, 0.021209716796875, -0.01617431640625, 0.035980224609375, 0.017547607421875, 0.024658203125, 0.041107177734375, -0.037689208984375, -0.062744140625, -0.01271820068359375, 0.009002685546875, 0.039764404296875, -0.05352783203125, 0.056488037109375, -0.006763458251953125, -0.023193359375, -0.016082763671875, -0.01517486572265625, 0.038665771484375, 0.0260009765625, 0.024078369140625, -0.01213836669921875, -0.034454345703125, -0.07025146484375, 0.02227783203125, 0.005733489990234375, 0.00841522216796875, -0.001964569091796875, 0.0616455078125, 0.00841522216796875, 0.0511474609375, -0.035491943359375, -0.0013027191162109375, 0.010467529296875, 0.00832366943359375, 0.03179931640625, 0.0589599609375, 0.050933837890625, -0.06524658203125, -0.02532958984375, -0.039581298828125, -0.016326904296875, 0.0204315185546875, -0.0243682861328125, -0.0174560546875, 0.034149169921875, 0.0124053955078125, -0.0283355712890625, 0.04754638671875, 0.028961181640625, -0.053741455078125, 0.07366943359375, -0.01424407958984375, 0.01556396484375, -0.06561279296875, 0.002685546875, 0.00027251243591308594, -0.0224609375, 0.0012598037719726562, 0.002529144287109375, 0.0233612060546875, 0.005390167236328125, -0.049591064453125, 0.022979736328125, -0.0233154296875, -0.00635528564453125, -0.002529144287109375, -0.0445556640625, 0.01611328125, 0.043609619140625, 0.00652313232421875, 0.07171630859375, 0.0540771484375, -0.056365966796875, 0.02484130859375, -0.0010614395141601562, -0.002185821533203125, 0.038360595703125, -0.047882080078125, 0.0027256011962890625, 0.032958984375, -0.0035114288330078125, -0.061920166015625, 0.0088348388671875, 0.024200439453125, -0.037109375, 0.03057861328125, -0.00582122802734375, -0.034149169921875, -0.004486083984375, -0.049407958984375, 0.04473876953125, 0.035430908203125, -0.02276611328125, 0.022735595703125, 0.0022296905517578125, 0.040435791015625, -0.0303497314453125, -0.06256103515625, -0.034912109375, -0.015411376953125, -0.043914794921875, 0.04730224609375, -0.034332275390625, 0.007106781005859375, -0.0165863037109375, -0.0384521484375, -0.026275634765625, 0.0178985595703125, 0.0258636474609375, 0.02618408203125, -0.025421142578125, -0.01239013671875, 0.01384735107421875, -0.01284027099609375, 0.031341552734375, -0.007213592529296875, 0.037017822265625, -0.03509521484375, -0.0157012939453125, -0.038848876953125, 0.0084991455078125, 0.015899658203125, 0.01287078857421875, 0.0662841796875, 0.073974609375, -0.0322265625, 0.01026153564453125, -0.035125732421875, -0.02154541015625, -0.03924560546875, 0.0223236083984375, -0.045684814453125, -0.075927734375, 0.024658203125, 0.00528717041015625, -0.0007615089416503906, 0.055938720703125, 0.016357421875, -0.016143798828125, 0.05352783203125, 0.029327392578125, 0.0220794677734375, 0.05169677734375, -0.044036865234375, -0.00577545166015625, -0.07818603515625, -0.0081329345703125, -0.037841796875, -0.0258636474609375, -0.0596923828125, -0.006439208984375, 0.035491943359375, 0.034271240234375, -0.0621337890625, 0.0162353515625, -0.038909912109375, 0.02569580078125, 0.0445556640625, 0.0247955322265625, -0.00907135009765625, 0.0230865478515625, -0.0169830322265625, 0.00384521484375, -0.040435791015625, -0.0037784576416015625, 0.09478759765625, 0.0207977294921875, 0.0286712646484375, -0.005222320556640625, 0.062744140625, 0.0002560615539550781, -0.0262298583984375, -0.04779052734375, 0.04840087890625, -0.0011663436889648438, -0.05865478515625, -0.01177978515625, -0.048370361328125, -0.043212890625, 0.0020694732666015625, -0.0182037353515625, -0.05029296875, 0.021697998046875, 0.0195770263671875, -0.040313720703125, 0.01476287841796875, -0.040435791015625, 0.07208251953125, -0.0258026123046875, -0.00971221923828125, -0.02484130859375, -0.0516357421875, 0.013763427734375, 0.0009303092956542969, 0.00324249267578125, 0.004261016845703125, -0.01020050048828125, 0.07733154296875, -0.040191650390625, 0.047698974609375, -0.027587890625, 0.01514434814453125, 0.028564453125, -0.0198974609375, 0.0279693603515625, 0.018035888671875, -0.007740020751953125, 0.044281005859375, 0.0228118896484375, -0.041900634765625, -0.0025463104248046875, 0.038726806640625, -0.08013916015625, -0.006072998046875, -0.057342529296875, -0.0172119140625, -0.01390838623046875, 0.00849151611328125, 0.04254150390625, 0.028045654296875, 0.01251983642578125, 0.0309600830078125, 0.08251953125, 0.0067291259765625, 0.044525146484375, 0.035980224609375, 0.002002716064453125, -0.07135009765625, 0.053497314453125, -0.0027980804443359375, 0.045318603515625, 0.031280517578125, 0.00971221923828125, -0.0202484130859375, -0.0589599609375, -0.0286712646484375, 0.01456451416015625, -0.0281982421875, -0.03887939453125, -0.015899658203125, -0.0380859375, -0.04949951171875, 0.0021533966064453125, -0.043701171875, -0.051971435546875, -0.037841796875, -0.0092315673828125, 0.0523681640625, -0.00875091552734375, -0.0208740234375, 0.0562744140625, -0.07769775390625, 0.01316070556640625, 0.0003676414489746094, 0.0270538330078125, -0.01025390625, -0.056854248046875, -0.05474853515625, 0.006427764892578125, -0.052490234375, -0.0623779296875, 0.0051727294921875, 0.01383209228515625, 0.0312347412109375, 0.03662109375, -0.005702972412109375, 0.040618896484375, -0.012359619140625, 0.0494384765625, 0.0153350830078125, -0.07537841796875, 0.033447265625, -0.0082550048828125, 0.022674560546875, 0.055267333984375, 0.03289794921875, -0.00440216064453125, 0.0130157470703125, -0.074462890625, -0.072021484375, 0.0623779296875, 0.0158233642578125, -0.00749969482421875, 0.0158233642578125, 0.036590576171875, -0.0077972412109375, 0.045013427734375, -0.055267333984375, -0.0191650390625, -0.0130767822265625, 0.00494384765625, 0.0247039794921875, -0.0198211669921875, -0.006679534912109375, -0.0251312255859375, 0.0858154296875, -0.01157379150390625, 0.04052734375, 0.0171661376953125, 0.0000330805778503418, -0.0250244140625, 0.01043701171875, 0.034088134765625, 0.049835205078125, -0.053314208984375, 0.0012664794921875, 0.0025424957275390625, -0.07061767578125, 0.00499725341796875, 0.013397216796875, -0.025604248046875, -0.01200103759765625, 0.00959014892578125, 0.0809326171875, 0.0174102783203125, -0.0196533203125, 0.02117919921875, -0.0041961669921875, -0.042510986328125, -0.0428466796875, 0.026611328125, -0.0014276504516601562, 0.02178955078125, 0.01358795166015625, 0.0167388916015625, 0.004566192626953125, -0.005523681640625, 0.0056304931640625, 0.0274810791015625, -0.0137786865234375, -0.0198822021484375, 0.0250244140625, 0.021270751953125, -0.0037250518798828125, 0.06292724609375, -0.044830322265625, -0.0263214111328125, 0.06298828125, 0.034515380859375, 0.081298828125, -0.0014705657958984375, 0.011474609375, 0.037994384765625, 0.0126800537109375, -0.0045166015625, 0.06781005859375, -0.0175018310546875, -0.063720703125, -0.02374267578125, -0.06707763671875, -0.0118560791015625, 0.01119232177734375, -0.071533203125, 0.0234832763671875, -0.05322265625, -0.0229339599609375, -0.0007910728454589844, 0.0394287109375, -0.05487060546875, 0.0285186767578125, 0.016326904296875, 0.08203125, -0.059356689453125, 0.0726318359375, 0.06256103515625, -0.0292510986328125, -0.067626953125, -0.0262451171875, -0.006961822509765625, -0.050567626953125, 0.040435791015625, -0.00040078163146972656, 0.017669677734375, 0.0286102294921875, -0.033294677734375, -0.05328369140625, 0.103271484375, 0.0154266357421875, -0.047760009765625, 0.0135955810546875, 0.023345947265625, 0.010986328125, -0.01415252685546875, 0.0240478515625, 0.0279998779296875, 0.022857666015625, 0.0216522216796875, -0.06488037109375, 0.014617919921875, 0.000240325927734375, -0.00945281982421875, 0.0384521484375, -0.055267333984375, 0.08502197265625, -0.0075836181640625, 0.01971435546875, -0.0068817138671875, 0.039031982421875, 0.050384521484375, 0.01849365234375, 0.044952392578125, 0.06201171875, 0.0234832763671875, -0.0031337738037109375, 0.030487060546875, -0.0313720703125, 0.049102783203125, 0.074951171875, 0.0015459060668945312, 0.060089111328125, 0.0557861328125, -0.040069580078125, 0.0187530517578125, 0.0665283203125, -0.01471710205078125, 0.035858154296875, 0.005615234375, -0.033111572265625, -0.019256591796875, 0.0291748046875, -0.044189453125, 0.0023441314697265625, 0.034332275390625, -0.0164642333984375, -0.00933074951171875, -0.00962066650390625, 0.0099945068359375, -0.06060791015625, -0.0279083251953125, 0.0478515625, -0.0003955364227294922, -0.0228729248046875, 0.054656982421875, 0.00586700439453125, 0.06591796875, -0.06298828125, -0.0130462646484375, -0.00548553466796875, 0.0202178955078125, -0.0261688232421875, -0.043487548828125, 0.0247039794921875, -0.01349639892578125, -0.0263824462890625, -0.01358795166015625, 0.039825439453125, -0.0023708343505859375, -0.0462646484375, -0.0037364959716796875, -0.003269195556640625, 0.0162811279296875, -0.03900146484375, -0.06451416015625, 0.00661468505859375, 0.03253173828125, -0.04510498046875, 0.044342041015625, 0.0496826171875, 0.0174407958984375, 0.018280029296875, 0.026611328125, 0.009918212890625, 0.0211334228515625, -0.007801055908203125, 0.058685302734375, -0.05194091796875, -0.04229736328125, -0.061737060546875, 0.04595947265625, 0.005535125732421875, -0.045684814453125, 0.0305633544921875, 0.0599365234375, 0.08038330078125, -0.0013341903686523438, 0.03271484375, -0.015716552734375, -0.01000213623046875, -0.056732177734375, 0.035980224609375, -0.020263671875, 0.01068878173828125, 0.0078277587890625, -0.048370361328125, 0.0232696533203125, 0.0584716796875, -0.01039886474609375, 0.01049041748046875, 0.074462890625, 0.03668212890625, -0.0094146728515625, 0.020172119140625, 0.00852203369140625, 0.011260986328125, 0.010467529296875, 0.046478271484375, 0.00954437255859375, -0.0648193359375, 0.0297698974609375, -0.054168701171875, -0.02923583984375, -0.01050567626953125, -0.05877685546875, -0.03497314453125, -0.01473236083984375, -0.0584716796875, -0.03460693359375, -0.013763427734375, 0.05780029296875, 0.06317138671875, -0.060821533203125, -0.03363037109375, -0.0258636474609375, -0.00971221923828125, -0.021728515625, -0.0228424072265625, 0.028961181640625, -0.0151519775390625, -0.0274810791015625, 0.0073699951171875, -0.00027871131896972656, 0.0219268798828125, -0.0175933837890625, -0.018035888671875, -0.0316162109375, -0.01412200927734375, 0.01142120361328125, -0.00608062744140625, -0.008514404296875, -0.01107025146484375, -0.0203094482421875, 0.005962371826171875, 0.033477783203125, 0.0360107421875, -0.0616455078125, 0.00868988037109375, 0.043426513671875, 0.0204315185546875, 0.049591064453125, -0.021484375, 0.0180206298828125, -0.07415771484375, 0.02520751953125, 0.0233917236328125, 0.0251922607421875, -0.01070404052734375, -0.04681396484375, 0.0352783203125, 0.0396728515625, -0.052642822265625, -0.047027587890625, -0.00481414794921875, -0.086669921875, 0.0019245147705078125, 0.06402587890625, -0.011383056640625, -0.0297088623046875, 0.023468017578125, -0.0046539306640625, 0.0194244384765625, -0.0447998046875, 0.035308837890625, 0.04388427734375, -0.031768798828125, 0.0012254714965820312, -0.05224609375, 0.03729248046875, 0.0256500244140625, -0.056610107421875, -0.04571533203125, 0.030242919921875, 0.0357666015625, 0.01149749755859375, 0.05419921875, -0.0094146728515625, 0.0254364013671875, 0.0176544189453125, 0.0215606689453125, -0.0198822021484375, -0.0017642974853515625, -0.038818359375, -0.0021820068359375, -0.0044708251953125, -0.063720703125 ] ]
prithivida/parrot_adequacy_model
2022-05-27T02:47:22.000Z
[ "transformers", "pytorch", "roberta", "text-classification", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
prithivida
null
null
prithivida/parrot_adequacy_model
6
275,343
transformers
2022-05-27T02:04:37
--- license: apache-2.0 --- Parrot THIS IS AN ANCILLARY MODEL FOR PARROT PARAPHRASER 1. What is Parrot? Parrot is a paraphrase-based utterance augmentation framework purpose-built to accelerate training NLU models. A paraphrase framework is more than just a paraphrasing model. Please refer to the GitHub page or The model card prithivida/parrot_paraphraser_on_T5
364
[ [ -0.007045745849609375, -0.07476806640625, -0.00003403425216674805, 0.033843994140625, -0.0135650634765625, -0.0215301513671875, 0.017913818359375, -0.02459716796875, 0.0050201416015625, 0.0526123046875, -0.041778564453125, 0.01904296875, -0.00214385986328125, 0.0189208984375, -0.04998779296875, 0.05426025390625, 0.0537109375, 0.007404327392578125, -0.0247955322265625, -0.00797271728515625, -0.01617431640625, -0.054534912109375, -0.052520751953125, -0.0287017822265625, 0.02880859375, 0.032196044921875, 0.082275390625, 0.047607421875, 0.039337158203125, 0.02740478515625, -0.019134521484375, -0.031158447265625, -0.01351165771484375, 0.0022907257080078125, -0.0218048095703125, -0.052001953125, -0.040740966796875, -0.0055694580078125, 0.031585693359375, 0.0516357421875, -0.0192413330078125, 0.0079345703125, 0.00870513916015625, 0.0291595458984375, -0.048004150390625, 0.01216888427734375, -0.041778564453125, 0.0189361572265625, -0.025634765625, -0.030120849609375, -0.01300048828125, -0.01702880859375, 0.019866943359375, -0.058624267578125, 0.00571441650390625, -0.009979248046875, 0.039825439453125, 0.01549530029296875, -0.0283355712890625, -0.03167724609375, -0.034698486328125, 0.05865478515625, -0.0782470703125, -0.00033473968505859375, 0.024627685546875, 0.041168212890625, -0.00032019615173339844, -0.08465576171875, -0.061248779296875, 0.0005350112915039062, -0.01003265380859375, 0.004222869873046875, 0.0034999847412109375, 0.025054931640625, 0.006999969482421875, 0.050750732421875, -0.01971435546875, -0.040802001953125, -0.0333251953125, 0.002216339111328125, 0.0307464599609375, 0.0081329345703125, 0.045074462890625, -0.0168914794921875, -0.033538818359375, 0.0027370452880859375, -0.0261077880859375, 0.00772857666015625, 0.00951385498046875, 0.02203369140625, -0.0013093948364257812, 0.03961181640625, -0.006427764892578125, 0.07501220703125, 0.002960205078125, 0.0018224716186523438, 0.01482391357421875, -0.00897979736328125, -0.031890869140625, 0.005779266357421875, 0.038360595703125, 0.007511138916015625, 0.0165252685546875, -0.0207977294921875, 0.001667022705078125, -0.00901031494140625, 0.0404052734375, -0.039825439453125, -0.03021240234375, 0.00881195068359375, -0.057708740234375, -0.059478759765625, 0.0009431838989257812, -0.04144287109375, -0.00897979736328125, -0.00508880615234375, 0.0251312255859375, -0.0440673828125, -0.02685546875, -0.003536224365234375, -0.0496826171875, 0.037445068359375, 0.0102386474609375, -0.054168701171875, 0.022613525390625, 0.05242919921875, 0.06072998046875, 0.0276641845703125, -0.059661865234375, -0.0311431884765625, 0.0160064697265625, -0.030120849609375, 0.04486083984375, -0.07818603515625, -0.0517578125, -0.0024051666259765625, 0.02203369140625, -0.024993896484375, -0.012786865234375, 0.05938720703125, -0.02630615234375, 0.0396728515625, -0.03387451171875, -0.06878662109375, -0.057891845703125, 0.01464080810546875, -0.04669189453125, 0.047271728515625, 0.012969970703125, -0.06561279296875, 0.006443023681640625, -0.07269287109375, -0.03338623046875, 0.03533935546875, -0.0020465850830078125, -0.0015439987182617188, 0.00995635986328125, 0.0176239013671875, 0.0180206298828125, -0.0276641845703125, 0.00745391845703125, -0.038177490234375, -0.042724609375, 0.0302276611328125, -0.033843994140625, 0.0770263671875, 0.03265380859375, 0.0006127357482910156, 0.0171356201171875, -0.06683349609375, -0.0035419464111328125, -0.010955810546875, -0.0306854248046875, -0.01511383056640625, 0.01430511474609375, 0.03363037109375, 0.025970458984375, 0.017730712890625, -0.041107177734375, 0.0009398460388183594, -0.0259857177734375, 0.08154296875, 0.0098419189453125, 0.027008056640625, 0.0211029052734375, -0.06964111328125, 0.051971435546875, -0.01238250732421875, 0.023681640625, -0.01528167724609375, -0.036102294921875, -0.03497314453125, -0.017791748046875, 0.01171875, 0.040008544921875, -0.046142578125, 0.02435302734375, -0.00455474853515625, -0.0276641845703125, -0.0298309326171875, 0.00942230224609375, 0.0223541259765625, 0.057708740234375, 0.031097412109375, 0.0033550262451171875, -0.036376953125, -0.062255859375, -0.0155487060546875, -0.02886962890625, -0.0013990402221679688, -0.00698089599609375, 0.0148468017578125, 0.0059814453125, 0.062042236328125, 0.030364990234375, -0.0245361328125, -0.034088134765625, -0.00579833984375, 0.00411224365234375, 0.0455322265625, 0.071533203125, -0.07086181640625, -0.016815185546875, -0.0081024169921875, -0.08001708984375, 0.00817108154296875, -0.0279388427734375, -0.0015344619750976562, -0.040802001953125, 0.049346923828125, -0.034423828125, 0.021514892578125, 0.0177001953125, -0.01313018798828125, 0.033660888671875, -0.01073455810546875, 0.006908416748046875, -0.07470703125, 0.0126800537109375, -0.037017822265625, -0.0134429931640625, -0.045654296875, 0.046295166015625, 0.0064544677734375, -0.0181427001953125, -0.0472412109375, 0.0271759033203125, -0.0193328857421875, -0.00384521484375, -0.04302978515625, -0.006999969482421875, 0.0216217041015625, 0.02557373046875, 0.01654052734375, 0.08172607421875, 0.07257080078125, -0.05096435546875, 0.031890869140625, 0.0595703125, -0.005527496337890625, 0.045440673828125, -0.0625, 0.0036907196044921875, 0.0164337158203125, 0.01380157470703125, -0.050140380859375, -0.0294952392578125, -0.0005812644958496094, -0.019378662109375, -0.014312744140625, -0.002307891845703125, -0.0134735107421875, -0.040191650390625, -0.017608642578125, 0.03485107421875, 0.0193939208984375, -0.055633544921875, 0.0401611328125, 0.0031833648681640625, -0.026153564453125, -0.02020263671875, -0.08056640625, 0.035858154296875, -0.031951904296875, -0.03314208984375, 0.007167816162109375, 0.004024505615234375, -0.005096435546875, 0.007518768310546875, 0.0258026123046875, -0.0175933837890625, 0.01306915283203125, 0.00849151611328125, -0.00989532470703125, -0.036590576171875, 0.0157470703125, -0.00539398193359375, 0.0206451416015625, 0.00824737548828125, -0.0161285400390625, 0.0239105224609375, -0.027099609375, -0.009033203125, -0.053009033203125, 0.040985107421875, 0.056976318359375, -0.020416259765625, 0.038421630859375, 0.00914764404296875, -0.0032138824462890625, 0.00977325439453125, -0.0277252197265625, 0.0023136138916015625, -0.038360595703125, 0.0158843994140625, -0.058319091796875, -0.01102447509765625, 0.041778564453125, 0.0135498046875, -0.0066070556640625, 0.024078369140625, 0.0154876708984375, -0.0016412734985351562, 0.04315185546875, 0.0364990234375, -0.0157318115234375, 0.045989990234375, -0.00665283203125, -0.02880859375, -0.06475830078125, 0.002208709716796875, -0.0161590576171875, -0.0160064697265625, -0.00955963134765625, -0.0244598388671875, -0.005611419677734375, 0.044403076171875, -0.0203094482421875, 0.0219573974609375, -0.030242919921875, 0.0284423828125, 0.06585693359375, 0.00738525390625, 0.0102386474609375, -0.0093841552734375, -0.0011568069458007812, 0.026611328125, -0.0679931640625, -0.049591064453125, 0.0797119140625, 0.00844573974609375, 0.03729248046875, 0.01155853271484375, 0.0279388427734375, 0.0284881591796875, -0.0199127197265625, -0.037353515625, 0.042755126953125, -0.002471923828125, -0.052703857421875, -0.0261077880859375, 0.00991058349609375, -0.043121337890625, 0.023040771484375, -0.01299285888671875, -0.04547119140625, 0.040802001953125, 0.01041412353515625, -0.01500701904296875, 0.024261474609375, -0.018402099609375, 0.06170654296875, -0.00600433349609375, -0.00800323486328125, -0.01168060302734375, -0.0350341796875, 0.0276641845703125, 0.003681182861328125, 0.00499725341796875, 0.0160064697265625, 0.007659912109375, 0.0556640625, -0.0859375, 0.044647216796875, -0.01079559326171875, 0.022064208984375, 0.04193115234375, 0.00342559814453125, 0.0279388427734375, -0.005130767822265625, -0.056671142578125, 0.006168365478515625, 0.02679443359375, -0.06591796875, -0.04119873046875, 0.0290679931640625, -0.04345703125, -0.032470703125, -0.0123138427734375, -0.041412353515625, -0.01108551025390625, 0.0028629302978515625, 0.033966064453125, 0.038116455078125, -0.0229339599609375, 0.0556640625, -0.0172882080078125, -0.016204833984375, 0.048828125, 0.0098114013671875, -0.00797271728515625, -0.0253753662109375, 0.0587158203125, -0.0304718017578125, 0.01873779296875, 0.037811279296875, 0.0323486328125, -0.0256805419921875, -0.035675048828125, -0.0225372314453125, -0.004791259765625, -0.05804443359375, 0.006122589111328125, -0.044189453125, -0.035430908203125, -0.0343017578125, -0.02911376953125, 0.00678253173828125, -0.05621337890625, -0.018218994140625, -0.01189422607421875, 0.044769287109375, 0.05462646484375, -0.00624847412109375, 0.059539794921875, -0.043212890625, 0.01806640625, 0.045806884765625, -0.02294921875, 0.01273345947265625, -0.031494140625, 0.0145111083984375, 0.0215301513671875, -0.033233642578125, -0.067626953125, 0.033477783203125, 0.07415771484375, 0.050384521484375, 0.045166015625, 0.006938934326171875, 0.0679931640625, -0.021697998046875, 0.08843994140625, 0.01256561279296875, -0.083251953125, 0.049407958984375, -0.03173828125, 0.017059326171875, 0.040740966796875, -0.0130767822265625, -0.0288543701171875, -0.053955078125, -0.03936767578125, -0.0501708984375, 0.043914794921875, 0.0302276611328125, 0.017608642578125, -0.01433563232421875, 0.036468505859375, 0.007114410400390625, 0.03448486328125, -0.056976318359375, -0.007404327392578125, -0.06536865234375, -0.00904083251953125, -0.0155487060546875, -0.0302886962890625, 0.0017862319946289062, -0.017333984375, 0.0458984375, -0.006565093994140625, 0.00916290283203125, 0.016265869140625, -0.0277557373046875, -0.007144927978515625, 0.0197601318359375, 0.02899169921875, 0.0296478271484375, -0.01934814453125, -0.013763427734375, 0.0079345703125, -0.0267333984375, -0.0007061958312988281, -0.009490966796875, -0.01934814453125, 0.0010557174682617188, 0.069580078125, 0.068603515625, 0.028045654296875, -0.02740478515625, 0.0438232421875, -0.0014390945434570312, -0.03546142578125, -0.017791748046875, 0.0163421630859375, 0.0009851455688476562, 0.029937744140625, 0.042327880859375, 0.006427764892578125, 0.04608154296875, -0.05218505859375, 0.028167724609375, 0.0195465087890625, -0.0300750732421875, -0.011138916015625, 0.084716796875, 0.033050537109375, -0.03955078125, 0.036346435546875, 0.0140533447265625, -0.035247802734375, 0.04345703125, 0.045654296875, 0.060211181640625, -0.007129669189453125, 0.0198974609375, 0.0020008087158203125, -0.00673675537109375, -0.00409698486328125, 0.032073974609375, -0.0401611328125, -0.0254669189453125, -0.00446319580078125, -0.0650634765625, -0.041107177734375, 0.0146026611328125, -0.08721923828125, 0.01910400390625, -0.029693603515625, 0.0038700103759765625, 0.0134735107421875, -0.0186309814453125, -0.0302276611328125, 0.040008544921875, -0.007038116455078125, 0.0792236328125, -0.034912109375, 0.08447265625, 0.062255859375, -0.0498046875, -0.062286376953125, 0.0174407958984375, -0.005031585693359375, -0.052642822265625, 0.050140380859375, 0.00046539306640625, 0.006420135498046875, 0.0079498291015625, -0.007144927978515625, -0.044219970703125, 0.061126708984375, 0.017791748046875, -0.038360595703125, -0.038177490234375, 0.01465606689453125, 0.03753662109375, -0.038177490234375, 0.041717529296875, 0.01432037353515625, 0.01207733154296875, 0.015350341796875, -0.06182861328125, 0.00020551681518554688, -0.04241943359375, 0.0279998779296875, -0.048248291015625, -0.040008544921875, 0.0770263671875, 0.0163421630859375, 0.0159149169921875, 0.04583740234375, 0.041778564453125, 0.0021820068359375, -0.0081634521484375, 0.0291595458984375, 0.007411956787109375, 0.0458984375, 0.0268402099609375, 0.0885009765625, -0.050750732421875, 0.0185394287109375, 0.10284423828125, -0.0035572052001953125, 0.07806396484375, 0.03668212890625, -0.0139923095703125, 0.04718017578125, 0.04779052734375, 0.0040130615234375, 0.045562744140625, -0.011505126953125, -0.004734039306640625, -0.0304107666015625, 0.0310516357421875, -0.05841064453125, 0.046966552734375, 0.0302734375, -0.0241546630859375, -0.02642822265625, 0.037750244140625, -0.05609130859375, 0.0121307373046875, -0.016204833984375, 0.06982421875, 0.0008153915405273438, -0.0233612060546875, 0.05462646484375, -0.019775390625, 0.033843994140625, -0.033538818359375, -0.00021338462829589844, -0.01080322265625, 0.0187530517578125, -0.00604248046875, -0.027374267578125, 0.03948974609375, -0.024169921875, -0.0024871826171875, 0.00688934326171875, 0.0304718017578125, -0.0313720703125, -0.0665283203125, 0.0252227783203125, 0.0086212158203125, 0.01554107666015625, -0.0228424072265625, -0.09149169921875, -0.01806640625, -0.016632080078125, -0.0217742919921875, -0.0153350830078125, 0.0394287109375, 0.0014162063598632812, 0.060699462890625, 0.03314208984375, 0.0107269287109375, 0.0237579345703125, 0.003360748291015625, 0.038421630859375, -0.053955078125, -0.07928466796875, -0.0577392578125, 0.0133209228515625, -0.019775390625, -0.0189666748046875, 0.077392578125, 0.049468994140625, 0.0204010009765625, -0.03314208984375, 0.08392333984375, -0.00481414794921875, 0.043304443359375, -0.01934814453125, 0.054168701171875, -0.053375244140625, 0.020782470703125, -0.02301025390625, -0.0626220703125, 0.0022983551025390625, 0.0711669921875, -0.005290985107421875, -0.0225982666015625, 0.034912109375, 0.046234130859375, -0.0178375244140625, 0.043914794921875, 0.044219970703125, 0.006542205810546875, 0.0083160400390625, 0.0171051025390625, 0.0714111328125, -0.056884765625, 0.038787841796875, -0.01910400390625, 0.01348876953125, -0.0221099853515625, -0.047119140625, -0.0743408203125, -0.035797119140625, -0.0362548828125, -0.015106201171875, 0.033233642578125, 0.041748046875, 0.06817626953125, -0.0758056640625, -0.0250244140625, -0.06915283203125, -0.01043701171875, -0.041107177734375, -0.02069091796875, 0.00044655799865722656, -0.056365966796875, -0.048828125, 0.045654296875, 0.0052947998046875, 0.0008749961853027344, -0.005054473876953125, -0.01305389404296875, -0.0304107666015625, 0.041046142578125, 0.01548004150390625, 0.0394287109375, -0.07232666015625, -0.0242919921875, -0.0270233154296875, 0.009002685546875, -0.0003497600555419922, 0.043212890625, -0.048095703125, 0.034454345703125, 0.0181884765625, 0.031158447265625, 0.027374267578125, 0.020477294921875, 0.08050537109375, -0.05889892578125, 0.048309326171875, 0.007122039794921875, 0.027313232421875, 0.024017333984375, -0.02130126953125, 0.0244903564453125, 0.00708770751953125, -0.040771484375, -0.0582275390625, 0.002559661865234375, -0.06842041015625, -0.049072265625, 0.06982421875, -0.0217132568359375, -0.037567138671875, -0.0168609619140625, -0.0255889892578125, 0.00646209716796875, -0.0008521080017089844, 0.02606201171875, 0.041900634765625, 0.00616455078125, -0.0155487060546875, -0.01788330078125, 0.045562744140625, 0.017791748046875, -0.036865234375, -0.0035915374755859375, 0.00562286376953125, 0.0316162109375, 0.00360870361328125, 0.03863525390625, -0.00241851806640625, 0.049346923828125, 0.040130615234375, 0.004123687744140625, -0.025421142578125, -0.0010271072387695312, -0.01702880859375, 0.034759521484375, 0.004550933837890625, -0.0533447265625 ] ]
SG161222/Realistic_Vision_V5.1_noVAE
2023-07-31T06:32:01.000Z
[ "diffusers", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
null
SG161222
null
null
SG161222/Realistic_Vision_V5.1_noVAE
98
274,981
diffusers
2023-07-31T05:20:51
--- license: creativeml-openrail-m --- <b>Please read this!</b><br> For version 5.1 it is recommended to use with VAE (to improve generation quality and get rid of artifacts): https://huggingface.co/stabilityai/sd-vae-ft-mse-original<br> <hr/> <b>The recommended negative prompt:</b> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br> <b>OR</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation <b>Euler A or DPM++ 2M Karras<br> CFG Scale 3,5 - 7<br> Hires. fix with 4x-UltraSharp upscaler<br> 0 Hires steps and Denoising strength 0.25-0.7<br> Upscale by 1.1-2.0</b>
1,332
[ [ -0.055450439453125, -0.047698974609375, 0.029937744140625, 0.0268402099609375, -0.049468994140625, -0.019805908203125, 0.0283660888671875, -0.037261962890625, 0.03955078125, 0.0538330078125, -0.0704345703125, -0.05413818359375, -0.0308074951171875, 0.00830841064453125, -0.0228424072265625, 0.07232666015625, 0.0011539459228515625, 0.01216888427734375, -0.01267242431640625, 0.00650787353515625, -0.0439453125, -0.0030689239501953125, -0.041717529296875, 0.00276947021484375, 0.0139312744140625, 0.05517578125, 0.042633056640625, 0.04583740234375, 0.0256500244140625, 0.016265869140625, -0.0111083984375, 0.01134490966796875, -0.05096435546875, 0.0006513595581054688, -0.0045013427734375, -0.0091552734375, -0.056304931640625, -0.003940582275390625, 0.04296875, 0.0164337158203125, -0.007659912109375, 0.02740478515625, 0.0028514862060546875, 0.048095703125, -0.04681396484375, -0.0144805908203125, -0.007068634033203125, 0.01148223876953125, -0.025604248046875, -0.0134429931640625, -0.0109100341796875, -0.0205078125, -0.02197265625, -0.07232666015625, 0.04107666015625, -0.0028553009033203125, 0.1015625, 0.01971435546875, -0.017303466796875, 0.0340576171875, -0.039947509765625, 0.037506103515625, -0.044769287109375, 0.0160064697265625, 0.01468658447265625, 0.0421142578125, -0.005161285400390625, -0.0537109375, -0.0574951171875, 0.0054168701171875, 0.0108642578125, 0.0205535888671875, -0.02227783203125, 0.023162841796875, 0.056121826171875, 0.039031982421875, -0.05096435546875, -0.00156402587890625, -0.048370361328125, -0.034332275390625, 0.054290771484375, 0.01329803466796875, 0.037200927734375, -0.0184478759765625, -0.06256103515625, -0.0243988037109375, -0.058013916015625, -0.0104217529296875, 0.0261077880859375, -0.004009246826171875, -0.0292510986328125, 0.03466796875, -0.0018463134765625, 0.0289154052734375, 0.03924560546875, -0.004314422607421875, 0.034515380859375, -0.0316162109375, -0.01342010498046875, -0.025390625, 0.047760009765625, 0.07000732421875, 0.007030487060546875, 0.022308349609375, 0.010101318359375, -0.0021800994873046875, 0.0360107421875, -0.0794677734375, -0.0159759521484375, 0.0260467529296875, -0.04736328125, -0.0186767578125, -0.0172882080078125, -0.0963134765625, -0.0301361083984375, -0.012115478515625, 0.030517578125, -0.0297698974609375, -0.0207672119140625, -0.009857177734375, -0.02569580078125, 0.038787841796875, 0.04766845703125, -0.06597900390625, 0.00792694091796875, -0.0009088516235351562, 0.036956787109375, 0.021575927734375, -0.00867462158203125, -0.0231170654296875, 0.00021779537200927734, -0.024566650390625, 0.054290771484375, -0.00525665283203125, -0.05499267578125, -0.0140228271484375, 0.0279388427734375, 0.0268096923828125, -0.04180908203125, 0.07867431640625, -0.0030269622802734375, 0.021728515625, -0.0248565673828125, -0.0218048095703125, -0.0224456787109375, -0.01346588134765625, -0.05755615234375, 0.053009033203125, 0.03399658203125, -0.051544189453125, 0.044769287109375, -0.0291900634765625, -0.002422332763671875, 0.0193939208984375, 0.00388336181640625, -0.051116943359375, 0.0107574462890625, 0.025146484375, 0.032958984375, 0.0041351318359375, -0.0167694091796875, -0.03399658203125, -0.037506103515625, -0.03338623046875, -0.042144775390625, 0.061920166015625, 0.01312255859375, -0.038238525390625, 0.0164947509765625, -0.0726318359375, -0.0108642578125, 0.02423095703125, 0.01511383056640625, -0.0080108642578125, -0.01290130615234375, 0.024627685546875, 0.0277252197265625, 0.0298309326171875, -0.05206298828125, 0.0200347900390625, -0.0175323486328125, -0.00975799560546875, 0.06304931640625, 0.0195159912109375, 0.01105499267578125, -0.0275726318359375, 0.050140380859375, 0.0267486572265625, 0.0364990234375, -0.0208282470703125, -0.041290283203125, -0.066162109375, -0.02764892578125, -0.01209259033203125, 0.030731201171875, -0.07745361328125, 0.0175933837890625, 0.00727081298828125, -0.04071044921875, -0.0399169921875, -0.000047147274017333984, 0.0258636474609375, 0.034210205078125, 0.047393798828125, -0.042236328125, -0.04913330078125, -0.0655517578125, 0.0227203369140625, 0.0023479461669921875, -0.002712249755859375, 0.0165252685546875, 0.0240631103515625, -0.007724761962890625, 0.03961181640625, -0.0289459228515625, -0.020172119140625, 0.017669677734375, -0.011322021484375, -0.0054168701171875, 0.04248046875, 0.056121826171875, -0.04925537109375, -0.0232391357421875, -0.016448974609375, -0.042877197265625, -0.01244354248046875, -0.01232147216796875, -0.0169219970703125, 0.0157012939453125, 0.017120361328125, -0.036407470703125, 0.037872314453125, 0.047332763671875, -0.0660400390625, 0.0640869140625, -0.0172882080078125, 0.0226287841796875, -0.093505859375, 0.01396942138671875, 0.0260772705078125, -0.03533935546875, -0.0310516357421875, 0.0301666259765625, 0.01318359375, -0.01885986328125, -0.052001953125, 0.04473876953125, -0.03424072265625, 0.021331787109375, -0.0039215087890625, 0.005435943603515625, 0.0182952880859375, 0.01568603515625, -0.006603240966796875, 0.05596923828125, 0.045166015625, -0.0406494140625, 0.0399169921875, 0.02764892578125, 0.004947662353515625, 0.07061767578125, -0.056060791015625, -0.0012645721435546875, -0.0206146240234375, -0.00634765625, -0.05670166015625, -0.0225830078125, 0.040740966796875, -0.048004150390625, 0.0447998046875, 0.0219879150390625, 0.006267547607421875, -0.040191650390625, -0.050384521484375, 0.0168609619140625, 0.060699462890625, -0.059814453125, 0.0256500244140625, 0.007965087890625, 0.01003265380859375, -0.036224365234375, -0.043212890625, 0.007122039794921875, -0.0338134765625, -0.051116943359375, 0.0308990478515625, -0.0119171142578125, 0.0016775131225585938, 0.0011615753173828125, -0.0071868896484375, -0.0110626220703125, -0.01212310791015625, 0.0176239013671875, 0.025146484375, -0.0308074951171875, -0.042205810546875, 0.0286865234375, -0.020111083984375, -0.0010957717895507812, 0.0189208984375, 0.0187225341796875, 0.0087890625, -0.024658203125, -0.041259765625, 0.03338623046875, 0.07501220703125, -0.00269317626953125, 0.0177154541015625, 0.07635498046875, -0.037017822265625, 0.0098114013671875, -0.04583740234375, -0.0191192626953125, -0.0345458984375, 0.0169830322265625, -0.0245513916015625, -0.05755615234375, 0.056121826171875, -0.0052337646484375, -0.003200531005859375, 0.06781005859375, 0.0254364013671875, -0.0260772705078125, 0.09320068359375, 0.05712890625, 0.011322021484375, 0.041534423828125, -0.0227813720703125, -0.01244354248046875, -0.07879638671875, -0.0238189697265625, -0.0214385986328125, -0.0204010009765625, -0.056549072265625, -0.031402587890625, 0.0312347412109375, 0.024932861328125, -0.0115814208984375, 0.041717529296875, -0.049957275390625, 0.033721923828125, 0.0308837890625, 0.0238189697265625, 0.0056610107421875, 0.01290130615234375, 0.00933837890625, -0.0202789306640625, -0.03271484375, -0.042694091796875, 0.0611572265625, 0.026031494140625, 0.0489501953125, 0.0113525390625, 0.05792236328125, -0.00016808509826660156, 0.001354217529296875, -0.0302886962890625, 0.045623779296875, -0.03729248046875, -0.059539794921875, -0.0131683349609375, -0.0170745849609375, -0.073486328125, 0.0155487060546875, -0.042877197265625, -0.07635498046875, 0.049346923828125, 0.02862548828125, -0.0265045166015625, 0.020538330078125, -0.06292724609375, 0.0733642578125, 0.0045928955078125, -0.046478271484375, -0.009674072265625, -0.04571533203125, 0.03533935546875, 0.006519317626953125, -0.01342010498046875, -0.0013036727905273438, -0.007335662841796875, 0.027862548828125, -0.0399169921875, 0.05206298828125, -0.02984619140625, 0.001880645751953125, 0.0291748046875, -0.0068511962890625, 0.0239715576171875, 0.0098114013671875, 0.01316070556640625, 0.02032470703125, 0.03228759765625, -0.04766845703125, -0.039093017578125, 0.044769287109375, -0.05401611328125, -0.053009033203125, -0.03314208984375, -0.0031833648681640625, 0.01258087158203125, 0.0239105224609375, 0.05401611328125, 0.025390625, -0.0234527587890625, 0.0035877227783203125, 0.04931640625, -0.01219940185546875, 0.030120849609375, 0.0273895263671875, -0.023895263671875, -0.039337158203125, 0.07476806640625, 0.00955963134765625, 0.033050537109375, -0.01172637939453125, -0.00505828857421875, -0.0101165771484375, -0.0247955322265625, -0.07452392578125, 0.0248260498046875, -0.033233642578125, -0.01322174072265625, -0.00431060791015625, -0.00429534912109375, -0.023895263671875, -0.0148468017578125, -0.0246429443359375, -0.01041412353515625, -0.04913330078125, -0.002101898193359375, 0.030792236328125, 0.024810791015625, -0.01340484619140625, 0.006114959716796875, -0.06549072265625, 0.042694091796875, 0.0009045600891113281, 0.0233306884765625, -0.0265350341796875, -0.022308349609375, -0.0122528076171875, 0.00634002685546875, -0.045440673828125, -0.08050537109375, 0.0308990478515625, -0.0106353759765625, 0.035888671875, 0.0360107421875, -0.0034351348876953125, 0.06982421875, -0.0205535888671875, 0.09857177734375, 0.03271484375, -0.051605224609375, 0.034393310546875, -0.048675537109375, 0.0307769775390625, 0.03857421875, 0.0268096923828125, -0.032745361328125, -0.0184326171875, -0.0791015625, -0.0765380859375, 0.040924072265625, 0.014862060546875, 0.0240478515625, 0.00484466552734375, 0.026611328125, 0.0083465576171875, 0.0085906982421875, -0.045013427734375, -0.03875732421875, -0.042327880859375, 0.007656097412109375, -0.00021398067474365234, -0.039276123046875, 0.006145477294921875, -0.04510498046875, 0.0687255859375, -0.0019254684448242188, 0.037750244140625, 0.028594970703125, 0.02691650390625, -0.045562744140625, -0.006771087646484375, 0.046661376953125, 0.038726806640625, -0.0223236083984375, -0.01490020751953125, 0.01418304443359375, -0.05206298828125, 0.0037994384765625, 0.011199951171875, -0.0287017822265625, 0.0222015380859375, 0.00787353515625, 0.073974609375, -0.016693115234375, -0.0275726318359375, 0.04254150390625, -0.0190582275390625, -0.0173492431640625, -0.0198822021484375, 0.0114288330078125, -0.018341064453125, 0.00492095947265625, 0.0087432861328125, 0.019287109375, 0.026275634765625, -0.0213775634765625, 0.0100250244140625, 0.007701873779296875, -0.033935546875, -0.017822265625, 0.052978515625, 0.028350830078125, -0.0037097930908203125, 0.042877197265625, -0.03436279296875, -0.01371002197265625, 0.05072021484375, 0.0450439453125, 0.06695556640625, -0.035247802734375, 0.027496337890625, 0.061737060546875, 0.029205322265625, 0.016571044921875, 0.05096435546875, 0.007293701171875, -0.03594970703125, -0.0308837890625, -0.037200927734375, -0.02154541015625, 0.0408935546875, -0.047576904296875, 0.048828125, -0.03997802734375, -0.006633758544921875, -0.009490966796875, -0.01056671142578125, -0.03131103515625, 0.050384521484375, 0.0174102783203125, 0.05340576171875, -0.0712890625, 0.033660888671875, 0.056854248046875, -0.0662841796875, -0.06842041015625, 0.0027523040771484375, 0.0210723876953125, -0.038116455078125, -0.01210784912109375, 0.006099700927734375, -0.0006971359252929688, 0.0195465087890625, -0.051177978515625, -0.06024169921875, 0.0848388671875, 0.04327392578125, -0.039520263671875, 0.0031585693359375, -0.0318603515625, 0.043487548828125, 0.0012273788452148438, 0.024169921875, 0.0184173583984375, 0.032745361328125, 0.0128631591796875, -0.067626953125, 0.018890380859375, -0.043487548828125, 0.0212249755859375, 0.0194854736328125, -0.06597900390625, 0.044647216796875, -0.0205078125, -0.036651611328125, 0.0286712646484375, 0.0538330078125, 0.0161590576171875, 0.029083251953125, 0.033172607421875, 0.049346923828125, 0.046234130859375, -0.00887298583984375, 0.078857421875, -0.030059814453125, 0.023834228515625, 0.0438232421875, 0.0180816650390625, 0.0287628173828125, 0.036590576171875, -0.01300811767578125, 0.0469970703125, 0.077392578125, -0.026763916015625, 0.0208587646484375, 0.0031070709228515625, -0.0200958251953125, -0.006671905517578125, -0.0023403167724609375, -0.05145263671875, 0.0251617431640625, 0.0189666748046875, -0.02471923828125, -0.01235198974609375, 0.0173187255859375, -0.0009012222290039062, 0.00873565673828125, -0.022857666015625, 0.04534912109375, -0.0007624626159667969, -0.0190887451171875, 0.05511474609375, -0.00637054443359375, 0.047637939453125, -0.031219482421875, -0.034820556640625, -0.02337646484375, -0.0018863677978515625, -0.037200927734375, -0.05712890625, 0.037384033203125, -0.007625579833984375, -0.0233612060546875, -0.00504302978515625, 0.06048583984375, -0.00782012939453125, -0.0335693359375, 0.004947662353515625, 0.017486572265625, 0.03472900390625, 0.0111083984375, -0.054107666015625, -0.002132415771484375, 0.00969696044921875, -0.0283966064453125, 0.0147247314453125, 0.0171356201171875, -0.00440216064453125, 0.029541015625, 0.0237579345703125, 0.00484466552734375, -0.0160369873046875, 0.00923919677734375, 0.063232421875, -0.038116455078125, -0.0275115966796875, -0.044097900390625, 0.07720947265625, -0.017974853515625, -0.03985595703125, 0.0682373046875, 0.0276641845703125, 0.0771484375, -0.0249786376953125, 0.032257080078125, -0.0095062255859375, 0.011444091796875, -0.04345703125, 0.04681396484375, -0.06085205078125, -0.014434814453125, -0.0467529296875, -0.077880859375, -0.00951385498046875, 0.06610107421875, 0.0044708251953125, 0.029510498046875, 0.06085205078125, 0.06011962890625, -0.0007781982421875, -0.0101318359375, 0.037384033203125, 0.01073455810546875, -0.00016236305236816406, 0.0290069580078125, 0.0286712646484375, -0.0386962890625, 0.01849365234375, -0.0384521484375, -0.0235137939453125, -0.007965087890625, -0.0438232421875, -0.035186767578125, -0.051116943359375, -0.045379638671875, -0.0518798828125, 0.00225067138671875, 0.049346923828125, 0.06280517578125, -0.04815673828125, 0.0115203857421875, -0.0087890625, -0.00655364990234375, -0.0187225341796875, -0.02227783203125, 0.0075225830078125, 0.02496337890625, -0.07049560546875, 0.011444091796875, 0.01435089111328125, 0.0253448486328125, -0.036346435546875, -0.002838134765625, -0.002696990966796875, -0.009857177734375, 0.03570556640625, 0.03875732421875, -0.04266357421875, -0.041046142578125, -0.0143890380859375, 0.0026531219482421875, 0.0121612548828125, 0.01204681396484375, -0.0377197265625, 0.0699462890625, 0.040802001953125, -0.002582550048828125, 0.04266357421875, 0.007480621337890625, 0.0283203125, -0.0513916015625, 0.006542205810546875, 0.0141143798828125, 0.029632568359375, 0.0207366943359375, -0.0555419921875, 0.023681640625, 0.036956787109375, -0.032928466796875, -0.04632568359375, 0.0250244140625, -0.087890625, -0.00959014892578125, 0.079833984375, 0.0050506591796875, -0.028778076171875, 0.005832672119140625, -0.0621337890625, 0.030517578125, -0.0303955078125, 0.043426513671875, 0.04168701171875, -0.01305389404296875, -0.032501220703125, -0.034912109375, 0.036773681640625, 0.0165252685546875, -0.050537109375, -0.02056884765625, 0.0675048828125, 0.0086822509765625, 0.0256500244140625, 0.04266357421875, -0.037322998046875, 0.050537109375, 0.030059814453125, 0.049072265625, -0.0191802978515625, -0.0125274658203125, -0.046722412109375, -0.0015954971313476562, -0.002777099609375, -0.02691650390625 ] ]
THUDM/chatglm2-6b
2023-10-09T08:19:27.000Z
[ "transformers", "pytorch", "chatglm", "glm", "thudm", "custom_code", "zh", "en", "arxiv:2103.10360", "arxiv:2210.02414", "arxiv:1911.02150", "endpoints_compatible", "has_space", "region:us" ]
null
THUDM
null
null
THUDM/chatglm2-6b
1,882
274,059
transformers
2023-06-24T16:26:27
--- language: - zh - en tags: - glm - chatglm - thudm --- # ChatGLM2-6B <p align="center"> 💻 <a href="https://github.com/THUDM/ChatGLM2-6B" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br> </p> <p align="center"> 👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1y7pqoloy-9b1g6T6JjA8J0KxvUjbwJw" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a> </p> <p align="center"> 📍Experience the larger-scale ChatGLM model at <a href="https://www.chatglm.cn">chatglm.cn</a> </p> ## 介绍 ChatGLM**2**-6B 是开源中英双语对话模型 [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B) 的第二代版本,在保留了初代模型对话流畅、部署门槛较低等众多优秀特性的基础之上,ChatGLM**2**-6B 引入了如下新特性: 1. **更强大的性能**:基于 ChatGLM 初代模型的开发经验,我们全面升级了 ChatGLM2-6B 的基座模型。ChatGLM2-6B 使用了 [GLM](https://github.com/THUDM/GLM) 的混合目标函数,经过了 1.4T 中英标识符的预训练与人类偏好对齐训练,[评测结果](#评测结果)显示,相比于初代模型,ChatGLM2-6B 在 MMLU(+23%)、CEval(+33%)、GSM8K(+571%) 、BBH(+60%)等数据集上的性能取得了大幅度的提升,在同尺寸开源模型中具有较强的竞争力。 2. **更长的上下文**:基于 [FlashAttention](https://github.com/HazyResearch/flash-attention) 技术,我们将基座模型的上下文长度(Context Length)由 ChatGLM-6B 的 2K 扩展到了 32K,并在对话阶段使用 8K 的上下文长度训练,允许更多轮次的对话。但当前版本的 ChatGLM2-6B 对单轮超长文档的理解能力有限,我们会在后续迭代升级中着重进行优化。 3. **更高效的推理**:基于 [Multi-Query Attention](http://arxiv.org/abs/1911.02150) 技术,ChatGLM2-6B 有更高效的推理速度和更低的显存占用:在官方的模型实现下,推理速度相比初代提升了 42%,INT4 量化下,6G 显存支持的对话长度由 1K 提升到了 8K。 4. **更开放的协议**:ChatGLM2-6B 权重对学术研究**完全开放**,在填写[问卷](https://open.bigmodel.cn/mla/form)进行登记后**亦允许免费商业使用**。 ChatGLM**2**-6B is the second-generation version of the open-source bilingual (Chinese-English) chat model [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B). It retains the smooth conversation flow and low deployment threshold of the first-generation model, while introducing the following new features: 1. **Stronger Performance**: Based on the development experience of the first-generation ChatGLM model, we have fully upgraded the base model of ChatGLM2-6B. ChatGLM2-6B uses the hybrid objective function of [GLM](https://github.com/THUDM/GLM), and has undergone pre-training with 1.4T bilingual tokens and human preference alignment training. The [evaluation results](README.md#evaluation-results) show that, compared to the first-generation model, ChatGLM2-6B has achieved substantial improvements in performance on datasets like MMLU (+23%), CEval (+33%), GSM8K (+571%), BBH (+60%), showing strong competitiveness among models of the same size. 2. **Longer Context**: Based on [FlashAttention](https://github.com/HazyResearch/flash-attention) technique, we have extended the context length of the base model from 2K in ChatGLM-6B to 32K, and trained with a context length of 8K during the dialogue alignment, allowing for more rounds of dialogue. However, the current version of ChatGLM2-6B has limited understanding of single-round ultra-long documents, which we will focus on optimizing in future iterations. 3. **More Efficient Inference**: Based on [Multi-Query Attention](http://arxiv.org/abs/1911.02150) technique, ChatGLM2-6B has more efficient inference speed and lower GPU memory usage: under the official implementation, the inference speed has increased by 42% compared to the first generation; under INT4 quantization, the dialogue length supported by 6G GPU memory has increased from 1K to 8K. 4. **More Open License**: ChatGLM2-6B weights are **completely open** for academic research, and **free commercial use** is also allowed after completing the [questionnaire](https://open.bigmodel.cn/mla/form). ## 软件依赖 ```shell pip install protobuf transformers==4.30.2 cpm_kernels torch>=2.0 gradio mdtex2html sentencepiece accelerate ``` ## 代码调用 可以通过如下代码调用 ChatGLM-6B 模型来生成对话: ```ipython >>> from transformers import AutoTokenizer, AutoModel >>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm2-6b", trust_remote_code=True) >>> model = AutoModel.from_pretrained("THUDM/chatglm2-6b", trust_remote_code=True).half().cuda() >>> model = model.eval() >>> response, history = model.chat(tokenizer, "你好", history=[]) >>> print(response) 你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。 >>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history) >>> print(response) 晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法: 1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。 2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。 3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。 4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。 5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。 6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。 如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。 ``` 关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM2-6B)。 For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/ChatGLM2-6B). ## Change Log * v1.0 ## 协议 本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM2-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。 ## 引用 如果你觉得我们的工作有帮助的话,请考虑引用下列论文,ChatGLM2-6B 的论文会在近期公布,敬请期待~ ``` @article{zeng2022glm, title={Glm-130b: An open bilingual pre-trained model}, author={Zeng, Aohan and Liu, Xiao and Du, Zhengxiao and Wang, Zihan and Lai, Hanyu and Ding, Ming and Yang, Zhuoyi and Xu, Yifan and Zheng, Wendi and Xia, Xiao and others}, journal={arXiv preprint arXiv:2210.02414}, year={2022} } ``` ``` @inproceedings{du2022glm, title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling}, author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie}, booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, pages={320--335}, year={2022} } ```
6,205
[ [ -0.033355712890625, -0.0631103515625, 0.0078277587890625, 0.0251007080078125, -0.0266571044921875, -0.00418853759765625, -0.0235443115234375, -0.042388916015625, 0.00580596923828125, 0.01255035400390625, -0.0400390625, -0.045684814453125, -0.03955078125, -0.0172576904296875, -0.010223388671875, 0.06719970703125, 0.0169219970703125, 0.0028400421142578125, 0.00420379638671875, -0.003284454345703125, -0.03875732421875, -0.038360595703125, -0.058502197265625, -0.01468658447265625, 0.00688934326171875, 0.0112457275390625, 0.053070068359375, 0.0225677490234375, 0.032379150390625, 0.023834228515625, -0.017547607421875, 0.01517486572265625, -0.05096435546875, -0.021942138671875, 0.0167083740234375, -0.038177490234375, -0.051666259765625, 0.0036640167236328125, 0.04132080078125, 0.0163421630859375, -0.00986480712890625, 0.023681640625, 0.0239105224609375, 0.0496826171875, -0.031585693359375, 0.037872314453125, -0.04449462890625, -0.00572967529296875, -0.007843017578125, -0.00860595703125, -0.02569580078125, -0.0249786376953125, 0.0028400421142578125, -0.041717529296875, 0.0016660690307617188, 0.01338958740234375, 0.0924072265625, -0.009124755859375, -0.02227783203125, -0.01409149169921875, -0.045928955078125, 0.07159423828125, -0.08392333984375, 0.01511383056640625, 0.0257110595703125, 0.0323486328125, -0.020538330078125, -0.057037353515625, -0.035125732421875, -0.01186370849609375, -0.034637451171875, 0.0270233154296875, -0.0128631591796875, -0.00424957275390625, 0.01120758056640625, 0.0267791748046875, -0.053314208984375, 0.0021800994873046875, -0.042022705078125, -0.0203857421875, 0.0513916015625, 0.01216888427734375, 0.04412841796875, -0.00836181640625, -0.0360107421875, -0.0006632804870605469, -0.03759765625, 0.0180816650390625, 0.0195770263671875, 0.018890380859375, -0.0469970703125, 0.0235137939453125, -0.00797271728515625, 0.035064697265625, 0.0048675537109375, -0.0231170654296875, 0.033050537109375, -0.044677734375, -0.0198211669921875, -0.0180816650390625, 0.09259033203125, 0.034393310546875, 0.001312255859375, 0.013671875, -0.0025882720947265625, -0.01090240478515625, -0.00876617431640625, -0.061279296875, -0.00846099853515625, 0.0289154052734375, -0.045928955078125, -0.0147247314453125, -0.0026721954345703125, -0.0498046875, 0.00803375244140625, 0.0014295578002929688, 0.04339599609375, -0.04193115234375, -0.03472900390625, 0.01320648193359375, -0.003997802734375, 0.024322509765625, 0.0281829833984375, -0.0743408203125, 0.026458740234375, 0.0400390625, 0.064697265625, -0.005832672119140625, -0.017486572265625, -0.012847900390625, 0.006458282470703125, -0.011383056640625, 0.028228759765625, -0.004489898681640625, -0.03692626953125, -0.007015228271484375, -0.0012769699096679688, -0.0223388671875, -0.02569580078125, 0.029388427734375, -0.02862548828125, 0.0572509765625, -0.007068634033203125, -0.0374755859375, -0.0227203369140625, 0.0216064453125, -0.027557373046875, 0.08148193359375, -0.0018777847290039062, -0.0667724609375, -0.004619598388671875, -0.047821044921875, -0.01137542724609375, -0.001964569091796875, -0.00417327880859375, -0.0283966064453125, -0.0211181640625, 0.0287933349609375, 0.01528167724609375, -0.033447265625, 0.010223388671875, -0.019927978515625, -0.0299835205078125, 0.01873779296875, -0.0255889892578125, 0.08642578125, 0.019134521484375, -0.03533935546875, 0.018798828125, -0.026763916015625, 0.023712158203125, 0.0213470458984375, -0.01493072509765625, -0.0008668899536132812, 0.0036296844482421875, 0.000637054443359375, 0.03570556640625, 0.039703369140625, -0.017547607421875, 0.007781982421875, -0.05242919921875, 0.031280517578125, 0.048095703125, -0.007328033447265625, 0.03387451171875, -0.03179931640625, 0.0213623046875, 0.01357269287109375, 0.044647216796875, -0.014434814453125, -0.052642822265625, -0.07421875, -0.01340484619140625, 0.0166015625, 0.060150146484375, -0.044952392578125, 0.05926513671875, -0.0164642333984375, -0.044830322265625, -0.03936767578125, 0.0175323486328125, 0.045501708984375, 0.0252685546875, 0.0360107421875, -0.01873779296875, -0.03936767578125, -0.056060791015625, -0.00608062744140625, -0.03369140625, -0.005275726318359375, 0.0386962890625, 0.03765869140625, -0.0244903564453125, 0.06817626953125, -0.039337158203125, -0.030426025390625, -0.024444580078125, 0.004581451416015625, 0.021697998046875, 0.04437255859375, 0.047943115234375, -0.056884765625, -0.06329345703125, -0.0005822181701660156, -0.06451416015625, 0.00858306884765625, 0.00728607177734375, -0.0298614501953125, 0.03741455078125, 0.0246429443359375, -0.046142578125, 0.0321044921875, 0.04840087890625, -0.0274658203125, 0.041351318359375, -0.015655517578125, 0.0008425712585449219, -0.0928955078125, 0.0037555694580078125, -0.00949859619140625, 0.00583648681640625, -0.053314208984375, -0.008941650390625, -0.0003650188446044922, 0.01515960693359375, -0.045379638671875, 0.07940673828125, -0.049041748046875, 0.01433563232421875, -0.0097808837890625, 0.022369384765625, -0.0128173828125, 0.057403564453125, -0.0159454345703125, 0.04736328125, 0.05450439453125, -0.04046630859375, 0.02239990234375, 0.023223876953125, -0.016693115234375, -0.0022754669189453125, -0.060638427734375, 0.01450347900390625, 0.0004203319549560547, 0.0255889892578125, -0.09954833984375, -0.005832672119140625, 0.043212890625, -0.0653076171875, 0.021697998046875, -0.0146636962890625, -0.03369140625, -0.0435791015625, -0.041107177734375, 0.01407623291015625, 0.0615234375, -0.0243377685546875, 0.040252685546875, 0.0321044921875, -0.0029048919677734375, -0.042449951171875, -0.0460205078125, -0.004364013671875, -0.0155487060546875, -0.072998046875, 0.0194549560546875, -0.01267242431640625, -0.0007920265197753906, -0.01222991943359375, 0.01007080078125, 0.00531005859375, -0.0001876354217529297, 0.01462554931640625, 0.037445068359375, -0.00821685791015625, -0.004856109619140625, -0.0138702392578125, -0.0092620849609375, -0.0011081695556640625, -0.0111236572265625, 0.049530029296875, -0.0211944580078125, -0.0266571044921875, -0.040283203125, 0.0146331787109375, 0.035308837890625, -0.0145721435546875, 0.0615234375, 0.08001708984375, -0.0140380859375, 0.0154571533203125, -0.05517578125, -0.017913818359375, -0.0411376953125, 0.0273895263671875, -0.000885009765625, -0.07452392578125, 0.061614990234375, 0.0242919921875, 0.0206756591796875, 0.047943115234375, 0.05108642578125, 0.00885009765625, 0.09210205078125, 0.0270843505859375, -0.02239990234375, 0.04034423828125, -0.031982421875, 0.0184783935546875, -0.0615234375, -0.0262908935546875, -0.029876708984375, -0.020538330078125, -0.05401611328125, -0.04010009765625, 0.0289306640625, 0.01062774658203125, -0.016021728515625, 0.00980377197265625, -0.033172607421875, -0.002170562744140625, 0.041900634765625, 0.0036792755126953125, 0.00548553466796875, -0.0088348388671875, -0.0078582763671875, 0.002094268798828125, -0.0546875, -0.036407470703125, 0.0653076171875, 0.03271484375, 0.053253173828125, 0.0249786376953125, 0.03680419921875, -0.003787994384765625, 0.022796630859375, -0.045074462890625, 0.048126220703125, 0.006114959716796875, -0.061126708984375, -0.03363037109375, -0.03546142578125, -0.07354736328125, 0.04150390625, -0.00986480712890625, -0.0792236328125, -0.006092071533203125, 0.01175689697265625, -0.0172576904296875, 0.017242431640625, -0.06585693359375, 0.06561279296875, -0.0299835205078125, -0.0267181396484375, -0.00531005859375, -0.062103271484375, 0.0423583984375, 0.0196685791015625, 0.0305633544921875, -0.0287933349609375, 0.002063751220703125, 0.05474853515625, -0.03472900390625, 0.062103271484375, -0.018310546875, -0.00574493408203125, 0.04052734375, -0.00787353515625, 0.047821044921875, 0.01372528076171875, 0.0137939453125, 0.0227203369140625, -0.0008893013000488281, -0.0295562744140625, -0.041961669921875, 0.05242919921875, -0.057403564453125, -0.0582275390625, -0.0304718017578125, -0.032379150390625, -0.01139068603515625, 0.0208587646484375, 0.029571533203125, 0.020660400390625, -0.00360107421875, 0.017913818359375, 0.027313232421875, -0.034027099609375, 0.044586181640625, 0.043731689453125, -0.04229736328125, -0.035675048828125, 0.051666259765625, 0.0033473968505859375, 0.03326416015625, 0.0157012939453125, 0.00778961181640625, -0.034088134765625, -0.034942626953125, -0.029571533203125, 0.0266571044921875, -0.03277587890625, -0.0024471282958984375, -0.0560302734375, -0.03497314453125, -0.052642822265625, 0.01151275634765625, -0.023681640625, -0.00856781005859375, -0.03173828125, 0.0045013427734375, 0.03155517578125, 0.00745391845703125, 0.002361297607421875, 0.0110015869140625, -0.071533203125, 0.0249176025390625, 0.0214691162109375, 0.0193634033203125, 0.016754150390625, -0.0574951171875, -0.035919189453125, 0.038818359375, -0.016021728515625, -0.038482666015625, 0.0447998046875, 0.010223388671875, 0.04931640625, 0.0281829833984375, -0.00457000732421875, 0.05242919921875, -0.022674560546875, 0.0736083984375, 0.0269622802734375, -0.06610107421875, 0.0400390625, -0.037994384765625, 0.033538818359375, 0.0215606689453125, 0.0213623046875, -0.047821044921875, -0.033599853515625, -0.054931640625, -0.0689697265625, 0.0780029296875, 0.042083740234375, 0.03326416015625, 0.003284454345703125, -0.0010671615600585938, -0.0232696533203125, 0.00771331787109375, -0.050201416015625, -0.053924560546875, -0.0073089599609375, -0.0070953369140625, -0.0018787384033203125, -0.03790283203125, -0.00655364990234375, -0.03521728515625, 0.05926513671875, 0.00179290771484375, 0.04803466796875, 0.0009064674377441406, 0.0025081634521484375, 0.00946807861328125, 0.01517486572265625, 0.0509033203125, 0.051177978515625, -0.0233306884765625, -0.01059722900390625, 0.0227508544921875, -0.043426513671875, -0.0042266845703125, 0.004970550537109375, -0.0181732177734375, 0.00797271728515625, 0.01776123046875, 0.08538818359375, 0.0198211669921875, -0.035919189453125, 0.042083740234375, -0.0199432373046875, -0.0226287841796875, -0.0224761962890625, 0.020599365234375, 0.0236053466796875, 0.01262664794921875, 0.044189453125, -0.023345947265625, 0.0010776519775390625, -0.05279541015625, -0.00264739990234375, 0.0428466796875, -0.0220489501953125, -0.0205535888671875, 0.054107666015625, 0.01505279541015625, -0.009307861328125, 0.0310516357421875, -0.0192413330078125, -0.044677734375, 0.043426513671875, 0.04119873046875, 0.060882568359375, -0.01788330078125, 0.00872039794921875, 0.053558349609375, 0.015838623046875, -0.0257568359375, 0.0218658447265625, 0.01502227783203125, -0.054168701171875, -0.0209503173828125, -0.036529541015625, -0.0137481689453125, 0.01491546630859375, -0.039947509765625, 0.0242156982421875, -0.0243682861328125, -0.025146484375, -0.01027679443359375, 0.006927490234375, -0.027252197265625, 0.01171875, 0.0105133056640625, 0.05120849609375, -0.03167724609375, 0.06134033203125, 0.034912109375, -0.024871826171875, -0.06610107421875, -0.014678955078125, 0.015777587890625, -0.05712890625, 0.0309295654296875, 0.01049041748046875, -0.001567840576171875, 0.0011911392211914062, -0.041015625, -0.08636474609375, 0.0958251953125, 0.0181732177734375, -0.0296173095703125, -0.00997161865234375, -0.0001201629638671875, 0.05194091796875, -0.01325225830078125, 0.045318603515625, 0.02044677734375, 0.031585693359375, 0.0198974609375, -0.09625244140625, 0.01201629638671875, -0.043182373046875, 0.01180267333984375, 0.0024280548095703125, -0.0819091796875, 0.08056640625, -0.00978851318359375, -0.031219482421875, -0.005950927734375, 0.051666259765625, 0.0186309814453125, 0.01071929931640625, 0.022125244140625, 0.026214599609375, 0.039337158203125, -0.024688720703125, 0.0655517578125, -0.039642333984375, 0.06085205078125, 0.0732421875, 0.0009088516235351562, 0.049591064453125, 0.0155029296875, -0.0299224853515625, 0.0300750732421875, 0.04046630859375, -0.0100860595703125, 0.03515625, -0.0040740966796875, -0.0189971923828125, -0.0038013458251953125, 0.0110015869140625, -0.050689697265625, 0.0189056396484375, 0.03216552734375, -0.01491546630859375, -0.01131439208984375, -0.007587432861328125, 0.0205230712890625, -0.029388427734375, -0.0114288330078125, 0.06536865234375, 0.01558685302734375, -0.05029296875, 0.07904052734375, 0.00902557373046875, 0.07550048828125, -0.062103271484375, 0.01322174072265625, -0.022064208984375, 0.00948333740234375, -0.0167388916015625, -0.045440673828125, 0.0087738037109375, -0.00931549072265625, 0.0014696121215820312, -0.0015430450439453125, 0.06329345703125, -0.0404052734375, -0.0341796875, 0.038909912109375, 0.031890869140625, 0.00974273681640625, 0.00678253173828125, -0.07305908203125, 0.0062255859375, 0.0191192626953125, -0.0367431640625, 0.03363037109375, 0.0216064453125, 0.004131317138671875, 0.0567626953125, 0.0518798828125, -0.0072784423828125, 0.00913238525390625, -0.000885009765625, 0.060943603515625, -0.04583740234375, -0.03753662109375, -0.0738525390625, 0.050506591796875, -0.01251220703125, -0.02001953125, 0.07720947265625, 0.045684814453125, 0.060943603515625, -0.00010913610458374023, 0.05914306640625, -0.0198974609375, 0.040191650390625, -0.0372314453125, 0.056854248046875, -0.04290771484375, 0.014434814453125, -0.0209197998046875, -0.044097900390625, -0.0168304443359375, 0.036773681640625, -0.0245513916015625, 0.0269775390625, 0.048187255859375, 0.0711669921875, 0.01314544677734375, -0.0159454345703125, 0.01491546630859375, 0.020660400390625, 0.028564453125, 0.066162109375, 0.041015625, -0.053741455078125, 0.054901123046875, -0.0173797607421875, -0.0012845993041992188, -0.039459228515625, -0.037322998046875, -0.08258056640625, -0.03887939453125, -0.0163421630859375, -0.033935546875, -0.00826263427734375, 0.0640869140625, 0.04473876953125, -0.052581787109375, -0.0302886962890625, 0.0161895751953125, 0.0115203857421875, -0.0182952880859375, -0.019561767578125, 0.0357666015625, -0.0270233154296875, -0.064453125, -0.00044417381286621094, 0.0191192626953125, 0.0196990966796875, -0.01515960693359375, -0.0213470458984375, -0.03387451171875, 0.00542449951171875, 0.041534423828125, 0.0232086181640625, -0.0628662109375, -0.00925445556640625, 0.008148193359375, -0.03692626953125, 0.01372528076171875, 0.0125732421875, -0.03076171875, 0.028961181640625, 0.043670654296875, 0.00391387939453125, 0.054779052734375, 0.0002613067626953125, 0.025970458984375, -0.037017822265625, 0.031341552734375, 0.0001780986785888672, 0.021392822265625, 0.0093536376953125, -0.0227203369140625, 0.04632568359375, 0.013671875, -0.0288848876953125, -0.058990478515625, -0.0146331787109375, -0.08001708984375, -0.009246826171875, 0.1099853515625, -0.016845703125, -0.0183868408203125, 0.0014324188232421875, -0.039154052734375, 0.029205322265625, -0.034149169921875, 0.0645751953125, 0.05914306640625, -0.00044465065002441406, -0.01055145263671875, -0.050445556640625, 0.041290283203125, 0.02850341796875, -0.0655517578125, -0.01032257080078125, 0.0295562744140625, 0.0225982666015625, 0.00914764404296875, 0.0711669921875, -0.01445770263671875, 0.0241546630859375, -0.02117919921875, 0.01226043701171875, -0.0113067626953125, 0.01262664794921875, -0.010040283203125, -0.00909423828125, -0.01047515869140625, -0.019775390625 ] ]
michellejieli/emotion_text_classifier
2023-05-03T00:39:47.000Z
[ "transformers", "pytorch", "roberta", "text-classification", "distilroberta", "sentiment", "emotion", "twitter", "reddit", "en", "endpoints_compatible", "has_space", "region:us" ]
text-classification
michellejieli
null
null
michellejieli/emotion_text_classifier
24
272,386
transformers
2022-10-22T22:44:07
--- language: "en" tags: - distilroberta - sentiment - emotion - twitter - reddit widget: - text: "Oh my God, he's lost it. He's totally lost it." - text: "What?" - text: "Wow, congratulations! So excited for you!" --- # Fine-tuned DistilRoBERTa-base for Emotion Classification 🤬🤢😀😐😭😲 # Model Description DistilRoBERTa-base is a transformer model that performs sentiment analysis. I fine-tuned the model on transcripts from the Friends show with the goal of classifying emotions from text data, specifically dialogue from Netflix shows or movies. The model predicts 6 Ekman emotions and a neutral class. These emotions include anger, disgust, fear, joy, neutrality, sadness, and surprise. The model is a fine-tuned version of [Emotion English DistilRoBERTa-base](https://huggingface.co/j-hartmann/emotion-english-distilroberta-base/) and [DistilRoBERTa-base](https://huggingface.co/j-hartmann/emotion-english-distilroberta-base). This model was initially trained on the following table from [Emotion English DistilRoBERTa-base](https://huggingface.co/j-hartmann/emotion-english-distilroberta-base/): |Name|anger|disgust|fear|joy|neutral|sadness|surprise| |---|---|---|---|---|---|---|---| |Crowdflower (2016)|Yes|-|-|Yes|Yes|Yes|Yes| |Emotion Dataset, Elvis et al. (2018)|Yes|-|Yes|Yes|-|Yes|Yes| |GoEmotions, Demszky et al. (2020)|Yes|Yes|Yes|Yes|Yes|Yes|Yes| |ISEAR, Vikash (2018)|Yes|Yes|Yes|Yes|-|Yes|-| |MELD, Poria et al. (2019)|Yes|Yes|Yes|Yes|Yes|Yes|Yes| |SemEval-2018, EI-reg, Mohammad et al. (2018) |Yes|-|Yes|Yes|-|Yes|-| It was fine-tuned on: |Name|anger|disgust|fear|joy|neutral|sadness|surprise| |---|---|---|---|---|---|---|---| |Emotion Lines (Friends)|Yes|Yes|Yes|Yes|Yes|Yes|Yes| # How to Use ```python from transformers import pipeline classifier = pipeline("sentiment-analysis", model="michellejieli/emotion_text_classifier") classifier("I love this!") ``` ```python Output: [{'label': 'joy', 'score': 0.9887555241584778}] ``` # Contact Please reach out to [michelleli1999@gmail.com](mailto:michelleli1999@gmail.com) if you have any questions or feedback. # Reference ``` Jochen Hartmann, "Emotion English DistilRoBERTa-base". https://huggingface.co/j-hartmann/emotion-english-distilroberta-base/, 2022. Ashritha R Murthy and K M Anil Kumar 2021 IOP Conf. Ser.: Mater. Sci. Eng. 1110 012009 ```
2,333
[ [ -0.044830322265625, -0.040740966796875, 0.005584716796875, 0.031280517578125, -0.025421142578125, -0.00019037723541259766, -0.0173492431640625, -0.0123291015625, 0.0276641845703125, 0.0020904541015625, -0.05120849609375, -0.04583740234375, -0.0643310546875, 0.0267181396484375, -0.0088348388671875, 0.09716796875, 0.0039825439453125, 0.0166168212890625, 0.0005412101745605469, -0.020782470703125, -0.0166168212890625, -0.04986572265625, -0.02874755859375, -0.02850341796875, 0.042022705078125, 0.0261688232421875, 0.0401611328125, 0.01354217529296875, 0.03814697265625, 0.0283660888671875, -0.02923583984375, -0.00995635986328125, -0.05523681640625, 0.0252838134765625, 0.005290985107421875, -0.037933349609375, -0.0289306640625, -0.00391387939453125, 0.01528167724609375, 0.036712646484375, -0.0025501251220703125, 0.0245361328125, 0.0194244384765625, 0.07904052734375, -0.0455322265625, 0.0325927734375, -0.036041259765625, 0.018646240234375, -0.0050048828125, 0.00928497314453125, -0.036102294921875, -0.033905029296875, 0.015960693359375, -0.00799560546875, 0.01413726806640625, 0.007293701171875, 0.08282470703125, 0.0433349609375, -0.033477783203125, -0.0266571044921875, -0.048431396484375, 0.061859130859375, -0.04510498046875, 0.02203369140625, -0.0007023811340332031, 0.014892578125, -0.00904083251953125, -0.043487548828125, -0.06951904296875, 0.01149749755859375, -0.022674560546875, 0.030853271484375, -0.0401611328125, 0.0005292892456054688, 0.0193634033203125, 0.052337646484375, -0.032257080078125, -0.025177001953125, -0.02508544921875, 0.0027141571044921875, 0.048004150390625, 0.022796630859375, 0.00685882568359375, -0.033233642578125, -0.0215606689453125, -0.025604248046875, -0.00365447998046875, 0.0277557373046875, 0.03131103515625, 0.0182342529296875, -0.0178375244140625, 0.05108642578125, -0.021148681640625, 0.030059814453125, 0.039031982421875, -0.005458831787109375, 0.0572509765625, 0.01364898681640625, -0.0244293212890625, 0.01311492919921875, 0.089111328125, 0.036834716796875, 0.03240966796875, 0.0126190185546875, 0.00342559814453125, 0.03924560546875, 0.011566162109375, -0.0728759765625, -0.02392578125, 0.02960205078125, -0.047332763671875, -0.043487548828125, 0.0005850791931152344, -0.08673095703125, -0.02899169921875, -0.012542724609375, 0.025726318359375, -0.039947509765625, -0.0300750732421875, 0.006938934326171875, -0.01140594482421875, 0.0023174285888671875, -0.0006403923034667969, -0.0716552734375, 0.007465362548828125, 0.03448486328125, 0.04132080078125, -0.006305694580078125, -0.012939453125, 0.0003783702850341797, -0.042144775390625, -0.0019083023071289062, 0.0369873046875, -0.02716064453125, -0.033172607421875, -0.01364898681640625, 0.01299285888671875, -0.0008640289306640625, -0.0277252197265625, 0.059234619140625, -0.01824951171875, 0.04437255859375, -0.0106658935546875, -0.038818359375, -0.034088134765625, 0.01509857177734375, -0.042755126953125, 0.07861328125, 0.0165252685546875, -0.07745361328125, 0.01849365234375, -0.04241943359375, -0.0210113525390625, -0.01406097412109375, 0.0183258056640625, -0.034423828125, 0.01220703125, -0.00164031982421875, 0.0521240234375, 0.000423431396484375, 0.017852783203125, -0.0321044921875, -0.0076904296875, 0.01264190673828125, -0.023834228515625, 0.06732177734375, 0.03204345703125, -0.03485107421875, 0.0031871795654296875, -0.0655517578125, -0.0027332305908203125, 0.0073089599609375, -0.018402099609375, -0.0269622802734375, 0.004547119140625, 0.007110595703125, 0.0220184326171875, 0.0254974365234375, -0.038177490234375, 0.00740814208984375, -0.03729248046875, 0.01404571533203125, 0.0472412109375, 0.0010776519775390625, 0.030426025390625, -0.004642486572265625, 0.0391845703125, 0.010345458984375, 0.0184173583984375, -0.001705169677734375, -0.038818359375, -0.07745361328125, -0.030487060546875, 0.0092620849609375, 0.0472412109375, -0.0276641845703125, 0.0631103515625, -0.020172119140625, -0.06610107421875, -0.04656982421875, -0.01171112060546875, 0.020782470703125, 0.06573486328125, 0.034027099609375, -0.004520416259765625, -0.06756591796875, -0.060577392578125, -0.0096435546875, -0.027801513671875, 0.013763427734375, 0.0044097900390625, 0.02911376953125, -0.036285400390625, 0.069580078125, -0.044677734375, -0.01702880859375, -0.01549530029296875, 0.0164337158203125, 0.046295166015625, 0.0146636962890625, 0.044525146484375, -0.063232421875, -0.0704345703125, -0.00634002685546875, -0.05682373046875, -0.0111846923828125, 0.0230865478515625, -0.021453857421875, 0.01519012451171875, -0.0098876953125, -0.047637939453125, 0.0280914306640625, 0.045257568359375, -0.048828125, 0.034027099609375, 0.0173797607421875, 0.0126800537109375, -0.10394287109375, -0.00429534912109375, 0.01395416259765625, -0.006610870361328125, -0.055450439453125, -0.005462646484375, -0.009979248046875, 0.0105743408203125, -0.043365478515625, 0.029632568359375, -0.0130767822265625, 0.026214599609375, -0.0117645263671875, -0.036041259765625, 0.006916046142578125, 0.05523681640625, 0.00238800048828125, 0.0218353271484375, 0.046295166015625, -0.026031494140625, 0.050872802734375, 0.04681396484375, -0.0031299591064453125, 0.056243896484375, -0.0433349609375, -0.004871368408203125, -0.025177001953125, 0.0151214599609375, -0.09136962890625, -0.01300048828125, 0.0264434814453125, -0.04718017578125, 0.0276031494140625, -0.0113525390625, -0.015869140625, -0.0396728515625, -0.0164947509765625, -0.0013551712036132812, 0.07183837890625, -0.044219970703125, 0.06500244140625, 0.019805908203125, -0.01251220703125, -0.051116943359375, -0.04669189453125, -0.01398468017578125, -0.04931640625, -0.04754638671875, 0.007244110107421875, -0.01026153564453125, -0.0216522216796875, 0.0037403106689453125, -0.00882720947265625, -0.0161285400390625, -0.00438690185546875, 0.051483154296875, 0.0285491943359375, -0.00249481201171875, -0.00896453857421875, 0.00970458984375, -0.01190948486328125, 0.017547607421875, 0.027313232421875, 0.0526123046875, -0.041839599609375, 0.0029449462890625, -0.036834716796875, 0.0181732177734375, 0.05908203125, 0.002185821533203125, 0.04931640625, 0.053314208984375, -0.0217132568359375, -0.00936126708984375, -0.0291290283203125, -0.003429412841796875, -0.03485107421875, 0.0272369384765625, -0.0281219482421875, -0.06195068359375, 0.0477294921875, 0.0021724700927734375, -0.01213836669921875, 0.0634765625, 0.0677490234375, -0.0217132568359375, 0.0888671875, 0.03424072265625, -0.03204345703125, 0.0224609375, -0.0309295654296875, 0.017547607421875, -0.04229736328125, -0.027008056640625, -0.040985107421875, -0.047637939453125, -0.04693603515625, 0.00308990478515625, 0.0210723876953125, 0.004146575927734375, -0.0286102294921875, 0.022247314453125, -0.060516357421875, 0.0243377685546875, 0.0269927978515625, 0.0006475448608398438, 0.00995635986328125, -0.00278472900390625, 0.0011730194091796875, -0.0237884521484375, -0.025054931640625, -0.036224365234375, 0.056182861328125, 0.04217529296875, 0.060089111328125, 0.004856109619140625, 0.0655517578125, 0.0207977294921875, 0.0205230712890625, -0.0771484375, 0.03857421875, -0.0264892578125, -0.040435791015625, -0.00946807861328125, -0.0244598388671875, -0.060211181640625, 0.014923095703125, -0.0180206298828125, -0.05206298828125, 0.0297393798828125, 0.01514434814453125, -0.0153350830078125, 0.01216888427734375, -0.064453125, 0.0672607421875, -0.00690460205078125, -0.0172576904296875, -0.005344390869140625, -0.052703857421875, 0.0301971435546875, 0.004070281982421875, 0.003223419189453125, -0.012054443359375, 0.032684326171875, 0.0440673828125, -0.04705810546875, 0.06500244140625, -0.0286102294921875, 0.004062652587890625, 0.0239105224609375, 0.00894927978515625, 0.0253143310546875, -0.007762908935546875, -0.0123443603515625, 0.025177001953125, -0.00023055076599121094, -0.00446319580078125, -0.040740966796875, 0.050079345703125, -0.0673828125, -0.0139617919921875, -0.04547119140625, -0.015869140625, -0.0171356201171875, 0.00905609130859375, 0.04034423828125, -0.00803375244140625, -0.00817108154296875, 0.0256195068359375, 0.038421630859375, -0.005558013916015625, 0.02490234375, 0.0254669189453125, -0.0153961181640625, -0.03460693359375, 0.054168701171875, -0.01477813720703125, -0.00830078125, 0.014312744140625, 0.02911376953125, -0.04052734375, -0.003185272216796875, -0.0201263427734375, 0.00614166259765625, -0.051116943359375, -0.0242156982421875, -0.055511474609375, -0.0130157470703125, -0.021270751953125, 0.0036468505859375, -0.031494140625, -0.0277252197265625, -0.0289764404296875, -0.02178955078125, 0.048095703125, 0.04327392578125, 0.011199951171875, 0.0168609619140625, -0.058990478515625, 0.0263519287109375, 0.0232391357421875, 0.0272216796875, -0.00771331787109375, -0.04656982421875, -0.00013113021850585938, 0.007232666015625, -0.033660888671875, -0.08642578125, 0.054443359375, 0.0259552001953125, 0.0222930908203125, 0.035064697265625, 0.0181427001953125, 0.054473876953125, -0.037933349609375, 0.0648193359375, 0.0234832763671875, -0.0802001953125, 0.045928955078125, -0.00362396240234375, -0.01218414306640625, 0.048858642578125, 0.052734375, -0.045989990234375, -0.024871826171875, -0.0472412109375, -0.0738525390625, 0.0572509765625, 0.018524169921875, 0.0197906494140625, -0.007659912109375, 0.01104736328125, 0.015380859375, 0.023345947265625, -0.05419921875, -0.050811767578125, -0.038116455078125, -0.0548095703125, -0.0005183219909667969, -0.031097412109375, 0.006389617919921875, -0.0391845703125, 0.058074951171875, 0.0015497207641601562, 0.0141143798828125, 0.0138092041015625, 0.00716400146484375, -0.01511383056640625, 0.018402099609375, 0.016143798828125, -0.0112762451171875, -0.033447265625, 0.0078887939453125, 0.0260009765625, -0.0408935546875, 0.0189971923828125, -0.00325775146484375, 0.004932403564453125, 0.017730712890625, 0.0092010498046875, 0.0870361328125, -0.0157470703125, -0.0286865234375, 0.041839599609375, -0.018768310546875, -0.0242919921875, -0.035858154296875, 0.008453369140625, -0.0023632049560546875, 0.0294952392578125, 0.02288818359375, 0.0241546630859375, 0.0274810791015625, -0.0303802490234375, 0.01129150390625, 0.01282501220703125, -0.04638671875, -0.039276123046875, 0.04229736328125, 0.00887298583984375, -0.01209259033203125, 0.035491943359375, -0.024658203125, -0.05865478515625, 0.04107666015625, 0.0189971923828125, 0.08331298828125, -0.025787353515625, 0.0106658935546875, 0.036773681640625, -0.0041351318359375, -0.011199951171875, 0.044189453125, 0.0213775634765625, -0.060760498046875, -0.023712158203125, -0.07171630859375, -0.0252838134765625, 0.01274871826171875, -0.06640625, 0.0301513671875, -0.0252838134765625, -0.0229949951171875, 0.004077911376953125, -0.01160430908203125, -0.0472412109375, 0.02734375, 0.0288848876953125, 0.06329345703125, -0.08209228515625, 0.049591064453125, 0.0626220703125, -0.02337646484375, -0.08135986328125, -0.0142364501953125, 0.00569915771484375, -0.0350341796875, 0.062225341796875, 0.0208892822265625, -0.0030536651611328125, 0.004367828369140625, -0.0264434814453125, -0.0513916015625, 0.07513427734375, 0.0105743408203125, -0.041961669921875, 0.017242431640625, 0.01708984375, 0.0816650390625, -0.021392822265625, 0.039306640625, 0.03564453125, 0.03594970703125, 0.01415252685546875, -0.054443359375, -0.00341033935546875, -0.056488037109375, -0.00814056396484375, 0.01035308837890625, -0.06134033203125, 0.070556640625, 0.0089569091796875, -0.006542205810546875, -0.012298583984375, 0.053924560546875, 0.013671875, 0.0302581787109375, 0.060394287109375, 0.05950927734375, 0.044952392578125, -0.0281219482421875, 0.0587158203125, -0.038970947265625, 0.059783935546875, 0.05908203125, -0.0244293212890625, 0.060150146484375, 0.04071044921875, -0.04144287109375, 0.06610107421875, 0.046844482421875, -0.0195159912109375, 0.04705810546875, 0.012054443359375, -0.0092315673828125, 0.001964569091796875, 0.00531005859375, -0.019500732421875, 0.043182373046875, 0.0203094482421875, -0.0294036865234375, 0.0033416748046875, 0.00634002685546875, 0.0126495361328125, -0.0038661956787109375, -0.00579071044921875, 0.0550537109375, -0.004642486572265625, -0.037506103515625, 0.053924560546875, -0.0182342529296875, 0.07049560546875, -0.0316162109375, 0.0012540817260742188, -0.020294189453125, 0.0216064453125, -0.0396728515625, -0.06585693359375, 0.035369873046875, 0.026092529296875, -0.029449462890625, -0.005527496337890625, 0.043670654296875, -0.0340576171875, -0.041748046875, 0.0263519287109375, 0.0149993896484375, 0.00974273681640625, -0.01042938232421875, -0.05718994140625, 0.002246856689453125, 0.0217132568359375, -0.04718017578125, -0.00018846988677978516, 0.0435791015625, 0.022003173828125, 0.0401611328125, 0.03143310546875, 0.0021953582763671875, 0.0065460205078125, 0.003253936767578125, 0.061859130859375, -0.040771484375, -0.030059814453125, -0.08544921875, 0.0745849609375, -0.00441741943359375, -0.023162841796875, 0.056640625, 0.04254150390625, 0.039337158203125, -0.0288848876953125, 0.06451416015625, -0.026519775390625, 0.048004150390625, -0.0026760101318359375, 0.053314208984375, -0.043609619140625, -0.0135498046875, -0.038970947265625, -0.07244873046875, -0.00868988037109375, 0.055084228515625, -0.0125579833984375, -0.00860595703125, 0.040863037109375, 0.0447998046875, -0.0014190673828125, -0.0009717941284179688, 0.0029468536376953125, 0.03375244140625, 0.01070404052734375, 0.038726806640625, 0.058563232421875, -0.03399658203125, 0.03582763671875, -0.052093505859375, -0.020233154296875, -0.0212249755859375, -0.05560302734375, -0.0721435546875, -0.040679931640625, -0.0224761962890625, -0.045166015625, -0.0298614501953125, 0.07098388671875, 0.032257080078125, -0.08502197265625, 0.0003962516784667969, -0.007762908935546875, -0.01058197021484375, 0.002147674560546875, -0.02752685546875, 0.01483154296875, -0.014984130859375, -0.0799560546875, 0.00272369384765625, 0.0077362060546875, 0.004810333251953125, -0.013671875, -0.022674560546875, 0.0173797607421875, -0.00897979736328125, 0.03961181640625, 0.004329681396484375, -0.041473388671875, -0.016387939453125, 0.01568603515625, -0.0135040283203125, 0.0135955810546875, 0.00923919677734375, -0.03436279296875, 0.0263671875, 0.06024169921875, 0.021026611328125, 0.0242156982421875, 0.00543975830078125, 0.016937255859375, -0.05487060546875, 0.0141143798828125, 0.018829345703125, 0.05157470703125, 0.01727294921875, -0.04156494140625, 0.032928466796875, 0.0226593017578125, -0.044097900390625, -0.05438232421875, 0.01058197021484375, -0.09844970703125, 0.0003993511199951172, 0.08184814453125, -0.01251983642578125, -0.0215606689453125, 0.0176239013671875, -0.0426025390625, 0.037445068359375, -0.042633056640625, 0.0692138671875, 0.060577392578125, -0.0135498046875, -0.0010328292846679688, -0.0027523040771484375, 0.03179931640625, 0.0372314453125, -0.037841796875, -0.01045989990234375, 0.034820556640625, 0.0219879150390625, 0.0290679931640625, 0.021636962890625, 0.00012111663818359375, -0.0065155029296875, -0.01383209228515625, 0.042999267578125, 0.005855560302734375, -0.001605987548828125, -0.03790283203125, 0.00910186767578125, -0.02008056640625, -0.02227783203125 ] ]
openai/whisper-large-v2
2023-09-08T12:54:49.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "whisper", "automatic-speech-recognition", "audio", "hf-asr-leaderboard", "en", "zh", "de", "es", "ru", "ko", "fr", "ja", "pt", "tr", "pl", "ca", "nl", "ar", "sv", "it", "id", "hi", "fi", "vi", "he", "uk", "el", "ms", "cs", "ro", "da", "hu", "ta", "no", "th", "ur", "hr", "bg", "lt", "la", "mi", "ml", "cy", "sk", "te", "fa", "lv", "bn", "sr", "az", "sl", "kn", "et", "mk", "br", "eu", "is", "hy", "ne", "mn", "bs", "kk", "sq", "sw", "gl", "mr", "pa", "si", "km", "sn", "yo", "so", "af", "oc", "ka", "be", "tg", "sd", "gu", "am", "yi", "lo", "uz", "fo", "ht", "ps", "tk", "nn", "mt", "sa", "lb", "my", "bo", "tl", "mg", "as", "tt", "haw", "ln", "ha", "ba", "jw", "su", "arxiv:2212.04356", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
openai
null
null
openai/whisper-large-v2
1,383
270,529
transformers
2022-12-05T18:42:20
--- language: - en - zh - de - es - ru - ko - fr - ja - pt - tr - pl - ca - nl - ar - sv - it - id - hi - fi - vi - he - uk - el - ms - cs - ro - da - hu - ta - no - th - ur - hr - bg - lt - la - mi - ml - cy - sk - te - fa - lv - bn - sr - az - sl - kn - et - mk - br - eu - is - hy - ne - mn - bs - kk - sq - sw - gl - mr - pa - si - km - sn - yo - so - af - oc - ka - be - tg - sd - gu - am - yi - lo - uz - fo - ht - ps - tk - nn - mt - sa - lb - my - bo - tl - mg - as - tt - haw - ln - ha - ba - jw - su tags: - audio - automatic-speech-recognition - hf-asr-leaderboard widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac pipeline_tag: automatic-speech-recognition license: apache-2.0 --- # Whisper Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need for fine-tuning. Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356) by Alec Radford et al. from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper). Compared to the Whisper large model, the large-v2 model is trained for 2.5x more epochs with added regularization for improved performance. **Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were copied and pasted from the original model card. ## Model details Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model. It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision. The models were trained on either English-only data or multilingual data. The English-only models were trained on the task of speech recognition. The multilingual models were trained on both speech recognition and speech translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio. For speech translation, the model predicts transcriptions to a *different* language to the audio. Whisper checkpoints come in five configurations of varying model sizes. The smallest four are trained on either English-only or multilingual data. The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The checkpoints are summarised in the following table with links to the models on the Hub: | Size | Parameters | English-only | Multilingual | |----------|------------|------------------------------------------------------|-----------------------------------------------------| | tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) | | base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) | | small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) | | medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) | | large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) | | large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) | # Usage To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor). The `WhisperProcessor` is used to: 1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model) 2. Post-process the model outputs (converting them from tokens to text) The model is informed of which task to perform (transcription or translation) by passing the appropriate "context tokens". These context tokens are a sequence of tokens that are given to the decoder at the start of the decoding process, and take the following order: 1. The transcription always starts with the `<|startoftranscript|>` token 2. The second token is the language token (e.g. `<|en|>` for English) 3. The third token is the "task token". It can take one of two values: `<|transcribe|>` for speech recognition or `<|translate|>` for speech translation 4. In addition, a `<|notimestamps|>` token is added if the model should not include timestamp prediction Thus, a typical sequence of context tokens might look as follows: ``` <|startoftranscript|> <|en|> <|transcribe|> <|notimestamps|> ``` Which tells the model to decode in English, under the task of speech recognition, and not to predict timestamps. These tokens can either be forced or un-forced. If they are forced, the model is made to predict each token at each position. This allows one to control the output language and task for the Whisper model. If they are un-forced, the Whisper model will automatically predict the output langauge and task itself. The context tokens can be set accordingly: ```python model.config.forced_decoder_ids = WhisperProcessor.get_decoder_prompt_ids(language="english", task="transcribe") ``` Which forces the model to predict in English under the task of speech recognition. ## Transcription ### English to English In this example, the context tokens are 'unforced', meaning the model automatically predicts the output language (English) and task (transcribe). ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-large-v2") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-large-v2") >>> model.config.forced_decoder_ids = None >>> # load dummy dataset and read audio files >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False) ['<|startoftranscript|><|en|><|transcribe|><|notimestamps|> Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.'] ``` The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`. ### French to French The following example demonstrates French to French transcription by setting the decoder ids appropriately. ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-large-v2") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-large-v2") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="transcribe") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids) ['<|startoftranscript|><|fr|><|transcribe|><|notimestamps|> Un vrai travail intéressant va enfin être mené sur ce sujet.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Un vrai travail intéressant va enfin être mené sur ce sujet.'] ``` ## Translation Setting the task to "translate" forces the Whisper model to perform speech translation. ### French to English ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-large-v2") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-large-v2") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="translate") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' A very interesting work, we will finally be given on this subject.'] ``` ## Evaluation This code snippet shows how to evaluate Whisper Large on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr): ```python >>> from datasets import load_dataset >>> from transformers import WhisperForConditionalGeneration, WhisperProcessor >>> import torch >>> from evaluate import load >>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test") >>> processor = WhisperProcessor.from_pretrained("openai/whisper-large-v2") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-large-v2").to("cuda") >>> def map_to_pred(batch): >>> audio = batch["audio"] >>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features >>> batch["reference"] = processor.tokenizer._normalize(batch['text']) >>> >>> with torch.no_grad(): >>> predicted_ids = model.generate(input_features.to("cuda"))[0] >>> transcription = processor.decode(predicted_ids) >>> batch["prediction"] = processor.tokenizer._normalize(transcription) >>> return batch >>> result = librispeech_test_clean.map(map_to_pred) >>> wer = load("wer") >>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"])) 3.0003583080317572 ``` ## Long-Form Transcription The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers [`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline) method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`: ```python >>> import torch >>> from transformers import pipeline >>> from datasets import load_dataset >>> device = "cuda:0" if torch.cuda.is_available() else "cpu" >>> pipe = pipeline( >>> "automatic-speech-recognition", >>> model="openai/whisper-large-v2", >>> chunk_length_s=30, >>> device=device, >>> ) >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> prediction = pipe(sample.copy(), batch_size=8)["text"] " Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel." >>> # we can also return timestamps for the predictions >>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"] [{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.', 'timestamp': (0.0, 5.44)}] ``` Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm. ## Fine-Tuning The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However, its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step guide to fine-tuning the Whisper model with as little as 5 hours of labelled data. ### Evaluated Use The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research. The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them. In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes. ## Training Data The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages. As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language. ## Performance and Limitations Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level. However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself. Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf). In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages. ## Broader Implications We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications. There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects. ### BibTeX entry and citation info ```bibtex @misc{radford2022whisper, doi = {10.48550/ARXIV.2212.04356}, url = {https://arxiv.org/abs/2212.04356}, author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya}, title = {Robust Speech Recognition via Large-Scale Weak Supervision}, publisher = {arXiv}, year = {2022}, copyright = {arXiv.org perpetual, non-exclusive license} } ```
18,955
[ [ -0.0144195556640625, -0.043365478515625, 0.00868988037109375, 0.035003662109375, -0.00959014892578125, -0.00876617431640625, -0.023956298828125, -0.042388916015625, 0.0155181884765625, 0.034423828125, -0.061004638671875, -0.043792724609375, -0.05792236328125, -0.0008959770202636719, -0.0401611328125, 0.0762939453125, 0.007595062255859375, -0.0012598037719726562, 0.01224517822265625, -0.00768280029296875, -0.03314208984375, -0.035491943359375, -0.053802490234375, -0.01525115966796875, 0.01129913330078125, 0.01251983642578125, 0.0299224853515625, 0.0372314453125, 0.00804901123046875, 0.0307159423828125, -0.032257080078125, -0.002330780029296875, -0.03106689453125, -0.004955291748046875, 0.027557373046875, -0.038330078125, -0.044952392578125, 0.00609588623046875, 0.054656982421875, 0.038818359375, -0.01800537109375, 0.028350830078125, 0.016937255859375, 0.0244903564453125, -0.019256591796875, 0.0226287841796875, -0.04620361328125, -0.01061248779296875, -0.019866943359375, -0.0023708343505859375, -0.0243682861328125, -0.0191192626953125, 0.04046630859375, -0.04229736328125, 0.0204925537109375, 0.0008630752563476562, 0.07574462890625, 0.0155029296875, -0.01397705078125, -0.02764892578125, -0.05303955078125, 0.0770263671875, -0.06512451171875, 0.03692626953125, 0.037567138671875, 0.0135955810546875, -0.00481414794921875, -0.0662841796875, -0.05157470703125, -0.0014524459838867188, -0.007198333740234375, 0.0220947265625, -0.034942626953125, 0.0014371871948242188, 0.0180206298828125, 0.0200042724609375, -0.03704833984375, 0.0019664764404296875, -0.049102783203125, -0.05413818359375, 0.043487548828125, -0.0010995864868164062, 0.0247955322265625, -0.019805908203125, -0.0161590576171875, -0.0240020751953125, -0.0233001708984375, 0.034942626953125, 0.026611328125, 0.034332275390625, -0.048614501953125, 0.029571533203125, -0.0092315673828125, 0.047821044921875, 0.0116424560546875, -0.04998779296875, 0.049102783203125, -0.014801025390625, -0.01239776611328125, 0.02655029296875, 0.074951171875, 0.02264404296875, 0.01116943359375, 0.002101898193359375, -0.017974853515625, 0.007335662841796875, -0.00811767578125, -0.054931640625, 0.006160736083984375, 0.037689208984375, -0.040985107421875, -0.023040771484375, -0.0176544189453125, -0.037506103515625, 0.017059326171875, -0.018768310546875, 0.05230712890625, -0.042633056640625, -0.02740478515625, 0.01189422607421875, -0.03021240234375, 0.01904296875, 0.002048492431640625, -0.06072998046875, 0.0273590087890625, 0.03204345703125, 0.06744384765625, 0.0042877197265625, -0.049224853515625, -0.042816162109375, 0.006107330322265625, 0.0036468505859375, 0.03485107421875, -0.018341064453125, -0.043121337890625, -0.006336212158203125, 0.01358795166015625, -0.0287322998046875, -0.038116455078125, 0.051971435546875, -0.009857177734375, 0.034393310546875, -0.005786895751953125, -0.037322998046875, -0.0209503173828125, -0.01250457763671875, -0.03363037109375, 0.07122802734375, 0.0114898681640625, -0.052398681640625, 0.009521484375, -0.038970947265625, -0.0380859375, -0.01276397705078125, 0.0193634033203125, -0.032928466796875, -0.0021724700927734375, 0.032379150390625, 0.0362548828125, -0.00971221923828125, 0.009124755859375, 0.00719451904296875, -0.030364990234375, 0.0269317626953125, -0.03411865234375, 0.07470703125, 0.011871337890625, -0.02960205078125, 0.01396942138671875, -0.058258056640625, 0.005100250244140625, 0.00527191162109375, -0.0191802978515625, 0.007137298583984375, -0.0008854866027832031, 0.021148681640625, 0.0077056884765625, 0.0158233642578125, -0.0517578125, -0.0056610107421875, -0.049530029296875, 0.06597900390625, 0.042327880859375, -0.00254058837890625, 0.027008056640625, -0.040374755859375, 0.022735595703125, 0.010223388671875, 0.0269317626953125, -0.018890380859375, -0.04681396484375, -0.0618896484375, -0.029327392578125, 0.033782958984375, 0.0594482421875, -0.03082275390625, 0.044219970703125, -0.0216217041015625, -0.0443115234375, -0.09228515625, -0.00635528564453125, 0.042877197265625, 0.046722412109375, 0.049224853515625, -0.00411224365234375, -0.04974365234375, -0.060272216796875, -0.0091705322265625, -0.0219573974609375, -0.01235198974609375, 0.0259246826171875, 0.0263671875, -0.02923583984375, 0.0511474609375, -0.029876708984375, -0.03973388671875, -0.025360107421875, 0.0038909912109375, 0.032928466796875, 0.045928955078125, 0.025177001953125, -0.05645751953125, -0.0278472900390625, -0.0147247314453125, -0.04168701171875, -0.0120086669921875, -0.0090484619140625, -0.001373291015625, 0.016448974609375, 0.035125732421875, -0.052398681640625, 0.0340576171875, 0.052734375, -0.01467132568359375, 0.047210693359375, 0.004650115966796875, -0.002986907958984375, -0.0892333984375, 0.0016622543334960938, -0.016448974609375, -0.0121002197265625, -0.05328369140625, -0.01739501953125, -0.004878997802734375, -0.007213592529296875, -0.04229736328125, 0.0457763671875, -0.0261993408203125, 0.0035877227783203125, -0.00469207763671875, 0.010498046875, -0.0028743743896484375, 0.0487060546875, 0.019744873046875, 0.05224609375, 0.06201171875, -0.042388916015625, 0.01702880859375, 0.04205322265625, -0.01947021484375, 0.0216522216796875, -0.0728759765625, 0.0100250244140625, 0.005367279052734375, 0.01413726806640625, -0.0675048828125, -0.00882720947265625, 0.0034198760986328125, -0.071044921875, 0.03240966796875, -0.0255126953125, -0.02288818359375, -0.040008544921875, -0.0084228515625, 0.005687713623046875, 0.06512451171875, -0.03631591796875, 0.053070068359375, 0.03228759765625, -0.0177459716796875, -0.04241943359375, -0.052337646484375, -0.004978179931640625, -0.0102996826171875, -0.057891845703125, 0.037200927734375, -0.00162506103515625, 0.0048828125, -0.00859832763671875, -0.003814697265625, 0.009033203125, -0.0147857666015625, 0.035430908203125, 0.03131103515625, -0.006084442138671875, -0.0190277099609375, 0.018524169921875, -0.0186767578125, 0.0007233619689941406, -0.0206756591796875, 0.05059814453125, -0.0160064697265625, -0.0005178451538085938, -0.059326171875, 0.028228759765625, 0.048065185546875, -0.027008056640625, 0.04974365234375, 0.056060791015625, -0.0202484130859375, -0.01151275634765625, -0.04852294921875, -0.01280975341796875, -0.039581298828125, 0.01751708984375, -0.03558349609375, -0.060638427734375, 0.0595703125, 0.0174713134765625, 0.01369476318359375, 0.04901123046875, 0.037994384765625, -0.00943756103515625, 0.079833984375, 0.036865234375, -0.0209503173828125, 0.0189666748046875, -0.050506591796875, -0.00632476806640625, -0.07672119140625, -0.03271484375, -0.040863037109375, -0.013824462890625, -0.033599853515625, -0.021148681640625, 0.03533935546875, 0.01422882080078125, -0.0011425018310546875, 0.0384521484375, -0.051788330078125, 0.0005512237548828125, 0.049774169921875, 0.0008215904235839844, 0.004001617431640625, -0.0027599334716796875, -0.0189971923828125, -0.00034499168395996094, -0.0360107421875, -0.02801513671875, 0.07379150390625, 0.03582763671875, 0.0328369140625, -0.002696990966796875, 0.05279541015625, -0.0020923614501953125, -0.00034165382385253906, -0.060577392578125, 0.03778076171875, -0.00925445556640625, -0.038726806640625, -0.0301971435546875, -0.0185394287109375, -0.06341552734375, 0.0127716064453125, -0.012664794921875, -0.054473876953125, 0.00778961181640625, -0.0010852813720703125, -0.02154541015625, 0.01458740234375, -0.054718017578125, 0.04656982421875, 0.01245880126953125, 0.01073455810546875, 0.0014867782592773438, -0.055633544921875, 0.011199951171875, 0.006134033203125, 0.00873565673828125, -0.00469207763671875, 0.01087188720703125, 0.07647705078125, -0.03680419921875, 0.07073974609375, -0.0233612060546875, 0.00039958953857421875, 0.0335693359375, -0.00814056396484375, 0.029632568359375, -0.0160064697265625, -0.00812530517578125, 0.03753662109375, 0.0265045166015625, -0.021331787109375, -0.0207061767578125, 0.039581298828125, -0.08062744140625, -0.0294189453125, -0.018463134765625, -0.024505615234375, -0.00843048095703125, 0.0176849365234375, 0.0677490234375, 0.0567626953125, -0.0107879638671875, -0.0006666183471679688, 0.032379150390625, -0.019805908203125, 0.04180908203125, 0.047454833984375, -0.01537322998046875, -0.0374755859375, 0.06732177734375, 0.02191162109375, 0.017913818359375, 0.020904541015625, 0.0255279541015625, -0.035491943359375, -0.050445556640625, -0.040863037109375, 0.0241241455078125, -0.03851318359375, -0.011566162109375, -0.0703125, -0.04315185546875, -0.0528564453125, 0.0024738311767578125, -0.0276641845703125, -0.0213623046875, -0.035064697265625, 0.007537841796875, 0.04248046875, 0.0313720703125, 0.0014123916625976562, 0.0401611328125, -0.0738525390625, 0.03057861328125, 0.0242767333984375, 0.00714874267578125, 0.0025882720947265625, -0.0771484375, -0.00461578369140625, 0.0159454345703125, -0.01467132568359375, -0.054595947265625, 0.039642333984375, 0.0257720947265625, 0.040802001953125, 0.0202789306640625, -0.000659942626953125, 0.060516357421875, -0.055877685546875, 0.06512451171875, 0.01165008544921875, -0.09381103515625, 0.0540771484375, -0.0266571044921875, 0.0258636474609375, 0.0287933349609375, 0.0269775390625, -0.05426025390625, -0.036376953125, -0.046173095703125, -0.048919677734375, 0.0611572265625, 0.0258636474609375, 0.01322174072265625, 0.006893157958984375, 0.0229644775390625, 0.005611419677734375, 0.008636474609375, -0.03656005859375, -0.031982421875, -0.034210205078125, -0.018524169921875, -0.0146331787109375, -0.01296234130859375, -0.0022983551025390625, -0.038818359375, 0.055206298828125, -0.0016431808471679688, 0.042236328125, 0.0322265625, -0.002872467041015625, -0.0015687942504882812, 0.006206512451171875, 0.043792724609375, 0.0218963623046875, -0.01352691650390625, -0.0271759033203125, 0.0235137939453125, -0.058746337890625, -0.0015583038330078125, 0.0201263427734375, -0.0213165283203125, 0.01151275634765625, 0.060760498046875, 0.0919189453125, 0.016387939453125, -0.037200927734375, 0.05474853515625, -0.007793426513671875, -0.031707763671875, -0.042388916015625, 0.0030536651611328125, 0.022430419921875, 0.0163116455078125, 0.02581787109375, 0.007579803466796875, 0.007160186767578125, -0.036346435546875, 0.0036258697509765625, 0.021484375, -0.03350830078125, -0.04010009765625, 0.06097412109375, 0.0114288330078125, -0.036376953125, 0.05279541015625, 0.00800323486328125, -0.05792236328125, 0.036346435546875, 0.05322265625, 0.07513427734375, -0.03778076171875, 0.002094268798828125, 0.034423828125, 0.017120361328125, -0.005817413330078125, 0.037872314453125, -0.00885772705078125, -0.055572509765625, -0.0341796875, -0.07452392578125, -0.0172271728515625, 0.01348876953125, -0.06976318359375, 0.0228424072265625, -0.0187225341796875, -0.0217742919921875, 0.0229949951171875, 0.0000731348991394043, -0.057037353515625, 0.01003265380859375, 0.00606536865234375, 0.0787353515625, -0.054962158203125, 0.0750732421875, 0.0180816650390625, -0.0181121826171875, -0.08233642578125, 0.00225830078125, 0.0025539398193359375, -0.0784912109375, 0.0304107666015625, 0.025390625, -0.0149993896484375, 0.01335906982421875, -0.0418701171875, -0.064208984375, 0.0750732421875, 0.00919342041015625, -0.054473876953125, -0.007434844970703125, -0.00444793701171875, 0.039947509765625, -0.0211639404296875, 0.00952911376953125, 0.0546875, 0.0316162109375, 0.00780487060546875, -0.104248046875, -0.006412506103515625, -0.0202484130859375, -0.01216888427734375, 0.0026988983154296875, -0.054962158203125, 0.062286376953125, -0.0251007080078125, -0.0188446044921875, 0.0202789306640625, 0.050506591796875, 0.0169830322265625, 0.0162506103515625, 0.046173095703125, 0.03662109375, 0.05316162109375, -0.01212310791015625, 0.07647705078125, -0.020111083984375, 0.00909423828125, 0.06536865234375, -0.00437164306640625, 0.0850830078125, 0.0204315185546875, -0.0279388427734375, 0.04217529296875, 0.0289764404296875, -0.000016570091247558594, 0.03985595703125, -0.006740570068359375, -0.022216796875, 0.006511688232421875, -0.00392913818359375, -0.032257080078125, 0.05902099609375, 0.0301666259765625, -0.0200653076171875, 0.0257415771484375, 0.02288818359375, 0.009552001953125, -0.01082611083984375, -0.0191650390625, 0.07269287109375, 0.00930023193359375, -0.04669189453125, 0.0660400390625, 0.00238037109375, 0.072998046875, -0.06231689453125, 0.0162811279296875, 0.003276824951171875, 0.010467529296875, -0.01239013671875, -0.049591064453125, 0.0240631103515625, -0.0107269287109375, -0.024383544921875, -0.01206207275390625, 0.03985595703125, -0.05401611328125, -0.03961181640625, 0.042724609375, 0.025115966796875, 0.0243682861328125, -0.0093231201171875, -0.06512451171875, 0.0302581787109375, 0.0160675048828125, -0.01824951171875, 0.01316070556640625, 0.01491546630859375, 0.0166778564453125, 0.047576904296875, 0.0643310546875, 0.0297698974609375, 0.0103912353515625, 0.0138092041015625, 0.059722900390625, -0.047698974609375, -0.05047607421875, -0.049560546875, 0.035400390625, 0.003582000732421875, -0.03485107421875, 0.059112548828125, 0.037261962890625, 0.05084228515625, 0.0013513565063476562, 0.057525634765625, 0.0056304931640625, 0.07061767578125, -0.042724609375, 0.060638427734375, -0.0325927734375, 0.000053822994232177734, -0.025238037109375, -0.056884765625, 0.003704071044921875, 0.04248046875, -0.0049896240234375, -0.007904052734375, 0.02789306640625, 0.06634521484375, 0.006404876708984375, 0.01210784912109375, 0.0100250244140625, 0.030853271484375, 0.016510009765625, 0.042083740234375, 0.0430908203125, -0.057159423828125, 0.04779052734375, -0.0355224609375, -0.018646240234375, 0.0043487548828125, -0.043670654296875, -0.07440185546875, -0.0657958984375, -0.01922607421875, -0.042449951171875, -0.017425537109375, 0.05963134765625, 0.06719970703125, -0.062255859375, -0.0234222412109375, 0.022003173828125, -0.005146026611328125, -0.029205322265625, -0.018310546875, 0.04315185546875, -0.001674652099609375, -0.0667724609375, 0.046142578125, 0.0033664703369140625, 0.0293426513671875, -0.0144195556640625, -0.0164031982421875, 0.00412750244140625, 0.007293701171875, 0.0426025390625, 0.0216217041015625, -0.06488037109375, -0.010223388671875, 0.0092620849609375, 0.00421905517578125, -0.0026721954345703125, 0.0312042236328125, -0.052337646484375, 0.0263671875, 0.0268707275390625, 0.00952911376953125, 0.061370849609375, -0.0221099853515625, 0.0275726318359375, -0.054931640625, 0.034942626953125, 0.0147857666015625, 0.0246124267578125, 0.0265045166015625, -0.022674560546875, 0.0101318359375, 0.0209503173828125, -0.04205322265625, -0.0762939453125, -0.00937652587890625, -0.08441162109375, -0.01255035400390625, 0.07684326171875, 0.0020580291748046875, -0.02520751953125, -0.0084991455078125, -0.0257720947265625, 0.03228759765625, -0.035736083984375, 0.0243988037109375, 0.043701171875, 0.0048675537109375, -0.003292083740234375, -0.044464111328125, 0.054656982421875, 0.0154571533203125, -0.01812744140625, -0.0015134811401367188, 0.005107879638671875, 0.04547119140625, 0.0195159912109375, 0.0635986328125, -0.01727294921875, 0.014190673828125, 0.00927734375, 0.01262664794921875, -0.00794219970703125, -0.01467132568359375, -0.03424072265625, -0.0047607421875, -0.0242156982421875, -0.0301361083984375 ] ]
facebook/blenderbot-400M-distill
2023-03-30T16:12:30.000Z
[ "transformers", "pytorch", "tf", "jax", "blenderbot", "text2text-generation", "convAI", "conversational", "facebook", "en", "dataset:blended_skill_talk", "arxiv:2004.13637", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
conversational
facebook
null
null
facebook/blenderbot-400M-distill
283
268,295
transformers
2022-03-02T23:29:05
--- language: - en thumbnail: tags: - convAI - conversational - facebook license: apache-2.0 datasets: - blended_skill_talk metrics: - perplexity --- ## Model description + Paper: [Recipes for building an open-domain chatbot]( https://arxiv.org/abs/2004.13637) + [Original PARLAI Code](https://parl.ai/projects/recipes/) ### Abstract Building open-domain chatbots is a challenging area for machine learning research. While prior work has shown that scaling neural models in the number of parameters and the size of the data they are trained on gives improved results, we show that other ingredients are important for a high-performing chatbot. Good conversation requires a number of skills that an expert conversationalist blends in a seamless way: providing engaging talking points and listening to their partners, both asking and answering questions, and displaying knowledge, empathy and personality appropriately, depending on the situation. We show that large scale models can learn these skills when given appropriate training data and choice of generation strategy. We build variants of these recipes with 90M, 2.7B and 9.4B parameter neural models, and make our models and code publicly available. Human evaluations show our best models are superior to existing approaches in multi-turn dialogue in terms of engagingness and humanness measurements. We then discuss the limitations of this work by analyzing failure cases of our models.
1,451
[ [ -0.0263214111328125, -0.060302734375, 0.025970458984375, 0.0249481201171875, 0.02410888671875, -0.006618499755859375, -0.032135009765625, -0.0158843994140625, -0.0006656646728515625, 0.0501708984375, -0.0202789306640625, -0.0194244384765625, -0.057098388671875, -0.0279388427734375, -0.017669677734375, 0.07293701171875, 0.024658203125, 0.01381683349609375, -0.0252227783203125, -0.0141448974609375, -0.036712646484375, -0.05615234375, -0.061431884765625, -0.01464080810546875, 0.047698974609375, 0.035797119140625, 0.07171630859375, 0.035369873046875, 0.0144805908203125, 0.0251922607421875, -0.0247650146484375, 0.0355224609375, -0.056732177734375, 0.02301025390625, -0.0027790069580078125, -0.0302276611328125, -0.06573486328125, 0.017913818359375, 0.004848480224609375, 0.06378173828125, -0.0253143310546875, 0.005031585693359375, 0.0170135498046875, 0.04132080078125, -0.030303955078125, 0.037689208984375, -0.0631103515625, -0.0167388916015625, 0.007427215576171875, -0.01348114013671875, -0.058929443359375, -0.01415252685546875, 0.0184478759765625, -0.045928955078125, 0.01605224609375, -0.0048828125, 0.053375244140625, -0.0021343231201171875, -0.0214691162109375, -0.0277862548828125, -0.07244873046875, 0.055267333984375, -0.0679931640625, 0.01488494873046875, 0.0452880859375, 0.0360107421875, -0.032958984375, -0.05584716796875, -0.031646728515625, -0.0219879150390625, 0.00907135009765625, -0.00609588623046875, -0.01641845703125, -0.0002224445343017578, 0.00696563720703125, 0.016998291015625, -0.02105712890625, 0.01317596435546875, -0.0245819091796875, 0.009185791015625, 0.045440673828125, 0.03167724609375, 0.01806640625, 0.0119171142578125, -0.0025844573974609375, -0.01517486572265625, -0.03204345703125, 0.006549835205078125, 0.0584716796875, 0.0457763671875, -0.0206756591796875, 0.041717529296875, -0.01312255859375, 0.05072021484375, -0.0009069442749023438, -0.00572967529296875, -0.0044708251953125, -0.031158447265625, -0.006256103515625, -0.03216552734375, 0.06365966796875, 0.032257080078125, 0.037353515625, -0.00778961181640625, 0.0169830322265625, -0.029022216796875, 0.046875, -0.07708740234375, -0.0352783203125, 0.0287322998046875, -0.045745849609375, -0.025787353515625, -0.00508880615234375, -0.042633056640625, -0.0248260498046875, -0.02008056640625, 0.017242431640625, -0.04632568359375, -0.036651611328125, 0.0212554931640625, -0.016204833984375, 0.01055145263671875, 0.032684326171875, -0.08782958984375, 0.0233917236328125, 0.056793212890625, 0.0638427734375, 0.01459503173828125, -0.0225067138671875, -0.034515380859375, -0.02471923828125, -0.03619384765625, 0.0305023193359375, -0.04583740234375, -0.0240936279296875, 0.007476806640625, -0.01361083984375, -0.0005679130554199219, -0.03863525390625, 0.03448486328125, -0.03204345703125, 0.0296173095703125, -0.005626678466796875, -0.045196533203125, -0.01132965087890625, 0.0079498291015625, -0.03961181640625, 0.035247802734375, 0.01473236083984375, -0.044219970703125, 0.0016584396362304688, -0.06988525390625, -0.035919189453125, 0.01275634765625, 0.001285552978515625, -0.0214996337890625, 0.0228729248046875, 0.0169677734375, 0.04791259765625, -0.026153564453125, 0.0238189697265625, -0.0291290283203125, 0.01172637939453125, 0.04156494140625, -0.037322998046875, 0.06781005859375, 0.0088653564453125, 0.005924224853515625, 0.0287628173828125, -0.04547119140625, 0.027923583984375, 0.020294189453125, -0.02166748046875, -0.01079559326171875, 0.0091400146484375, 0.0045166015625, -0.007404327392578125, -0.0020923614501953125, -0.0303802490234375, -0.00018894672393798828, -0.039031982421875, 0.06231689453125, 0.047393798828125, -0.004329681396484375, 0.0379638671875, -0.0269012451171875, 0.0194549560546875, 0.008880615234375, 0.02227783203125, -0.0372314453125, -0.07232666015625, -0.06915283203125, -0.0297698974609375, 0.037353515625, 0.025970458984375, -0.033905029296875, 0.0650634765625, 0.005550384521484375, -0.07171630859375, -0.06793212890625, -0.004894256591796875, 0.030303955078125, 0.03472900390625, 0.0177001953125, -0.035247802734375, -0.034637451171875, -0.060577392578125, -0.01715087890625, -0.0018320083618164062, -0.021759033203125, 0.0526123046875, 0.031829833984375, 0.012176513671875, 0.0672607421875, -0.04046630859375, 0.0028247833251953125, -0.031982421875, 0.01471710205078125, 0.01256561279296875, 0.0341796875, 0.0285491943359375, -0.0662841796875, -0.053558349609375, 0.0080718994140625, -0.0626220703125, 0.0277252197265625, -0.0013866424560546875, -0.026763916015625, -0.0011777877807617188, 0.01399993896484375, -0.07293701171875, 0.040679931640625, 0.0277862548828125, -0.0181427001953125, 0.0267181396484375, -0.0188751220703125, 0.0147857666015625, -0.09002685546875, 0.0009217262268066406, 0.0101470947265625, 0.004486083984375, -0.07843017578125, 0.00942230224609375, 0.00687408447265625, -0.030670166015625, -0.048583984375, 0.04412841796875, -0.01390838623046875, 0.014404296875, -0.01535797119140625, -0.00922393798828125, -0.005275726318359375, 0.06378173828125, 0.002056121826171875, 0.054840087890625, 0.035980224609375, -0.037567138671875, 0.0249481201171875, 0.036865234375, -0.02154541015625, 0.0390625, -0.08709716796875, 0.017730712890625, 0.0142974853515625, 0.01148223876953125, -0.074462890625, -0.043182373046875, 0.0005888938903808594, -0.0628662109375, 0.0125732421875, -0.0243377685546875, -0.0411376953125, -0.00963592529296875, 0.01007843017578125, 0.00272369384765625, 0.056304931640625, -0.033660888671875, 0.044464111328125, 0.03759765625, -0.0186004638671875, -0.00716400146484375, -0.006908416748046875, 0.00751495361328125, -0.00460052490234375, -0.062255859375, 0.01324462890625, -0.0250244140625, -0.007678985595703125, -0.0142364501953125, 0.022918701171875, -0.016998291015625, 0.00734710693359375, 0.0287322998046875, 0.01214599609375, -0.021270751953125, -0.01044464111328125, -0.01029205322265625, -0.0056610107421875, 0.00945281982421875, -0.0219879150390625, 0.05877685546875, -0.021270751953125, -0.00986480712890625, -0.043670654296875, 0.035919189453125, 0.048248291015625, -0.0258636474609375, 0.07415771484375, 0.033843994140625, -0.0172882080078125, -0.0113372802734375, -0.0229949951171875, -0.0277099609375, -0.034576416015625, 0.020965576171875, -0.01517486572265625, -0.060302734375, 0.037353515625, 0.0135955810546875, 0.0108184814453125, 0.0177154541015625, 0.06622314453125, 0.0007863044738769531, 0.09027099609375, 0.041778564453125, 0.0174407958984375, 0.021148681640625, -0.006988525390625, 0.0264434814453125, -0.04644775390625, -0.02154541015625, -0.046630859375, -0.00936126708984375, -0.0309600830078125, -0.0379638671875, 0.02593994140625, -0.01149749755859375, -0.034881591796875, 0.03759765625, -0.03094482421875, 0.040985107421875, 0.0709228515625, 0.026458740234375, -0.0014448165893554688, -0.006622314453125, 0.0031890869140625, -0.0015153884887695312, -0.06353759765625, -0.0435791015625, 0.08709716796875, 0.036590576171875, 0.048675537109375, -0.00614166259765625, 0.0282440185546875, -0.00798797607421875, 0.0006427764892578125, -0.0692138671875, 0.03997802734375, 0.00672149658203125, -0.065185546875, -0.036468505859375, -0.041961669921875, -0.06951904296875, 0.007221221923828125, -0.01165771484375, -0.034881591796875, -0.0241851806640625, 0.009918212890625, -0.025726318359375, 0.02880859375, -0.07220458984375, 0.08038330078125, -0.0091094970703125, -0.0219573974609375, -0.0013036727905273438, -0.06646728515625, 0.01291656494140625, 0.0212249755859375, -0.01328277587890625, -0.004299163818359375, 0.0208587646484375, 0.032989501953125, -0.0277862548828125, 0.0897216796875, -0.013641357421875, 0.004039764404296875, 0.0287322998046875, 0.0176239013671875, 0.00966644287109375, 0.0024967193603515625, 0.0016345977783203125, 0.0232086181640625, -0.02337646484375, -0.0478515625, -0.058197021484375, 0.040008544921875, -0.0743408203125, -0.0408935546875, 0.0051422119140625, -0.050140380859375, -0.0267791748046875, 0.0095672607421875, 0.00910186767578125, 0.034027099609375, -0.034393310546875, 0.055023193359375, 0.06671142578125, -0.036285400390625, 0.0170745849609375, 0.0217132568359375, 0.010589599609375, -0.039398193359375, 0.059722900390625, -0.005039215087890625, 0.036895751953125, 0.0238494873046875, 0.02374267578125, 0.019622802734375, -0.02374267578125, -0.0300140380859375, -0.009368896484375, -0.033935546875, -0.01203155517578125, -0.05718994140625, -0.04486083984375, -0.03631591796875, -0.00652313232421875, -0.055328369140625, -0.0294189453125, -0.03131103515625, 0.00803375244140625, 0.030792236328125, 0.047637939453125, 0.0089263916015625, 0.029205322265625, -0.06329345703125, 0.00508880615234375, 0.0233154296875, 0.0255279541015625, 0.042449951171875, -0.036102294921875, -0.025909423828125, 0.0240325927734375, -0.04736328125, -0.0328369140625, 0.048583984375, 0.00989532470703125, 0.039154052734375, 0.0192718505859375, -0.002490997314453125, 0.0243377685546875, -0.04400634765625, 0.06610107421875, 0.0286865234375, -0.054595947265625, 0.046142578125, -0.045013427734375, 0.0210418701171875, 0.0225982666015625, 0.067138671875, -0.033172607421875, -0.024658203125, -0.05804443359375, -0.058135986328125, 0.050384521484375, 0.039825439453125, 0.048919677734375, -0.0084075927734375, 0.0187835693359375, 0.0343017578125, 0.01861572265625, -0.028472900390625, -0.0108795166015625, -0.032318115234375, -0.0159759521484375, -0.005481719970703125, -0.006786346435546875, -0.0216217041015625, -0.0225067138671875, 0.032257080078125, -0.006122589111328125, 0.034698486328125, -0.024322509765625, 0.03326416015625, 0.00148773193359375, 0.0103302001953125, 0.044281005859375, 0.04583740234375, -0.0146331787109375, -0.0141448974609375, -0.0190887451171875, -0.007965087890625, -0.0169830322265625, -0.0148468017578125, 0.030059814453125, -0.032196044921875, 0.0302276611328125, 0.0718994140625, 0.0038967132568359375, -0.049896240234375, 0.042755126953125, -0.0238494873046875, -0.033233642578125, -0.005275726318359375, 0.03179931640625, 0.04156494140625, 0.0260009765625, 0.006305694580078125, 0.018768310546875, -0.01508331298828125, -0.050994873046875, 0.00333404541015625, 0.0192413330078125, -0.04132080078125, -0.040863037109375, 0.044403076171875, 0.033843994140625, -0.063232421875, 0.070556640625, -0.00445556640625, -0.0283966064453125, 0.037567138671875, 0.039825439453125, 0.06634521484375, -0.014892578125, 0.0272979736328125, 0.0189056396484375, 0.002231597900390625, -0.0033626556396484375, 0.01324462890625, -0.00725555419921875, -0.057525634765625, -0.0102081298828125, -0.026885986328125, -0.0640869140625, 0.007503509521484375, -0.03997802734375, 0.007358551025390625, -0.03277587890625, -0.01250457763671875, 0.02978515625, -0.031280517578125, -0.0579833984375, -0.00921630859375, -0.0149688720703125, 0.061187744140625, -0.038055419921875, 0.04620361328125, 0.0280914306640625, -0.044097900390625, -0.046051025390625, -0.0218353271484375, -0.0170135498046875, -0.05804443359375, 0.03753662109375, -0.0008497238159179688, 0.0242919921875, -0.02020263671875, -0.07623291015625, -0.031768798828125, 0.055694580078125, 0.0262451171875, -0.018707275390625, -0.033447265625, 0.003917694091796875, 0.06011962890625, -0.05755615234375, 0.028228759765625, 0.010894775390625, 0.00324249267578125, 0.029937744140625, -0.0831298828125, -0.01544952392578125, -0.02410888671875, 0.007091522216796875, 0.005710601806640625, -0.049652099609375, 0.072021484375, -0.026275634765625, 0.0187835693359375, 0.02294921875, 0.055023193359375, 0.0001474618911743164, 0.022003173828125, 0.01641845703125, 0.03228759765625, 0.0313720703125, 0.0036773681640625, 0.057830810546875, -0.0217437744140625, 0.005832672119140625, 0.10540771484375, -0.027191162109375, 0.0811767578125, 0.01146697998046875, 0.0028553009033203125, 0.027191162109375, 0.0357666015625, 0.0095367431640625, 0.03173828125, 0.0095672607421875, -0.0038318634033203125, -0.023590087890625, 0.0017366409301757812, -0.01076507568359375, 0.049896240234375, 0.036102294921875, -0.037384033203125, -0.000029802322387695312, 0.0018701553344726562, 0.01383209228515625, 0.0071563720703125, 0.01451873779296875, 0.0718994140625, 0.00714874267578125, -0.0631103515625, 0.0479736328125, -0.007293701171875, 0.0173797607421875, -0.032958984375, -0.0078582763671875, -0.0215911865234375, 0.01522064208984375, 0.00807952880859375, -0.058807373046875, 0.01174163818359375, -0.00897979736328125, -0.00711822509765625, -0.02020263671875, 0.04254150390625, -0.0260772705078125, -0.0089111328125, 0.0135345458984375, 0.0526123046875, 0.005279541015625, -0.0190277099609375, -0.049102783203125, -0.006137847900390625, -0.01451873779296875, -0.0178680419921875, 0.030731201171875, 0.052337646484375, 0.0236663818359375, 0.04766845703125, 0.046173095703125, -0.003692626953125, -0.01073455810546875, -0.004001617431640625, 0.0733642578125, -0.0445556640625, -0.02655029296875, -0.044952392578125, 0.05767822265625, -0.02862548828125, -0.0447998046875, 0.056304931640625, 0.03521728515625, 0.0672607421875, -0.01416778564453125, 0.05328369140625, -0.004993438720703125, 0.043670654296875, -0.021636962890625, 0.055419921875, -0.0229644775390625, -0.0089111328125, 0.0055084228515625, -0.061859130859375, -0.0179595947265625, 0.03680419921875, 0.0021152496337890625, 0.0110931396484375, 0.03240966796875, 0.06787109375, -0.01334381103515625, 0.02880859375, 0.052520751953125, 0.01099395751953125, 0.04058837890625, 0.042083740234375, 0.06744384765625, -0.029052734375, 0.041717529296875, 0.00531768798828125, -0.037811279296875, -0.0255889892578125, -0.052490234375, -0.1043701171875, -0.0679931640625, -0.0167694091796875, -0.039947509765625, 0.007099151611328125, 0.076171875, 0.09393310546875, -0.0439453125, -0.0296478271484375, -0.004161834716796875, -0.003597259521484375, -0.0189056396484375, -0.0128326416015625, -0.01271820068359375, -0.0408935546875, -0.05657958984375, 0.033905029296875, 0.0146026611328125, 0.004238128662109375, -0.0267181396484375, -0.01401519775390625, 0.0005578994750976562, 0.040771484375, 0.06201171875, 0.02294921875, -0.04437255859375, -0.005786895751953125, 0.007663726806640625, -0.006916046142578125, 0.0154266357421875, 0.062103271484375, -0.01800537109375, 0.0390625, 0.03436279296875, 0.05718994140625, 0.038055419921875, -0.020294189453125, 0.0631103515625, -0.053863525390625, -0.0026302337646484375, 0.0176239013671875, 0.0132293701171875, 0.023956298828125, -0.00806427001953125, 0.039215087890625, -0.007537841796875, -0.0643310546875, -0.054168701171875, 0.0206756591796875, -0.07391357421875, -0.017547607421875, 0.08056640625, -0.022552490234375, -0.0173492431640625, 0.01088714599609375, -0.02911376953125, 0.0181884765625, -0.04150390625, 0.05517578125, 0.058074951171875, -0.0121307373046875, 0.0082244873046875, -0.0594482421875, 0.0347900390625, 0.003936767578125, -0.06085205078125, 0.0328369140625, 0.0248565673828125, -0.0015840530395507812, 0.0149383544921875, 0.034088134765625, -0.0126800537109375, 0.0013561248779296875, 0.025238037109375, 0.006679534912109375, -0.00978851318359375, -0.051788330078125, 0.007793426513671875, 0.033935546875, 0.0181427001953125, -0.003265380859375 ] ]
indolem/indobert-base-uncased
2023-08-09T13:07:37.000Z
[ "transformers", "pytorch", "jax", "bert", "fill-mask", "indobert", "indolem", "id", "arxiv:2011.00677", "license:mit", "autotrain_compatible", "has_space", "region:us" ]
fill-mask
indolem
null
null
indolem/indobert-base-uncased
23
266,655
transformers
2022-03-02T23:29:05
--- language: id tags: - indobert - indolem license: mit inference: False --- ## About [IndoBERT](https://arxiv.org/pdf/2011.00677.pdf) is the Indonesian version of BERT model. We train the model using over 220M words, aggregated from three main sources: * Indonesian Wikipedia (74M words) * news articles from Kompas, Tempo (Tala et al., 2003), and Liputan6 (55M words in total) * an Indonesian Web Corpus (Medved and Suchomel, 2017) (90M words). We trained the model for 2.4M steps (180 epochs) with the final perplexity over the development set being <b>3.97</b> (similar to English BERT-base). This <b>IndoBERT</b> was used to examine IndoLEM - an Indonesian benchmark that comprises of seven tasks for the Indonesian language, spanning morpho-syntax, semantics, and discourse. | Task | Metric | Bi-LSTM | mBERT | MalayBERT | IndoBERT | | ---- | ---- | ---- | ---- | ---- | ---- | | POS Tagging | Acc | 95.4 | <b>96.8</b> | <b>96.8</b> | <b>96.8</b> | | NER UGM | F1| 70.9 | 71.6 | 73.2 | <b>74.9</b> | | NER UI | F1 | 82.2 | 82.2 | 87.4 | <b>90.1</b> | | Dep. Parsing (UD-Indo-GSD) | UAS/LAS | 85.25/80.35 | 86.85/81.78 | 86.99/81.87 | <b>87.12<b/>/<b>82.32</b> | | Dep. Parsing (UD-Indo-PUD) | UAS/LAS | 84.04/79.01 | <b>90.58</b>/<b>85.44</b> | 88.91/83.56 | 89.23/83.95 | | Sentiment Analysis | F1 | 71.62 | 76.58 | 82.02 | <b>84.13</b> | | Summarization | R1/R2/RL | 67.96/61.65/67.24 | 68.40/61.66/67.67 | 68.44/61.38/67.71 | <b>69.93</b>/<b>62.86</b>/<b>69.21</b> | | Next Tweet Prediction | Acc | 73.6 | 92.4 | 93.1 | <b>93.7</b> | | Tweet Ordering | Spearman corr. | 0.45 | 0.53 | 0.51 | <b>0.59</b> | The paper is published at the 28th COLING 2020. Please refer to https://indolem.github.io for more details about the benchmarks. ## How to use ### Load model and tokenizer (tested with transformers==3.5.1) ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("indolem/indobert-base-uncased") model = AutoModel.from_pretrained("indolem/indobert-base-uncased") ``` ## Citation If you use our work, please cite: ```bibtex @inproceedings{koto2020indolem, title={IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP}, author={Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin}, booktitle={Proceedings of the 28th COLING}, year={2020} } ```
2,373
[ [ -0.03265380859375, -0.040679931640625, -0.003978729248046875, 0.043121337890625, -0.036041259765625, -0.014129638671875, -0.039581298828125, -0.030364990234375, 0.01076507568359375, 0.0194244384765625, -0.0140228271484375, -0.0294036865234375, -0.049652099609375, 0.0249786376953125, -0.0165252685546875, 0.06842041015625, -0.0018825531005859375, -0.00470733642578125, 0.004177093505859375, -0.0312042236328125, -0.01568603515625, -0.040008544921875, -0.032012939453125, -0.0155487060546875, 0.018829345703125, 0.011383056640625, 0.01497650146484375, 0.0306549072265625, 0.0362548828125, 0.0155029296875, -0.01267242431640625, -0.00435638427734375, -0.00798797607421875, 0.01520538330078125, 0.000009715557098388672, -0.031829833984375, -0.022613525390625, -0.0019683837890625, 0.036773681640625, 0.04791259765625, 0.0204010009765625, 0.0262451171875, 0.0184783935546875, 0.065673828125, -0.046966552734375, 0.02532958984375, -0.036834716796875, -0.01139068603515625, -0.017303466796875, 0.0008044242858886719, -0.03173828125, -0.030242919921875, 0.025299072265625, -0.02197265625, -0.003421783447265625, 0.0045623779296875, 0.08282470703125, 0.024932861328125, -0.0298309326171875, -0.011932373046875, -0.040618896484375, 0.06494140625, -0.0570068359375, 0.0131988525390625, 0.0273284912109375, 0.00928497314453125, 0.0008821487426757812, -0.06060791015625, -0.0562744140625, 0.000026047229766845703, -0.0182952880859375, 0.016021728515625, -0.0088958740234375, 0.0007138252258300781, 0.0225067138671875, 0.0251312255859375, -0.04266357421875, -0.01470184326171875, -0.035980224609375, 0.007671356201171875, 0.062042236328125, -0.01181793212890625, 0.003635406494140625, -0.060455322265625, -0.0328369140625, -0.03076171875, -0.03240966796875, 0.0273590087890625, 0.0300445556640625, 0.020721435546875, -0.0047607421875, 0.045928955078125, -0.0140838623046875, 0.0474853515625, -0.002155303955078125, -0.00769805908203125, 0.04022216796875, -0.019439697265625, -0.02435302734375, 0.015106201171875, 0.0723876953125, 0.03240966796875, 0.037139892578125, 0.0215301513671875, -0.003787994384765625, 0.004878997802734375, 0.00928497314453125, -0.06634521484375, -0.037139892578125, 0.0281219482421875, -0.052001953125, -0.004680633544921875, -0.005504608154296875, -0.051513671875, -0.005939483642578125, -0.029083251953125, 0.030792236328125, -0.06024169921875, -0.040863037109375, -0.02471923828125, -0.00662994384765625, 0.03936767578125, 0.00557708740234375, -0.056182861328125, 0.0019388198852539062, 0.037078857421875, 0.054443359375, -0.0084228515625, -0.016632080078125, 0.006580352783203125, -0.00601959228515625, -0.02880859375, 0.059051513671875, -0.038299560546875, -0.02874755859375, 0.00827789306640625, -0.0015811920166015625, -0.029205322265625, -0.0253448486328125, 0.061126708984375, -0.00931549072265625, 0.0241241455078125, -0.01444244384765625, -0.058868408203125, -0.03619384765625, 0.0225830078125, -0.023712158203125, 0.10357666015625, 0.0020694732666015625, -0.0784912109375, 0.0501708984375, -0.04718017578125, -0.0208587646484375, 0.002960205078125, -0.01099395751953125, -0.027069091796875, 0.005069732666015625, 0.033843994140625, 0.046234130859375, -0.01087188720703125, 0.029388427734375, -0.015655517578125, -0.0234222412109375, -0.0015172958374023438, -0.0184173583984375, 0.100830078125, 0.0278167724609375, -0.034149169921875, 0.0098419189453125, -0.079833984375, 0.0063323974609375, 0.00405120849609375, -0.0218353271484375, -0.052490234375, -0.0030918121337890625, 0.00942230224609375, 0.019073486328125, 0.01035308837890625, -0.04736328125, 0.028656005859375, -0.04510498046875, 0.02392578125, 0.055572509765625, 0.0062713623046875, 0.00577545166015625, -0.022247314453125, 0.0261688232421875, -0.004535675048828125, 0.015533447265625, -0.0218963623046875, -0.0311279296875, -0.0740966796875, -0.044219970703125, 0.036865234375, 0.0426025390625, -0.0308685302734375, 0.049591064453125, -0.036590576171875, -0.0634765625, -0.045135498046875, -0.004474639892578125, 0.0152740478515625, 0.032989501953125, 0.0306549072265625, -0.013946533203125, -0.055572509765625, -0.075439453125, -0.044403076171875, -0.03350830078125, 0.00334930419921875, 0.015777587890625, 0.050933837890625, -0.01605224609375, 0.08245849609375, -0.005718231201171875, -0.01316070556640625, -0.0234527587890625, 0.025360107421875, 0.0418701171875, 0.051788330078125, 0.05230712890625, -0.05194091796875, -0.08184814453125, -0.0247802734375, -0.051727294921875, 0.002376556396484375, 0.00704193115234375, -0.00033211708068847656, 0.04150390625, 0.046051025390625, -0.054718017578125, 0.04248046875, 0.0234527587890625, -0.00438690185546875, 0.046600341796875, -0.01000213623046875, -0.004688262939453125, -0.0977783203125, 0.02593994140625, -0.0030689239501953125, 0.0019245147705078125, -0.032257080078125, -0.01044464111328125, 0.015106201171875, -0.000263214111328125, -0.033843994140625, 0.046905517578125, -0.02178955078125, 0.017242431640625, -0.0018358230590820312, -0.01155853271484375, -0.01444244384765625, 0.057769775390625, -0.00901031494140625, 0.054901123046875, 0.04742431640625, -0.043182373046875, 0.006435394287109375, 0.019287109375, -0.04156494140625, 0.04425048828125, -0.06378173828125, -0.0184783935546875, -0.0143890380859375, 0.028839111328125, -0.09185791015625, -0.0136260986328125, 0.02520751953125, -0.041015625, 0.023895263671875, -0.01296234130859375, -0.06036376953125, -0.026885986328125, -0.031585693359375, 0.02691650390625, 0.05926513671875, -0.04156494140625, 0.058868408203125, 0.01025390625, -0.02545166015625, -0.05059814453125, -0.04425048828125, -0.0104827880859375, -0.022613525390625, -0.0426025390625, 0.0258636474609375, -0.002384185791015625, -0.0175628662109375, 0.01140594482421875, -0.0001239776611328125, 0.0019483566284179688, -0.015869140625, 0.03045654296875, 0.013519287109375, 0.00597381591796875, 0.0226593017578125, 0.0031280517578125, -0.001110076904296875, 0.0151519775390625, -0.017059326171875, 0.04681396484375, -0.01534271240234375, -0.00009906291961669922, -0.03558349609375, 0.00446319580078125, 0.032684326171875, -0.0215606689453125, 0.0667724609375, 0.046051025390625, -0.0163726806640625, 0.005077362060546875, -0.03857421875, -0.0019321441650390625, -0.0276947021484375, 0.011474609375, -0.0283203125, -0.0312347412109375, 0.032440185546875, -0.00662994384765625, 0.03033447265625, 0.072509765625, 0.03961181640625, -0.0033626556396484375, 0.07427978515625, 0.058746337890625, -0.010406494140625, 0.0274505615234375, -0.0340576171875, 0.034759521484375, -0.058135986328125, -0.0191497802734375, -0.046539306640625, -0.0146331787109375, -0.077392578125, -0.0027618408203125, 0.0160064697265625, 0.009979248046875, -0.0255584716796875, 0.036224365234375, -0.058685302734375, 0.002643585205078125, 0.050628662109375, -0.021209716796875, 0.0008935928344726562, 0.0144195556640625, -0.02764892578125, -0.0167694091796875, -0.05731201171875, -0.034515380859375, 0.1055908203125, 0.01319122314453125, 0.047119140625, -0.006015777587890625, 0.052764892578125, 0.012420654296875, 0.03424072265625, -0.025299072265625, 0.043701171875, -0.0021686553955078125, -0.067138671875, -0.00977325439453125, -0.009857177734375, -0.06768798828125, 0.036773681640625, -0.0005545616149902344, -0.058807373046875, 0.028167724609375, -0.0000871419906616211, -0.01085662841796875, 0.020538330078125, -0.0484619140625, 0.059783935546875, -0.002315521240234375, -0.012451171875, -0.01517486572265625, -0.06414794921875, 0.0277557373046875, 0.0120849609375, 0.0092620849609375, -0.022552490234375, 0.0112152099609375, 0.06622314453125, -0.034881591796875, 0.044891357421875, -0.0133209228515625, 0.01534271240234375, 0.025787353515625, -0.0216522216796875, 0.023834228515625, 0.00408172607421875, -0.00798797607421875, 0.019012451171875, 0.00640106201171875, -0.04150390625, -0.013092041015625, 0.05096435546875, -0.069091796875, -0.0232086181640625, -0.06982421875, -0.025146484375, 0.0023746490478515625, 0.0184478759765625, 0.0416259765625, 0.025787353515625, 0.007236480712890625, 0.030517578125, 0.035491943359375, -0.020782470703125, 0.053009033203125, 0.0197296142578125, -0.0171966552734375, -0.0301971435546875, 0.06658935546875, 0.0156707763671875, -0.00907135009765625, 0.0152130126953125, -0.0063018798828125, -0.025665283203125, -0.0071563720703125, -0.016387939453125, 0.0299530029296875, -0.054473876953125, -0.0027561187744140625, -0.08197021484375, -0.0254364013671875, -0.043182373046875, -0.0013208389282226562, -0.036285400390625, -0.041778564453125, -0.00809478759765625, 0.00717926025390625, 0.0261688232421875, 0.0210418701171875, -0.0106201171875, 0.0028934478759765625, -0.0257110595703125, -0.004993438720703125, 0.017486572265625, 0.0220184326171875, -0.0017852783203125, -0.0498046875, -0.004756927490234375, -0.01303863525390625, -0.006885528564453125, -0.046356201171875, 0.041229248046875, 0.01319122314453125, 0.0272216796875, 0.01898193359375, 0.02288818359375, 0.049530029296875, -0.02099609375, 0.0677490234375, -0.0024433135986328125, -0.05328369140625, 0.053070068359375, -0.002582550048828125, 0.022857666015625, 0.0570068359375, 0.049530029296875, -0.028533935546875, -0.0303955078125, -0.06927490234375, -0.0968017578125, 0.044769287109375, 0.0220489501953125, 0.004726409912109375, -0.00446319580078125, 0.0126800537109375, 0.0096893310546875, 0.029998779296875, -0.06451416015625, -0.040557861328125, -0.031280517578125, -0.0189971923828125, -0.00586700439453125, -0.034698486328125, 0.00157928466796875, -0.05126953125, 0.0711669921875, 0.01462554931640625, 0.0303497314453125, 0.0267486572265625, -0.025360107421875, 0.01139068603515625, 0.01219940185546875, 0.0340576171875, 0.060577392578125, -0.04156494140625, 0.0017652511596679688, 0.005100250244140625, -0.0255584716796875, -0.000003814697265625, 0.044036865234375, -0.0157470703125, 0.0250244140625, 0.0384521484375, 0.06475830078125, -0.01434326171875, -0.058868408203125, 0.039764404296875, -0.019195556640625, -0.020538330078125, -0.070556640625, -0.004955291748046875, -0.024200439453125, 0.0028934478759765625, 0.040924072265625, 0.00814056396484375, -0.01152801513671875, -0.01519012451171875, -0.0016765594482421875, 0.003887176513671875, -0.034576416015625, -0.0233306884765625, 0.0540771484375, 0.0095977783203125, -0.00154876708984375, 0.04815673828125, -0.0181427001953125, -0.042266845703125, 0.0338134765625, 0.005817413330078125, 0.07232666015625, -0.010589599609375, 0.0006866455078125, 0.0645751953125, 0.035675048828125, 0.01422882080078125, 0.0294342041015625, -0.004627227783203125, -0.0148162841796875, -0.041229248046875, -0.061737060546875, -0.01125335693359375, 0.01885986328125, -0.041015625, 0.01012420654296875, -0.0279693603515625, -0.00557708740234375, -0.0022678375244140625, 0.0163421630859375, -0.057586669921875, 0.0163116455078125, -0.02178955078125, 0.08050537109375, -0.057403564453125, 0.05938720703125, 0.07049560546875, -0.0418701171875, -0.061676025390625, 0.01361083984375, -0.0211334228515625, -0.0299530029296875, 0.06866455078125, 0.04449462890625, 0.03997802734375, -0.0046539306640625, -0.042816162109375, -0.062744140625, 0.0565185546875, -0.0048065185546875, -0.0296173095703125, 0.0218963623046875, -0.0026569366455078125, 0.0540771484375, -0.01366424560546875, 0.0265045166015625, 0.039154052734375, 0.03656005859375, -0.0271148681640625, -0.052398681640625, -0.005016326904296875, -0.0487060546875, 0.0125885009765625, 0.00970458984375, -0.05499267578125, 0.0885009765625, 0.02239990234375, -0.004367828369140625, -0.0076141357421875, 0.062255859375, 0.0118255615234375, -0.00572967529296875, 0.044769287109375, 0.052337646484375, 0.0289459228515625, -0.00605010986328125, 0.09429931640625, -0.0618896484375, 0.0305023193359375, 0.0540771484375, 0.0018434524536132812, 0.06439208984375, 0.0384521484375, -0.0208892822265625, 0.047149658203125, 0.055633544921875, 0.0078277587890625, 0.037811279296875, -0.004070281982421875, -0.01763916015625, -0.00537109375, -0.01308441162109375, -0.0384521484375, 0.03448486328125, 0.0225067138671875, -0.0168609619140625, -0.004703521728515625, 0.00982666015625, 0.01268768310546875, -0.003238677978515625, 0.00023663043975830078, 0.048919677734375, -0.005741119384765625, -0.0723876953125, 0.06158447265625, -0.0032558441162109375, 0.04803466796875, -0.037506103515625, 0.003955841064453125, -0.0169525146484375, 0.0226898193359375, 0.00002092123031616211, -0.03570556640625, 0.018707275390625, 0.0012845993041992188, -0.0030879974365234375, -0.006206512451171875, 0.05609130859375, -0.028717041015625, -0.0504150390625, 0.014556884765625, 0.0166473388671875, 0.0120697021484375, -0.0017843246459960938, -0.057373046875, 0.00878143310546875, -0.0013952255249023438, -0.040435791015625, -0.0007476806640625, 0.0243682861328125, 0.0031280517578125, 0.038421630859375, 0.03302001953125, -0.0017156600952148438, 0.01258087158203125, -0.0215606689453125, 0.05303955078125, -0.040557861328125, -0.044036865234375, -0.06304931640625, 0.04058837890625, -0.033782958984375, -0.05072021484375, 0.06622314453125, 0.04339599609375, 0.07635498046875, 0.007007598876953125, 0.07440185546875, -0.039886474609375, 0.062225341796875, -0.0030517578125, 0.03509521484375, -0.055389404296875, -0.0024967193603515625, -0.020782470703125, -0.0889892578125, -0.02044677734375, 0.06304931640625, -0.032440185546875, 0.0019197463989257812, 0.044036865234375, 0.03857421875, -0.00040221214294433594, -0.00449371337890625, -0.0170745849609375, 0.0309295654296875, 0.03564453125, 0.01560211181640625, 0.0286102294921875, -0.0426025390625, 0.050933837890625, -0.049591064453125, -0.016815185546875, -0.04510498046875, -0.0570068359375, -0.0914306640625, -0.057861328125, -0.012603759765625, -0.0231781005859375, 0.006195068359375, 0.087158203125, 0.039581298828125, -0.0662841796875, -0.01506805419921875, -0.01824951171875, -0.01509857177734375, -0.0223388671875, -0.019195556640625, 0.06689453125, -0.0374755859375, -0.0599365234375, -0.007732391357421875, 0.0213623046875, -0.01026153564453125, -0.0189666748046875, 0.0023956298828125, -0.0489501953125, 0.01097869873046875, 0.0447998046875, 0.024078369140625, -0.0609130859375, -0.01132965087890625, -0.01287078857421875, -0.0215606689453125, 0.01006317138671875, 0.02703857421875, -0.05389404296875, 0.053070068359375, 0.043792724609375, 0.04840087890625, 0.050140380859375, -0.006900787353515625, 0.0231781005859375, -0.052001953125, 0.0391845703125, 0.0139007568359375, 0.0222015380859375, 0.026336669921875, 0.01294708251953125, 0.05181884765625, 0.0183563232421875, -0.04840087890625, -0.06500244140625, -0.00829315185546875, -0.053863525390625, -0.0222930908203125, 0.06451416015625, -0.01141357421875, -0.027191162109375, 0.0017805099487304688, -0.020111083984375, 0.0303497314453125, -0.02581787109375, 0.062103271484375, 0.050018310546875, -0.0102691650390625, -0.0060577392578125, -0.046661376953125, 0.045623779296875, 0.05999755859375, -0.03875732421875, -0.01434326171875, 0.006305694580078125, 0.005512237548828125, 0.01090240478515625, 0.05120849609375, 0.0036525726318359375, 0.013458251953125, -0.0005412101745605469, 0.0164947509765625, 0.0096893310546875, -0.0148162841796875, -0.002643585205078125, 0.002544403076171875, -0.007022857666015625, -0.01532745361328125 ] ]
mrm8488/bert-spanish-cased-finetuned-ner
2021-05-20T00:35:25.000Z
[ "transformers", "pytorch", "jax", "bert", "token-classification", "es", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
mrm8488
null
null
mrm8488/bert-spanish-cased-finetuned-ner
18
266,332
transformers
2022-03-02T23:29:05
--- language: es thumbnail: https://i.imgur.com/jgBdimh.png --- # Spanish BERT (BETO) + NER This model is a fine-tuned on [NER-C](https://www.kaggle.com/nltkdata/conll-corpora) version of the Spanish BERT cased [(BETO)](https://github.com/dccuchile/beto) for **NER** downstream task. ## Details of the downstream task (NER) - Dataset - [Dataset: CONLL Corpora ES](https://www.kaggle.com/nltkdata/conll-corpora) I preprocessed the dataset and split it as train / dev (80/20) | Dataset | # Examples | | ---------------------- | ----- | | Train | 8.7 K | | Dev | 2.2 K | - [Fine-tune on NER script provided by Huggingface](https://github.com/huggingface/transformers/blob/master/examples/token-classification/run_ner_old.py) - Labels covered: ``` B-LOC B-MISC B-ORG B-PER I-LOC I-MISC I-ORG I-PER O ``` ## Metrics on evaluation set: | Metric | # score | | :------------------------------------------------------------------------------------: | :-------: | | F1 | **90.17** | Precision | **89.86** | | Recall | **90.47** | ## Comparison: | Model | # F1 score |Size(MB)| | :--------------------------------------------------------------------------------------------------------------: | :-------: |:------| | bert-base-spanish-wwm-cased (BETO) | 88.43 | 421 | [bert-spanish-cased-finetuned-ner (this one)](https://huggingface.co/mrm8488/bert-spanish-cased-finetuned-ner) | **90.17** | 420 | | Best Multilingual BERT | 87.38 | 681 | |[TinyBERT-spanish-uncased-finetuned-ner](https://huggingface.co/mrm8488/TinyBERT-spanish-uncased-finetuned-ner) | 70.00 | **55** | ## Model in action Fast usage with **pipelines**: ```python from transformers import pipeline nlp_ner = pipeline( "ner", model="mrm8488/bert-spanish-cased-finetuned-ner", tokenizer=( 'mrm8488/bert-spanish-cased-finetuned-ner', {"use_fast": False} )) text = 'Mis amigos están pensando viajar a Londres este verano' nlp_ner(text) #Output: [{'entity': 'B-LOC', 'score': 0.9998720288276672, 'word': 'Londres'}] ``` > Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) > Made with <span style="color: #e25555;">&hearts;</span> in Spain
2,716
[ [ -0.0301055908203125, -0.03875732421875, 0.005802154541015625, 0.03668212890625, -0.0123138427734375, 0.00867462158203125, -0.03271484375, -0.037139892578125, 0.04364013671875, 0.0242462158203125, -0.05364990234375, -0.051239013671875, -0.055908203125, 0.0176849365234375, -0.0154266357421875, 0.097900390625, 0.005443572998046875, 0.01611328125, 0.0216064453125, -0.01134490966796875, -0.034698486328125, -0.045654296875, -0.0653076171875, -0.0264892578125, 0.036865234375, 0.0269317626953125, 0.048919677734375, 0.021697998046875, 0.0413818359375, 0.0209808349609375, -0.0212554931640625, 0.0035724639892578125, -0.0152130126953125, -0.005279541015625, 0.0033111572265625, -0.027374267578125, -0.0496826171875, -0.01666259765625, 0.034423828125, 0.044464111328125, -0.002460479736328125, 0.01824951171875, 0.0023021697998046875, 0.0399169921875, -0.020477294921875, 0.019744873046875, -0.040985107421875, 0.0012273788452148438, -0.00865936279296875, -0.0031642913818359375, -0.020660400390625, -0.029449462890625, 0.0152130126953125, -0.03558349609375, 0.03173828125, 0.0021800994873046875, 0.1060791015625, 0.020172119140625, -0.009796142578125, -0.03106689453125, -0.035675048828125, 0.06634521484375, -0.070068359375, 0.0255584716796875, 0.0240936279296875, 0.00685882568359375, -0.0161590576171875, -0.0498046875, -0.048858642578125, -0.004100799560546875, -0.02001953125, 0.020050048828125, -0.0206146240234375, -0.00736236572265625, 0.012908935546875, 0.0225982666015625, -0.03924560546875, 0.01515960693359375, -0.045684814453125, -0.0279998779296875, 0.046905517578125, 0.0020351409912109375, 0.0124664306640625, -0.033843994140625, -0.022552490234375, -0.036285400390625, -0.037322998046875, 0.0153350830078125, 0.031585693359375, 0.04510498046875, -0.027618408203125, 0.051422119140625, -0.016387939453125, 0.04107666015625, 0.01187896728515625, -0.004177093505859375, 0.0584716796875, -0.019073486328125, -0.0192413330078125, 0.006500244140625, 0.085205078125, 0.02764892578125, 0.019287109375, -0.004329681396484375, -0.01531982421875, 0.00907135009765625, 0.0022487640380859375, -0.04986572265625, -0.007419586181640625, 0.0272369384765625, -0.0309295654296875, -0.01058197021484375, 0.0135498046875, -0.0562744140625, 0.0234527587890625, -0.0217742919921875, 0.037750244140625, -0.04376220703125, -0.01456451416015625, 0.0148773193359375, -0.005069732666015625, 0.0322265625, -0.003284454345703125, -0.062744140625, 0.01922607421875, 0.049163818359375, 0.057098388671875, -0.0015592575073242188, -0.0233154296875, -0.0270843505859375, -0.024139404296875, -0.02386474609375, 0.038818359375, -0.00785064697265625, -0.01320648193359375, -0.0003707408905029297, 0.023345947265625, -0.021087646484375, -0.0260772705078125, 0.0625, -0.0276641845703125, 0.034820556640625, -0.0203704833984375, -0.028778076171875, -0.01560211181640625, 0.0269927978515625, -0.047943115234375, 0.09735107421875, 0.00399017333984375, -0.058502197265625, 0.0401611328125, -0.051788330078125, -0.03216552734375, -0.01145172119140625, 0.006961822509765625, -0.03619384765625, -0.0006723403930664062, 0.024627685546875, 0.042083740234375, -0.009002685546875, 0.028961181640625, -0.012847900390625, -0.01123809814453125, 0.0048675537109375, -0.0168914794921875, 0.06939697265625, 0.01203155517578125, -0.02557373046875, 0.016082763671875, -0.0704345703125, 0.01522064208984375, 0.0086212158203125, -0.0311126708984375, -0.016143798828125, -0.0042877197265625, 0.01351165771484375, 0.025482177734375, 0.028411865234375, -0.0384521484375, 0.00018084049224853516, -0.04547119140625, 0.020172119140625, 0.048492431640625, -0.00814056396484375, 0.01690673828125, -0.029327392578125, 0.029632568359375, -0.020477294921875, 0.011505126953125, 0.01617431640625, -0.039520263671875, -0.0784912109375, -0.03857421875, 0.04364013671875, 0.051361083984375, -0.04443359375, 0.060577392578125, -0.033905029296875, -0.03887939453125, -0.055572509765625, -0.0202178955078125, 0.032440185546875, 0.040557861328125, 0.046295166015625, -0.01325225830078125, -0.05865478515625, -0.0784912109375, 0.004131317138671875, -0.013946533203125, -0.00958251953125, 0.0128326416015625, 0.060333251953125, -0.01268768310546875, 0.04937744140625, -0.0293426513671875, -0.0217132568359375, -0.0212249755859375, 0.0107421875, 0.06866455078125, 0.044281005859375, 0.038909912109375, -0.037322998046875, -0.04693603515625, 0.004398345947265625, -0.04449462890625, -0.0002079010009765625, 0.00118255615234375, -0.01331329345703125, 0.0223236083984375, 0.029022216796875, -0.02899169921875, 0.0196685791015625, 0.04486083984375, -0.04400634765625, 0.045501708984375, -0.0279541015625, 0.0006427764892578125, -0.0882568359375, 0.01555633544921875, -0.006397247314453125, 0.00006181001663208008, -0.039215087890625, -0.005023956298828125, -0.001453399658203125, 0.0236053466796875, -0.03790283203125, 0.048614501953125, -0.0263519287109375, 0.0036144256591796875, 0.00568389892578125, -0.01226043701171875, -0.005245208740234375, 0.0557861328125, 0.0132293701171875, 0.035430908203125, 0.0498046875, -0.048126220703125, 0.009033203125, 0.0323486328125, -0.035858154296875, 0.0296478271484375, -0.06597900390625, -0.0120697021484375, -0.0034389495849609375, 0.00583648681640625, -0.061370849609375, -0.014251708984375, 0.0248870849609375, -0.041229248046875, 0.0260772705078125, -0.01239013671875, -0.04705810546875, -0.038482666015625, -0.028106689453125, -0.0002560615539550781, 0.0277557373046875, -0.039642333984375, 0.0242462158203125, 0.01123046875, -0.0111236572265625, -0.055328369140625, -0.06390380859375, -0.01123046875, -0.0217132568359375, -0.048553466796875, 0.04803466796875, -0.0087738037109375, 0.004791259765625, 0.004024505615234375, -0.005870819091796875, -0.009033203125, -0.003570556640625, -0.0013856887817382812, 0.039520263671875, -0.0169677734375, -0.003864288330078125, 0.005764007568359375, -0.0019073486328125, -0.0020656585693359375, 0.0115509033203125, 0.0645751953125, -0.0202178955078125, -0.00841522216796875, -0.027984619140625, 0.022796630859375, 0.052398681640625, -0.0125274658203125, 0.06280517578125, 0.055206298828125, -0.0262298583984375, 0.0015554428100585938, -0.033111572265625, -0.00650787353515625, -0.0309906005859375, 0.0289306640625, -0.0328369140625, -0.061767578125, 0.07183837890625, 0.0240936279296875, 0.012054443359375, 0.06768798828125, 0.051849365234375, -0.01068115234375, 0.071044921875, 0.041839599609375, -0.00800323486328125, 0.03997802734375, -0.0509033203125, 0.00891876220703125, -0.07086181640625, -0.040740966796875, -0.05511474609375, -0.02923583984375, -0.06829833984375, -0.01522064208984375, 0.0169525146484375, 0.00623321533203125, -0.031951904296875, 0.0467529296875, -0.049530029296875, 0.0186767578125, 0.04931640625, 0.01042938232421875, 0.0018024444580078125, -0.004009246826171875, -0.0293731689453125, -0.01456451416015625, -0.06256103515625, -0.030029296875, 0.0833740234375, 0.027557373046875, 0.0286712646484375, 0.01390838623046875, 0.057586669921875, 0.0020389556884765625, 0.021636962890625, -0.061981201171875, 0.03167724609375, -0.0277252197265625, -0.060638427734375, -0.0135498046875, -0.018951416015625, -0.08074951171875, 0.0291748046875, -0.024658203125, -0.06121826171875, 0.02587890625, -0.00421142578125, -0.0266265869140625, 0.01270294189453125, -0.06365966796875, 0.07525634765625, -0.021514892578125, -0.007720947265625, 0.005756378173828125, -0.043365478515625, 0.01197052001953125, 0.0003268718719482422, 0.0209808349609375, -0.0140228271484375, 0.005939483642578125, 0.07122802734375, -0.05352783203125, 0.060577392578125, 0.0017261505126953125, 0.0088958740234375, 0.0234375, -0.00838470458984375, 0.041748046875, -0.00316619873046875, -0.0144805908203125, 0.0212860107421875, 0.0014553070068359375, -0.0230712890625, -0.014892578125, 0.047119140625, -0.057586669921875, -0.01021575927734375, -0.053192138671875, -0.0258636474609375, -0.005809783935546875, 0.01983642578125, 0.041961669921875, 0.02239990234375, -0.0045166015625, 0.025726318359375, 0.0307769775390625, -0.020843505859375, 0.05352783203125, 0.0416259765625, -0.004497528076171875, -0.03466796875, 0.054656982421875, 0.00370025634765625, 0.00408172607421875, 0.029998779296875, 0.0179290771484375, -0.042724609375, -0.0372314453125, -0.01387786865234375, 0.029022216796875, -0.029541015625, -0.0293731689453125, -0.04888916015625, -0.019805908203125, -0.030670166015625, -0.00469970703125, -0.033721923828125, -0.03204345703125, -0.028228759765625, -0.0144805908203125, 0.037109375, 0.0287628173828125, -0.0153656005859375, 0.025909423828125, -0.044158935546875, 0.0223388671875, 0.006317138671875, 0.034423828125, -0.00522613525390625, -0.060943603515625, -0.01145172119140625, -0.0024852752685546875, -0.01219940185546875, -0.0667724609375, 0.057342529296875, 0.011260986328125, 0.04705810546875, 0.01186370849609375, -0.004283905029296875, 0.058013916015625, -0.033905029296875, 0.0447998046875, 0.0247650146484375, -0.06549072265625, 0.04827880859375, -0.0187225341796875, 0.0120086669921875, 0.047088623046875, 0.033416748046875, -0.049285888671875, -0.037322998046875, -0.0677490234375, -0.09820556640625, 0.07086181640625, 0.024078369140625, 0.005504608154296875, -0.0247955322265625, -0.00925445556640625, 0.003589630126953125, 0.00812530517578125, -0.072998046875, -0.03509521484375, -0.01438140869140625, -0.0220184326171875, -0.0015583038330078125, -0.01172637939453125, 0.0039520263671875, -0.0325927734375, 0.06927490234375, -0.005046844482421875, 0.0211944580078125, 0.0257720947265625, 0.0027332305908203125, 0.0150299072265625, 0.0201568603515625, 0.037689208984375, 0.0272979736328125, -0.035980224609375, -0.018341064453125, 0.0220794677734375, -0.0280303955078125, -0.0212249755859375, 0.012420654296875, -0.01486968994140625, 0.024658203125, 0.040557861328125, 0.07415771484375, 0.0207672119140625, -0.04010009765625, 0.032470703125, -0.0048675537109375, -0.033233642578125, -0.032196044921875, -0.006801605224609375, -0.00811767578125, 0.0152435302734375, 0.02740478515625, 0.0132904052734375, -0.003429412841796875, -0.034423828125, 0.010345458984375, 0.026123046875, -0.0316162109375, -0.0260009765625, 0.04144287109375, -0.0038852691650390625, -0.0178375244140625, 0.040985107421875, -0.0234832763671875, -0.052276611328125, 0.0635986328125, 0.037567138671875, 0.064453125, 0.0007262229919433594, 0.01403045654296875, 0.0693359375, 0.0322265625, -0.0252838134765625, 0.047271728515625, 0.0213165283203125, -0.06402587890625, -0.04571533203125, -0.062469482421875, -0.01068115234375, 0.029998779296875, -0.05291748046875, 0.0287628173828125, -0.025970458984375, -0.0263671875, 0.0124359130859375, 0.0086212158203125, -0.05999755859375, 0.0222930908203125, -0.0039825439453125, 0.05908203125, -0.07110595703125, 0.054473876953125, 0.047882080078125, -0.03912353515625, -0.0771484375, -0.022918701171875, -0.027374267578125, -0.055145263671875, 0.04803466796875, 0.0006833076477050781, 0.021148681640625, -0.0153350830078125, -0.0242767333984375, -0.0814208984375, 0.066162109375, 0.01433563232421875, -0.041748046875, 0.009490966796875, -0.0012378692626953125, 0.06719970703125, -0.0122222900390625, 0.033843994140625, 0.03692626953125, 0.04443359375, 0.00789642333984375, -0.08087158203125, 0.0008783340454101562, -0.027923583984375, -0.01309967041015625, 0.01551055908203125, -0.05517578125, 0.0704345703125, -0.00957489013671875, -0.0008130073547363281, -0.0034027099609375, 0.044281005859375, 0.0309906005859375, 0.00975799560546875, 0.0281982421875, 0.0496826171875, 0.057037353515625, -0.0284423828125, 0.0738525390625, -0.0196685791015625, 0.049560546875, 0.08697509765625, 0.0132598876953125, 0.06488037109375, 0.038787841796875, -0.02679443359375, 0.045562744140625, 0.051788330078125, -0.0242767333984375, 0.040374755859375, 0.00841522216796875, 0.0029296875, 0.0008254051208496094, -0.0005459785461425781, -0.032012939453125, 0.044158935546875, 0.0170135498046875, -0.025970458984375, -0.00421905517578125, -0.0256195068359375, 0.0308837890625, -0.00240325927734375, -0.0200958251953125, 0.039215087890625, -0.0140380859375, -0.0635986328125, 0.05499267578125, 0.0108642578125, 0.081787109375, -0.048583984375, 0.013946533203125, -0.016143798828125, 0.0128326416015625, -0.0193328857421875, -0.05859375, 0.039337158203125, 0.019989013671875, -0.0261077880859375, -0.02362060546875, 0.0266876220703125, -0.0294189453125, -0.06317138671875, 0.035400390625, 0.03704833984375, 0.0185089111328125, 0.01227569580078125, -0.0662841796875, -0.005710601806640625, 0.013153076171875, -0.021881103515625, 0.006267547607421875, 0.03363037109375, -0.009124755859375, 0.0408935546875, 0.0628662109375, 0.0171966552734375, 0.024017333984375, 0.0238037109375, 0.049041748046875, -0.048065185546875, -0.02996826171875, -0.061248779296875, 0.0435791015625, -0.00643157958984375, -0.0528564453125, 0.042572021484375, 0.0601806640625, 0.07000732421875, -0.0128936767578125, 0.037628173828125, -0.03759765625, 0.041748046875, -0.0308074951171875, 0.050506591796875, -0.04644775390625, 0.0026397705078125, -0.0244903564453125, -0.057891845703125, -0.0310821533203125, 0.052978515625, -0.0208740234375, -0.00103759765625, 0.05322265625, 0.054595947265625, 0.0025787353515625, -0.0264892578125, -0.00807952880859375, 0.0230255126953125, 0.0159454345703125, 0.047332763671875, 0.037200927734375, -0.0604248046875, 0.05084228515625, -0.040863037109375, -0.0022754669189453125, -0.01024627685546875, -0.06549072265625, -0.056121826171875, -0.037994384765625, -0.0309295654296875, -0.03131103515625, -0.02301025390625, 0.06640625, 0.0457763671875, -0.08856201171875, -0.020843505859375, -0.01123046875, 0.0174102783203125, -0.0221099853515625, -0.0228271484375, 0.050628662109375, -0.0221405029296875, -0.078125, 0.0203399658203125, -0.01178741455078125, 0.005825042724609375, 0.01430511474609375, -0.0189208984375, -0.031219482421875, -0.00975799560546875, 0.039031982421875, 0.0301666259765625, -0.04913330078125, -0.0233154296875, 0.021942138671875, -0.007755279541015625, 0.011932373046875, 0.013946533203125, -0.038330078125, 0.020843505859375, 0.04498291015625, 0.0164337158203125, 0.04913330078125, -0.0254058837890625, 0.0159759521484375, -0.06524658203125, 0.0212860107421875, 0.0286102294921875, 0.03961181640625, 0.0237274169921875, -0.00971221923828125, 0.046295166015625, 0.00801849365234375, -0.021270751953125, -0.05706787109375, -0.01690673828125, -0.091796875, -0.0130157470703125, 0.07244873046875, 0.00909423828125, -0.0217742919921875, 0.0081024169921875, -0.0081634521484375, 0.031646728515625, -0.044708251953125, 0.051788330078125, 0.0589599609375, -0.01049041748046875, 0.01885986328125, -0.0355224609375, 0.022216796875, 0.040618896484375, -0.04339599609375, -0.0229034423828125, 0.0127716064453125, 0.0271759033203125, 0.0234222412109375, 0.04443359375, -0.015289306640625, 0.006183624267578125, -0.01416778564453125, 0.0286102294921875, 0.014892578125, -0.0117645263671875, -0.0146484375, 0.00396728515625, -0.0218048095703125, -0.03582763671875 ] ]
bert-large-uncased-whole-word-masking
2023-04-06T13:39:50.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "bert", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1810.04805", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
null
null
null
bert-large-uncased-whole-word-masking
10
264,753
transformers
2022-03-02T23:29:04
--- language: en license: apache-2.0 datasets: - bookcorpus - wikipedia --- # BERT large model (uncased) whole word masking Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1810.04805) and first released in [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference between english and English. Differently to other BERT models, this model was trained with a new technique: Whole Word Masking. In this case, all of the tokens corresponding to a word are masked at once. The overall masking rate remains the same. The training is identical -- each masked WordPiece token is predicted independently. Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. This model has the following configuration: - 24-layer - 1024 hidden dimension - 16 attention heads - 336M parameters. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-large-uncased-whole-word-masking') >>> unmasker("Hello I'm a [MASK] model.") [ { 'sequence': "[CLS] hello i'm a fashion model. [SEP]", 'score': 0.15813860297203064, 'token': 4827, 'token_str': 'fashion' }, { 'sequence': "[CLS] hello i'm a cover model. [SEP]", 'score': 0.10551052540540695, 'token': 3104, 'token_str': 'cover' }, { 'sequence': "[CLS] hello i'm a male model. [SEP]", 'score': 0.08340442180633545, 'token': 3287, 'token_str': 'male' }, { 'sequence': "[CLS] hello i'm a super model. [SEP]", 'score': 0.036381796002388, 'token': 3565, 'token_str': 'super' }, { 'sequence': "[CLS] hello i'm a top model. [SEP]", 'score': 0.03609578311443329, 'token': 2327, 'token_str': 'top' } ] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('bert-large-uncased-whole-word-masking') model = BertModel.from_pretrained("bert-large-uncased-whole-word-masking") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import BertTokenizer, TFBertModel tokenizer = BertTokenizer.from_pretrained('bert-large-uncased-whole-word-masking') model = TFBertModel.from_pretrained("bert-large-uncased-whole-word-masking") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-large-uncased-whole-word-masking') >>> unmasker("The man worked as a [MASK].") [ { "sequence":"[CLS] the man worked as a waiter. [SEP]", "score":0.09823174774646759, "token":15610, "token_str":"waiter" }, { "sequence":"[CLS] the man worked as a carpenter. [SEP]", "score":0.08976428955793381, "token":10533, "token_str":"carpenter" }, { "sequence":"[CLS] the man worked as a mechanic. [SEP]", "score":0.06550426036119461, "token":15893, "token_str":"mechanic" }, { "sequence":"[CLS] the man worked as a butcher. [SEP]", "score":0.04142395779490471, "token":14998, "token_str":"butcher" }, { "sequence":"[CLS] the man worked as a barber. [SEP]", "score":0.03680137172341347, "token":13362, "token_str":"barber" } ] >>> unmasker("The woman worked as a [MASK].") [ { "sequence":"[CLS] the woman worked as a waitress. [SEP]", "score":0.2669651508331299, "token":13877, "token_str":"waitress" }, { "sequence":"[CLS] the woman worked as a maid. [SEP]", "score":0.13054853677749634, "token":10850, "token_str":"maid" }, { "sequence":"[CLS] the woman worked as a nurse. [SEP]", "score":0.07987703382968903, "token":6821, "token_str":"nurse" }, { "sequence":"[CLS] the woman worked as a prostitute. [SEP]", "score":0.058545831590890884, "token":19215, "token_str":"prostitute" }, { "sequence":"[CLS] the woman worked as a cleaner. [SEP]", "score":0.03834161534905434, "token":20133, "token_str":"cleaner" } ] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Model | SQUAD 1.1 F1/EM | Multi NLI Accuracy ---------------------------------------- | :-------------: | :----------------: BERT-Large, Uncased (Whole Word Masking) | 92.8/86.7 | 87.07 ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1810-04805, author = {Jacob Devlin and Ming{-}Wei Chang and Kenton Lee and Kristina Toutanova}, title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language Understanding}, journal = {CoRR}, volume = {abs/1810.04805}, year = {2018}, url = {http://arxiv.org/abs/1810.04805}, archivePrefix = {arXiv}, eprint = {1810.04805}, timestamp = {Tue, 30 Oct 2018 20:39:56 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
9,763
[ [ -0.00951385498046875, -0.04644775390625, 0.0161590576171875, 0.0247344970703125, -0.04425048828125, -0.0017375946044921875, -0.0030612945556640625, -0.01556396484375, 0.034271240234375, 0.041839599609375, -0.042755126953125, -0.031951904296875, -0.062469482421875, 0.0102386474609375, -0.0418701171875, 0.08380126953125, 0.0189971923828125, 0.0245513916015625, 0.005390167236328125, 0.0162353515625, -0.0310821533203125, -0.05712890625, -0.05609130859375, -0.0216064453125, 0.037872314453125, 0.0200653076171875, 0.047698974609375, 0.046844482421875, 0.03582763671875, 0.0308837890625, -0.004314422607421875, -0.004016876220703125, -0.0275421142578125, 0.0053253173828125, -0.007171630859375, -0.038604736328125, -0.026641845703125, 0.0174407958984375, 0.042449951171875, 0.05731201171875, 0.0003006458282470703, 0.0225372314453125, -0.00815582275390625, 0.04150390625, -0.0180816650390625, 0.021331787109375, -0.04144287109375, 0.0097198486328125, -0.02349853515625, 0.0096893310546875, -0.025787353515625, -0.0178375244140625, 0.00945281982421875, -0.041412353515625, 0.015899658203125, 0.0133056640625, 0.087158203125, 0.00634765625, -0.013153076171875, -0.006313323974609375, -0.0364990234375, 0.058563232421875, -0.0556640625, 0.0122222900390625, 0.037750244140625, 0.016357421875, -0.013427734375, -0.077880859375, -0.033905029296875, -0.0029048919677734375, -0.011688232421875, 0.002391815185546875, -0.00287628173828125, -0.00592803955078125, 0.023651123046875, 0.032318115234375, -0.0215301513671875, 0.002613067626953125, -0.05242919921875, -0.0283203125, 0.050933837890625, 0.0106048583984375, 0.016326904296875, -0.0258636474609375, -0.029510498046875, -0.021148681640625, -0.0190582275390625, 0.00464630126953125, 0.041168212890625, 0.0305023193359375, -0.01032257080078125, 0.058929443359375, -0.01415252685546875, 0.043701171875, -0.0033416748046875, -0.00004565715789794922, 0.0312347412109375, -0.00787353515625, -0.031982421875, -0.0014638900756835938, 0.07159423828125, 0.024627685546875, 0.03271484375, -0.007099151611328125, -0.0258331298828125, 0.00260162353515625, 0.02685546875, -0.046844482421875, -0.0202789306640625, 0.007297515869140625, -0.0377197265625, -0.036163330078125, 0.038482666015625, -0.049560546875, -0.00862884521484375, -0.005626678466796875, 0.04681396484375, -0.0196685791015625, -0.0057373046875, 0.01495361328125, -0.0406494140625, 0.01511383056640625, 0.00431060791015625, -0.06280517578125, 0.01305389404296875, 0.050537109375, 0.06341552734375, 0.0230255126953125, -0.012481689453125, -0.0281829833984375, -0.012481689453125, -0.0279541015625, 0.036865234375, -0.0226287841796875, -0.034515380859375, 0.0027942657470703125, 0.02459716796875, -0.00797271728515625, -0.0171051025390625, 0.0517578125, -0.03875732421875, 0.0421142578125, -0.0009016990661621094, -0.04144287109375, -0.019866943359375, 0.00330352783203125, -0.054473876953125, 0.0885009765625, 0.021942138671875, -0.054229736328125, 0.021026611328125, -0.06658935546875, -0.0428466796875, 0.01430511474609375, 0.01392364501953125, -0.034149169921875, 0.020721435546875, 0.0079498291015625, 0.03131103515625, -0.007663726806640625, 0.0227203369140625, -0.0159759521484375, -0.03369140625, 0.037506103515625, -0.0207366943359375, 0.0765380859375, 0.011077880859375, -0.0208587646484375, 0.013153076171875, -0.05670166015625, 0.0007405281066894531, 0.0193328857421875, -0.0250244140625, -0.01412200927734375, -0.013153076171875, 0.025299072265625, 0.01528167724609375, 0.033843994140625, -0.055419921875, 0.0186614990234375, -0.045257568359375, 0.04681396484375, 0.06658935546875, -0.007602691650390625, 0.016571044921875, -0.033355712890625, 0.03668212890625, -0.005702972412109375, -0.003360748291015625, -0.013458251953125, -0.05364990234375, -0.058013916015625, -0.031463623046875, 0.046600341796875, 0.0537109375, -0.0364990234375, 0.058563232421875, -0.002197265625, -0.046051025390625, -0.0474853515625, -0.01035308837890625, 0.0270233154296875, 0.03173828125, 0.021881103515625, -0.0333251953125, -0.06787109375, -0.06378173828125, -0.0232086181640625, -0.00792694091796875, -0.023406982421875, 0.0038471221923828125, 0.05120849609375, -0.02105712890625, 0.06005859375, -0.062255859375, -0.039337158203125, -0.012420654296875, 0.0146636962890625, 0.046356201171875, 0.05126953125, 0.0247802734375, -0.04388427734375, -0.0273590087890625, -0.03118896484375, -0.042816162109375, -0.0018663406372070312, 0.003173828125, -0.0204925537109375, 0.006587982177734375, 0.043548583984375, -0.055633544921875, 0.042755126953125, 0.0218048095703125, -0.043487548828125, 0.053985595703125, -0.0284423828125, -0.005916595458984375, -0.0938720703125, 0.0129852294921875, -0.00991058349609375, -0.0243682861328125, -0.0543212890625, 0.002178192138671875, -0.014190673828125, -0.006343841552734375, -0.040771484375, 0.039794921875, -0.0299072265625, -0.0007834434509277344, -0.0033721923828125, -0.00867462158203125, 0.0075531005859375, 0.0333251953125, -0.00045871734619140625, 0.045440673828125, 0.039703369140625, -0.03826904296875, 0.04376220703125, 0.033172607421875, -0.044586181640625, 0.0120849609375, -0.062164306640625, 0.019805908203125, 0.0024356842041015625, 0.006816864013671875, -0.08990478515625, -0.024169921875, 0.0170135498046875, -0.042633056640625, 0.0123138427734375, 0.00010776519775390625, -0.059234619140625, -0.0467529296875, -0.02227783203125, 0.033172607421875, 0.04052734375, -0.0191650390625, 0.0287933349609375, 0.0279388427734375, -0.00853729248046875, -0.04376220703125, -0.0543212890625, 0.013763427734375, -0.01364898681640625, -0.039215087890625, 0.0305023193359375, 0.0004630088806152344, -0.00763702392578125, -0.0185699462890625, 0.00603485107421875, -0.0112762451171875, 0.005817413330078125, 0.01439666748046875, 0.03662109375, -0.01306915283203125, -0.004894256591796875, -0.015838623046875, -0.0126953125, 0.02130126953125, -0.01654052734375, 0.06170654296875, 0.0029964447021484375, -0.006778717041015625, -0.0254669189453125, 0.02960205078125, 0.047149658203125, -0.00887298583984375, 0.05889892578125, 0.064453125, -0.0438232421875, 0.006130218505859375, -0.0278472900390625, -0.01470947265625, -0.038482666015625, 0.0390625, -0.0307159423828125, -0.06256103515625, 0.055877685546875, 0.024017333984375, -0.012725830078125, 0.055328369140625, 0.0419921875, -0.017974853515625, 0.07891845703125, 0.0364990234375, -0.005748748779296875, 0.035125732421875, -0.0089569091796875, 0.025115966796875, -0.059814453125, -0.036865234375, -0.034881591796875, -0.0273590087890625, -0.034027099609375, -0.0181732177734375, 0.0169219970703125, 0.0205535888671875, -0.031951904296875, 0.043914794921875, -0.0518798828125, 0.030242919921875, 0.0743408203125, 0.0245819091796875, -0.0101776123046875, -0.0159454345703125, -0.0165863037109375, -0.000789642333984375, -0.025390625, -0.0280914306640625, 0.07989501953125, 0.0406494140625, 0.051727294921875, 0.00510406494140625, 0.04376220703125, 0.0265960693359375, -0.0007777214050292969, -0.051727294921875, 0.048736572265625, -0.030181884765625, -0.06878662109375, -0.032440185546875, -0.006771087646484375, -0.08001708984375, 0.016845703125, -0.0240631103515625, -0.06854248046875, -0.006893157958984375, -0.013916015625, -0.02197265625, 0.017822265625, -0.048980712890625, 0.07666015625, -0.019317626953125, -0.0038127899169921875, 0.0099945068359375, -0.06842041015625, 0.0184326171875, -0.0014619827270507812, 0.01094818115234375, -0.0110015869140625, 0.01215362548828125, 0.08245849609375, -0.04168701171875, 0.075439453125, -0.015594482421875, 0.010284423828125, 0.0031337738037109375, -0.003353118896484375, 0.0240325927734375, -0.0004124641418457031, 0.00861358642578125, 0.0244140625, 0.0012865066528320312, -0.034027099609375, -0.00928497314453125, 0.024139404296875, -0.05670166015625, -0.0390625, -0.047515869140625, -0.0445556640625, 0.01108551025390625, 0.03289794921875, 0.04583740234375, 0.041656494140625, -0.014068603515625, 0.0189666748046875, 0.036224365234375, -0.0209503173828125, 0.05523681640625, 0.0233001708984375, -0.024139404296875, -0.041839599609375, 0.04443359375, 0.0007061958312988281, 0.0008859634399414062, 0.03887939453125, 0.019439697265625, -0.046844482421875, -0.0103607177734375, -0.0269622802734375, 0.01363372802734375, -0.041717529296875, -0.0272979736328125, -0.043365478515625, -0.037567138671875, -0.046417236328125, -0.00608062744140625, -0.00949859619140625, -0.037628173828125, -0.047393798828125, -0.01367950439453125, 0.03564453125, 0.0509033203125, -0.00945281982421875, 0.039825439453125, -0.06005859375, 0.020660400390625, 0.025634765625, 0.033599853515625, -0.01904296875, -0.06103515625, -0.026641845703125, 0.0007491111755371094, -0.00942230224609375, -0.06427001953125, 0.048828125, 0.01611328125, 0.033966064453125, 0.040985107421875, -0.0018415451049804688, 0.04547119140625, -0.04693603515625, 0.07098388671875, 0.018341064453125, -0.08441162109375, 0.036773681640625, -0.02716064453125, 0.0224456787109375, 0.024444580078125, 0.011077880859375, -0.04583740234375, -0.0300750732421875, -0.060089111328125, -0.076171875, 0.06109619140625, 0.01378631591796875, 0.0286712646484375, -0.00714874267578125, 0.0133514404296875, 0.006992340087890625, 0.031524658203125, -0.0712890625, -0.03680419921875, -0.03521728515625, -0.023040771484375, -0.0138397216796875, -0.02294921875, -0.005413055419921875, -0.041259765625, 0.0518798828125, 0.013275146484375, 0.044769287109375, 0.00799560546875, -0.01128387451171875, 0.0091705322265625, 0.01611328125, 0.0650634765625, 0.03594970703125, -0.036041259765625, 0.0013790130615234375, -0.0020618438720703125, -0.04827880859375, 0.004245758056640625, 0.01438140869140625, 0.0025196075439453125, 0.0205841064453125, 0.0438232421875, 0.06280517578125, 0.02099609375, -0.0380859375, 0.04864501953125, 0.0131683349609375, -0.02294921875, -0.044097900390625, 0.00429534912109375, -0.0021820068359375, 0.006458282470703125, 0.04010009765625, 0.01134490966796875, 0.00371551513671875, -0.038604736328125, 0.0302734375, 0.0284271240234375, -0.04071044921875, -0.01345062255859375, 0.06890869140625, 0.0032520294189453125, -0.052490234375, 0.0579833984375, -0.01128387451171875, -0.06005859375, 0.055145263671875, 0.053253173828125, 0.07147216796875, -0.0168609619140625, 0.016937255859375, 0.03216552734375, 0.0254974365234375, -0.0186614990234375, 0.0254974365234375, 0.0269927978515625, -0.0621337890625, -0.0304718017578125, -0.061431884765625, -0.00937652587890625, 0.0155792236328125, -0.060028076171875, 0.0190582275390625, -0.036712646484375, -0.017425537109375, 0.01439666748046875, -0.00786590576171875, -0.0494384765625, 0.03515625, 0.0020427703857421875, 0.08111572265625, -0.07623291015625, 0.0716552734375, 0.057708740234375, -0.046051025390625, -0.06494140625, -0.0286102294921875, -0.0208892822265625, -0.0863037109375, 0.055145263671875, 0.0284423828125, 0.0220794677734375, -0.00044226646423339844, -0.045257568359375, -0.0531005859375, 0.059112548828125, 0.01078033447265625, -0.03204345703125, -0.0156097412109375, 0.00820159912109375, 0.041900634765625, -0.0418701171875, 0.031982421875, 0.03961181640625, 0.0330810546875, -0.0008225440979003906, -0.06231689453125, 0.005413055419921875, -0.027374267578125, 0.005710601806640625, 0.0113983154296875, -0.03076171875, 0.0867919921875, -0.00478363037109375, 0.004184722900390625, 0.019256591796875, 0.04022216796875, -0.00299835205078125, 0.007171630859375, 0.036285400390625, 0.051025390625, 0.05145263671875, -0.0263824462890625, 0.05914306640625, -0.01165008544921875, 0.038818359375, 0.06378173828125, 0.006481170654296875, 0.0589599609375, 0.0262298583984375, -0.02215576171875, 0.06805419921875, 0.060302734375, -0.0275421142578125, 0.055267333984375, 0.021392822265625, -0.005443572998046875, -0.00595855712890625, 0.006175994873046875, -0.021942138671875, 0.04620361328125, 0.016937255859375, -0.04400634765625, 0.0029850006103515625, 0.00023829936981201172, 0.01551055908203125, -0.00894927978515625, -0.040191650390625, 0.0609130859375, 0.01364898681640625, -0.05072021484375, 0.02105712890625, 0.018218994140625, 0.050872802734375, -0.0445556640625, 0.0014371871948242188, -0.006198883056640625, 0.0176239013671875, -0.006031036376953125, -0.0654296875, 0.0172119140625, -0.01395416259765625, -0.031951904296875, -0.0181884765625, 0.0546875, -0.0364990234375, -0.052581787109375, 0.0021686553955078125, 0.018157958984375, 0.02557373046875, -0.0145263671875, -0.05487060546875, -0.0175018310546875, 0.0011386871337890625, -0.006603240966796875, 0.0135345458984375, 0.0253448486328125, 0.00782012939453125, 0.03802490234375, 0.0628662109375, -0.006282806396484375, 0.00510406494140625, 0.004383087158203125, 0.05303955078125, -0.07501220703125, -0.0662841796875, -0.0777587890625, 0.044158935546875, -0.0133056640625, -0.0394287109375, 0.04986572265625, 0.0550537109375, 0.052215576171875, -0.03399658203125, 0.034027099609375, -0.0085296630859375, 0.04412841796875, -0.029449462890625, 0.05963134765625, -0.025604248046875, -0.0029315948486328125, -0.032135009765625, -0.0679931640625, -0.0235748291015625, 0.06280517578125, -0.00107574462890625, 0.0002961158752441406, 0.05352783203125, 0.04583740234375, 0.00801849365234375, -0.005817413330078125, 0.0184478759765625, 0.0128173828125, 0.003223419189453125, 0.0229034423828125, 0.0416259765625, -0.044708251953125, 0.0304718017578125, -0.00965118408203125, -0.0053558349609375, -0.0278167724609375, -0.06890869140625, -0.07452392578125, -0.045440673828125, -0.01306915283203125, -0.0487060546875, -0.0172882080078125, 0.069091796875, 0.054779052734375, -0.06719970703125, -0.020721435546875, -0.0013933181762695312, 0.00547027587890625, -0.021331787109375, -0.022125244140625, 0.035919189453125, -0.012786865234375, -0.05548095703125, 0.01763916015625, 0.0009965896606445312, 0.0068511962890625, -0.01666259765625, 0.01175689697265625, -0.0313720703125, 0.01218414306640625, 0.04010009765625, 0.004642486572265625, -0.057769775390625, -0.038482666015625, 0.0026702880859375, -0.0167236328125, 0.007495880126953125, 0.039459228515625, -0.041107177734375, 0.0283203125, 0.029815673828125, 0.0288848876953125, 0.050994873046875, 0.01061248779296875, 0.05108642578125, -0.08489990234375, 0.0207977294921875, 0.01514434814453125, 0.0357666015625, 0.0275726318359375, -0.035003662109375, 0.03753662109375, 0.036773681640625, -0.03509521484375, -0.0634765625, 0.0011386871337890625, -0.07208251953125, -0.0229034423828125, 0.0625, -0.01021575927734375, -0.02191162109375, -0.007526397705078125, -0.029815673828125, 0.03118896484375, -0.0280609130859375, 0.055633544921875, 0.06622314453125, 0.01041412353515625, -0.00763702392578125, -0.025848388671875, 0.02825927734375, 0.0284423828125, -0.033599853515625, -0.03118896484375, 0.00782012939453125, 0.03179931640625, 0.0179595947265625, 0.04180908203125, -0.00560760498046875, 0.0097198486328125, 0.011474609375, 0.0166778564453125, -0.00904083251953125, -0.00931549072265625, -0.0185394287109375, 0.010894775390625, -0.0128936767578125, -0.05657958984375 ] ]
jbetker/wav2vec2-large-robust-ft-libritts-voxpopuli
2022-02-25T19:07:57.000Z
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
jbetker
null
null
jbetker/wav2vec2-large-robust-ft-libritts-voxpopuli
7
264,486
transformers
2022-03-02T23:29:05
This checkpoint is a wav2vec2-large model that is useful for generating transcriptions with punctuation. It is intended for use in building transcriptions for TTS models, where punctuation is very important for prosody. This model was created by fine-tuning the `facebook/wav2vec2-large-robust-ft-libri-960h` checkpoint on the [libritts](https://research.google/tools/datasets/libri-tts/) and [voxpopuli](https://github.com/facebookresearch/voxpopuli) datasets with a new vocabulary that includes punctuation. The model gets a respectable WER of 4.45% on the librispeech validation set. The baseline, `facebook/wav2vec2-large-robust-ft-libri-960h`, got 4.3%. Since the model was fine-tuned on clean audio, it is not well-suited for noisy audio like CommonVoice (though I may upload a checkpoint for that soon too). It still does pretty good, though. The vocabulary is uploaded to the model hub as well `jbetker/tacotron_symbols`. Check out my speech transcription script repo, [ocotillo](https://github.com/neonbjb/ocotillo) for usage examples: https://github.com/neonbjb/ocotillo
1,085
[ [ -0.006702423095703125, -0.0634765625, 0.04864501953125, 0.0249481201171875, -0.0300445556640625, -0.00022339820861816406, -0.004055023193359375, -0.0347900390625, 0.00870513916015625, 0.03778076171875, -0.04888916015625, -0.04437255859375, -0.029510498046875, -0.0082855224609375, -0.037200927734375, 0.059783935546875, 0.0143585205078125, 0.0232696533203125, 0.0272216796875, -0.00855255126953125, -0.049835205078125, -0.031951904296875, -0.068359375, -0.0010204315185546875, 0.032928466796875, 0.0157623291015625, 0.0298004150390625, 0.028472900390625, 0.01047515869140625, 0.01268768310546875, -0.04949951171875, -0.0021820068359375, -0.0341796875, 0.02374267578125, -0.01061248779296875, -0.003597259521484375, -0.0260772705078125, -0.01546478271484375, 0.06793212890625, 0.0496826171875, -0.0093536376953125, 0.00699615478515625, 0.005191802978515625, 0.0086822509765625, -0.0219573974609375, 0.0119781494140625, -0.045318603515625, -0.0225982666015625, -0.019805908203125, 0.024566650390625, -0.02789306640625, -0.046051025390625, 0.0017080307006835938, -0.057037353515625, -0.01125335693359375, 0.006359100341796875, 0.06890869140625, 0.02435302734375, -0.031829833984375, -0.0258636474609375, -0.051422119140625, 0.057586669921875, -0.038238525390625, 0.04913330078125, 0.0254364013671875, 0.019866943359375, -0.0228118896484375, -0.07647705078125, -0.03662109375, 0.003627777099609375, 0.031005859375, 0.018585205078125, -0.0148773193359375, 0.005359649658203125, 0.0301666259765625, 0.036773681640625, -0.031707763671875, 0.0280303955078125, -0.041656494140625, -0.058502197265625, -0.0160369873046875, 0.00791168212890625, 0.01136016845703125, -0.018280029296875, -0.03460693359375, -0.001522064208984375, -0.036529541015625, 0.04571533203125, 0.032684326171875, 0.032135009765625, -0.01285552978515625, 0.031524658203125, -0.00789642333984375, 0.058685302734375, -0.02294921875, -0.0223846435546875, 0.02783203125, -0.054840087890625, -0.0004329681396484375, -0.005840301513671875, 0.060211181640625, 0.0231475830078125, 0.0244598388671875, 0.01206207275390625, -0.0152740478515625, 0.005794525146484375, 0.01157379150390625, -0.0758056640625, -0.02783203125, 0.03533935546875, -0.004486083984375, -0.01537322998046875, 0.01508331298828125, -0.021240234375, 0.005893707275390625, -0.029510498046875, 0.037078857421875, -0.07806396484375, -0.00757598876953125, 0.01132965087890625, -0.0254669189453125, -0.00994873046875, 0.0352783203125, -0.02410888671875, -0.004974365234375, 0.049285888671875, 0.0811767578125, 0.0185394287109375, -0.007404327392578125, -0.0743408203125, 0.00867462158203125, -0.02423095703125, 0.0584716796875, -0.01213836669921875, -0.04315185546875, 0.0228424072265625, 0.0199737548828125, -0.0085906982421875, -0.039093017578125, 0.07781982421875, -0.0240936279296875, 0.0160980224609375, -0.059112548828125, -0.051605224609375, -0.0151519775390625, -0.0035991668701171875, -0.062225341796875, 0.08111572265625, 0.0261077880859375, -0.043548583984375, 0.034912109375, -0.031982421875, -0.04681396484375, 0.01181793212890625, 0.00018167495727539062, -0.052642822265625, -0.00970458984375, -0.00392913818359375, -0.0037555694580078125, -0.011688232421875, 0.01355743408203125, -0.01016998291015625, -0.046722412109375, 0.038543701171875, 0.01535797119140625, 0.0833740234375, 0.032867431640625, -0.01477813720703125, -0.0158538818359375, -0.06964111328125, 0.016143798828125, -0.024566650390625, -0.04595947265625, -0.0028400421142578125, -0.01151275634765625, 0.0268096923828125, 0.01525115966796875, 0.01204681396484375, -0.0478515625, -0.0040283203125, -0.0234527587890625, 0.059906005859375, 0.037689208984375, 0.038177490234375, -0.004077911376953125, -0.037384033203125, 0.029022216796875, 0.001033782958984375, 0.00830078125, -0.01520538330078125, -0.043182373046875, -0.04339599609375, -0.0621337890625, 0.03564453125, 0.0372314453125, -0.03717041015625, 0.055999755859375, 0.01149749755859375, -0.039459228515625, -0.0298919677734375, 0.0029582977294921875, 0.037872314453125, 0.045074462890625, 0.025604248046875, -0.033721923828125, -0.051422119140625, -0.054351806640625, -0.0022869110107421875, -0.035614013671875, -0.00897216796875, 0.005474090576171875, 0.0223388671875, -0.001773834228515625, 0.054718017578125, -0.004322052001953125, -0.0256805419921875, 0.005584716796875, 0.0208740234375, 0.0298919677734375, 0.048919677734375, 0.034759521484375, -0.0225982666015625, 0.0025959014892578125, -0.0310516357421875, -0.021759033203125, -0.038970947265625, -0.0025272369384765625, -0.0174102783203125, -0.0162200927734375, 0.046142578125, -0.0340576171875, 0.003955841064453125, 0.038848876953125, -0.032135009765625, 0.0595703125, 0.0281829833984375, 0.0183868408203125, -0.09661865234375, 0.02978515625, -0.0240631103515625, -0.050384521484375, -0.039031982421875, -0.0106353759765625, -0.002170562744140625, -0.01274871826171875, -0.05328369140625, 0.0166473388671875, -0.01232147216796875, -0.001567840576171875, -0.005634307861328125, -0.0015325546264648438, 0.0161895751953125, 0.00618743896484375, -0.00284576416015625, 0.07696533203125, 0.0391845703125, -0.047698974609375, -0.0025463104248046875, 0.039947509765625, -0.047149658203125, 0.04412841796875, -0.068115234375, 0.015167236328125, 0.045013427734375, 0.02227783203125, -0.0694580078125, -0.02398681640625, 0.0015735626220703125, -0.03704833984375, 0.00281524658203125, -0.0308685302734375, -0.056732177734375, -0.023681640625, -0.0107879638671875, 0.037567138671875, 0.0550537109375, -0.0238037109375, 0.0162506103515625, 0.02947998046875, -0.011444091796875, 0.00022840499877929688, -0.052490234375, -0.00795745849609375, -0.002552032470703125, -0.042449951171875, 0.01061248779296875, -0.006519317626953125, -0.01291656494140625, -0.03460693359375, 0.0013751983642578125, 0.00794219970703125, 0.0024871826171875, 0.040802001953125, 0.01480865478515625, -0.0090179443359375, 0.00890350341796875, 0.00830078125, -0.0006346702575683594, 0.0023784637451171875, -0.048736572265625, 0.060333251953125, 0.004180908203125, -0.011688232421875, -0.060791015625, 0.00958251953125, 0.046417236328125, -0.005519866943359375, 0.0391845703125, 0.061859130859375, -0.0300445556640625, -0.0013427734375, -0.047149658203125, -0.0158843994140625, -0.0299530029296875, 0.0306549072265625, -0.01470184326171875, -0.07476806640625, 0.054290771484375, -0.00554656982421875, -0.00547027587890625, 0.037384033203125, 0.068115234375, -0.0223541259765625, 0.04302978515625, 0.01910400390625, 0.004024505615234375, 0.056365966796875, -0.004802703857421875, -0.006622314453125, -0.057647705078125, -0.023284912109375, -0.062103271484375, 0.0035305023193359375, -0.0226593017578125, -0.040557861328125, 0.038116455078125, 0.021331787109375, -0.033447265625, 0.049591064453125, -0.03338623046875, 0.00897216796875, 0.059783935546875, 0.01120758056640625, 0.0194549560546875, 0.00859832763671875, -0.03466796875, -0.0176544189453125, -0.029022216796875, -0.03704833984375, 0.055572509765625, 0.022552490234375, 0.0260772705078125, 0.010009765625, 0.038421630859375, 0.0186004638671875, -0.033782958984375, -0.044921875, 0.028289794921875, -0.00722503662109375, -0.060821533203125, -0.00554656982421875, -0.0235748291015625, -0.053985595703125, -0.0227813720703125, -0.0201568603515625, -0.07110595703125, 0.017669677734375, 0.01412200927734375, -0.0438232421875, 0.00902557373046875, -0.047393798828125, 0.0604248046875, 0.01934814453125, -0.0194549560546875, -0.005733489990234375, -0.038848876953125, 0.0216217041015625, -0.01532745361328125, 0.02777099609375, -0.0017604827880859375, 0.0180206298828125, 0.07220458984375, -0.06201171875, 0.059814453125, -0.0169219970703125, -0.010650634765625, 0.0616455078125, 0.033905029296875, 0.0535888671875, 0.008819580078125, -0.0142974853515625, 0.02685546875, 0.0240478515625, -0.0208740234375, 0.004314422607421875, 0.041229248046875, -0.054931640625, 0.004055023193359375, -0.0234832763671875, -0.034912109375, 0.0005192756652832031, 0.0246429443359375, 0.04205322265625, 0.044403076171875, -0.0275421142578125, 0.021087646484375, 0.0374755859375, -0.01018524169921875, 0.023468017578125, 0.0357666015625, -0.017608642578125, -0.06488037109375, 0.06256103515625, 0.00347900390625, 0.033905029296875, 0.005298614501953125, -0.006610870361328125, -0.0103302001953125, -0.04339599609375, -0.0222320556640625, 0.01447296142578125, -0.058624267578125, -0.0095062255859375, -0.051849365234375, -0.02069091796875, -0.05126953125, 0.00029397010803222656, -0.033294677734375, -0.0584716796875, -0.0299835205078125, 0.000827789306640625, 0.0391845703125, 0.040130615234375, -0.01029205322265625, 0.037384033203125, -0.041107177734375, 0.007328033447265625, 0.01119232177734375, -0.005710601806640625, -0.04248046875, -0.078369140625, -0.0256500244140625, -0.00792694091796875, -0.022979736328125, -0.060211181640625, -0.005950927734375, 0.006038665771484375, 0.0087738037109375, 0.029541015625, -0.01079559326171875, 0.051727294921875, -0.024810791015625, 0.080078125, 0.0198516845703125, -0.08447265625, 0.037017822265625, -0.053955078125, 0.028472900390625, 0.07208251953125, -0.0031833648681640625, -0.043365478515625, -0.06549072265625, -0.03680419921875, -0.06890869140625, 0.036346435546875, 0.049530029296875, -0.0186920166015625, -0.0104827880859375, 0.03204345703125, 0.006744384765625, 0.0179901123046875, -0.0487060546875, 0.020965576171875, -0.0214691162109375, -0.0009593963623046875, 0.026153564453125, -0.034393310546875, -0.0189361572265625, -0.00585174560546875, 0.062347412109375, 0.005207061767578125, 0.000362396240234375, 0.0256805419921875, -0.00966644287109375, -0.01009368896484375, 0.00891876220703125, 0.033782958984375, 0.036163330078125, -0.02032470703125, -0.00677490234375, 0.0048828125, -0.05316162109375, -0.01381683349609375, 0.0166168212890625, -0.03350830078125, 0.0181427001953125, 0.0426025390625, 0.0439453125, 0.0162200927734375, -0.048309326171875, 0.0418701171875, 0.0110931396484375, -0.0294036865234375, -0.053497314453125, 0.005184173583984375, 0.01502227783203125, 0.0279693603515625, 0.006591796875, 0.00033092498779296875, 0.00909423828125, -0.03717041015625, 0.0146026611328125, 0.0185089111328125, -0.04217529296875, -0.033233642578125, 0.07745361328125, 0.01398468017578125, -0.0253143310546875, 0.04998779296875, -0.0179901123046875, -0.033050537109375, 0.031768798828125, 0.0261383056640625, 0.053497314453125, -0.01239013671875, -0.0084075927734375, 0.059112548828125, 0.02288818359375, -0.012847900390625, 0.05108642578125, -0.0032863616943359375, -0.0202789306640625, -0.0291290283203125, -0.06414794921875, -0.03582763671875, 0.0046844482421875, -0.051513671875, 0.06561279296875, -0.0537109375, -0.043914794921875, -0.0010471343994140625, -0.024017333984375, -0.05169677734375, 0.0193023681640625, 0.005710601806640625, 0.05999755859375, -0.06597900390625, 0.070068359375, 0.033233642578125, -0.04412841796875, -0.076416015625, -0.0033702850341796875, 0.00217437744140625, -0.0667724609375, 0.0244598388671875, 0.01348114013671875, -0.0016946792602539062, -0.0013418197631835938, -0.0206298828125, -0.046844482421875, 0.08941650390625, 0.0242462158203125, -0.07476806640625, -0.0035686492919921875, 0.00823211669921875, 0.039459228515625, -0.0106353759765625, 0.00901031494140625, 0.03460693359375, 0.03802490234375, 0.01207733154296875, -0.076171875, 0.0095062255859375, 0.01027679443359375, -0.01136016845703125, -0.0140533447265625, -0.061614990234375, 0.0758056640625, -0.004352569580078125, -0.01084136962890625, 0.0457763671875, 0.0293121337890625, 0.0309295654296875, 0.006610870361328125, 0.05816650390625, 0.052734375, 0.059112548828125, -0.0288238525390625, 0.0626220703125, -0.038604736328125, 0.0226593017578125, 0.08074951171875, -0.0174560546875, 0.04876708984375, 0.0390625, -0.01215362548828125, 0.0531005859375, 0.05010986328125, -0.0037994384765625, 0.02679443359375, 0.0018787384033203125, -0.01107025146484375, -0.0478515625, -0.029296875, -0.02947998046875, 0.02685546875, 0.0158538818359375, -0.0390625, 0.0135345458984375, -0.017974853515625, 0.02362060546875, -0.006725311279296875, 0.005077362060546875, 0.06292724609375, 0.01326751708984375, -0.052459716796875, 0.035003662109375, -0.018463134765625, 0.000008165836334228516, -0.045654296875, 0.01413726806640625, 0.00223541259765625, 0.00902557373046875, 0.007598876953125, -0.044708251953125, 0.00795745849609375, -0.007122039794921875, -0.01207733154296875, -0.0062408447265625, 0.01934814453125, -0.028717041015625, -0.050323486328125, 0.0209503173828125, 0.0244293212890625, 0.024383544921875, -0.0177154541015625, -0.0389404296875, 0.028411865234375, 0.006023406982421875, -0.005481719970703125, -0.01340484619140625, 0.03350830078125, 0.004779815673828125, 0.00414276123046875, 0.05511474609375, 0.017120361328125, -0.017974853515625, 0.053741455078125, 0.07000732421875, -0.057647705078125, -0.069091796875, -0.038848876953125, 0.045623779296875, -0.01116943359375, -0.0361328125, 0.04693603515625, 0.0693359375, 0.034820556640625, 0.01418304443359375, 0.04962158203125, -0.0008130073547363281, 0.067138671875, -0.07452392578125, 0.035430908203125, -0.032806396484375, 0.01552581787109375, -0.037872314453125, -0.0660400390625, -0.006908416748046875, 0.0246429443359375, 0.005825042724609375, 0.0010023117065429688, 0.0718994140625, 0.0621337890625, 0.010162353515625, 0.0030574798583984375, 0.0181427001953125, 0.04022216796875, 0.0313720703125, 0.04449462890625, 0.059295654296875, -0.03375244140625, 0.06329345703125, 0.000751495361328125, -0.013671875, -0.00811767578125, -0.054656982421875, -0.07659912109375, -0.05108642578125, -0.03216552734375, -0.08331298828125, 0.0164642333984375, 0.050567626953125, 0.05596923828125, -0.06793212890625, -0.0018205642700195312, -0.005947113037109375, -0.0166015625, -0.00904083251953125, -0.01546478271484375, 0.01328277587890625, -0.019012451171875, -0.0628662109375, 0.041107177734375, 0.0171356201171875, 0.020263671875, 0.00003522634506225586, 0.0068206787109375, -0.0026531219482421875, -0.00711822509765625, 0.00787353515625, 0.019561767578125, -0.045562744140625, -0.042083740234375, 0.00011873245239257812, -0.0015716552734375, 0.003936767578125, 0.08111572265625, -0.03424072265625, 0.038818359375, 0.02685546875, -0.003131866455078125, 0.052032470703125, 0.00035071372985839844, 0.0343017578125, -0.05767822265625, 0.040740966796875, 0.0184173583984375, 0.050323486328125, 0.0308685302734375, -0.00994873046875, 0.01702880859375, 0.046539306640625, -0.054931640625, -0.06396484375, 0.023345947265625, -0.09857177734375, -0.0113067626953125, 0.1104736328125, 0.004596710205078125, -0.010833740234375, -0.0224609375, -0.0262908935546875, 0.038726806640625, -0.04803466796875, 0.039306640625, 0.056365966796875, 0.0304107666015625, 0.02618408203125, -0.04254150390625, 0.035125732421875, 0.040191650390625, -0.0191650390625, -0.0017824172973632812, 0.0548095703125, 0.052764892578125, 0.007350921630859375, 0.0506591796875, -0.0298919677734375, 0.0478515625, 0.0097808837890625, 0.0189208984375, -0.0178375244140625, -0.007610321044921875, -0.01107025146484375, 0.0020885467529296875, 0.0208587646484375, -0.0144500732421875 ] ]
deepset/bert-base-cased-squad2
2023-05-05T07:00:52.000Z
[ "transformers", "pytorch", "jax", "safetensors", "bert", "question-answering", "en", "dataset:squad_v2", "license:cc-by-4.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
deepset
null
null
deepset/bert-base-cased-squad2
17
263,105
transformers
2022-03-02T23:29:05
--- language: en license: cc-by-4.0 datasets: - squad_v2 model-index: - name: deepset/bert-base-cased-squad2 results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 71.1517 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGZlNmQ1YzIzMWUzNTg4YmI4NWVhYThiMzE2ZGZmNWUzNDM3NWI0ZGJkNzliNGUxNTY2MDA5MWVkYjAwYWZiMCIsInZlcnNpb24iOjF9.iUvVdy5c4hoXkwlThJankQqG9QXzNilvfF1_4P0oL8X-jkY5Q6YSsZx6G6cpgXogqFpn7JlE_lP6_OT0VIamCg - type: f1 value: 74.6714 name: F1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWE5OGNjODhmY2Y0NWIyZDIzMmQ2NmRjZGYyYTYzOWMxZDUzYzg4YjBhNTRiNTY4NTc0M2IxNjI5NWI5ZDM0NCIsInZlcnNpb24iOjF9.IqU9rbzUcKmDEoLkwCUZTKSH0ZFhtqgnhOaEDKKnaRMGBJLj98D5V4VirYT6jLh8FlR0FiwvMTMjReBcfTisAQ --- This is a BERT base cased model trained on SQuAD v2
1,045
[ [ -0.01158905029296875, -0.0225830078125, 0.01155853271484375, 0.01519012451171875, -0.01288604736328125, 0.011962890625, 0.0279693603515625, -0.0068206787109375, 0.001407623291015625, 0.049560546875, -0.0792236328125, 0.000019311904907226562, -0.029144287109375, -0.02850341796875, -0.0240631103515625, 0.08087158203125, 0.0310821533203125, 0.04376220703125, 0.01099395751953125, -0.0034770965576171875, -0.0244140625, -0.0635986328125, -0.04864501953125, -0.03179931640625, 0.0229339599609375, 0.0288238525390625, 0.0267791748046875, -0.00408172607421875, 0.020751953125, 0.005580902099609375, 0.0006489753723144531, -0.0382080078125, -0.0736083984375, -0.0133514404296875, 0.000980377197265625, -0.051483154296875, -0.0240631103515625, -0.00998687744140625, 0.0056304931640625, 0.0164031982421875, -0.009857177734375, -0.01348876953125, -0.04541015625, 0.040069580078125, -0.004840850830078125, 0.0208740234375, -0.062164306640625, 0.0001800060272216797, 0.00569915771484375, 0.0298614501953125, -0.042144775390625, 0.0297088623046875, 0.039093017578125, -0.032745361328125, 0.04693603515625, -0.034759521484375, 0.07989501953125, 0.0084686279296875, -0.03179931640625, 0.00015103816986083984, -0.0267181396484375, 0.051177978515625, -0.0227813720703125, 0.03564453125, 0.023590087890625, 0.04779052734375, -0.0262908935546875, -0.048919677734375, -0.01216888427734375, 0.01983642578125, 0.01995849609375, 0.020843505859375, -0.025177001953125, -0.01308441162109375, -0.0031299591064453125, -0.0213623046875, -0.0215301513671875, 0.00984954833984375, -0.06787109375, -0.01849365234375, 0.039764404296875, -0.00913238525390625, -0.00025177001953125, 0.0083770751953125, -0.07855224609375, -0.01049041748046875, -0.052642822265625, 0.00897979736328125, 0.014556884765625, 0.0116424560546875, -0.0233154296875, 0.0421142578125, 0.00563812255859375, 0.0294036865234375, 0.0245361328125, 0.0233306884765625, 0.040740966796875, 0.0126953125, -0.0268096923828125, 0.0217132568359375, -0.01171875, 0.042327880859375, 0.0345458984375, -0.00579833984375, -0.01806640625, -0.0214385986328125, 0.051116943359375, -0.069580078125, -0.0458984375, 0.012237548828125, -0.041259765625, -0.018524169921875, 0.01052093505859375, -0.0099029541015625, 0.00760650634765625, -0.0037746429443359375, 0.031494140625, -0.053863525390625, -0.038116455078125, -0.01348876953125, -0.0163421630859375, 0.05712890625, 0.0285491943359375, -0.0670166015625, 0.0114288330078125, 0.035064697265625, 0.042572021484375, 0.017822265625, 0.0190277099609375, -0.0173187255859375, 0.03033447265625, -0.040771484375, 0.050689697265625, 0.004974365234375, -0.058258056640625, 0.02703857421875, 0.0240936279296875, 0.01934814453125, -0.0248260498046875, 0.037017822265625, -0.050567626953125, -0.0076751708984375, -0.045257568359375, -0.0367431640625, -0.0116424560546875, 0.0187225341796875, -0.038055419921875, 0.08868408203125, 0.07037353515625, -0.01299285888671875, 0.054534912109375, -0.022247314453125, -0.043487548828125, 0.028564453125, -0.0042266845703125, -0.05950927734375, 0.01371002197265625, -0.02313232421875, 0.0276641845703125, 0.0276641845703125, -0.0159759521484375, -0.0278472900390625, -0.019744873046875, -0.0253753662109375, 0.0209808349609375, 0.071044921875, 0.0175018310546875, -0.004825592041015625, -0.0017566680908203125, -0.048187255859375, 0.026947021484375, 0.0088653564453125, -0.0290374755859375, -0.02001953125, 0.01019287109375, 0.007137298583984375, 0.013671875, 0.023406982421875, -0.0266876220703125, 0.034393310546875, -0.01396942138671875, 0.020599365234375, 0.0406494140625, -0.004016876220703125, 0.0141143798828125, -0.03668212890625, -0.0164794921875, 0.0092620849609375, 0.01309967041015625, 0.0185089111328125, -0.05438232421875, -0.06573486328125, -0.048431396484375, 0.0014829635620117188, 0.038177490234375, -0.044769287109375, 0.048980712890625, 0.02337646484375, -0.044219970703125, -0.0016412734985351562, 0.004390716552734375, 0.01538848876953125, -0.0061492919921875, 0.01541900634765625, -0.01514434814453125, -0.06146240234375, -0.10064697265625, 0.056671142578125, -0.06072998046875, -0.01259613037109375, -0.004703521728515625, 0.0382080078125, -0.0328369140625, 0.073486328125, 0.01153564453125, -0.02978515625, -0.06280517578125, -0.0010786056518554688, 0.0271759033203125, 0.04254150390625, 0.0802001953125, -0.0289306640625, -0.0118560791015625, -0.035003662109375, -0.042144775390625, -0.0098419189453125, 0.0033416748046875, -0.03948974609375, 0.02001953125, 0.03668212890625, -0.041107177734375, 0.0251617431640625, 0.017547607421875, -0.0188446044921875, 0.0138092041015625, -0.031280517578125, -0.01305389404296875, -0.054931640625, -0.0245819091796875, -0.001956939697265625, -0.0208740234375, -0.0556640625, -0.0007967948913574219, 0.024383544921875, 0.007137298583984375, -0.04254150390625, 0.01255035400390625, -0.037567138671875, -0.033447265625, -0.0182952880859375, -0.0662841796875, -0.023345947265625, 0.0244598388671875, 0.019775390625, 0.068115234375, 0.0251007080078125, -0.0273590087890625, 0.0135955810546875, 0.037506103515625, -0.027587890625, 0.051910400390625, -0.06695556640625, 0.0258026123046875, -0.00513458251953125, 0.0151519775390625, -0.073486328125, -0.0221099853515625, -0.0144805908203125, -0.0210113525390625, 0.0185089111328125, -0.032684326171875, -0.034454345703125, -0.048614501953125, -0.0138397216796875, 0.0066986083984375, 0.07086181640625, -0.043914794921875, 0.02032470703125, 0.0284271240234375, -0.0008063316345214844, 0.0021572113037109375, -0.052734375, -0.011993408203125, 0.011749267578125, -0.0469970703125, 0.034820556640625, -0.004230499267578125, -0.04119873046875, -0.0228729248046875, -0.0157623291015625, -0.05462646484375, 0.0014781951904296875, 0.0217742919921875, 0.0357666015625, -0.041229248046875, 0.016510009765625, 0.0054779052734375, 0.016998291015625, 0.00860595703125, 0.034149169921875, 0.058319091796875, -0.0231170654296875, -0.00035762786865234375, -0.00966644287109375, 0.038970947265625, 0.0418701171875, 0.021148681640625, 0.040252685546875, 0.02008056640625, -0.0291595458984375, -0.0226593017578125, -0.0367431640625, -0.0236663818359375, -0.027496337890625, 0.0545654296875, -0.033447265625, -0.021453857421875, 0.024749755859375, 0.024383544921875, 0.00677490234375, 0.045013427734375, 0.06005859375, -0.035400390625, 0.06549072265625, 0.057708740234375, -0.01788330078125, 0.031402587890625, -0.045654296875, -0.0149383544921875, -0.038726806640625, -0.048614501953125, -0.00827789306640625, -0.0140838623046875, 0.011932373046875, -0.0428466796875, 0.031463623046875, 0.024749755859375, -0.04791259765625, 0.033843994140625, -0.0175628662109375, 0.0322265625, 0.062042236328125, 0.04071044921875, -0.031768798828125, 0.005748748779296875, 0.026824951171875, -0.0091705322265625, -0.045166015625, -0.01375579833984375, 0.07720947265625, 0.036407470703125, 0.037017822265625, 0.026214599609375, 0.0309906005859375, 0.0601806640625, 0.0116729736328125, -0.051544189453125, 0.03216552734375, -0.00830078125, -0.08624267578125, -0.0120391845703125, 0.0119476318359375, -0.05999755859375, 0.002079010009765625, -0.0135955810546875, -0.0249176025390625, -0.004047393798828125, -0.0159454345703125, -0.030029296875, 0.00252532958984375, -0.06451416015625, 0.043304443359375, -0.026519775390625, -0.01338958740234375, -0.020416259765625, -0.0667724609375, 0.031402587890625, 0.0279998779296875, -0.0175323486328125, -0.00290679931640625, 0.024505615234375, 0.039276123046875, -0.068603515625, 0.05230712890625, -0.0287628173828125, -0.003368377685546875, 0.038238525390625, 0.0086669921875, 0.0377197265625, 0.0148162841796875, -0.0039215087890625, 0.01242828369140625, -0.0165863037109375, -0.0614013671875, 0.0029277801513671875, 0.02972412109375, -0.0670166015625, -0.023406982421875, -0.036346435546875, -0.033447265625, -0.0105743408203125, -0.001934051513671875, 0.033050537109375, 0.035797119140625, -0.0206756591796875, 0.0260162353515625, 0.06842041015625, 0.00023984909057617188, 0.043304443359375, 0.04638671875, -0.0010175704956054688, -0.0018053054809570312, 0.0204620361328125, -0.0170440673828125, 0.0286102294921875, -0.0032329559326171875, -0.0220794677734375, -0.01290130615234375, -0.0291595458984375, -0.042999267578125, 0.00202178955078125, -0.0251007080078125, 0.0006093978881835938, -0.005405426025390625, -0.0501708984375, -0.0333251953125, -0.026153564453125, -0.02294921875, -0.025238037109375, -0.01983642578125, -0.0302734375, 0.0394287109375, 0.063720703125, -0.01611328125, 0.04876708984375, -0.02337646484375, 0.003002166748046875, 0.0584716796875, 0.01093292236328125, -0.0517578125, -0.034149169921875, -0.00510406494140625, -0.00899505615234375, -0.0167083740234375, -0.054718017578125, 0.012481689453125, -0.0012359619140625, 0.055450439453125, 0.016998291015625, -0.0159454345703125, 0.0185089111328125, -0.054779052734375, 0.06280517578125, 0.0379638671875, -0.0352783203125, 0.03485107421875, -0.046478271484375, -0.0031948089599609375, 0.052032470703125, -0.0002689361572265625, 0.01235198974609375, 0.0009298324584960938, -0.0640869140625, -0.046630859375, 0.060577392578125, 0.02386474609375, 0.01039886474609375, -0.004611968994140625, 0.028228759765625, 0.0078125, 0.01953125, -0.0394287109375, -0.032684326171875, -0.0350341796875, -0.0149688720703125, 0.0189208984375, -0.05291748046875, -0.00800323486328125, -0.034088134765625, 0.076171875, 0.005458831787109375, 0.060546875, 0.0189208984375, -0.00794219970703125, 0.00453948974609375, -0.02880859375, 0.0693359375, 0.0228271484375, -0.06256103515625, -0.0203857421875, 0.020599365234375, -0.0176544189453125, 0.0097503662109375, -0.01139068603515625, 0.0066070556640625, 0.031585693359375, 0.0284881591796875, 0.0472412109375, 0.02978515625, -0.018646240234375, 0.06024169921875, 0.017913818359375, -0.036346435546875, -0.049407958984375, 0.005550384521484375, -0.01219940185546875, 0.044525146484375, 0.04071044921875, 0.0179443359375, 0.00502777099609375, -0.06280517578125, 0.052764892578125, 0.042449951171875, -0.054473876953125, -0.01727294921875, 0.04229736328125, 0.037994384765625, -0.04638671875, 0.048675537109375, -0.00811004638671875, -0.068115234375, 0.07440185546875, 0.038726806640625, 0.037017822265625, -0.01436614990234375, 0.0217437744140625, 0.0182952880859375, 0.03466796875, -0.01519775390625, 0.036956787109375, 0.01336669921875, -0.04010009765625, 0.01134490966796875, 0.0007901191711425781, -0.033843994140625, 0.0352783203125, -0.06524658203125, 0.03570556640625, -0.05181884765625, -0.005138397216796875, -0.0172119140625, -0.00415802001953125, -0.040771484375, 0.0306854248046875, 0.0149078369140625, 0.08099365234375, -0.029876708984375, 0.10113525390625, 0.0386962890625, -0.00885009765625, -0.039886474609375, -0.0224151611328125, -0.0049591064453125, -0.11456298828125, 0.093017578125, 0.0160675048828125, 0.01055908203125, -0.0024662017822265625, -0.048675537109375, -0.041259765625, 0.053192138671875, 0.016357421875, -0.06182861328125, 0.0032825469970703125, -0.0238800048828125, 0.034332275390625, -0.0175323486328125, 0.03448486328125, 0.0301971435546875, 0.013153076171875, 0.0135650634765625, -0.043731689453125, -0.05499267578125, -0.027069091796875, -0.00107574462890625, -0.005733489990234375, -0.0738525390625, 0.078857421875, -0.034637451171875, 0.0221710205078125, 0.043182373046875, 0.051300048828125, 0.03009033203125, -0.01273345947265625, 0.048431396484375, 0.031585693359375, 0.0179901123046875, -0.0222015380859375, 0.06378173828125, -0.01413726806640625, 0.037200927734375, 0.06439208984375, -0.04693603515625, 0.058685302734375, 0.0316162109375, -0.019775390625, 0.06829833984375, 0.037567138671875, -0.025390625, 0.080322265625, 0.03656005859375, -0.02117919921875, -0.0423583984375, 0.035552978515625, -0.061492919921875, 0.0108184814453125, 0.01788330078125, -0.033447265625, -0.02435302734375, -0.024505615234375, -0.0121307373046875, -0.017303466796875, -0.0352783203125, 0.0462646484375, -0.0259552001953125, -0.0191802978515625, 0.016876220703125, -0.01153564453125, 0.0296783447265625, -0.047271728515625, -0.0027599334716796875, -0.00743865966796875, 0.032135009765625, 0.03070068359375, -0.06024169921875, -0.004695892333984375, -0.014068603515625, -0.055328369140625, -0.01373291015625, 0.053314208984375, -0.0203857421875, -0.06964111328125, 0.01357269287109375, 0.023468017578125, 0.01396942138671875, 0.015838623046875, -0.048095703125, 0.00426483154296875, -0.00018215179443359375, -0.00690460205078125, 0.01247406005859375, 0.026275634765625, -0.0037708282470703125, 0.034088134765625, 0.0166015625, -0.0197906494140625, 0.0220184326171875, 0.0211181640625, 0.03387451171875, -0.0174560546875, -0.025054931640625, -0.0305633544921875, 0.0307769775390625, -0.026123046875, -0.052154541015625, 0.0159912109375, 0.07537841796875, 0.037384033203125, -0.044219970703125, 0.07318115234375, -0.0027751922607421875, 0.043914794921875, -0.032440185546875, 0.040679931640625, -0.0211334228515625, -0.01241302490234375, 0.0085601806640625, -0.044769287109375, 0.01442718505859375, 0.0936279296875, 0.0225830078125, -0.005435943603515625, 0.0081024169921875, 0.0195770263671875, 0.002887725830078125, 0.006298065185546875, 0.01617431640625, 0.0275421142578125, 0.0263824462890625, 0.06341552734375, 0.05194091796875, -0.045989990234375, 0.0186920166015625, -0.0151214599609375, -0.020904541015625, -0.035675048828125, -0.021881103515625, -0.11322021484375, -0.03375244140625, -0.0007543563842773438, -0.0296630859375, 0.03631591796875, 0.06719970703125, 0.1072998046875, -0.056549072265625, -0.01690673828125, -0.0233154296875, -0.0163421630859375, 0.01076507568359375, -0.017852783203125, -0.00677490234375, -0.0187225341796875, -0.03326416015625, 0.047332763671875, 0.0015621185302734375, 0.040069580078125, 0.001705169677734375, 0.0027065277099609375, -0.020294189453125, -0.0026645660400390625, 0.057220458984375, 0.01242828369140625, -0.031005859375, -0.0599365234375, -0.00745391845703125, -0.020843505859375, 0.01313018798828125, 0.03619384765625, -0.058624267578125, -0.004016876220703125, -0.00147247314453125, 0.0288238525390625, 0.051513671875, -0.00627899169921875, 0.0672607421875, -0.0555419921875, 0.0262908935546875, -0.0055999755859375, 0.049835205078125, 0.01090240478515625, -0.01207733154296875, 0.0667724609375, 0.0322265625, -0.0225372314453125, -0.071044921875, 0.00516510009765625, -0.1180419921875, -0.0178070068359375, 0.0767822265625, 0.004940032958984375, -0.0165557861328125, -0.01180267333984375, -0.05010986328125, 0.01263427734375, -0.037811279296875, 0.0418701171875, 0.050689697265625, 0.0017995834350585938, -0.0010852813720703125, -0.037628173828125, 0.02001953125, 0.030670166015625, -0.04730224609375, -0.04327392578125, 0.02423095703125, 0.049652099609375, -0.0023746490478515625, 0.0259857177734375, 0.01264190673828125, 0.04229736328125, 0.0004754066467285156, 0.03143310546875, 0.002391815185546875, -0.040191650390625, 0.00904083251953125, -0.0035686492919921875, -0.0301971435546875, -0.02667236328125 ] ]
naver/splade-cocondenser-ensembledistil
2022-05-11T08:05:37.000Z
[ "transformers", "pytorch", "bert", "fill-mask", "splade", "query-expansion", "document-expansion", "bag-of-words", "passage-retrieval", "knowledge-distillation", "en", "dataset:ms_marco", "arxiv:2205.04733", "license:cc-by-nc-sa-4.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
naver
null
null
naver/splade-cocondenser-ensembledistil
17
260,595
transformers
2022-05-09T13:18:41
--- license: cc-by-nc-sa-4.0 language: "en" tags: - splade - query-expansion - document-expansion - bag-of-words - passage-retrieval - knowledge-distillation datasets: - ms_marco --- ## SPLADE CoCondenser EnsembleDistil SPLADE model for passage retrieval. For additional details, please visit: * paper: https://arxiv.org/abs/2205.04733 * code: https://github.com/naver/splade | | MRR@10 (MS MARCO dev) | R@1000 (MS MARCO dev) | | --- | --- | --- | | `splade-cocondenser-ensembledistil` | 38.3 | 98.3 | ## Citation If you use our checkpoint, please cite our work: ``` @misc{https://doi.org/10.48550/arxiv.2205.04733, doi = {10.48550/ARXIV.2205.04733}, url = {https://arxiv.org/abs/2205.04733}, author = {Formal, Thibault and Lassance, Carlos and Piwowarski, Benjamin and Clinchant, Stéphane}, keywords = {Information Retrieval (cs.IR), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International} } ```
1,203
[ [ -0.02813720703125, -0.0426025390625, 0.04473876953125, 0.04510498046875, -0.03472900390625, 0.002712249755859375, -0.0153961181640625, -0.0269317626953125, 0.02166748046875, 0.0164031982421875, -0.02130126953125, -0.040863037109375, -0.053985595703125, 0.015289306640625, -0.0367431640625, 0.07708740234375, 0.007259368896484375, 0.02227783203125, -0.01064300537109375, -0.0152130126953125, -0.003162384033203125, -0.021575927734375, -0.025543212890625, -0.0124664306640625, 0.032440185546875, -0.0025577545166015625, 0.0010395050048828125, 0.0202484130859375, 0.02691650390625, 0.022552490234375, -0.0293121337890625, -0.0194854736328125, -0.0248565673828125, 0.022735595703125, -0.01776123046875, -0.023101806640625, -0.0268707275390625, -0.004833221435546875, 0.05145263671875, 0.03753662109375, -0.020782470703125, -0.006229400634765625, 0.03057861328125, 0.048858642578125, -0.060333251953125, 0.01666259765625, -0.04559326171875, -0.0007920265197753906, -0.0209503173828125, -0.0179901123046875, -0.049224853515625, -0.0291748046875, 0.02984619140625, -0.03692626953125, 0.040008544921875, 0.0015850067138671875, 0.06304931640625, 0.0203094482421875, -0.033203125, -0.0227508544921875, -0.02215576171875, 0.064208984375, -0.042724609375, 0.0288238525390625, 0.021453857421875, 0.01441192626953125, -0.00257110595703125, -0.06951904296875, -0.04486083984375, -0.0313720703125, -0.0084381103515625, -0.00827789306640625, 0.00074005126953125, 0.01406097412109375, 0.0074615478515625, 0.018707275390625, -0.05047607421875, -0.03460693359375, -0.057647705078125, -0.043060302734375, 0.04412841796875, -0.0027370452880859375, -0.00585174560546875, -0.0008268356323242188, -0.049713134765625, -0.03375244140625, -0.027801513671875, -0.01922607421875, 0.009307861328125, 0.0298919677734375, -0.00200653076171875, 0.0240325927734375, -0.00707244873046875, 0.053070068359375, -0.003993988037109375, 0.013671875, 0.04559326171875, -0.0209503173828125, -0.0135498046875, 0.00669097900390625, 0.046905517578125, 0.000016808509826660156, -0.01210784912109375, -0.0111541748046875, -0.02410888671875, -0.00878143310546875, 0.006496429443359375, -0.06951904296875, -0.0272369384765625, 0.0107421875, -0.018341064453125, -0.0085906982421875, 0.00811767578125, -0.07196044921875, -0.025390625, -0.0135650634765625, 0.04510498046875, -0.0438232421875, -0.007328033447265625, 0.015716552734375, -0.0184783935546875, 0.031768798828125, 0.0310516357421875, -0.0229339599609375, 0.0253143310546875, 0.03363037109375, 0.06988525390625, -0.02593994140625, -0.0278778076171875, -0.046966552734375, -0.0180511474609375, 0.004913330078125, 0.05865478515625, -0.0152740478515625, -0.04205322265625, -0.0172576904296875, 0.00858306884765625, -0.038116455078125, -0.0406494140625, 0.0634765625, -0.0574951171875, 0.0016040802001953125, -0.017913818359375, -0.0163421630859375, -0.024688720703125, 0.029144287109375, -0.056671142578125, 0.0562744140625, -0.018310546875, -0.035430908203125, 0.004192352294921875, -0.0413818359375, -0.04156494140625, -0.0012607574462890625, -0.0112457275390625, -0.053863525390625, -0.0082855224609375, 0.00548553466796875, 0.012908935546875, -0.01377105712890625, 0.03387451171875, -0.025360107421875, -0.036041259765625, 0.01236724853515625, -0.039794921875, 0.082763671875, 0.047515869140625, -0.021636962890625, -0.016510009765625, -0.052154541015625, -0.002346038818359375, -0.016815185546875, -0.029571533203125, -0.0244293212890625, -0.00817108154296875, 0.0148162841796875, 0.0067901611328125, 0.03558349609375, -0.04547119140625, 0.0167388916015625, -0.006748199462890625, 0.0271148681640625, 0.054595947265625, 0.014251708984375, 0.0374755859375, -0.02362060546875, 0.03485107421875, 0.0269775390625, 0.0016241073608398438, 0.0015134811401367188, -0.0238037109375, -0.04168701171875, -0.03839111328125, 0.04046630859375, 0.047119140625, -0.0650634765625, 0.0203094482421875, -0.021636962890625, -0.0304718017578125, -0.033935546875, -0.010345458984375, 0.0196685791015625, 0.031768798828125, 0.050811767578125, -0.0197906494140625, -0.038665771484375, -0.055389404296875, 0.00496673583984375, -0.0114898681640625, 0.0046234130859375, 0.021759033203125, 0.06622314453125, -0.0029315948486328125, 0.062744140625, -0.061279296875, -0.0235595703125, -0.0081329345703125, 0.0097503662109375, 0.0278167724609375, 0.065185546875, 0.04052734375, -0.08233642578125, -0.052825927734375, -0.03607177734375, -0.040618896484375, 0.0012083053588867188, 0.00827789306640625, -0.02813720703125, 0.004058837890625, 0.0229339599609375, -0.03350830078125, 0.037200927734375, -0.00007534027099609375, -0.033599853515625, 0.042999267578125, -0.008880615234375, 0.0226593017578125, -0.076904296875, 0.031646728515625, 0.0220947265625, 0.006076812744140625, -0.03839111328125, 0.020416259765625, 0.02606201171875, -0.0122528076171875, -0.038665771484375, 0.03790283203125, -0.04156494140625, 0.0244293212890625, 0.00014650821685791016, -0.0023899078369140625, 0.01299285888671875, 0.055816650390625, 0.0203704833984375, 0.0592041015625, 0.057220458984375, -0.054595947265625, -0.006290435791015625, 0.021270751953125, -0.0291900634765625, 0.05084228515625, -0.0657958984375, 0.005199432373046875, -0.0294189453125, 0.038543701171875, -0.0684814453125, 0.0009088516235351562, 0.01849365234375, -0.03741455078125, 0.04901123046875, -0.0292510986328125, -0.0205841064453125, -0.004154205322265625, -0.047760009765625, 0.052825927734375, 0.01568603515625, -0.023193359375, 0.0419921875, 0.0190582275390625, -0.00885009765625, -0.058746337890625, -0.05926513671875, -0.00423431396484375, -0.017181396484375, -0.043609619140625, 0.03265380859375, -0.0272674560546875, -0.0003514289855957031, -0.00949859619140625, -0.015899658203125, -0.027374267578125, 0.000743865966796875, -0.017730712890625, 0.033416748046875, -0.023468017578125, -0.0005116462707519531, 0.006244659423828125, -0.008758544921875, -0.021575927734375, -0.031890869140625, 0.037872314453125, -0.011138916015625, -0.0284576416015625, -0.0333251953125, 0.0266265869140625, 0.03955078125, -0.0271148681640625, 0.061798095703125, 0.03173828125, -0.016021728515625, -0.0055999755859375, -0.03570556640625, -0.033111572265625, -0.036102294921875, 0.025970458984375, -0.026336669921875, -0.0404052734375, 0.05517578125, 0.0307464599609375, -0.0257110595703125, 0.029998779296875, 0.0084075927734375, -0.004993438720703125, 0.051544189453125, 0.025726318359375, 0.00496673583984375, 0.047698974609375, -0.051666259765625, 0.01317596435546875, -0.0771484375, -0.026153564453125, -0.0701904296875, -0.02001953125, -0.04742431640625, -0.032470703125, 0.05584716796875, 0.0186920166015625, -0.0201873779296875, 0.045562744140625, -0.0511474609375, 0.0238189697265625, 0.04345703125, 0.0204620361328125, 0.0139923095703125, 0.0026569366455078125, -0.04083251953125, 0.0088043212890625, -0.0433349609375, -0.027374267578125, 0.0634765625, 0.009979248046875, 0.07061767578125, 0.0255126953125, 0.0672607421875, 0.0215301513671875, 0.00017917156219482422, -0.03424072265625, 0.056121826171875, 0.01023101806640625, -0.0784912109375, -0.0163421630859375, -0.03387451171875, -0.0692138671875, 0.03900146484375, 0.005397796630859375, -0.0770263671875, 0.06072998046875, 0.0028285980224609375, -0.035888671875, 0.006069183349609375, -0.058624267578125, 0.05108642578125, 0.02191162109375, -0.048858642578125, 0.002147674560546875, -0.0650634765625, 0.0167694091796875, 0.016082763671875, -0.01030731201171875, 0.01016998291015625, 0.0006642341613769531, 0.0517578125, -0.0233917236328125, 0.05047607421875, -0.037841796875, -0.001079559326171875, 0.042510986328125, 0.017333984375, 0.056915283203125, 0.0013027191162109375, -0.03204345703125, 0.01319122314453125, 0.0257415771484375, -0.04901123046875, -0.037994384765625, 0.061065673828125, -0.03900146484375, -0.039581298828125, -0.071044921875, -0.0165252685546875, 0.018310546875, 0.0139007568359375, 0.0226287841796875, 0.038787841796875, 0.006870269775390625, 0.02813720703125, 0.0731201171875, -0.01302337646484375, 0.029754638671875, 0.06280517578125, 0.01605224609375, -0.0313720703125, 0.07958984375, 0.02850341796875, 0.004291534423828125, 0.04681396484375, -0.0170440673828125, -0.044586181640625, -0.06011962890625, -0.031646728515625, 0.0200042724609375, -0.050048828125, -0.0240020751953125, -0.053070068359375, -0.036468505859375, -0.033203125, 0.00472259521484375, -0.0140380859375, -0.035552978515625, -0.039886474609375, -0.0126953125, 0.017730712890625, -0.00018918514251708984, -0.029083251953125, -0.0013027191162109375, -0.051300048828125, -0.00011944770812988281, 0.0073699951171875, 0.0251922607421875, 0.0221710205078125, -0.037628173828125, -0.00902557373046875, 0.00400543212890625, -0.0390625, -0.0635986328125, 0.02978515625, 0.011932373046875, 0.059051513671875, 0.001865386962890625, 0.040374755859375, 0.046600341796875, -0.01256561279296875, 0.05926513671875, -0.00453948974609375, -0.06201171875, 0.036224365234375, -0.011260986328125, 0.01175689697265625, 0.08233642578125, 0.0293121337890625, -0.0219879150390625, -0.0184783935546875, -0.03582763671875, -0.0985107421875, 0.05511474609375, 0.03778076171875, 0.0016222000122070312, -0.01537322998046875, 0.00553131103515625, -0.0017518997192382812, 0.022857666015625, -0.07537841796875, -0.0306396484375, -0.00530242919921875, -0.0137786865234375, -0.00432586669921875, -0.01401519775390625, -0.00002193450927734375, -0.01422882080078125, 0.09112548828125, 0.0056915283203125, 0.04498291015625, 0.052825927734375, -0.034515380859375, 0.00949859619140625, 0.021697998046875, 0.0657958984375, 0.029632568359375, -0.03631591796875, 0.0244293212890625, 0.004703521728515625, -0.058868408203125, 0.0216064453125, 0.04107666015625, -0.0193939208984375, -0.010498046875, 0.03472900390625, 0.04248046875, -0.010528564453125, -0.033935546875, 0.0546875, 0.00421142578125, -0.0117034912109375, -0.02911376953125, 0.006092071533203125, -0.0235748291015625, 0.029388427734375, 0.0743408203125, 0.035919189453125, 0.037689208984375, -0.0160675048828125, 0.029083251953125, 0.0177764892578125, -0.049224853515625, -0.015289306640625, 0.049468994140625, 0.0174407958984375, -0.00545501708984375, 0.041839599609375, -0.043212890625, -0.03253173828125, 0.040374755859375, 0.0208892822265625, 0.06591796875, 0.006092071533203125, 0.00007325410842895508, 0.0458984375, 0.00640106201171875, -0.0181732177734375, 0.01275634765625, -0.052825927734375, -0.04180908203125, -0.028717041015625, -0.058135986328125, 0.01207733154296875, 0.00627899169921875, -0.053985595703125, 0.0116119384765625, 0.0029621124267578125, 0.003231048583984375, -0.01216888427734375, 0.0297393798828125, -0.027557373046875, -0.00901031494140625, -0.01537322998046875, 0.0933837890625, -0.06317138671875, 0.0638427734375, 0.0274505615234375, -0.053558349609375, -0.0343017578125, 0.0176239013671875, -0.00959014892578125, -0.03497314453125, 0.0257568359375, -0.004241943359375, -0.011871337890625, 0.009552001953125, -0.047332763671875, -0.053985595703125, 0.0955810546875, 0.0203094482421875, -0.05059814453125, -0.0003256797790527344, -0.02618408203125, 0.0355224609375, 0.00989532470703125, 0.0290374755859375, 0.01837158203125, 0.03460693359375, 0.04205322265625, -0.06317138671875, -0.0034580230712890625, -0.037506103515625, 0.00966644287109375, 0.018951416015625, -0.04437255859375, 0.059417724609375, -0.01336669921875, -0.02947998046875, 0.0445556640625, 0.043487548828125, 0.05035400390625, 0.049774169921875, 0.0257110595703125, 0.046661376953125, 0.04339599609375, -0.009063720703125, 0.05615234375, -0.0266265869140625, 0.0538330078125, 0.08551025390625, -0.0217437744140625, 0.055633544921875, 0.061370849609375, -0.0230865478515625, 0.039825439453125, 0.043487548828125, 0.0122222900390625, 0.0828857421875, 0.02471923828125, -0.047515869140625, -0.0004436969757080078, 0.018798828125, -0.042724609375, -0.0006351470947265625, 0.0179443359375, -0.036956787109375, -0.0241241455078125, 0.00807952880859375, -0.01568603515625, -0.0014905929565429688, -0.00579833984375, 0.047027587890625, 0.00787353515625, -0.054901123046875, 0.05926513671875, -0.00988006591796875, 0.055419921875, -0.0418701171875, -0.00789642333984375, -0.0207977294921875, 0.01033782958984375, -0.00689697265625, -0.03753662109375, 0.007320404052734375, -0.01047515869140625, -0.0094451904296875, -0.00395965576171875, 0.030426025390625, -0.0304718017578125, -0.04180908203125, 0.0180511474609375, 0.0250244140625, 0.028656005859375, 0.007755279541015625, -0.044647216796875, 0.0038509368896484375, 0.0252838134765625, -0.016510009765625, 0.0235748291015625, 0.034393310546875, 0.007610321044921875, 0.03875732421875, 0.039520263671875, -0.0011682510375976562, 0.0086669921875, -0.01226043701171875, 0.0638427734375, -0.0516357421875, -0.046173095703125, -0.0677490234375, 0.0498046875, -0.007617950439453125, -0.0086517333984375, 0.032318115234375, 0.067626953125, 0.0516357421875, -0.006328582763671875, 0.044769287109375, -0.0238189697265625, 0.030792236328125, -0.026031494140625, 0.056915283203125, -0.030792236328125, 0.042236328125, -0.027923583984375, -0.0694580078125, -0.0273895263671875, 0.06646728515625, -0.030059814453125, 0.01380157470703125, 0.041778564453125, 0.10516357421875, -0.032958984375, -0.01303863525390625, 0.0094451904296875, 0.03729248046875, 0.0157318115234375, 0.0296783447265625, 0.05633544921875, -0.053558349609375, 0.038604736328125, -0.007686614990234375, 0.0025539398193359375, -0.031494140625, -0.061492919921875, -0.02447509765625, -0.08416748046875, -0.035430908203125, -0.045318603515625, 0.01200103759765625, 0.0753173828125, 0.06268310546875, -0.06988525390625, -0.010162353515625, -0.030426025390625, 0.0194854736328125, -0.00853729248046875, -0.0247344970703125, 0.04052734375, -0.00934600830078125, -0.05828857421875, 0.027587890625, -0.004642486572265625, 0.031219482421875, -0.01605224609375, -0.0250701904296875, -0.049407958984375, -0.00749969482421875, 0.017486572265625, 0.021697998046875, -0.016693115234375, -0.0029735565185546875, -0.0013914108276367188, -0.016448974609375, 0.01629638671875, 0.034515380859375, -0.054656982421875, 0.0224456787109375, 0.054168701171875, 0.0128173828125, 0.08465576171875, -0.0221710205078125, 0.02264404296875, -0.052825927734375, 0.0216217041015625, 0.04681396484375, 0.027984619140625, 0.01715087890625, -0.015106201171875, 0.035308837890625, 0.0250701904296875, -0.038421630859375, -0.080078125, -0.00873565673828125, -0.0833740234375, -0.026092529296875, 0.0709228515625, -0.0180206298828125, -0.03021240234375, -0.002933502197265625, -0.01302337646484375, 0.019866943359375, -0.06329345703125, 0.0677490234375, 0.04046630859375, 0.007701873779296875, 0.001049041748046875, -0.022735595703125, 0.025543212890625, 0.00704193115234375, -0.01483917236328125, -0.03338623046875, 0.029083251953125, 0.034515380859375, 0.039703369140625, 0.055328369140625, -0.01023101806640625, 0.022552490234375, -0.00661468505859375, -0.0011434555053710938, 0.0019779205322265625, -0.025146484375, -0.043609619140625, 0.027557373046875, -0.003528594970703125, -0.0276031494140625 ] ]
damo-vilab/text-to-video-ms-1.7b
2023-05-15T22:44:20.000Z
[ "diffusers", "text-to-video", "license:cc-by-nc-4.0", "has_space", "diffusers:TextToVideoSDPipeline", "region:us" ]
text-to-video
damo-vilab
null
null
damo-vilab/text-to-video-ms-1.7b
338
258,876
diffusers
2023-03-22T13:23:33
--- license: cc-by-nc-4.0 tags: - text-to-video duplicated_from: diffusers/text-to-video-ms-1.7b --- # Text-to-video-synthesis Model in Open Domain This model is based on a multi-stage text-to-video generation diffusion model, which inputs a description text and returns a video that matches the text description. Only English input is supported. **We Are Hiring!** (Based in Beijing / Hangzhou, China.) If you're looking for an exciting challenge and the opportunity to work with cutting-edge technologies in AIGC and large-scale pretraining, then we are the place for you. We are looking for talented, motivated and creative individuals to join our team. If you are interested, please send your CV to us. EMAIL: yingya.zyy@alibaba-inc.com ## Model description The text-to-video generation diffusion model consists of three sub-networks: text feature extraction model, text feature-to-video latent space diffusion model, and video latent space to video visual space model. The overall model parameters are about 1.7 billion. Currently, it only supports English input. The diffusion model adopts a UNet3D structure, and implements video generation through the iterative denoising process from the pure Gaussian noise video. This model is meant for research purposes. Please look at the [model limitations and biases and misuse](#model-limitations-and-biases), [malicious use and excessive use](#misuse-malicious-use-and-excessive-use) sections. ## Model Details - **Developed by:** [ModelScope](https://modelscope.cn/) - **Model type:** Diffusion-based text-to-video generation model - **Language(s):** English - **License:**[ CC-BY-NC-ND](https://creativecommons.org/licenses/by-nc-nd/4.0/) - **Resources for more information:** [ModelScope GitHub Repository](https://github.com/modelscope/modelscope), [Summary](https://modelscope.cn/models/damo/text-to-video-synthesis/summary). - **Cite as:** ## Use cases This model has a wide range of applications and can reason and generate videos based on arbitrary English text descriptions. ## Usage Let's first install the libraries required: ```bash $ pip install diffusers transformers accelerate torch ``` Now, generate a video: ```python import torch from diffusers import DiffusionPipeline, DPMSolverMultistepScheduler from diffusers.utils import export_to_video pipe = DiffusionPipeline.from_pretrained("damo-vilab/text-to-video-ms-1.7b", torch_dtype=torch.float16, variant="fp16") pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config) pipe.enable_model_cpu_offload() prompt = "Spiderman is surfing" video_frames = pipe(prompt, num_inference_steps=25).frames video_path = export_to_video(video_frames) ``` Here are some results: <table> <tr> <td><center> An astronaut riding a horse. <br> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/astr.gif" alt="An astronaut riding a horse." style="width: 300px;" /> </center></td> <td ><center> Darth vader surfing in waves. <br> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/vader.gif" alt="Darth vader surfing in waves." style="width: 300px;" /> </center></td> </tr> </table> ## Long Video Generation You can optimize for memory usage by enabling attention and VAE slicing and using Torch 2.0. This should allow you to generate videos up to 25 seconds on less than 16GB of GPU VRAM. ```bash $ pip install git+https://github.com/huggingface/diffusers transformers accelerate ``` ```py import torch from diffusers import DiffusionPipeline, DPMSolverMultistepScheduler from diffusers.utils import export_to_video # load pipeline pipe = DiffusionPipeline.from_pretrained("damo-vilab/text-to-video-ms-1.7b", torch_dtype=torch.float16, variant="fp16") pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config) # optimize for GPU memory pipe.enable_model_cpu_offload() pipe.enable_vae_slicing() # generate prompt = "Spiderman is surfing. Darth Vader is also surfing and following Spiderman" video_frames = pipe(prompt, num_inference_steps=25, num_frames=200).frames # convent to video video_path = export_to_video(video_frames) ``` ## View results The above code will display the save path of the output video, and the current encoding format can be played with [VLC player](https://www.videolan.org/vlc/). The output mp4 file can be viewed by [VLC media player](https://www.videolan.org/vlc/). Some other media players may not view it normally. ## Model limitations and biases * The model is trained based on public data sets such as Webvid, and the generated results may have deviations related to the distribution of training data. * This model cannot achieve perfect film and television quality generation. * The model cannot generate clear text. * The model is mainly trained with English corpus and does not support other languages ​​at the moment**. * The performance of this model needs to be improved on complex compositional generation tasks. ## Misuse, Malicious Use and Excessive Use * The model was not trained to realistically represent people or events, so using it to generate such content is beyond the model's capabilities. * It is prohibited to generate content that is demeaning or harmful to people or their environment, culture, religion, etc. * Prohibited for pornographic, violent and bloody content generation. * Prohibited for error and false information generation. ## Training data The training data includes [LAION5B](https://huggingface.co/datasets/laion/laion2B-en), [ImageNet](https://www.image-net.org/), [Webvid](https://m-bain.github.io/webvid-dataset/) and other public datasets. Image and video filtering is performed after pre-training such as aesthetic score, watermark score, and deduplication. _(Part of this model card has been taken from [here](https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis))_ ## Citation ```bibtex @InProceedings{VideoFusion, author = {Luo, Zhengxiong and Chen, Dayou and Zhang, Yingya and Huang, Yan and Wang, Liang and Shen, Yujun and Zhao, Deli and Zhou, Jingren and Tan, Tieniu}, title = {VideoFusion: Decomposed Diffusion Models for High-Quality Video Generation}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2023} } ```
6,580
[ [ -0.039520263671875, -0.06671142578125, 0.019561767578125, 0.01493072509765625, -0.0287322998046875, -0.0167999267578125, -0.006618499755859375, -0.00417327880859375, 0.0030517578125, 0.0313720703125, -0.0494384765625, -0.036712646484375, -0.06292724609375, -0.0083770751953125, -0.036865234375, 0.08135986328125, -0.018585205078125, 0.00360107421875, -0.010772705078125, 0.01056671142578125, -0.0159912109375, -0.0144195556640625, -0.0357666015625, -0.01178741455078125, -0.0025539398193359375, 0.01229095458984375, 0.0300140380859375, 0.0560302734375, 0.036834716796875, 0.0279693603515625, -0.00250244140625, 0.0186920166015625, -0.03265380859375, -0.0095062255859375, 0.003864288330078125, -0.006771087646484375, -0.040740966796875, -0.001888275146484375, 0.061370849609375, 0.01398468017578125, -0.003253936767578125, 0.0252685546875, 0.004421234130859375, 0.040130615234375, -0.04620361328125, 0.0013208389282226562, -0.0257720947265625, 0.00408172607421875, 0.00005054473876953125, -0.0070343017578125, -0.01016998291015625, -0.0161590576171875, 0.0079193115234375, -0.04742431640625, 0.04278564453125, -0.01015472412109375, 0.093017578125, 0.0275726318359375, -0.02294921875, 0.01007843017578125, -0.061279296875, 0.04266357421875, -0.06201171875, 0.0267791748046875, 0.027008056640625, 0.0156402587890625, 0.01187896728515625, -0.061553955078125, -0.055755615234375, -0.00447845458984375, 0.01433563232421875, 0.023590087890625, -0.016815185546875, 0.005489349365234375, 0.0276336669921875, 0.029754638671875, -0.05438232421875, -0.0088958740234375, -0.041656494140625, -0.0047149658203125, 0.0592041015625, 0.01091766357421875, 0.032958984375, -0.029937744140625, -0.028717041015625, -0.035736083984375, -0.018829345703125, 0.006443023681640625, 0.042083740234375, -0.01251983642578125, -0.0499267578125, 0.0400390625, -0.01078033447265625, 0.031341552734375, 0.0164337158203125, -0.021453857421875, 0.0184783935546875, -0.0178680419921875, -0.045074462890625, -0.01641845703125, 0.0711669921875, 0.04766845703125, 0.00627899169921875, 0.029388427734375, 0.0049896240234375, 0.0139007568359375, 0.0110015869140625, -0.08905029296875, -0.02496337890625, 0.0203094482421875, -0.036651611328125, -0.021881103515625, 0.009979248046875, -0.070068359375, -0.0033321380615234375, 0.0018072128295898438, 0.0411376953125, -0.028411865234375, -0.040618896484375, 0.01016998291015625, -0.037384033203125, 0.011749267578125, 0.018218994140625, -0.06475830078125, 0.01971435546875, 0.01291656494140625, 0.06744384765625, -0.00833892822265625, -0.03466796875, -0.032928466796875, 0.019561767578125, -0.00664520263671875, 0.036865234375, -0.00958251953125, -0.0168914794921875, 0.0074005126953125, 0.0223388671875, 0.006504058837890625, -0.038726806640625, 0.0298919677734375, -0.029937744140625, 0.0253753662109375, 0.005397796630859375, -0.038421630859375, -0.01555633544921875, 0.01398468017578125, -0.03948974609375, 0.06817626953125, 0.0025043487548828125, -0.0718994140625, 0.01494598388671875, -0.03857421875, -0.020050048828125, -0.0179443359375, 0.0021266937255859375, -0.0496826171875, -0.0232391357421875, 0.01788330078125, 0.033782958984375, -0.0170135498046875, 0.032928466796875, -0.01422882080078125, -0.0176239013671875, 0.01464080810546875, -0.042236328125, 0.061737060546875, 0.0273284912109375, -0.038909912109375, 0.006500244140625, -0.05963134765625, -0.0207061767578125, 0.00864410400390625, -0.01430511474609375, 0.00724029541015625, -0.00429534912109375, 0.0183868408203125, 0.0264434814453125, 0.00046944618225097656, -0.038330078125, -0.002666473388671875, -0.035858154296875, 0.0399169921875, 0.057159423828125, 0.0029144287109375, 0.037689208984375, -0.0099945068359375, 0.05548095703125, 0.01300048828125, 0.018280029296875, -0.01154327392578125, -0.041351318359375, -0.062469482421875, -0.00693511962890625, -0.01015472412109375, 0.037689208984375, -0.05572509765625, 0.0172882080078125, -0.0037021636962890625, -0.03973388671875, -0.03094482421875, 0.005268096923828125, 0.027618408203125, 0.04742431640625, 0.027984619140625, -0.0372314453125, -0.0496826171875, -0.05633544921875, 0.016815185546875, -0.018798828125, -0.0146942138671875, 0.02423095703125, 0.050323486328125, -0.01013946533203125, 0.0469970703125, -0.056427001953125, -0.0230560302734375, -0.01123809814453125, -0.002651214599609375, 0.01354217529296875, 0.035064697265625, 0.0394287109375, -0.071533203125, -0.040863037109375, -0.00672149658203125, -0.0792236328125, 0.0027294158935546875, 0.0007658004760742188, -0.005458831787109375, 0.00801849365234375, 0.0294189453125, -0.056121826171875, 0.0333251953125, 0.054473876953125, -0.01494598388671875, 0.048675537109375, -0.035430908203125, 0.0295257568359375, -0.087646484375, 0.0086822509765625, 0.0175323486328125, -0.01430511474609375, -0.032196044921875, 0.0003228187561035156, -0.02960205078125, -0.02410888671875, -0.059051513671875, 0.047515869140625, -0.0273284912109375, 0.01050567626953125, -0.026947021484375, 0.01898193359375, 0.006633758544921875, 0.04840087890625, 0.0117950439453125, 0.03643798828125, 0.06427001953125, -0.06512451171875, 0.042327880859375, 0.0382080078125, -0.0184783935546875, 0.040679931640625, -0.05718994140625, -0.0166015625, -0.0181884765625, 0.0111846923828125, -0.07684326171875, -0.042510986328125, 0.0232696533203125, -0.051788330078125, 0.01666259765625, -0.0195159912109375, -0.0160980224609375, -0.0300140380859375, -0.02496337890625, 0.0286102294921875, 0.0626220703125, -0.0287322998046875, 0.0249786376953125, 0.03436279296875, 0.0014104843139648438, -0.046875, -0.056976318359375, -0.0100555419921875, -0.037689208984375, -0.0733642578125, 0.048828125, -0.01529693603515625, -0.00255584716796875, -0.0012178421020507812, 0.0098724365234375, -0.0190887451171875, -0.02874755859375, 0.026947021484375, 0.0235443115234375, 0.0041656494140625, -0.02703857421875, 0.0018463134765625, 0.00662994384765625, 0.0089874267578125, -0.013916015625, 0.0245819091796875, 0.0207672119140625, -0.004486083984375, -0.04559326171875, 0.03533935546875, 0.0259246826171875, -0.006298065185546875, 0.039154052734375, 0.096923828125, -0.02783203125, -0.00872039794921875, -0.0309600830078125, -0.0186614990234375, -0.042266845703125, 0.048126220703125, 0.0011510848999023438, -0.054840087890625, 0.03216552734375, 0.00858306884765625, -0.00614166259765625, 0.05023193359375, 0.060028076171875, -0.0163726806640625, 0.08837890625, 0.05206298828125, 0.0245819091796875, 0.055145263671875, -0.06463623046875, -0.0156402587890625, -0.056243896484375, -0.0199127197265625, -0.004070281982421875, -0.00603485107421875, -0.054168701171875, -0.039459228515625, 0.047271728515625, -0.0008373260498046875, -0.04376220703125, 0.0245361328125, -0.0469970703125, 0.01190185546875, 0.0231781005859375, 0.00794219970703125, 0.0038356781005859375, 0.01393890380859375, 0.0046234130859375, -0.01270294189453125, -0.043121337890625, -0.0307769775390625, 0.06231689453125, 0.023834228515625, 0.056549072265625, 0.010040283203125, 0.0283050537109375, 0.0257415771484375, -0.007068634033203125, -0.0206146240234375, 0.056427001953125, -0.0232391357421875, -0.046356201171875, -0.0101318359375, -0.023712158203125, -0.062744140625, 0.00411224365234375, -0.01959228515625, -0.0477294921875, -0.00826263427734375, 0.01183319091796875, -0.0080108642578125, 0.040130615234375, -0.05712890625, 0.0704345703125, -0.0010709762573242188, -0.058380126953125, -0.0110015869140625, -0.06500244140625, 0.034210205078125, 0.0257415771484375, 0.0159149169921875, 0.01143646240234375, 0.01444244384765625, 0.06280517578125, -0.04296875, 0.0548095703125, -0.0222625732421875, 0.01041412353515625, 0.0479736328125, -0.00650787353515625, 0.015960693359375, 0.0126190185546875, 0.0158843994140625, 0.00897216796875, -0.0083465576171875, -0.0157623291015625, -0.026458740234375, 0.0548095703125, -0.072509765625, -0.042938232421875, -0.0246429443359375, -0.0258331298828125, 0.022857666015625, 0.0083465576171875, 0.053466796875, 0.0172576904296875, -0.0206756591796875, -0.00946044921875, 0.045867919921875, -0.03607177734375, 0.059234619140625, 0.00911712646484375, -0.0168304443359375, -0.057830810546875, 0.067626953125, -0.00008034706115722656, 0.0311126708984375, 0.0208587646484375, 0.0028476715087890625, -0.0111846923828125, -0.0254058837890625, -0.05303955078125, 0.026611328125, -0.03973388671875, -0.011016845703125, -0.045135498046875, -0.036865234375, -0.045928955078125, -0.0030841827392578125, -0.03411865234375, -0.00782012939453125, -0.036834716796875, -0.01291656494140625, 0.0394287109375, 0.036529541015625, -0.004573822021484375, 0.033660888671875, -0.052215576171875, 0.03619384765625, 0.0278778076171875, 0.025115966796875, -0.0057525634765625, -0.06365966796875, -0.041259765625, -0.00308990478515625, -0.07220458984375, -0.06103515625, 0.06103515625, 0.0163421630859375, 0.0197601318359375, 0.03192138671875, -0.024993896484375, 0.06243896484375, -0.034149169921875, 0.0859375, 0.0214691162109375, -0.055816650390625, 0.05108642578125, -0.040374755859375, 0.022430419921875, 0.0012178421020507812, 0.0401611328125, -0.0252227783203125, -0.0162811279296875, -0.056549072265625, -0.06622314453125, 0.058380126953125, 0.042388916015625, 0.01300048828125, 0.0156402587890625, 0.020050048828125, -0.006610870361328125, 0.01068115234375, -0.06304931640625, -0.0235748291015625, -0.038421630859375, 0.0156097412109375, -0.0026149749755859375, 0.0014820098876953125, 0.0009098052978515625, -0.033721923828125, 0.06927490234375, 0.0031642913818359375, 0.034759521484375, 0.03863525390625, 0.0021610260009765625, -0.0242767333984375, -0.0035247802734375, 0.032867431640625, 0.02154541015625, -0.031158447265625, -0.0093841552734375, -0.00022900104522705078, -0.05047607421875, 0.0233612060546875, 0.0168914794921875, -0.025482177734375, 0.0177154541015625, 0.009307861328125, 0.06982421875, -0.01363372802734375, -0.050262451171875, 0.047515869140625, 0.005191802978515625, -0.0265655517578125, -0.040130615234375, 0.00640869140625, 0.005950927734375, 0.0245819091796875, 0.0125579833984375, 0.00868988037109375, 0.01505279541015625, -0.031951904296875, 0.01605224609375, 0.0279083251953125, -0.00728607177734375, -0.03643798828125, 0.10595703125, 0.00926971435546875, -0.02593994140625, 0.042236328125, -0.0162353515625, -0.019134521484375, 0.0526123046875, 0.036834716796875, 0.07037353515625, -0.01898193359375, 0.03472900390625, 0.056365966796875, 0.003253936767578125, -0.019317626953125, 0.007579803466796875, 0.00885772705078125, -0.040435791015625, -0.03717041015625, -0.04278564453125, -0.0176544189453125, 0.0160369873046875, -0.041229248046875, 0.057342529296875, -0.0262451171875, -0.022613525390625, 0.0018930435180664062, -0.0088348388671875, -0.049530029296875, 0.03631591796875, 0.0243682861328125, 0.04937744140625, -0.078857421875, 0.06298828125, 0.0419921875, -0.06256103515625, -0.0518798828125, -0.0233154296875, -0.00923919677734375, -0.028533935546875, 0.031829833984375, 0.00860595703125, 0.0031986236572265625, -0.010223388671875, -0.045684814453125, -0.04437255859375, 0.091552734375, 0.0513916015625, -0.018035888671875, -0.0308685302734375, 0.0096282958984375, 0.048583984375, -0.037689208984375, 0.03448486328125, 0.0224761962890625, 0.025634765625, 0.020904541015625, -0.06048583984375, -0.0142669677734375, -0.0308685302734375, 0.005115509033203125, -0.004638671875, -0.0753173828125, 0.07861328125, -0.0233612060546875, -0.00496673583984375, 0.0034351348876953125, 0.045166015625, 0.0246734619140625, 0.041534423828125, 0.02581787109375, 0.053558349609375, 0.007778167724609375, 0.01500701904296875, 0.07196044921875, 0.0047607421875, 0.03863525390625, 0.05780029296875, 0.0069580078125, 0.0462646484375, 0.03485107421875, -0.027496337890625, 0.060821533203125, 0.048828125, -0.020355224609375, 0.039794921875, -0.000048279762268066406, -0.00640869140625, -0.01293182373046875, -0.005809783935546875, -0.025238037109375, 0.029296875, 0.004608154296875, -0.0465087890625, -0.0005674362182617188, 0.0168609619140625, -0.0027217864990234375, 0.00489044189453125, -0.0115509033203125, 0.039825439453125, 0.0011854171752929688, -0.05279541015625, 0.056121826171875, 0.0010938644409179688, 0.05712890625, -0.0523681640625, -0.01354217529296875, -0.0090789794921875, 0.0133209228515625, -0.0298919677734375, -0.054931640625, 0.0194549560546875, 0.01971435546875, -0.017242431640625, -0.020294189453125, 0.04119873046875, -0.0457763671875, -0.037384033203125, 0.020263671875, 0.0184326171875, 0.040679931640625, -0.00554656982421875, -0.051666259765625, 0.006866455078125, 0.01087188720703125, -0.0229949951171875, 0.0303802490234375, 0.0147247314453125, 0.0138092041015625, 0.042633056640625, 0.029815673828125, 0.0088958740234375, 0.0009660720825195312, -0.002361297607421875, 0.059417724609375, -0.041534423828125, -0.032684326171875, -0.0693359375, 0.0633544921875, -0.006282806396484375, -0.02960205078125, 0.064208984375, 0.049530029296875, 0.077392578125, -0.0284881591796875, 0.0672607421875, -0.021148681640625, -0.001262664794921875, -0.03466796875, 0.039825439453125, -0.060760498046875, 0.007205963134765625, -0.051727294921875, -0.06854248046875, 0.0036602020263671875, 0.039154052734375, 0.0104522705078125, 0.00447845458984375, 0.0438232421875, 0.06903076171875, -0.02313232421875, -0.0192413330078125, 0.021484375, 0.02349853515625, 0.0270233154296875, 0.0386962890625, 0.03485107421875, -0.07147216796875, 0.063232421875, -0.0285797119140625, -0.01708984375, -0.005123138427734375, -0.06365966796875, -0.05523681640625, -0.065673828125, -0.047271728515625, -0.04132080078125, -0.0082550048828125, 0.043426513671875, 0.07275390625, -0.024383544921875, -0.037322998046875, -0.00426483154296875, -0.0002543926239013672, -0.0229949951171875, -0.0218353271484375, 0.02801513671875, 0.03045654296875, -0.0831298828125, 0.00887298583984375, 0.01763916015625, 0.01534271240234375, -0.041259765625, -0.0154571533203125, -0.033935546875, -0.0030803680419921875, 0.040374755859375, 0.020965576171875, -0.044189453125, -0.01248931884765625, 0.004016876220703125, 0.01432037353515625, 0.01528167724609375, 0.040985107421875, -0.045928955078125, 0.042816162109375, 0.046722412109375, 0.0093841552734375, 0.088134765625, -0.0010986328125, 0.029449462890625, -0.0601806640625, 0.02593994140625, -0.007427215576171875, 0.0246429443359375, 0.0338134765625, -0.0286102294921875, 0.03582763671875, 0.03826904296875, -0.06475830078125, -0.05682373046875, 0.0101470947265625, -0.1048583984375, -0.00569915771484375, 0.1041259765625, -0.0109405517578125, -0.0174560546875, 0.02471923828125, -0.0268096923828125, 0.050811767578125, -0.02044677734375, 0.052154541015625, 0.03594970703125, -0.011199951171875, -0.02008056640625, -0.052154541015625, 0.033935546875, 0.00872802734375, -0.04522705078125, -0.0086822509765625, 0.04132080078125, 0.053985595703125, 0.0025424957275390625, 0.056549072265625, -0.00394439697265625, 0.0221710205078125, 0.0180206298828125, 0.016357421875, 0.00152587890625, -0.007671356201171875, -0.0284881591796875, 0.0103759765625, -0.0252685546875, -0.020294189453125 ] ]
hfl/chinese-bert-wwm-ext
2021-05-19T19:06:39.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "zh", "arxiv:1906.08101", "arxiv:2004.13922", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
hfl
null
null
hfl/chinese-bert-wwm-ext
122
258,307
transformers
2022-03-02T23:29:05
--- language: - zh license: "apache-2.0" --- ## Chinese BERT with Whole Word Masking For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**. **[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)** Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu This repository is developed based on:https://github.com/google-research/bert You may also interested in, - Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm - Chinese MacBERT: https://github.com/ymcui/MacBERT - Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA - Chinese XLNet: https://github.com/ymcui/Chinese-XLNet - Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer More resources by HFL: https://github.com/ymcui/HFL-Anthology ## Citation If you find the technical report or resource is useful, please cite the following technical report in your paper. - Primary: https://arxiv.org/abs/2004.13922 ``` @inproceedings{cui-etal-2020-revisiting, title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing", author = "Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58", pages = "657--668", } ``` - Secondary: https://arxiv.org/abs/1906.08101 ``` @article{chinese-bert-wwm, title={Pre-Training with Whole Word Masking for Chinese BERT}, author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping}, journal={arXiv preprint arXiv:1906.08101}, year={2019} } ```
1,993
[ [ -0.0305633544921875, -0.054412841796875, 0.0236053466796875, 0.042236328125, -0.0304107666015625, -0.0091400146484375, -0.042999267578125, -0.053985595703125, 0.0261383056640625, 0.034332275390625, -0.03448486328125, -0.034637451171875, -0.04266357421875, -0.0005483627319335938, -0.0067596435546875, 0.061859130859375, 0.0009307861328125, 0.0146026611328125, 0.020355224609375, 0.00716400146484375, -0.005001068115234375, -0.0540771484375, -0.052276611328125, -0.032623291015625, 0.0347900390625, -0.000024080276489257812, 0.0239410400390625, 0.03857421875, 0.020111083984375, 0.0260772705078125, -0.00678253173828125, 0.014495849609375, -0.0161285400390625, -0.002552032470703125, 0.01959228515625, -0.0039215087890625, -0.03753662109375, 0.007537841796875, 0.054046630859375, 0.043609619140625, 0.00832366943359375, -0.00656890869140625, 0.00888824462890625, 0.04833984375, -0.0445556640625, 0.0024318695068359375, -0.055877685546875, 0.00806427001953125, -0.03582763671875, 0.0041046142578125, -0.031951904296875, -0.0230712890625, 0.0203857421875, -0.05401611328125, 0.01418304443359375, 0.003009796142578125, 0.10888671875, -0.00988006591796875, -0.0003707408905029297, -0.0020999908447265625, -0.0303955078125, 0.051544189453125, -0.082763671875, 0.033294677734375, 0.03857421875, -0.00966644287109375, -0.0161285400390625, -0.080810546875, -0.0631103515625, -0.02520751953125, -0.00809478759765625, 0.0176849365234375, -0.0005807876586914062, 0.0297088623046875, 0.01454925537109375, 0.030548095703125, -0.04864501953125, 0.0183868408203125, -0.0184478759765625, -0.044525146484375, 0.04595947265625, -0.024017333984375, 0.022979736328125, -0.005329132080078125, -0.0254669189453125, -0.038055419921875, -0.026702880859375, 0.01611328125, 0.026092529296875, 0.016265869140625, -0.000244140625, 0.01175689697265625, -0.003749847412109375, 0.05047607421875, -0.00597381591796875, 0.00914764404296875, 0.044830322265625, -0.034393310546875, -0.0256500244140625, 0.01003265380859375, 0.07379150390625, -0.005840301513671875, 0.0183258056640625, -0.0022068023681640625, -0.0234527587890625, -0.0229034423828125, 0.001354217529296875, -0.05718994140625, -0.031585693359375, 0.010467529296875, -0.036407470703125, -0.0021820068359375, 0.016204833984375, -0.043212890625, -0.0203704833984375, -0.0162200927734375, 0.041534423828125, -0.038970947265625, -0.0229034423828125, 0.016876220703125, -0.017303466796875, 0.033905029296875, 0.0030422210693359375, -0.057373046875, 0.00800323486328125, 0.037811279296875, 0.05194091796875, 0.0043487548828125, -0.0309600830078125, -0.027862548828125, -0.006473541259765625, -0.01145172119140625, 0.053985595703125, -0.021514892578125, -0.0018606185913085938, 0.016082763671875, 0.00852203369140625, -0.006359100341796875, -0.0161285400390625, 0.0628662109375, -0.033233642578125, 0.027557373046875, -0.0202484130859375, -0.02838134765625, -0.0217132568359375, 0.0143280029296875, -0.04559326171875, 0.0877685546875, -0.0255889892578125, -0.05902099609375, 0.01377105712890625, -0.06402587890625, -0.049957275390625, 0.0014171600341796875, 0.01654052734375, -0.03875732421875, -0.0174560546875, 0.0302886962890625, 0.026519775390625, -0.00746917724609375, 0.0118865966796875, -0.01080322265625, -0.02764892578125, 0.0018796920776367188, -0.02587890625, 0.0888671875, 0.0204925537109375, -0.033782958984375, 0.0251617431640625, -0.06488037109375, 0.0096588134765625, 0.01093292236328125, -0.0164947509765625, -0.0215301513671875, -0.004276275634765625, 0.0178985595703125, 0.0197906494140625, 0.047332763671875, -0.059326171875, -0.00557708740234375, -0.048858642578125, 0.032470703125, 0.06109619140625, -0.022369384765625, 0.0178680419921875, -0.019378662109375, 0.013214111328125, -0.0005006790161132812, 0.004627227783203125, -0.029266357421875, -0.039276123046875, -0.07220458984375, -0.0218048095703125, 0.04547119140625, 0.046783447265625, -0.059112548828125, 0.06719970703125, -0.0177154541015625, -0.042572021484375, -0.056488037109375, -0.00577545166015625, 0.035125732421875, 0.031463623046875, 0.038848876953125, -0.0232696533203125, -0.0667724609375, -0.055633544921875, -0.01458740234375, -0.0185089111328125, -0.009735107421875, -0.0016927719116210938, 0.0152435302734375, -0.01154327392578125, 0.05718994140625, -0.045379638671875, -0.036529541015625, -0.0157470703125, 0.04022216796875, 0.0161285400390625, 0.037017822265625, 0.03515625, -0.050323486328125, -0.04541015625, -0.01035308837890625, -0.0222320556640625, -0.0183258056640625, -0.015380859375, -0.03448486328125, 0.03387451171875, 0.036529541015625, -0.0261077880859375, 0.034393310546875, 0.0271148681640625, -0.00809478759765625, 0.04779052734375, -0.031646728515625, -0.004276275634765625, -0.0740966796875, 0.014404296875, 0.007213592529296875, -0.0013141632080078125, -0.06060791015625, -0.0032062530517578125, 0.0016736984252929688, 0.01251983642578125, -0.028045654296875, 0.03814697265625, -0.049652099609375, 0.0191802978515625, -0.0192718505859375, 0.0252532958984375, 0.005077362060546875, 0.06646728515625, 0.0280303955078125, 0.04449462890625, 0.037994384765625, -0.05377197265625, 0.01175689697265625, 0.00679779052734375, -0.0269927978515625, -0.0200042724609375, -0.057373046875, 0.0150146484375, -0.0033664703369140625, 0.0310821533203125, -0.0865478515625, 0.01009368896484375, 0.03338623046875, -0.051361083984375, 0.0286712646484375, 0.0279541015625, -0.056854248046875, -0.029815673828125, -0.06134033203125, 0.0109100341796875, 0.035552978515625, -0.036163330078125, 0.0188751220703125, 0.021209716796875, 0.00029730796813964844, -0.045684814453125, -0.06390380859375, 0.01593017578125, 0.0157623291015625, -0.051788330078125, 0.061187744140625, -0.02490234375, 0.01258087158203125, 0.0003006458282470703, 0.0076751708984375, -0.0309600830078125, 0.005451202392578125, -0.011688232421875, 0.026947021484375, -0.0204925537109375, 0.01507568359375, -0.005283355712890625, 0.01039886474609375, 0.0021114349365234375, -0.0251312255859375, 0.04632568359375, 0.00745391845703125, -0.01251983642578125, -0.0272064208984375, 0.01364898681640625, 0.01715087890625, -0.028961181640625, 0.06158447265625, 0.08740234375, -0.047637939453125, 0.007511138916015625, -0.049468994140625, -0.01158905029296875, -0.0350341796875, 0.030731201171875, -0.004512786865234375, -0.06982421875, 0.0328369140625, 0.0300445556640625, 0.0341796875, 0.0484619140625, 0.0335693359375, -0.00640869140625, 0.05426025390625, 0.048828125, -0.02593994140625, 0.06390380859375, -0.000568389892578125, 0.036468505859375, -0.0706787109375, 0.0056304931640625, -0.046539306640625, -0.0191192626953125, -0.051788330078125, -0.01381683349609375, -0.00011545419692993164, 0.005046844482421875, -0.02069091796875, 0.036407470703125, -0.060394287109375, 0.0179443359375, 0.05621337890625, 0.0060272216796875, 0.00656890869140625, 0.004482269287109375, -0.0252227783203125, -0.00492095947265625, -0.028594970703125, -0.0271148681640625, 0.06842041015625, 0.02301025390625, 0.019073486328125, -0.015716552734375, 0.056671142578125, 0.01351165771484375, 0.01416778564453125, -0.042388916015625, 0.04925537109375, -0.0254364013671875, -0.0469970703125, -0.041229248046875, -0.0176544189453125, -0.08721923828125, 0.036895751953125, -0.023162841796875, -0.05926513671875, 0.00736236572265625, -0.0018663406372070312, -0.0282440185546875, 0.035858154296875, -0.05572509765625, 0.04119873046875, -0.013671875, -0.007266998291015625, 0.0018281936645507812, -0.058837890625, 0.035980224609375, -0.01299285888671875, 0.0034503936767578125, -0.00014650821685791016, 0.0162811279296875, 0.0787353515625, -0.032257080078125, 0.06451416015625, -0.0204620361328125, -0.0228271484375, 0.0264129638671875, -0.038116455078125, 0.0267181396484375, -0.0215606689453125, -0.0062408447265625, 0.036102294921875, -0.00745391845703125, -0.02459716796875, -0.01358795166015625, 0.040740966796875, -0.0640869140625, -0.04498291015625, -0.052642822265625, -0.021026611328125, 0.0011835098266601562, 0.038360595703125, 0.039703369140625, 0.0124053955078125, 0.002239227294921875, 0.00749969482421875, 0.057098388671875, -0.035308837890625, 0.05780029296875, 0.040771484375, -0.00308990478515625, -0.032867431640625, 0.06494140625, 0.02459716796875, -0.00191497802734375, 0.052276611328125, 0.01113128662109375, -0.0172576904296875, -0.04669189453125, -0.01141357421875, 0.025421142578125, -0.034332275390625, -0.0046844482421875, -0.0576171875, -0.057769775390625, -0.06298828125, 0.00968170166015625, -0.004627227783203125, -0.0284881591796875, -0.038055419921875, 0.003543853759765625, 0.01206207275390625, 0.020477294921875, -0.019989013671875, 0.0248565673828125, -0.065673828125, 0.032623291015625, 0.02191162109375, 0.021942138671875, 0.02227783203125, -0.049041748046875, -0.0426025390625, 0.0263519287109375, -0.033447265625, -0.04058837890625, 0.037872314453125, 0.02130126953125, 0.060089111328125, 0.0280303955078125, 0.0255279541015625, 0.050994873046875, -0.041595458984375, 0.08416748046875, 0.0159759521484375, -0.07598876953125, 0.031036376953125, -0.0032253265380859375, 0.02935791015625, 0.033294677734375, 0.01105499267578125, -0.049041748046875, -0.0195159912109375, -0.03460693359375, -0.07696533203125, 0.06878662109375, 0.0167236328125, 0.01383209228515625, 0.0003762245178222656, 0.008087158203125, -0.00492095947265625, 0.0003027915954589844, -0.0897216796875, -0.037017822265625, -0.0279998779296875, -0.002040863037109375, 0.010040283203125, -0.03521728515625, 0.01541900634765625, -0.036041259765625, 0.07562255859375, 0.0152435302734375, 0.04119873046875, 0.042999267578125, -0.0230712890625, -0.0001201629638671875, 0.018768310546875, 0.061309814453125, 0.032562255859375, -0.027862548828125, 0.0007562637329101562, 0.00496673583984375, -0.0615234375, -0.020965576171875, 0.036163330078125, 0.00600433349609375, 0.02044677734375, 0.04718017578125, 0.05926513671875, 0.00901031494140625, -0.03936767578125, 0.04119873046875, -0.01261138916015625, -0.037750244140625, -0.029571533203125, -0.01236724853515625, -0.0007295608520507812, -0.00905609130859375, 0.040771484375, -0.01473236083984375, 0.0007152557373046875, -0.0247344970703125, 0.002704620361328125, 0.0176239013671875, -0.036773681640625, -0.0298004150390625, 0.03851318359375, 0.0170440673828125, -0.0111236572265625, 0.052398681640625, -0.0250244140625, -0.07000732421875, 0.035736083984375, 0.04241943359375, 0.09515380859375, -0.006084442138671875, -0.005527496337890625, 0.044219970703125, 0.04339599609375, 0.00492095947265625, 0.0198974609375, -0.02154541015625, -0.07501220703125, -0.044921875, -0.045196533203125, -0.0155792236328125, 0.029510498046875, -0.030029296875, 0.0113372802734375, -0.034637451171875, -0.0011129379272460938, 0.0018186569213867188, 0.0189666748046875, -0.03985595703125, 0.0122528076171875, 0.038970947265625, 0.06964111328125, -0.039794921875, 0.096435546875, 0.06695556640625, -0.03106689453125, -0.04620361328125, 0.02288818359375, -0.028839111328125, -0.05572509765625, 0.05792236328125, 0.01227569580078125, -0.00926971435546875, -0.00971221923828125, -0.052825927734375, -0.050323486328125, 0.07061767578125, 0.008880615234375, -0.036468505859375, -0.0012493133544921875, -0.003696441650390625, 0.047637939453125, 0.00222015380859375, 0.019927978515625, 0.02996826171875, 0.055877685546875, -0.006855010986328125, -0.07489013671875, -0.0002925395965576172, -0.0308685302734375, 0.004364013671875, -0.0014801025390625, -0.05389404296875, 0.06842041015625, 0.0031414031982421875, -0.02490234375, 0.0014505386352539062, 0.06329345703125, 0.015960693359375, 0.0231170654296875, 0.04620361328125, 0.046783447265625, 0.059661865234375, -0.01097869873046875, 0.03607177734375, -0.0186309814453125, 0.0034389495849609375, 0.07171630859375, -0.00542449951171875, 0.051300048828125, 0.01593017578125, -0.0254364013671875, 0.043701171875, 0.061248779296875, 0.00823974609375, 0.040374755859375, 0.0233917236328125, -0.01042938232421875, -0.01120758056640625, -0.0090179443359375, -0.04833984375, 0.0162200927734375, 0.015655517578125, -0.018585205078125, 0.0004210472106933594, -0.00406646728515625, 0.0258636474609375, 0.002513885498046875, -0.00893402099609375, 0.049102783203125, 0.00836181640625, -0.0433349609375, 0.049652099609375, 0.0135345458984375, 0.096435546875, -0.07318115234375, -0.00600433349609375, -0.0095672607421875, -0.0160675048828125, -0.0081634521484375, -0.040740966796875, -0.0012369155883789062, -0.0081939697265625, -0.0170135498046875, -0.0178070068359375, 0.057373046875, -0.035430908203125, -0.028228759765625, 0.0328369140625, 0.01091766357421875, 0.01236724853515625, -0.0031528472900390625, -0.043701171875, -0.0006761550903320312, 0.02166748046875, -0.031951904296875, 0.027923583984375, 0.046295166015625, 0.0008597373962402344, 0.02520751953125, 0.07452392578125, 0.028594970703125, 0.0202789306640625, 0.01396942138671875, 0.0628662109375, -0.05340576171875, -0.038543701171875, -0.0625, 0.025634765625, -0.0193634033203125, -0.0274200439453125, 0.05322265625, 0.023590087890625, 0.07440185546875, 0.00396728515625, 0.056884765625, -0.00823211669921875, 0.030548095703125, -0.032012939453125, 0.080322265625, -0.037933349609375, 0.0111541748046875, -0.0341796875, -0.06597900390625, -0.036895751953125, 0.04559326171875, -0.007366180419921875, -0.002048492431640625, 0.0509033203125, 0.039154052734375, 0.029083251953125, -0.00931549072265625, 0.0251312255859375, 0.0268096923828125, 0.036590576171875, 0.0282440185546875, 0.027740478515625, -0.044219970703125, 0.05023193359375, -0.0296630859375, -0.013916015625, -0.0096435546875, -0.07708740234375, -0.055999755859375, -0.063232421875, -0.0229339599609375, -0.004180908203125, -0.005725860595703125, 0.064697265625, 0.0546875, -0.0701904296875, -0.01078033447265625, 0.00344085693359375, 0.01458740234375, -0.0255279541015625, -0.0181427001953125, 0.05426025390625, -0.045196533203125, -0.053985595703125, 0.007965087890625, 0.0029754638671875, -0.0023632049560546875, -0.00913238525390625, 0.0023097991943359375, -0.052520751953125, 0.01024627685546875, 0.04571533203125, 0.027435302734375, -0.055694580078125, -0.0157470703125, -0.004261016845703125, -0.011260986328125, 0.005252838134765625, 0.04620361328125, -0.051910400390625, 0.04095458984375, 0.048980712890625, 0.0504150390625, 0.033172607421875, -0.02081298828125, 0.02978515625, -0.06085205078125, 0.0166778564453125, 0.01001739501953125, 0.029876708984375, 0.02069091796875, -0.0305023193359375, 0.0306549072265625, 0.0110015869140625, -0.0406494140625, -0.0604248046875, -0.0134124755859375, -0.06854248046875, -0.0251312255859375, 0.061798095703125, -0.027252197265625, -0.0118865966796875, -0.004337310791015625, -0.031158447265625, 0.04925537109375, -0.03277587890625, 0.0523681640625, 0.0841064453125, 0.0096435546875, -0.0152740478515625, -0.0175018310546875, 0.033172607421875, 0.0341796875, -0.0457763671875, 0.003662109375, 0.0106048583984375, -0.008697509765625, 0.023040771484375, 0.051910400390625, -0.0011777877807617188, 0.002536773681640625, -0.0177764892578125, 0.04388427734375, -0.011077880859375, 0.0000413060188293457, -0.0198516845703125, -0.0195159912109375, -0.0053558349609375, -0.04266357421875 ] ]
prithivida/parrot_paraphraser_on_T5
2021-05-18T07:53:27.000Z
[ "transformers", "pytorch", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
prithivida
null
null
prithivida/parrot_paraphraser_on_T5
119
256,559
transformers
2022-03-02T23:29:05
# Parrot ## 1. What is Parrot? Parrot is a paraphrase based utterance augmentation framework purpose built to accelerate training NLU models. A paraphrase framework is more than just a paraphrasing model. For more details on the library and usage please refer to the [github page](https://github.com/PrithivirajDamodaran/Parrot) ### Installation ```python pip install git+https://github.com/PrithivirajDamodaran/Parrot_Paraphraser.git ``` ### Quickstart ```python from parrot import Parrot import torch import warnings warnings.filterwarnings("ignore") ''' uncomment to get reproducable paraphrase generations def random_state(seed): torch.manual_seed(seed) if torch.cuda.is_available(): torch.cuda.manual_seed_all(seed) random_state(1234) ''' #Init models (make sure you init ONLY once if you integrate this to your code) parrot = Parrot(model_tag="prithivida/parrot_paraphraser_on_T5", use_gpu=False) phrases = ["Can you recommed some upscale restaurants in Newyork?", "What are the famous places we should not miss in Russia?" ] for phrase in phrases: print("-"*100) print("Input_phrase: ", phrase) print("-"*100) para_phrases = parrot.augment(input_phrase=phrase) for para_phrase in para_phrases: print(para_phrase) ``` ``` ---------------------------------------------------------------------- Input_phrase: Can you recommed some upscale restaurants in Newyork? ---------------------------------------------------------------------- list some excellent restaurants to visit in new york city? what upscale restaurants do you recommend in new york? i want to try some upscale restaurants in new york? recommend some upscale restaurants in newyork? can you recommend some high end restaurants in newyork? can you recommend some upscale restaurants in new york? can you recommend some upscale restaurants in newyork? ---------------------------------------------------------------------- Input_phrase: What are the famous places we should not miss in Russia ---------------------------------------------------------------------- what should we not miss when visiting russia? recommend some of the best places to visit in russia? list some of the best places to visit in russia? can you list the top places to visit in russia? show the places that we should not miss in russia? list some famous places which we should not miss in russia? ``` ### Knobs ```python para_phrases = parrot.augment(input_phrase=phrase, diversity_ranker="levenshtein", do_diverse=False, max_return_phrases = 10, max_length=32, adequacy_threshold = 0.99, fluency_threshold = 0.90) ``` ## 2. Why Parrot? **Huggingface** lists [12 paraphrase models,](https://huggingface.co/models?pipeline_tag=text2text-generation&search=paraphrase) **RapidAPI** lists 7 fremium and commercial paraphrasers like [QuillBot](https://rapidapi.com/search/paraphrase?section=apis&page=1), Rasa has discussed an experimental paraphraser for augmenting text data [here](https://forum.rasa.com/t/paraphrasing-for-nlu-data-augmentation-experimental/27744), Sentence-transfomers offers a [paraphrase mining utility](https://www.sbert.net/examples/applications/paraphrase-mining/README.html) and [NLPAug](https://github.com/makcedward/nlpaug) offers word level augmentation with a [PPDB](http://paraphrase.org/#/download) (a multi-million paraphrase database). While these attempts at paraphrasing are great, there are still some gaps and paraphrasing is NOT yet a mainstream option for text augmentation in building NLU models....Parrot is a humble attempt to fill some of these gaps. **What is a good paraphrase?** Almost all conditioned text generation models are validated on 2 factors, (1) if the generated text conveys the same meaning as the original context (Adequacy) (2) if the text is fluent / grammatically correct english (Fluency). For instance Neural Machine Translation outputs are tested for Adequacy and Fluency. But [a good paraphrase](https://www.aclweb.org/anthology/D10-1090.pdf) should be adequate and fluent while being as different as possible on the surface lexical form. With respect to this definition, the **3 key metrics** that measures the quality of paraphrases are: - **Adequacy** (Is the meaning preserved adequately?) - **Fluency** (Is the paraphrase fluent English?) - **Diversity (Lexical / Phrasal / Syntactical)** (How much has the paraphrase changed the original sentence?) *Parrot offers knobs to control Adequacy, Fluency and Diversity as per your needs.* **What makes a paraphraser a good augmentor?** For training a NLU model we just don't need a lot of utterances but utterances with intents and slots/entities annotated. Typical flow would be: - Given an **input utterance + input annotations** a good augmentor spits out N **output paraphrases** while preserving the intent and slots. - The output paraphrases are then converted into annotated data using the input annotations that we got in step 1. - The annotated data created out of the output paraphrases then makes the training dataset for your NLU model. But in general being a generative model paraphrasers doesn't guarantee to preserve the slots/entities. So the ability to generate high quality paraphrases in a constrained fashion without trading off the intents and slots for lexical dissimilarity makes a paraphraser a good augmentor. *More on this in section 3 below* ## 3. Scope In the space of conversational engines, knowledge bots are to which **we ask questions** like *"when was the Berlin wall teared down?"*, transactional bots are to which **we give commands** like *"Turn on the music please"* and voice assistants are the ones which can do both answer questions and action our commands. Parrot mainly foucses on augmenting texts typed-into or spoken-to conversational interfaces for building robust NLU models. (*So usually people neither type out or yell out long paragraphs to conversational interfaces. Hence the pre-trained model is trained on text samples of maximum length of 32.*) *While Parrot predominantly aims to be a text augmentor for building good NLU models, it can also be used as a pure-play paraphraser.*
6,361
[ [ -0.0274810791015625, -0.0849609375, 0.0251007080078125, 0.02984619140625, -0.0234375, -0.02435302734375, -0.0023193359375, -0.0247802734375, 0.0126495361328125, 0.03863525390625, -0.01274871826171875, -0.00905609130859375, -0.0230712890625, 0.0261688232421875, -0.040740966796875, 0.074951171875, 0.02215576171875, -0.00786590576171875, -0.0234375, -0.01050567626953125, -0.0293731689453125, -0.03570556640625, -0.046844482421875, -0.01519012451171875, 0.015106201171875, 0.0164337158203125, 0.060211181640625, 0.03363037109375, 0.0321044921875, 0.03338623046875, -0.0280303955078125, 0.0193328857421875, -0.0165863037109375, 0.00905609130859375, -0.01678466796875, -0.0384521484375, -0.00878143310546875, -0.0012416839599609375, 0.030242919921875, 0.036376953125, -0.00865936279296875, -0.0047607421875, 0.0182952880859375, 0.019287109375, -0.035247802734375, 0.00966644287109375, -0.055084228515625, -0.003627777099609375, -0.0123748779296875, -0.0183868408203125, -0.0026340484619140625, -0.047210693359375, 0.015380859375, -0.05975341796875, 0.005947113037109375, 0.0028209686279296875, 0.0762939453125, 0.011871337890625, -0.016815185546875, -0.035186767578125, -0.0165252685546875, 0.0704345703125, -0.080078125, -0.006938934326171875, 0.04541015625, 0.0031337738037109375, -0.017974853515625, -0.080322265625, -0.06317138671875, -0.0209808349609375, -0.0195770263671875, 0.006443023681640625, -0.0030345916748046875, -0.004970550537109375, 0.0035076141357421875, 0.0440673828125, -0.04229736328125, -0.03448486328125, -0.028228759765625, -0.00789642333984375, 0.03839111328125, 0.0022296905517578125, 0.0367431640625, -0.032501220703125, -0.02001953125, -0.0073394775390625, -0.01406097412109375, 0.024932861328125, -0.004314422607421875, 0.01366424560546875, 0.003849029541015625, 0.045989990234375, -0.026275634765625, 0.049713134765625, 0.0022945404052734375, -0.0059814453125, 0.027130126953125, -0.04486083984375, -0.018951416015625, -0.01690673828125, 0.07647705078125, 0.036590576171875, 0.0163726806640625, -0.01898193359375, -0.0056915283203125, -0.007526397705078125, -0.0009226799011230469, -0.0280609130859375, -0.0273590087890625, 0.0178680419921875, -0.0345458984375, -0.043182373046875, -0.006938934326171875, -0.0654296875, -0.0136871337890625, 0.004146575927734375, 0.048553466796875, -0.06488037109375, -0.0019664764404296875, -0.004825592041015625, -0.04901123046875, 0.0211944580078125, 0.00586700439453125, -0.067626953125, 0.0170135498046875, 0.0511474609375, 0.0743408203125, 0.01155853271484375, -0.0496826171875, -0.035186767578125, -0.00925445556640625, -0.0338134765625, 0.050048828125, -0.040130615234375, -0.0231170654296875, -0.01087188720703125, 0.00981903076171875, -0.0241546630859375, -0.025848388671875, 0.0640869140625, -0.0195770263671875, 0.057769775390625, -0.03118896484375, -0.06243896484375, -0.03302001953125, 0.0225982666015625, -0.026641845703125, 0.075927734375, -0.0007462501525878906, -0.06268310546875, 0.0034008026123046875, -0.043426513671875, -0.046661376953125, 0.01201629638671875, 0.003528594970703125, -0.021240234375, 0.0089569091796875, 0.048828125, 0.0273590087890625, -0.0206756591796875, 0.01059722900390625, -0.007366180419921875, -0.02252197265625, 0.04241943359375, -0.02374267578125, 0.08251953125, 0.0338134765625, -0.0262451171875, 0.0003635883331298828, -0.052490234375, 0.00756072998046875, 0.006771087646484375, -0.038177490234375, -0.02545166015625, -0.0008726119995117188, 0.01241302490234375, 0.01166534423828125, 0.0261383056640625, -0.03515625, -0.004703521728515625, -0.055023193359375, 0.06854248046875, 0.0362548828125, 0.0259857177734375, 0.03369140625, -0.053314208984375, 0.036407470703125, -0.0180511474609375, 0.01389312744140625, -0.01428985595703125, -0.058868408203125, -0.049560546875, 0.00832366943359375, 0.01375579833984375, 0.049957275390625, -0.0694580078125, 0.04608154296875, -0.00885772705078125, -0.0421142578125, -0.037139892578125, 0.0159912109375, 0.0206756591796875, 0.054901123046875, 0.046722412109375, 0.0091094970703125, -0.035430908203125, -0.047210693359375, -0.0260009765625, -0.00765228271484375, 0.00807952880859375, 0.0046844482421875, 0.0259552001953125, 0.0171966552734375, 0.058990478515625, -0.014007568359375, -0.0157012939453125, -0.045928955078125, -0.01190185546875, 0.010711669921875, 0.044097900390625, 0.0506591796875, -0.07281494140625, -0.0288238525390625, -0.005374908447265625, -0.07452392578125, 0.01039886474609375, -0.027984619140625, 0.0010251998901367188, -0.019683837890625, 0.035614013671875, -0.05718994140625, 0.01507568359375, 0.024658203125, -0.014129638671875, 0.0247802734375, -0.0261688232421875, 0.01218414306640625, -0.10028076171875, 0.006195068359375, -0.015655517578125, -0.00278472900390625, -0.044036865234375, 0.035888671875, -0.0030231475830078125, -0.0196075439453125, -0.033050537109375, 0.034576416015625, -0.01311492919921875, 0.0171356201171875, -0.0190887451171875, 0.01209259033203125, 0.01898193359375, 0.0360107421875, -0.0004062652587890625, 0.07232666015625, 0.04925537109375, -0.0679931640625, 0.0338134765625, 0.051910400390625, -0.0306854248046875, 0.0219268798828125, -0.0692138671875, -0.005901336669921875, 0.01323699951171875, 0.0239105224609375, -0.06475830078125, -0.009735107421875, 0.018524169921875, -0.04156494140625, -0.0208892822265625, 0.019683837890625, -0.0193939208984375, -0.031768798828125, -0.03363037109375, 0.005039215087890625, 0.047760009765625, -0.0345458984375, 0.02374267578125, 0.0156402587890625, -0.013916015625, -0.04443359375, -0.062255859375, 0.03985595703125, -0.029754638671875, -0.03619384765625, 0.01001739501953125, -0.004726409912109375, -0.006046295166015625, 0.000026345252990722656, 0.0223388671875, -0.019256591796875, 0.01288604736328125, -0.0013437271118164062, -0.005680084228515625, -0.01084136962890625, 0.00533294677734375, -0.007595062255859375, 0.01415252685546875, -0.01025390625, -0.0162506103515625, 0.045989990234375, -0.029296875, 0.0134124755859375, -0.0443115234375, 0.033050537109375, 0.04107666015625, -0.033294677734375, 0.0482177734375, 0.048065185546875, -0.007015228271484375, 0.015777587890625, -0.04156494140625, -0.00461578369140625, -0.036590576171875, 0.01922607421875, -0.034881591796875, -0.0308990478515625, 0.0304412841796875, 0.021240234375, 0.01146697998046875, 0.039459228515625, 0.031036376953125, -0.019195556640625, 0.049896240234375, 0.0177154541015625, -0.0026569366455078125, 0.032684326171875, -0.029205322265625, -0.004772186279296875, -0.06353759765625, -0.0127410888671875, -0.036865234375, -0.0206451416015625, -0.03851318359375, -0.02691650390625, 0.01021575927734375, 0.021240234375, 0.01177978515625, 0.040283203125, -0.047210693359375, 0.0278778076171875, 0.07794189453125, 0.00530242919921875, -0.0007538795471191406, -0.00421905517578125, -0.0095977783203125, 0.00959014892578125, -0.061309814453125, -0.0426025390625, 0.08135986328125, 0.00011396408081054688, 0.0330810546875, -0.0093994140625, 0.05914306640625, 0.006465911865234375, -0.02886962890625, -0.039276123046875, 0.0615234375, -0.0127105712890625, -0.047882080078125, -0.033294677734375, -0.020263671875, -0.08251953125, 0.02288818359375, -0.0156402587890625, -0.04595947265625, 0.0269775390625, 0.00524139404296875, -0.035736083984375, 0.01715087890625, -0.032440185546875, 0.06280517578125, -0.01800537109375, -0.023345947265625, -0.0002181529998779297, -0.057159423828125, 0.023712158203125, 0.0082244873046875, 0.01611328125, 0.001972198486328125, 0.0005092620849609375, 0.06878662109375, -0.062469482421875, 0.0265655517578125, -0.010955810546875, 0.025634765625, 0.040496826171875, -0.00963592529296875, 0.02911376953125, -0.0021800994873046875, -0.0281524658203125, -0.00014734268188476562, 0.0146026611328125, -0.033172607421875, -0.040985107421875, 0.05419921875, -0.055389404296875, -0.031982421875, -0.0308837890625, -0.04559326171875, -0.0037326812744140625, 0.0131683349609375, 0.0458984375, 0.028350830078125, -0.00669097900390625, 0.043792724609375, 0.02142333984375, -0.0333251953125, 0.04913330078125, 0.005168914794921875, -0.001613616943359375, -0.046478271484375, 0.08148193359375, -0.00412750244140625, 0.006122589111328125, 0.04156494140625, 0.04278564453125, -0.0228424072265625, -0.048614501953125, -0.01389312744140625, 0.01088714599609375, -0.05303955078125, -0.002696990966796875, -0.07537841796875, -0.0116729736328125, -0.060821533203125, -0.01505279541015625, 0.007114410400390625, -0.054840087890625, -0.030975341796875, -0.00585174560546875, 0.048492431640625, 0.031219482421875, -0.007640838623046875, 0.02642822265625, -0.043701171875, 0.034393310546875, 0.036865234375, -0.033050537109375, -0.007053375244140625, -0.044189453125, -0.002330780029296875, 0.0113372802734375, -0.045440673828125, -0.060577392578125, 0.053466796875, 0.052947998046875, 0.04638671875, 0.034423828125, 0.0066680908203125, 0.059295654296875, -0.0290985107421875, 0.097412109375, 0.0047454833984375, -0.0718994140625, 0.052825927734375, -0.0094757080078125, 0.01561737060546875, 0.051727294921875, 0.01467132568359375, -0.039154052734375, -0.060638427734375, -0.050628662109375, -0.0706787109375, 0.035003662109375, 0.041412353515625, 0.03167724609375, -0.0293426513671875, 0.031646728515625, 0.006072998046875, 0.024688720703125, -0.0767822265625, -0.01082611083984375, -0.049957275390625, -0.01482391357421875, -0.027679443359375, -0.003448486328125, 0.0053253173828125, -0.0295867919921875, 0.049530029296875, 0.018524169921875, 0.0158233642578125, 0.022796630859375, -0.019378662109375, 0.015899658203125, 0.0244140625, 0.053070068359375, 0.04681396484375, -0.0179290771484375, 0.006275177001953125, 0.035003662109375, -0.02911376953125, 0.005954742431640625, -0.0006594657897949219, -0.02593994140625, 0.0151214599609375, 0.03985595703125, 0.06451416015625, 0.0113983154296875, -0.04718017578125, 0.042510986328125, -0.00040721893310546875, -0.0208892822265625, -0.0147857666015625, 0.01348876953125, 0.01251983642578125, 0.010040283203125, 0.033294677734375, 0.009521484375, 0.0258636474609375, -0.056610107421875, 0.0150909423828125, 0.027557373046875, -0.004962921142578125, -0.004497528076171875, 0.06689453125, 0.025054931640625, -0.04052734375, 0.048614501953125, -0.008819580078125, -0.04534912109375, 0.03851318359375, 0.0404052734375, 0.05474853515625, -0.013641357421875, 0.01800537109375, 0.0204620361328125, 0.0261077880859375, -0.009307861328125, 0.0182037353515625, -0.0160675048828125, -0.041015625, -0.007633209228515625, -0.056549072265625, -0.0156707763671875, 0.026092529296875, -0.04290771484375, 0.01149749755859375, -0.0291595458984375, -0.0079193115234375, -0.00507354736328125, -0.01068878173828125, -0.04010009765625, 0.03411865234375, -0.01215362548828125, 0.058197021484375, -0.051788330078125, 0.059173583984375, 0.0467529296875, -0.061798095703125, -0.06036376953125, -0.0008616447448730469, -0.033203125, -0.060089111328125, 0.031402587890625, 0.0048675537109375, 0.0248260498046875, 0.0025501251220703125, -0.0215911865234375, -0.050506591796875, 0.07196044921875, 0.00395965576171875, -0.01715087890625, -0.0265655517578125, 0.0083465576171875, 0.0379638671875, -0.015350341796875, 0.033172607421875, 0.037872314453125, 0.0269775390625, -0.0028839111328125, -0.06658935546875, 0.006549835205078125, -0.026641845703125, 0.010711669921875, -0.02667236328125, -0.0389404296875, 0.09613037109375, -0.00001919269561767578, 0.0151214599609375, 0.037109375, 0.046875, 0.005268096923828125, 0.0297393798828125, 0.0347900390625, 0.043121337890625, 0.051971435546875, 0.0091705322265625, 0.09637451171875, -0.0309906005859375, 0.018646240234375, 0.0799560546875, 0.01007843017578125, 0.0767822265625, 0.039215087890625, -0.01873779296875, 0.0286865234375, 0.058685302734375, -0.00555419921875, 0.05682373046875, -0.0120849609375, -0.00115966796875, -0.006195068359375, 0.01161956787109375, -0.0458984375, 0.0340576171875, 0.02020263671875, -0.03533935546875, -0.01055145263671875, 0.03570556640625, -0.009185791015625, 0.00788116455078125, -0.0018825531005859375, 0.07220458984375, 0.00919342041015625, -0.05780029296875, 0.0706787109375, -0.005573272705078125, 0.05645751953125, -0.04119873046875, 0.011383056640625, -0.032470703125, 0.006500244140625, -0.0125274658203125, -0.041259765625, 0.011871337890625, -0.01479339599609375, 0.01215362548828125, 0.00850677490234375, 0.0273895263671875, -0.03619384765625, -0.04742431640625, 0.0016374588012695312, 0.0226898193359375, 0.03363037109375, -0.018829345703125, -0.08404541015625, -0.012908935546875, 0.007366180419921875, -0.015838623046875, 0.0004949569702148438, 0.042022705078125, 0.01181793212890625, 0.058685302734375, 0.04632568359375, 0.0224151611328125, 0.0201416015625, -0.0155029296875, 0.0494384765625, -0.046112060546875, -0.05584716796875, -0.0626220703125, 0.0200347900390625, -0.0251922607421875, -0.038055419921875, 0.0943603515625, 0.04559326171875, 0.0269775390625, -0.0148162841796875, 0.0694580078125, -0.0094146728515625, 0.039764404296875, -0.033721923828125, 0.04718017578125, -0.048370361328125, 0.009429931640625, -0.031158447265625, -0.0745849609375, -0.003383636474609375, 0.08123779296875, -0.0284271240234375, -0.0100555419921875, 0.056549072265625, 0.060089111328125, -0.003490447998046875, 0.001064300537109375, 0.025238037109375, 0.0027866363525390625, 0.029083251953125, 0.042205810546875, 0.0673828125, -0.06719970703125, 0.06744384765625, -0.0323486328125, -0.0010919570922851562, -0.031158447265625, -0.04083251953125, -0.052490234375, -0.055511474609375, -0.031829833984375, -0.0188140869140625, 0.005832672119140625, 0.060882568359375, 0.042144775390625, -0.06494140625, -0.01458740234375, -0.04486083984375, 0.005558013916015625, -0.042327880859375, -0.025054931640625, 0.0286407470703125, -0.055572509765625, -0.06103515625, 0.042388916015625, -0.00466156005859375, 0.007579803466796875, 0.0099334716796875, 0.00846099853515625, -0.04931640625, 0.03204345703125, 0.034271240234375, 0.0294952392578125, -0.07086181640625, -0.017913818359375, -0.0060272216796875, -0.00319671630859375, -0.0017614364624023438, 0.03668212890625, -0.050628662109375, 0.0304718017578125, 0.037078857421875, 0.044647216796875, 0.0227203369140625, 0.0133209228515625, 0.04156494140625, -0.04156494140625, 0.0231475830078125, 0.02093505859375, 0.03289794921875, 0.03363037109375, -0.01479339599609375, 0.0251007080078125, 0.00843048095703125, -0.06036376953125, -0.0640869140625, -0.0186614990234375, -0.0787353515625, -0.03961181640625, 0.0823974609375, -0.0207366943359375, -0.020782470703125, -0.0200653076171875, -0.029815673828125, 0.019500732421875, -0.0206756591796875, 0.05963134765625, 0.047882080078125, -0.00754547119140625, 0.00199127197265625, -0.0255279541015625, 0.04962158203125, 0.04766845703125, -0.053466796875, -0.0089263916015625, 0.0142974853515625, 0.035308837890625, 0.0177459716796875, 0.03973388671875, 0.01248931884765625, 0.03094482421875, 0.01177978515625, -0.004833221435546875, -0.017425537109375, 0.0179595947265625, -0.016815185546875, 0.02874755859375, 0.0014543533325195312, -0.054840087890625 ] ]
laion/CLIP-ViT-bigG-14-laion2B-39B-b160k
2023-04-18T18:35:30.000Z
[ "open_clip", "pytorch", "clip", "zero-shot-image-classification", "arxiv:1910.04867", "license:mit", "has_space", "region:us" ]
zero-shot-image-classification
laion
null
null
laion/CLIP-ViT-bigG-14-laion2B-39B-b160k
130
256,043
open_clip
2023-01-23T07:12:35
--- license: mit widget: - src: >- https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png candidate_labels: playing music, playing sports example_title: Cat & Dog library_name: open_clip pipeline_tag: zero-shot-image-classification --- # Model Card for CLIP ViT-bigG/14 - LAION-2B # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) 7. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description A CLIP ViT-bigG/14 model trained with the LAION-2B English subset of LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip). Model training done by Mitchell Wortsman on the [stability.ai](https://stability.ai/) cluster. The license for this model is MIT. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases. Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below. # Training Details ## Training Data This model was trained with the 2 Billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/). Fine-tuning was also partially done on LAION-A, a 900M subset of LAION-2B filtered with aesthetic V2 4.5+ and phash deduplicated. **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure The training procedure will soon be discussed by a blog post on laion.ai. # Evaluation Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval. **TODO** - more detail ## Results The model achieves a 80.1 zero-shot top-1 accuracy on ImageNet-1k. An initial round of benchmarks have been performed on a wider range of datasets, and will soon be visible at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb **TODO** - create table for just this model's metrics. # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model. # Citation **BibTeX:** LAION-5B ```bibtex @inproceedings{schuhmann2022laionb, title={{LAION}-5B: An open large-scale dataset for training next generation image-text models}, author={Christoph Schuhmann and Romain Beaumont and Richard Vencu and Cade W Gordon and Ross Wightman and Mehdi Cherti and Theo Coombes and Aarush Katta and Clayton Mullis and Mitchell Wortsman and Patrick Schramowski and Srivatsa R Kundurthy and Katherine Crowson and Ludwig Schmidt and Robert Kaczmarczyk and Jenia Jitsev}, booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, year={2022}, url={https://openreview.net/forum?id=M3Y74vmsMcY} } ``` OpenAI CLIP paper ``` @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` OpenCLIP software ``` @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` Scaling OpenCLIP paper ``` @article{cherti2022reproducible, title={Reproducible scaling laws for contrastive language-image learning}, author={Cherti, Mehdi and Beaumont, Romain and Wightman, Ross and Wortsman, Mitchell and Ilharco, Gabriel and Gordon, Cade and Schuhmann, Christoph and Schmidt, Ludwig and Jitsev, Jenia}, journal={arXiv preprint arXiv:2212.07143}, year={2022} } ``` # How to Get Started with the Model Use the code below to get started with the model. ** TODO ** - Hugging Face transformers, OpenCLIP, and timm getting started snippets
8,652
[ [ -0.0220489501953125, -0.04193115234375, 0.0168304443359375, 0.004634857177734375, -0.029754638671875, -0.036773681640625, -0.02197265625, -0.0516357421875, -0.00046634674072265625, 0.034393310546875, -0.0281524658203125, -0.0435791015625, -0.048583984375, -0.005580902099609375, -0.0287017822265625, 0.0699462890625, -0.0163726806640625, 0.000057756900787353516, -0.018798828125, -0.03558349609375, -0.035614013671875, -0.039794921875, -0.0364990234375, 0.00476837158203125, 0.013092041015625, 0.01983642578125, 0.046173095703125, 0.0677490234375, 0.056304931640625, 0.018646240234375, -0.0088653564453125, -0.0019159317016601562, -0.04345703125, -0.0374755859375, -0.0035991668701171875, -0.02471923828125, -0.04351806640625, 0.01154327392578125, 0.048004150390625, 0.0277099609375, -0.004634857177734375, 0.0187835693359375, 0.0008554458618164062, 0.03546142578125, -0.06085205078125, 0.0178375244140625, -0.045623779296875, -0.0026874542236328125, -0.01485443115234375, 0.015625, -0.02777099609375, -0.01045989990234375, 0.01148223876953125, -0.055145263671875, 0.013824462890625, -0.0121002197265625, 0.10821533203125, 0.01776123046875, -0.0238189697265625, 0.0134124755859375, -0.05218505859375, 0.0606689453125, -0.05645751953125, 0.0265350341796875, 0.0289459228515625, 0.027130126953125, 0.0190277099609375, -0.066162109375, -0.034698486328125, -0.01470947265625, 0.0083465576171875, 0.0219268798828125, -0.0239715576171875, 0.0016698837280273438, 0.033843994140625, 0.015167236328125, -0.0238189697265625, 0.0034923553466796875, -0.050750732421875, -0.0014801025390625, 0.0496826171875, -0.0027370452880859375, 0.02252197265625, -0.0203704833984375, -0.05120849609375, -0.035125732421875, -0.047210693359375, 0.028778076171875, 0.017974853515625, -0.0028133392333984375, -0.036285400390625, 0.0313720703125, 0.00434112548828125, 0.03289794921875, -0.00278472900390625, -0.0224761962890625, 0.03753662109375, -0.0350341796875, -0.0219268798828125, -0.016448974609375, 0.085205078125, 0.0504150390625, 0.01023101806640625, 0.01007080078125, -0.0005555152893066406, -0.00962066650390625, 0.0199127197265625, -0.07708740234375, -0.01262664794921875, -0.0035991668701171875, -0.050445556640625, -0.02447509765625, 0.032257080078125, -0.060455322265625, -0.0003376007080078125, -0.01300048828125, 0.03857421875, -0.039764404296875, -0.01861572265625, -0.0011548995971679688, -0.0025272369384765625, 0.0175628662109375, 0.020843505859375, -0.047698974609375, 0.01483917236328125, 0.0269012451171875, 0.0804443359375, -0.015960693359375, -0.0340576171875, -0.01959228515625, 0.01276397705078125, -0.027008056640625, 0.03558349609375, -0.0171966552734375, -0.0208587646484375, -0.007656097412109375, 0.02813720703125, -0.00494384765625, -0.04400634765625, 0.0501708984375, -0.019683837890625, 0.0017251968383789062, -0.0122833251953125, -0.0167388916015625, -0.04119873046875, 0.009613037109375, -0.05413818359375, 0.06903076171875, -0.0007677078247070312, -0.06610107421875, 0.0257110595703125, -0.041961669921875, -0.0152740478515625, -0.0167388916015625, -0.005069732666015625, -0.04766845703125, -0.0176849365234375, 0.037994384765625, 0.040069580078125, -0.0240631103515625, 0.0330810546875, -0.0439453125, -0.0247039794921875, 0.017852783203125, -0.034820556640625, 0.0711669921875, 0.0013704299926757812, -0.0284881591796875, 0.0143280029296875, -0.049530029296875, -0.0103302001953125, 0.01763916015625, 0.0007634162902832031, -0.0185546875, -0.015228271484375, -0.00112152099609375, 0.0178070068359375, 0.01346588134765625, -0.0439453125, -0.0034236907958984375, -0.00982666015625, 0.03692626953125, 0.05682373046875, 0.007061004638671875, 0.0243682861328125, -0.02972412109375, 0.040374755859375, 0.01158905029296875, 0.040069580078125, -0.0240478515625, -0.036865234375, -0.058746337890625, -0.044891357421875, 0.033905029296875, 0.04193115234375, -0.049652099609375, 0.03350830078125, -0.022918701171875, -0.038787841796875, -0.0300750732421875, -0.005596160888671875, 0.037872314453125, 0.038848876953125, 0.0340576171875, -0.0313720703125, -0.036468505859375, -0.06494140625, 0.0145721435546875, -0.0005249977111816406, -0.0019140243530273438, 0.04742431640625, 0.056671142578125, -0.0115814208984375, 0.065185546875, -0.05126953125, -0.038238525390625, -0.00652313232421875, 0.00522613525390625, 0.00971221923828125, 0.035919189453125, 0.06585693359375, -0.06658935546875, -0.03936767578125, -0.0036525726318359375, -0.08575439453125, 0.005550384521484375, -0.0011749267578125, -0.0181121826171875, 0.01532745361328125, 0.04803466796875, -0.04180908203125, 0.0496826171875, 0.034698486328125, 0.004150390625, 0.037384033203125, -0.01288604736328125, 0.003940582275390625, -0.088623046875, 0.026519775390625, 0.013031005859375, -0.0140533447265625, -0.037628173828125, 0.0019159317016601562, 0.005039215087890625, -0.0251312255859375, -0.057464599609375, 0.0428466796875, -0.0285186767578125, 0.0088043212890625, -0.002796173095703125, 0.003040313720703125, 0.005016326904296875, 0.044158935546875, 0.005924224853515625, 0.066650390625, 0.05682373046875, -0.048583984375, 0.003406524658203125, 0.024017333984375, -0.02685546875, 0.03350830078125, -0.074462890625, 0.0011129379272460938, -0.007080078125, 0.01412200927734375, -0.035064697265625, -0.028961181640625, 0.02734375, -0.033538818359375, 0.026641845703125, -0.025482177734375, -0.0169525146484375, -0.03106689453125, -0.045654296875, 0.042236328125, 0.053436279296875, -0.0513916015625, 0.0247650146484375, 0.035125732421875, 0.00873565673828125, -0.057708740234375, -0.051727294921875, -0.0188446044921875, -0.024322509765625, -0.0516357421875, 0.031219482421875, -0.0054473876953125, -0.0017404556274414062, 0.00351715087890625, 0.0104217529296875, -0.01551055908203125, -0.00954437255859375, 0.05047607421875, 0.038604736328125, -0.007343292236328125, -0.007274627685546875, -0.00720977783203125, -0.0013599395751953125, -0.0022869110107421875, -0.01386260986328125, 0.0160369873046875, -0.0181732177734375, -0.020233154296875, -0.045623779296875, 0.014007568359375, 0.04736328125, -0.03228759765625, 0.058837890625, 0.058990478515625, -0.03460693359375, 0.005767822265625, -0.0269012451171875, -0.002864837646484375, -0.036590576171875, 0.03955078125, -0.005077362060546875, -0.052642822265625, 0.041839599609375, 0.0091705322265625, -0.006420135498046875, 0.0474853515625, 0.0274658203125, -0.006969451904296875, 0.06640625, 0.0679931640625, -0.0034656524658203125, 0.051483154296875, -0.053131103515625, 0.01361083984375, -0.072998046875, -0.0238189697265625, -0.0180816650390625, -0.009735107421875, -0.043853759765625, -0.041412353515625, 0.047271728515625, 0.0286865234375, -0.013671875, 0.030609130859375, -0.027923583984375, 0.0279388427734375, 0.039703369140625, 0.027679443359375, 0.0033283233642578125, -0.001537322998046875, -0.00024366378784179688, -0.0098876953125, -0.049957275390625, -0.031585693359375, 0.08990478515625, 0.0465087890625, 0.060089111328125, -0.00449371337890625, 0.033203125, 0.01129150390625, 0.0061187744140625, -0.053192138671875, 0.04833984375, -0.034759521484375, -0.04229736328125, -0.027252197265625, -0.032196044921875, -0.06890869140625, 0.002864837646484375, -0.006622314453125, -0.058197021484375, 0.03143310546875, 0.004608154296875, -0.021697998046875, 0.038848876953125, -0.04547119140625, 0.0733642578125, -0.0247955322265625, -0.0254364013671875, 0.003452301025390625, -0.05572509765625, 0.037872314453125, 0.0119171142578125, 0.0068511962890625, -0.0104827880859375, 0.0067138671875, 0.0784912109375, -0.0460205078125, 0.07318115234375, -0.01311492919921875, 0.015869140625, 0.049530029296875, -0.0175628662109375, 0.0135955810546875, 0.012237548828125, 0.0083160400390625, 0.0546875, 0.00376129150390625, -0.0164642333984375, -0.027130126953125, 0.03741455078125, -0.0743408203125, -0.0198974609375, -0.032989501953125, -0.036712646484375, 0.0172576904296875, 0.031341552734375, 0.050750732421875, 0.041107177734375, -0.00980377197265625, 0.0297088623046875, 0.046875, -0.02423095703125, 0.043670654296875, 0.0169219970703125, -0.011077880859375, -0.056732177734375, 0.0784912109375, 0.025482177734375, 0.0238189697265625, 0.00936126708984375, 0.0093536376953125, -0.00399017333984375, -0.03399658203125, -0.03814697265625, 0.0272979736328125, -0.05865478515625, -0.0278778076171875, -0.035308837890625, -0.033233642578125, -0.029571533203125, -0.005962371826171875, -0.03643798828125, -0.0160369873046875, -0.0455322265625, -0.0038967132568359375, 0.0260009765625, 0.04541015625, -0.00859832763671875, 0.0238494873046875, -0.06243896484375, 0.022918701171875, 0.019744873046875, 0.0343017578125, 0.00024771690368652344, -0.05181884765625, -0.02447509765625, 0.01285552978515625, -0.039215087890625, -0.044952392578125, 0.0240631103515625, 0.024444580078125, 0.0369873046875, 0.04986572265625, 0.0089111328125, 0.03839111328125, -0.032073974609375, 0.07928466796875, 0.027618408203125, -0.061614990234375, 0.03985595703125, -0.04547119140625, 0.0191802978515625, 0.0484619140625, 0.0562744140625, -0.0140838623046875, 0.0029201507568359375, -0.053131103515625, -0.0721435546875, 0.0703125, 0.009307861328125, 0.003681182861328125, 0.01276397705078125, 0.028961181640625, -0.00048732757568359375, 0.010498046875, -0.07757568359375, -0.0023651123046875, -0.03570556640625, -0.00839996337890625, 0.01247406005859375, -0.0251007080078125, -0.0160980224609375, -0.03143310546875, 0.057708740234375, -0.01959228515625, 0.0450439453125, 0.01641845703125, -0.011138916015625, -0.00514984130859375, -0.0018711090087890625, 0.0413818359375, 0.048126220703125, -0.030029296875, -0.0126953125, 0.00848388671875, -0.0523681640625, -0.004497528076171875, 0.00904083251953125, -0.047943115234375, -0.01224517822265625, 0.036529541015625, 0.09515380859375, 0.005939483642578125, -0.051055908203125, 0.07183837890625, -0.00498199462890625, -0.033050537109375, -0.0244293212890625, 0.00460052490234375, -0.0222015380859375, 0.01335906982421875, 0.00792694091796875, 0.01239776611328125, 0.005008697509765625, -0.03875732421875, 0.01346588134765625, 0.031463623046875, -0.042144775390625, -0.0313720703125, 0.060455322265625, -0.0015468597412109375, -0.006252288818359375, 0.045623779296875, -0.01023101806640625, -0.037841796875, 0.05242919921875, 0.040252685546875, 0.069580078125, -0.0007748603820800781, 0.0240936279296875, 0.055450439453125, 0.0243072509765625, -0.019989013671875, 0.006679534912109375, 0.0128021240234375, -0.042724609375, -0.009246826171875, -0.031036376953125, -0.023895263671875, 0.02374267578125, -0.06536865234375, 0.0389404296875, -0.05426025390625, -0.02777099609375, -0.0127716064453125, -0.03546142578125, -0.03253173828125, 0.0150604248046875, 0.0120849609375, 0.06402587890625, -0.06512451171875, 0.05035400390625, 0.055389404296875, -0.05859375, -0.064453125, 0.012725830078125, -0.0142669677734375, -0.0306549072265625, 0.031524658203125, 0.03668212890625, -0.0023555755615234375, -0.0286102294921875, -0.06964111328125, -0.08026123046875, 0.1053466796875, 0.03948974609375, -0.0190277099609375, -0.003963470458984375, 0.0020751953125, 0.0350341796875, -0.0169525146484375, 0.026611328125, 0.0178680419921875, 0.0152130126953125, 0.011444091796875, -0.07647705078125, -0.0006809234619140625, -0.024444580078125, 0.01511383056640625, 0.0035724639892578125, -0.08734130859375, 0.078125, -0.0176239013671875, -0.01953125, 0.005462646484375, 0.055389404296875, 0.00135040283203125, 0.0248260498046875, 0.0288543701171875, 0.05450439453125, 0.0430908203125, -0.00439453125, 0.076904296875, -0.01424407958984375, 0.0281829833984375, 0.08734130859375, -0.007457733154296875, 0.0738525390625, 0.01995849609375, -0.01715087890625, 0.03143310546875, 0.030731201171875, -0.030120849609375, 0.04815673828125, -0.0231475830078125, 0.00977325439453125, -0.003925323486328125, -0.0299224853515625, -0.0304107666015625, 0.041015625, 0.003658294677734375, -0.032989501953125, 0.0010089874267578125, 0.0249786376953125, 0.005985260009765625, -0.01131439208984375, -0.01473236083984375, 0.040313720703125, 0.014495849609375, -0.0364990234375, 0.065673828125, 0.006153106689453125, 0.054443359375, -0.051055908203125, -0.003086090087890625, -0.0027179718017578125, 0.0230865478515625, -0.014068603515625, -0.0576171875, 0.0201568603515625, -0.0006456375122070312, -0.01751708984375, -0.00717926025390625, 0.054840087890625, -0.01497650146484375, -0.03741455078125, 0.032867431640625, -0.0045318603515625, 0.01219940185546875, 0.003414154052734375, -0.0462646484375, 0.01007080078125, 0.0003075599670410156, -0.007373809814453125, 0.0296783447265625, 0.0194549560546875, -0.0183868408203125, 0.048431396484375, 0.03863525390625, -0.0117950439453125, 0.0142059326171875, -0.00749969482421875, 0.07183837890625, -0.03057861328125, -0.035980224609375, -0.046478271484375, 0.042938232421875, -0.01324462890625, -0.035491943359375, 0.05767822265625, 0.03948974609375, 0.07989501953125, -0.0114898681640625, 0.055145263671875, -0.01715087890625, 0.0180816650390625, -0.049591064453125, 0.056488037109375, -0.0484619140625, 0.00582122802734375, -0.03558349609375, -0.05633544921875, -0.017120361328125, 0.045623779296875, -0.0215911865234375, 0.0098876953125, 0.05706787109375, 0.055419921875, -0.01947021484375, -0.0062713623046875, 0.01727294921875, 0.0193328857421875, 0.02069091796875, 0.035308837890625, 0.038238525390625, -0.059417724609375, 0.05010986328125, -0.05157470703125, -0.0247955322265625, -0.010467529296875, -0.0633544921875, -0.08526611328125, -0.04888916015625, -0.0298614501953125, -0.018280029296875, -0.0011272430419921875, 0.05096435546875, 0.0718994140625, -0.057037353515625, -0.024017333984375, 0.009521484375, -0.012603759765625, -0.0207061767578125, -0.0157928466796875, 0.040924072265625, 0.0159149169921875, -0.041534423828125, 0.010833740234375, 0.01371002197265625, 0.02105712890625, -0.00432586669921875, -0.0038852691650390625, -0.0304107666015625, -0.006900787353515625, 0.0340576171875, 0.030853271484375, -0.04278564453125, -0.0157470703125, 0.008148193359375, 0.004177093505859375, 0.024932861328125, 0.041351318359375, -0.03851318359375, 0.031768798828125, 0.034515380859375, 0.03887939453125, 0.051483154296875, 0.0113677978515625, 0.0170135498046875, -0.047637939453125, 0.0308074951171875, 0.0030155181884765625, 0.0261383056640625, 0.023834228515625, -0.02947998046875, 0.0506591796875, 0.031463623046875, -0.03582763671875, -0.0655517578125, -0.0038127899169921875, -0.08587646484375, -0.0084991455078125, 0.08984375, -0.0389404296875, -0.03692626953125, 0.03070068359375, -0.018096923828125, 0.0281524658203125, -0.0270538330078125, 0.03192138671875, 0.0355224609375, 0.0024566650390625, -0.03338623046875, -0.056182861328125, 0.0248565673828125, 0.01349639892578125, -0.068603515625, -0.00864410400390625, 0.030548095703125, 0.0283355712890625, 0.0204315185546875, 0.038787841796875, -0.0242462158203125, 0.025146484375, -0.0014095306396484375, 0.0185546875, -0.02288818359375, -0.04888916015625, -0.03790283203125, 0.0033626556396484375, -0.00994873046875, -0.03033447265625 ] ]
PlanTL-GOB-ES/roberta-base-bne
2023-01-31T13:59:59.000Z
[ "transformers", "pytorch", "roberta", "fill-mask", "national library of spain", "spanish", "bne", "roberta-base-bne", "es", "dataset:bne", "arxiv:1907.11692", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
PlanTL-GOB-ES
null
null
PlanTL-GOB-ES/roberta-base-bne
20
255,041
transformers
2022-03-02T23:29:04
--- language: - es license: apache-2.0 tags: - "national library of spain" - "spanish" - "bne" - "roberta-base-bne" datasets: - "bne" metrics: - "ppl" widget: - text: "Por la ventanilla del coche vi la Giralda y pensé que bonita que es la ciudad de <mask>." - text: "Más vale <mask> que lamentar." - text: "Caminante no hay camino, se hace camino al <mask>." - text: "Tengo una pelota roja y otra amarilla. Si le doy la roja a Jose, sólo me queda la <mask>." - text: "Tengo una pelota roja y otra amarilla. Si le doy la amarilla a Jose, sólo me queda la <mask>." - text: "El <mask> es el pico más alto de España." --- # RoBERTa base trained with data from the National Library of Spain (BNE) ## Table of Contents <details> <summary>Click to expand</summary> - [Overview](#overview) - [Model description](#model-description) - [Intended uses and limitations](#intended-uses-and-limitations) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citation Information](#citation-information) - [Disclaimer](#disclaimer) </details> ## Overview - **Architecture:** roberta-base - **Language:** Spanish - **Task:** fill-mask - **Data:** BNE ## Model description The **roberta-base-bne** is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. ## Intended uses and limitations The **roberta-base-bne** model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition. You can use the raw model for fill mask or fine-tune it to a downstream task. ## How to use Here is how to use this model: ```python >>> from transformers import pipeline >>> from pprint import pprint >>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-base-bne') >>> pprint(unmasker("Gracias a los datos de la BNE se ha podido <mask> este modelo del lenguaje.")) [{'score': 0.08422081917524338, 'token': 3832, 'token_str': ' desarrollar', 'sequence': 'Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje.'}, {'score': 0.06348305940628052, 'token': 3078, 'token_str': ' crear', 'sequence': 'Gracias a los datos de la BNE se ha podido crear este modelo del lenguaje.'}, {'score': 0.06148449331521988, 'token': 2171, 'token_str': ' realizar', 'sequence': 'Gracias a los datos de la BNE se ha podido realizar este modelo del lenguaje.'}, {'score': 0.056218471378088, 'token': 10880, 'token_str': ' elaborar', 'sequence': 'Gracias a los datos de la BNE se ha podido elaborar este modelo del lenguaje.'}, {'score': 0.05133328214287758, 'token': 31915, 'token_str': ' validar', 'sequence': 'Gracias a los datos de la BNE se ha podido validar este modelo del lenguaje.'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python >>> from transformers import RobertaTokenizer, RobertaModel >>> tokenizer = RobertaTokenizer.from_pretrained('PlanTL-GOB-ES/roberta-base-bne') >>> model = RobertaModel.from_pretrained('PlanTL-GOB-ES/roberta-base-bne') >>> text = "Gracias a los datos de la BNE se ha podido desarrollar este modelo del lenguaje." >>> encoded_input = tokenizer(text, return_tensors='pt') >>> output = model(**encoded_input) >>> print(output.last_hidden_state.shape) torch.Size([1, 19, 768]) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions: ```python >>> from transformers import pipeline, set_seed >>> from pprint import pprint >>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/roberta-base-bne') >>> set_seed(42) >>> pprint(unmasker("Antonio está pensando en <mask>.")) [{'score': 0.07950365543365479, 'sequence': 'Antonio está pensando en ti.', 'token': 486, 'token_str': ' ti'}, {'score': 0.03375273942947388, 'sequence': 'Antonio está pensando en irse.', 'token': 13134, 'token_str': ' irse'}, {'score': 0.031026942655444145, 'sequence': 'Antonio está pensando en casarse.', 'token': 24852, 'token_str': ' casarse'}, {'score': 0.030703715980052948, 'sequence': 'Antonio está pensando en todo.', 'token': 665, 'token_str': ' todo'}, {'score': 0.02838558703660965, 'sequence': 'Antonio está pensando en ello.', 'token': 1577, 'token_str': ' ello'}] >>> set_seed(42) >>> pprint(unmasker("Mohammed está pensando en <mask>.")) [{'score': 0.05433618649840355, 'sequence': 'Mohammed está pensando en morir.', 'token': 9459, 'token_str': ' morir'}, {'score': 0.0400255024433136, 'sequence': 'Mohammed está pensando en irse.', 'token': 13134, 'token_str': ' irse'}, {'score': 0.03705748915672302, 'sequence': 'Mohammed está pensando en todo.', 'token': 665, 'token_str': ' todo'}, {'score': 0.03658654913306236, 'sequence': 'Mohammed está pensando en quedarse.', 'token': 9331, 'token_str': ' quedarse'}, {'score': 0.03329474478960037, 'sequence': 'Mohammed está pensando en ello.', 'token': 1577, 'token_str': ' ello'}] ``` ## Training ### Training data The [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text. Some of the statistics of the corpus: | Corpora | Number of documents | Number of tokens | Size (GB) | |---------|---------------------|------------------|-----------| | BNE | 201,080,084 | 135,733,450,668 | 570GB | ### Training procedure The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [RoBERTA](https://arxiv.org/abs/1907.11692) model with a vocabulary size of 50,262 tokens. The **roberta-base-bne** pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The training lasted a total of 48 hours with 16 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM. ## Evaluation When fine-tuned on downstream tasks, this model achieves the following results: | Dataset | Metric | [**RoBERTa-base**](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) | |--------------|----------|------------| | MLDoc | F1 | 0.9664 | | CoNLL-NERC | F1 | 0.8851 | | CAPITEL-NERC | F1 | 0.8960 | | PAWS-X | F1 | 0.9020 | | UD-POS | F1 | 0.9907 | | CAPITEL-POS | F1 | 0.9846 | | SQAC | F1 | 0.7923 | | STS | Combined | 0.8533 | | XNLI | Accuracy | 0.8016 | For more evaluation details visit our [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish) or [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405). ## Additional information ### Author Text Mining Unit (TeMU) from Barcelona Supercomputing Center (<bsc-temu@bsc.es>). ### Contact information For further information, send an email to <plantl-gob-es@bsc.es>. ### Copyright Copyright by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx). ### Licensing information This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://portal.mineco.gob.es/en-us/digitalizacionIA/Pages/sedia.aspx) within the framework of the Plan-TL. ### Citation information If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405): ``` @article{, title = {MarIA: Spanish Language Models}, author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas}, doi = {10.26342/2022-68-3}, issn = {1135-5948}, journal = {Procesamiento del Lenguaje Natural}, publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural}, url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley}, volume = {68}, year = {2022}, } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA) nor the creator (BSC) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de Inteligencia Artificial. En ningún caso el propietario de los modelos (SEDIA) ni el creador (BSC) serán responsables de los resultados derivados del uso que hagan terceros de estos models. </details>
11,565
[ [ -0.032318115234375, -0.0435791015625, 0.01161956787109375, 0.0180816650390625, -0.0201568603515625, 0.0029430389404296875, -0.0286865234375, -0.02105712890625, 0.0234527587890625, 0.04034423828125, -0.038421630859375, -0.06622314453125, -0.062255859375, 0.016448974609375, -0.0226593017578125, 0.088623046875, 0.01346588134765625, 0.01175689697265625, 0.0213470458984375, -0.0015535354614257812, -0.02032470703125, -0.03594970703125, -0.035247802734375, -0.01132965087890625, 0.0166168212890625, 0.01337432861328125, 0.036834716796875, 0.048736572265625, 0.035186767578125, 0.0260162353515625, -0.0222625732421875, 0.0097808837890625, -0.025482177734375, 0.001255035400390625, -0.01383209228515625, -0.036590576171875, -0.03289794921875, -0.0040740966796875, 0.038421630859375, 0.043975830078125, 0.005313873291015625, 0.02764892578125, -0.00643157958984375, 0.042327880859375, -0.0229339599609375, 0.02099609375, -0.045623779296875, 0.00098419189453125, -0.0233154296875, -0.00455474853515625, -0.032928466796875, -0.020355224609375, 0.00992584228515625, -0.0165557861328125, 0.043212890625, 0.0018873214721679688, 0.09796142578125, 0.0204010009765625, -0.006282806396484375, -0.0246124267578125, -0.038848876953125, 0.06744384765625, -0.06365966796875, 0.0207366943359375, 0.0400390625, 0.004528045654296875, -0.01422119140625, -0.043426513671875, -0.039947509765625, -0.0154571533203125, -0.00812530517578125, 0.01800537109375, -0.022186279296875, -0.0157012939453125, 0.0286102294921875, 0.029205322265625, -0.05096435546875, 0.00119781494140625, -0.041961669921875, -0.02813720703125, 0.037628173828125, -0.00429534912109375, 0.0176239013671875, -0.03424072265625, -0.0254669189453125, -0.033477783203125, -0.033843994140625, 0.005706787109375, 0.046966552734375, 0.0177001953125, -0.0255279541015625, 0.042327880859375, -0.011871337890625, 0.045684814453125, 0.007083892822265625, -0.0159454345703125, 0.04888916015625, -0.0080108642578125, -0.0228271484375, -0.0167388916015625, 0.084716796875, 0.005859375, 0.01678466796875, -0.0037326812744140625, 0.0010366439819335938, 0.005496978759765625, 0.0007867813110351562, -0.05010986328125, -0.01556396484375, 0.029541015625, -0.033111572265625, -0.0307159423828125, 0.025177001953125, -0.080078125, 0.00031185150146484375, -0.008758544921875, 0.0256805419921875, -0.0419921875, -0.00850677490234375, 0.022308349609375, -0.014404296875, 0.00646209716796875, -0.0029163360595703125, -0.05828857421875, 0.0143585205078125, 0.02783203125, 0.0670166015625, -0.006641387939453125, -0.0180816650390625, -0.018585205078125, -0.00879669189453125, 0.006137847900390625, 0.045623779296875, -0.00848388671875, -0.0254669189453125, -0.010498046875, 0.02703857421875, -0.01265716552734375, -0.0273895263671875, 0.05072021484375, -0.026641845703125, 0.0435791015625, 0.004451751708984375, -0.0244293212890625, -0.0224609375, 0.0180816650390625, -0.0601806640625, 0.0977783203125, 0.00811004638671875, -0.07220458984375, 0.005428314208984375, -0.046356201171875, -0.019775390625, -0.01467132568359375, 0.00879669189453125, -0.05230712890625, -0.0169219970703125, 0.033935546875, 0.036712646484375, -0.0283050537109375, 0.032440185546875, 0.00016391277313232422, -0.0097198486328125, 0.026275634765625, -0.037841796875, 0.09674072265625, 0.01526641845703125, -0.050140380859375, 0.007549285888671875, -0.0740966796875, -0.00235748291015625, 0.009521484375, -0.02996826171875, 0.005107879638671875, -0.019622802734375, 0.01033782958984375, 0.0235137939453125, 0.0183258056640625, -0.04229736328125, 0.01230621337890625, -0.055511474609375, 0.037078857421875, 0.047821044921875, -0.00508880615234375, 0.0245361328125, -0.027099609375, 0.050506591796875, -0.0081329345703125, 0.013763427734375, -0.01103973388671875, -0.054534912109375, -0.059417724609375, -0.039581298828125, 0.041961669921875, 0.05950927734375, -0.036712646484375, 0.061614990234375, -0.033843994140625, -0.047210693359375, -0.043853759765625, -0.00574493408203125, 0.044342041015625, 0.046905517578125, 0.040435791015625, -0.031890869140625, -0.0643310546875, -0.050933837890625, 0.01181793212890625, -0.0031337738037109375, -0.01171875, 0.022796630859375, 0.0655517578125, -0.0151214599609375, 0.05743408203125, -0.037841796875, -0.0259857177734375, -0.0167388916015625, 0.0035762786865234375, 0.048309326171875, 0.04473876953125, 0.034149169921875, -0.04278564453125, -0.0419921875, -0.0023517608642578125, -0.0533447265625, -0.007617950439453125, 0.008697509765625, -0.01126861572265625, 0.027069091796875, 0.0297088623046875, -0.03656005859375, 0.0262451171875, 0.051910400390625, -0.022186279296875, 0.039581298828125, -0.02532958984375, 0.00481414794921875, -0.092041015625, 0.0255279541015625, -0.007549285888671875, -0.01306915283203125, -0.043548583984375, -0.007022857666015625, -0.0038814544677734375, 0.00037741661071777344, -0.0469970703125, 0.04827880859375, -0.04010009765625, 0.006572723388671875, 0.0003795623779296875, -0.0006852149963378906, 0.00327301025390625, 0.05352783203125, 0.01380157470703125, 0.046905517578125, 0.05035400390625, -0.02484130859375, 0.00848388671875, 0.0215606689453125, -0.0279998779296875, 0.01007843017578125, -0.054290771484375, 0.0131988525390625, -0.011749267578125, 0.013580322265625, -0.061920166015625, -0.00904083251953125, 0.033111572265625, -0.045318603515625, 0.0269317626953125, -0.03057861328125, -0.02679443359375, -0.043487548828125, -0.0266876220703125, 0.0136871337890625, 0.04534912109375, -0.0292510986328125, 0.0380859375, 0.0238494873046875, -0.0017986297607421875, -0.06817626953125, -0.056060791015625, 0.00039958953857421875, -0.0194244384765625, -0.05389404296875, 0.045623779296875, 0.000002562999725341797, 0.00347137451171875, -0.00347137451171875, 0.0093536376953125, 0.0037746429443359375, -0.00420379638671875, 0.027191162109375, 0.034210205078125, -0.00011754035949707031, -0.01522064208984375, -0.0147705078125, -0.01430511474609375, -0.007457733154296875, -0.039825439453125, 0.08221435546875, -0.0101470947265625, -0.015350341796875, -0.029541015625, 0.01120758056640625, 0.036224365234375, -0.0265655517578125, 0.06805419921875, 0.05279541015625, -0.0274200439453125, -0.007205963134765625, -0.04071044921875, 0.00033855438232421875, -0.0310211181640625, 0.0244903564453125, -0.0260162353515625, -0.055633544921875, 0.060943603515625, 0.0189208984375, -0.0167694091796875, 0.054473876953125, 0.059112548828125, -0.00017201900482177734, 0.05755615234375, 0.030853271484375, -0.008209228515625, 0.0380859375, -0.0657958984375, 0.0035495758056640625, -0.0633544921875, -0.037078857421875, -0.053131103515625, -0.00621795654296875, -0.047088623046875, -0.05096435546875, 0.01084136962890625, 0.00978851318359375, -0.020233154296875, 0.04364013671875, -0.04644775390625, 0.0251312255859375, 0.058502197265625, 0.0213623046875, 0.005054473876953125, 0.0018215179443359375, -0.01953125, 0.0031452178955078125, -0.046234130859375, -0.046142578125, 0.103759765625, 0.036590576171875, 0.03326416015625, 0.006000518798828125, 0.048004150390625, 0.019012451171875, 0.00682830810546875, -0.04010009765625, 0.032073974609375, -0.018310546875, -0.04931640625, -0.0204315185546875, -0.01806640625, -0.0814208984375, 0.0208740234375, -0.0167694091796875, -0.0614013671875, 0.02496337890625, -0.0023174285888671875, -0.0110931396484375, 0.0225677490234375, -0.043365478515625, 0.06884765625, -0.010833740234375, -0.023101806640625, 0.0105743408203125, -0.055908203125, 0.0181427001953125, 0.004177093505859375, 0.01555633544921875, -0.0033130645751953125, 0.0113067626953125, 0.06884765625, -0.0482177734375, 0.065185546875, -0.0036373138427734375, 0.007015228271484375, 0.029022216796875, -0.0097198486328125, 0.04315185546875, -0.0052337646484375, -0.0131683349609375, 0.04437255859375, -0.00919342041015625, -0.038665771484375, -0.0157012939453125, 0.03955078125, -0.06256103515625, -0.0279693603515625, -0.05279541015625, -0.03570556640625, 0.01103973388671875, 0.0216217041015625, 0.047698974609375, 0.0286102294921875, -0.0063629150390625, 0.00930023193359375, 0.05218505859375, -0.024017333984375, 0.036590576171875, 0.0294952392578125, -0.005123138427734375, -0.047821044921875, 0.0579833984375, 0.002681732177734375, 0.0128173828125, 0.03070068359375, 0.0134429931640625, -0.033966064453125, -0.0628662109375, -0.033355712890625, 0.0273590087890625, -0.0433349609375, -0.0283966064453125, -0.06597900390625, -0.01462554931640625, -0.048065185546875, 0.006946563720703125, -0.017730712890625, -0.033050537109375, -0.0229644775390625, -0.01241302490234375, 0.03851318359375, 0.02783203125, -0.0031795501708984375, 0.022491455078125, -0.045318603515625, 0.0161285400390625, -0.0044708251953125, 0.01323699951171875, -0.01177215576171875, -0.0682373046875, -0.0192718505859375, 0.00406646728515625, -0.0198822021484375, -0.0823974609375, 0.06610107421875, 0.0011224746704101562, 0.030029296875, 0.0174407958984375, -0.019500732421875, 0.040313720703125, -0.0291900634765625, 0.057373046875, 0.002437591552734375, -0.0684814453125, 0.05499267578125, -0.034576416015625, 0.009979248046875, 0.03564453125, 0.032928466796875, -0.025115966796875, -0.03704833984375, -0.07525634765625, -0.07537841796875, 0.0645751953125, 0.0272979736328125, -0.00415802001953125, -0.0073089599609375, 0.005519866943359375, -0.0017108917236328125, 0.01383209228515625, -0.07177734375, -0.03594970703125, -0.02850341796875, -0.030914306640625, 0.00579833984375, -0.004970550537109375, 0.00037860870361328125, -0.0198516845703125, 0.0748291015625, 0.0017948150634765625, 0.0230712890625, 0.0236968994140625, -0.0222625732421875, 0.01139068603515625, 0.01210784912109375, 0.042022705078125, 0.0283355712890625, -0.033538818359375, 0.0023174285888671875, 0.0165863037109375, -0.05657958984375, -0.0038089752197265625, 0.01523590087890625, -0.032012939453125, 0.0163116455078125, 0.028778076171875, 0.06842041015625, 0.00977325439453125, -0.053741455078125, 0.040130615234375, 0.0176239013671875, -0.020538330078125, -0.0307464599609375, -0.0127410888671875, -0.0032024383544921875, 0.0043487548828125, 0.0231781005859375, 0.01508331298828125, -0.0056915283203125, -0.03924560546875, 0.008697509765625, 0.0290985107421875, -0.01371002197265625, -0.017333984375, 0.056396484375, -0.01171875, -0.02880859375, 0.0266265869140625, -0.028106689453125, -0.057342529296875, 0.06475830078125, 0.038726806640625, 0.058868408203125, -0.007678985595703125, 0.026031494140625, 0.059326171875, 0.0224609375, -0.0277099609375, 0.020782470703125, 0.0216217041015625, -0.0511474609375, -0.02789306640625, -0.06378173828125, -0.00705718994140625, 0.0265655517578125, -0.038726806640625, 0.0285186767578125, -0.0360107421875, -0.0195770263671875, 0.00015234947204589844, 0.01210784912109375, -0.06182861328125, 0.0185394287109375, -0.0030364990234375, 0.058990478515625, -0.0789794921875, 0.07110595703125, 0.03436279296875, -0.0657958984375, -0.057952880859375, -0.0209503173828125, -0.01383209228515625, -0.06396484375, 0.054595947265625, 0.00664520263671875, 0.01209259033203125, -0.0000655055046081543, -0.025238037109375, -0.08172607421875, 0.082763671875, 0.01947021484375, -0.0400390625, -0.00370025634765625, 0.006793975830078125, 0.057952880859375, -0.0204620361328125, 0.04296875, 0.031951904296875, 0.033203125, -0.002899169921875, -0.06463623046875, 0.01129913330078125, -0.032928466796875, -0.0022907257080078125, 0.006061553955078125, -0.049407958984375, 0.07421875, 0.004688262939453125, -0.009918212890625, 0.005321502685546875, 0.046051025390625, 0.01141357421875, 0.003261566162109375, 0.017425537109375, 0.05859375, 0.0684814453125, -0.0241546630859375, 0.066650390625, -0.02789306640625, 0.044158935546875, 0.07574462890625, 0.00927734375, 0.058685302734375, 0.0252838134765625, -0.02996826171875, 0.05950927734375, 0.035919189453125, -0.038238525390625, 0.035614013671875, 0.01031494140625, -0.007904052734375, 0.01309967041015625, 0.0010805130004882812, -0.0247955322265625, 0.034332275390625, 0.0156402587890625, -0.042999267578125, 0.002471923828125, 0.008056640625, 0.0214996337890625, 0.0118560791015625, -0.0074005126953125, 0.03631591796875, -0.0175628662109375, -0.056793212890625, 0.052581787109375, 0.0181121826171875, 0.07171630859375, -0.03326416015625, 0.0127105712890625, -0.01129913330078125, 0.006641387939453125, -0.020538330078125, -0.0455322265625, 0.0189208984375, 0.0227813720703125, -0.032684326171875, -0.0220184326171875, 0.037261962890625, -0.0283966064453125, -0.054168701171875, 0.0147705078125, 0.030975341796875, 0.0316162109375, 0.002414703369140625, -0.059722900390625, 0.0012111663818359375, 0.018707275390625, -0.0173797607421875, 0.01371002197265625, 0.0230712890625, -0.01230621337890625, 0.040863037109375, 0.06695556640625, 0.02301025390625, 0.0273895263671875, 0.0126190185546875, 0.049896240234375, -0.052520751953125, -0.0207366943359375, -0.06494140625, 0.054290771484375, -0.0127105712890625, -0.03228759765625, 0.05694580078125, 0.059295654296875, 0.072998046875, -0.019500732421875, 0.05462646484375, -0.021453857421875, 0.036163330078125, -0.04217529296875, 0.042205810546875, -0.00878143310546875, 0.018218994140625, -0.01922607421875, -0.061065673828125, -0.0261993408203125, 0.0574951171875, -0.0200653076171875, -0.0011119842529296875, 0.052093505859375, 0.0775146484375, 0.0058746337890625, -0.02093505859375, -0.0057525634765625, 0.02703857421875, 0.022216796875, 0.0517578125, 0.0279998779296875, -0.061492919921875, 0.04803466796875, -0.042083740234375, -0.013946533203125, -0.0151519775390625, -0.067626953125, -0.07666015625, -0.0450439453125, -0.0267333984375, -0.04620361328125, 0.00270843505859375, 0.06787109375, 0.04278564453125, -0.082275390625, -0.0141448974609375, -0.0241546630859375, 0.0052642822265625, -0.01367950439453125, -0.0185546875, 0.055419921875, -0.01058197021484375, -0.082275390625, 0.0181427001953125, 0.00228118896484375, 0.0116424560546875, -0.0161285400390625, -0.005825042724609375, -0.03515625, -0.00949859619140625, 0.0249176025390625, 0.03326416015625, -0.05560302734375, -0.0091400146484375, 0.015899658203125, -0.0145721435546875, 0.01462554931640625, 0.024261474609375, -0.061279296875, 0.0218505859375, 0.04888916015625, 0.022308349609375, 0.060455322265625, -0.005130767822265625, 0.0191497802734375, -0.056060791015625, 0.031158447265625, 0.018951416015625, 0.035491943359375, 0.026458740234375, -0.0205841064453125, 0.048797607421875, 0.03253173828125, -0.030059814453125, -0.062469482421875, -0.00434112548828125, -0.09228515625, -0.0147552490234375, 0.08282470703125, -0.01059722900390625, -0.029388427734375, -0.006954193115234375, -0.0265655517578125, 0.03741455078125, -0.0216827392578125, 0.057342529296875, 0.04931640625, 0.0076446533203125, 0.00446319580078125, -0.0399169921875, 0.035552978515625, 0.032073974609375, -0.04901123046875, -0.0189056396484375, 0.014801025390625, 0.033843994140625, 0.02392578125, 0.046844482421875, -0.0208740234375, 0.0008840560913085938, -0.0009965896606445312, 0.019927978515625, -0.00597381591796875, -0.0126800537109375, -0.0302734375, 0.0157928466796875, -0.01493072509765625, -0.014251708984375 ] ]
NousResearch/Llama-2-7b-chat-hf
2023-07-18T20:57:56.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "facebook", "meta", "llama-2", "en", "has_space", "text-generation-inference", "region:us" ]
text-generation
NousResearch
null
null
NousResearch/Llama-2-7b-chat-hf
46
253,823
transformers
2023-07-18T19:45:53
--- extra_gated_heading: Access Llama 2 on Hugging Face extra_gated_description: >- This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days. extra_gated_button_content: Submit extra_gated_fields: I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox language: - en pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 --- # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
10,136
[ [ -0.0165863037109375, -0.052734375, 0.028106689453125, 0.01453399658203125, -0.0285186767578125, 0.01702880859375, -0.0033283233642578125, -0.05621337890625, 0.00438690185546875, 0.023590087890625, -0.052581787109375, -0.043182373046875, -0.050140380859375, 0.00632476806640625, -0.016815185546875, 0.08099365234375, -0.0007138252258300781, -0.0211029052734375, -0.00958251953125, 0.00634765625, -0.036590576171875, -0.0300140380859375, -0.0401611328125, -0.031982421875, 0.0302886962890625, 0.035919189453125, 0.045928955078125, 0.04888916015625, 0.040740966796875, 0.0182647705078125, -0.019439697265625, 0.0162200927734375, -0.052734375, -0.020355224609375, 0.008880615234375, -0.037017822265625, -0.05120849609375, 0.01239013671875, 0.026611328125, 0.01332855224609375, -0.0220947265625, 0.039703369140625, 0.004512786865234375, 0.03497314453125, -0.042266845703125, 0.0127716064453125, -0.054443359375, 0.002716064453125, -0.0163116455078125, -0.005985260009765625, -0.01413726806640625, -0.0220184326171875, -0.01430511474609375, -0.0625, -0.008148193359375, 0.0067138671875, 0.07818603515625, 0.04901123046875, -0.034149169921875, -0.00901031494140625, -0.0208587646484375, 0.0714111328125, -0.064453125, 0.0036907196044921875, 0.04364013671875, 0.0208587646484375, -0.0171966552734375, -0.056976318359375, -0.049041748046875, -0.01059722900390625, 0.00450897216796875, 0.0268707275390625, -0.030975341796875, 0.001422882080078125, 0.014068603515625, 0.0287933349609375, -0.042999267578125, 0.042388916015625, -0.03857421875, -0.0126495361328125, 0.07977294921875, 0.0180511474609375, 0.00015866756439208984, -0.004390716552734375, -0.037109375, -0.0220184326171875, -0.06048583984375, 0.013763427734375, 0.036895751953125, -0.0029811859130859375, -0.034759521484375, 0.04644775390625, -0.0308685302734375, 0.022613525390625, 0.0012407302856445312, -0.037933349609375, 0.036102294921875, -0.035919189453125, -0.020965576171875, -0.00860595703125, 0.0673828125, 0.055084228515625, 0.0115203857421875, 0.00824737548828125, -0.005115509033203125, 0.0083160400390625, -0.0006785392761230469, -0.0609130859375, -0.0035457611083984375, 0.0179595947265625, -0.0279083251953125, -0.0435791015625, -0.0228729248046875, -0.05615234375, -0.012359619140625, -0.007293701171875, 0.0184783935546875, -0.0024089813232421875, -0.0287628173828125, 0.00913238525390625, 0.0037059783935546875, 0.041595458984375, 0.01611328125, -0.07098388671875, 0.0162811279296875, 0.0416259765625, 0.059173583984375, -0.018280029296875, -0.026702880859375, 0.000591278076171875, -0.0016498565673828125, -0.0247039794921875, 0.06787109375, -0.0256500244140625, -0.04156494140625, -0.0167083740234375, -0.0016908645629882812, 0.0122222900390625, -0.039093017578125, 0.033233642578125, -0.028778076171875, 0.01264190673828125, -0.0251312255859375, -0.027191162109375, -0.0262451171875, 0.01434326171875, -0.0301361083984375, 0.1090087890625, 0.00859832763671875, -0.03607177734375, 0.0236663818359375, -0.051300048828125, -0.0137176513671875, -0.0151824951171875, 0.007350921630859375, -0.03973388671875, -0.020111083984375, 0.009857177734375, 0.027984619140625, -0.048187255859375, 0.03668212890625, -0.01502227783203125, -0.032928466796875, 0.0028209686279296875, -0.031158447265625, 0.0635986328125, 0.0220489501953125, -0.034271240234375, 0.00559234619140625, -0.06256103515625, 0.004123687744140625, 0.033935546875, -0.03607177734375, 0.0197906494140625, 0.00563812255859375, -0.00927734375, 0.0143585205078125, 0.036407470703125, -0.027130126953125, 0.01242828369140625, -0.0237274169921875, 0.037933349609375, 0.0562744140625, 0.0040283203125, 0.01213836669921875, -0.039154052734375, 0.039398193359375, -0.002796173095703125, 0.029083251953125, 0.0007238388061523438, -0.05413818359375, -0.0775146484375, -0.0140533447265625, -0.0033817291259765625, 0.063232421875, -0.018890380859375, 0.051849365234375, -0.0006055831909179688, -0.055328369140625, -0.0321044921875, 0.0274810791015625, 0.050750732421875, 0.037872314453125, 0.0323486328125, -0.02142333984375, -0.045562744140625, -0.076904296875, 0.003963470458984375, -0.033599853515625, -0.0009679794311523438, 0.0271148681640625, 0.0491943359375, -0.0252227783203125, 0.055145263671875, -0.0404052734375, -0.0128173828125, -0.020050048828125, -0.0090179443359375, 0.00551605224609375, 0.0258331298828125, 0.048797607421875, -0.029754638671875, -0.015716552734375, -0.00931549072265625, -0.0679931640625, -0.0078582763671875, 0.007537841796875, -0.01531982421875, 0.0180206298828125, 0.023651123046875, -0.046142578125, 0.034454345703125, 0.053619384765625, -0.01340484619140625, 0.03875732421875, -0.00014722347259521484, -0.01291656494140625, -0.081298828125, 0.003002166748046875, -0.015228271484375, 0.0017070770263671875, -0.03265380859375, -0.002498626708984375, -0.0162200927734375, 0.00514984130859375, -0.046600341796875, 0.044708251953125, -0.0233001708984375, -0.012664794921875, -0.00927734375, 0.003814697265625, 0.00467681884765625, 0.04669189453125, -0.00940704345703125, 0.0814208984375, 0.0301971435546875, -0.04449462890625, 0.019805908203125, 0.0299835205078125, -0.037628173828125, 0.01142120361328125, -0.06573486328125, 0.027435302734375, 0.00821685791015625, 0.03973388671875, -0.0723876953125, -0.0282745361328125, 0.0245819091796875, -0.032562255859375, 0.006500244140625, 0.0178680419921875, -0.04156494140625, -0.0300140380859375, -0.0318603515625, 0.023284912109375, 0.0606689453125, -0.034393310546875, 0.01397705078125, 0.0283203125, 0.0014133453369140625, -0.051666259765625, -0.0628662109375, 0.004924774169921875, -0.0278472900390625, -0.040283203125, 0.02276611328125, -0.01458740234375, -0.018280029296875, -0.01953125, 0.00525665283203125, -0.0008378028869628906, 0.0289154052734375, 0.027740478515625, 0.0279388427734375, -0.00881195068359375, -0.0019702911376953125, 0.01087188720703125, -0.015350341796875, 0.0031871795654296875, 0.0169219970703125, 0.04388427734375, -0.0131683349609375, -0.01678466796875, -0.05535888671875, 0.00389862060546875, 0.0222015380859375, -0.0191650390625, 0.045440673828125, 0.03192138671875, -0.016693115234375, 0.0175323486328125, -0.05755615234375, -0.007904052734375, -0.0400390625, 0.040740966796875, -0.016204833984375, -0.06195068359375, 0.040435791015625, -0.0006127357482910156, 0.03326416015625, 0.056488037109375, 0.046234130859375, -0.00606536865234375, 0.061492919921875, 0.042999267578125, -0.005825042724609375, 0.0252227783203125, -0.03729248046875, -0.006603240966796875, -0.07171630859375, -0.0462646484375, -0.02398681640625, -0.032073974609375, -0.050018310546875, -0.031494140625, 0.020111083984375, 0.01479339599609375, -0.05096435546875, 0.024139404296875, -0.043426513671875, 0.0428466796875, 0.04022216796875, 0.0095672607421875, 0.02337646484375, 0.007350921630859375, 0.01068878173828125, 0.0045318603515625, -0.0399169921875, -0.055694580078125, 0.11041259765625, 0.03167724609375, 0.03375244140625, 0.007686614990234375, 0.051361083984375, 0.01116180419921875, 0.0245513916015625, -0.05401611328125, 0.04962158203125, 0.0037689208984375, -0.053924560546875, -0.0113677978515625, -0.00862884521484375, -0.0673828125, 0.01107025146484375, -0.0157623291015625, -0.058929443359375, 0.0023822784423828125, -0.0020694732666015625, -0.0289306640625, 0.0211029052734375, -0.050445556640625, 0.045501708984375, -0.0426025390625, -0.0236053466796875, -0.0270538330078125, -0.05950927734375, 0.05108642578125, -0.01511383056640625, 0.007350921630859375, -0.03741455078125, -0.019134521484375, 0.067138671875, -0.02508544921875, 0.07550048828125, -0.003124237060546875, -0.007556915283203125, 0.04364013671875, -0.0141448974609375, 0.033538818359375, 0.00281524658203125, -0.0209197998046875, 0.050811767578125, -0.009613037109375, -0.0235748291015625, -0.01152801513671875, 0.04052734375, -0.0904541015625, -0.060211181640625, -0.03875732421875, -0.03857421875, -0.0019817352294921875, 0.0067138671875, 0.038330078125, -0.006744384765625, -0.002666473388671875, 0.0091400146484375, 0.034515380859375, -0.03912353515625, 0.035797119140625, 0.041534423828125, -0.007694244384765625, -0.034881591796875, 0.04962158203125, 0.0034542083740234375, 0.0274810791015625, 0.015960693359375, 0.003421783447265625, -0.0311126708984375, -0.031951904296875, -0.038970947265625, 0.0206298828125, -0.03521728515625, -0.036529541015625, -0.04010009765625, -0.026702880859375, -0.02447509765625, -0.005779266357421875, -0.032440185546875, -0.031951904296875, -0.056060791015625, -0.0294952392578125, 0.03887939453125, 0.061492919921875, -0.0000029802322387695312, 0.047332763671875, -0.02459716796875, 0.0140228271484375, 0.027984619140625, 0.01312255859375, -0.002895355224609375, -0.056488037109375, 0.004093170166015625, 0.0094757080078125, -0.057403564453125, -0.04644775390625, 0.01873779296875, 0.0206298828125, 0.03607177734375, 0.035308837890625, -0.00563812255859375, 0.058563232421875, -0.0261383056640625, 0.08343505859375, 0.0273284912109375, -0.0501708984375, 0.05279541015625, -0.016632080078125, 0.004489898681640625, 0.047698974609375, 0.0202789306640625, -0.006866455078125, -0.01224517822265625, -0.0477294921875, -0.05120849609375, 0.060272216796875, 0.0171661376953125, 0.01351165771484375, 0.00397491455078125, 0.034637451171875, 0.004070281982421875, 0.00848388671875, -0.0628662109375, -0.022979736328125, -0.0196685791015625, -0.007572174072265625, -0.01540374755859375, -0.037872314453125, -0.005550384521484375, -0.0235748291015625, 0.048095703125, 0.00337982177734375, 0.0258941650390625, -0.00942230224609375, 0.0014238357543945312, -0.007701873779296875, 0.0038585662841796875, 0.054962158203125, 0.037933349609375, -0.019134521484375, -0.011505126953125, 0.048828125, -0.047088623046875, 0.0251922607421875, 0.000774383544921875, -0.01021575927734375, -0.0277252197265625, 0.0301513671875, 0.0667724609375, 0.020111083984375, -0.05377197265625, 0.02496337890625, 0.00992584228515625, -0.0276641845703125, -0.03173828125, 0.0277862548828125, 0.006374359130859375, 0.0244598388671875, 0.0197601318359375, -0.01020050048828125, 0.007328033447265625, -0.038970947265625, -0.00977325439453125, 0.0289306640625, 0.0090789794921875, -0.031341552734375, 0.07452392578125, 0.0243682861328125, -0.0219268798828125, 0.039947509765625, -0.01265716552734375, -0.0274505615234375, 0.06787109375, 0.047576904296875, 0.049102783203125, -0.020904541015625, 0.0091094970703125, 0.053741455078125, 0.033660888671875, -0.0176544189453125, 0.0175323486328125, -0.0008363723754882812, -0.037109375, -0.0156402587890625, -0.05224609375, -0.0355224609375, 0.026885986328125, -0.0428466796875, 0.0235443115234375, -0.047515869140625, -0.020294189453125, -0.0241851806640625, 0.034149169921875, -0.050628662109375, 0.01617431640625, 0.008331298828125, 0.06903076171875, -0.054168701171875, 0.05712890625, 0.03753662109375, -0.03851318359375, -0.06695556640625, -0.022247314453125, 0.0149993896484375, -0.09173583984375, 0.039276123046875, 0.028594970703125, -0.005550384521484375, 0.00926971435546875, -0.0562744140625, -0.09112548828125, 0.1275634765625, 0.03460693359375, -0.056793212890625, -0.0012187957763671875, 0.02557373046875, 0.03656005859375, -0.00853729248046875, 0.033447265625, 0.062103271484375, 0.03704833984375, 0.00856781005859375, -0.07977294921875, 0.006816864013671875, -0.0263671875, -0.002681732177734375, -0.01458740234375, -0.09832763671875, 0.061126708984375, -0.029815673828125, -0.0180511474609375, 0.0164794921875, 0.04791259765625, 0.051483154296875, 0.042266845703125, 0.0267181396484375, 0.05999755859375, 0.06890869140625, -0.0018701553344726562, 0.08416748046875, -0.027252197265625, 0.01312255859375, 0.0667724609375, -0.022003173828125, 0.07281494140625, 0.0178680419921875, -0.044036865234375, 0.045928955078125, 0.07623291015625, -0.001369476318359375, 0.044036865234375, 0.005092620849609375, -0.01369476318359375, -0.01419830322265625, -0.01378631591796875, -0.0487060546875, 0.0386962890625, 0.018280029296875, -0.0111846923828125, -0.0020427703857421875, -0.0252685546875, 0.0171356201171875, -0.0248565673828125, -0.0013599395751953125, 0.0606689453125, 0.0129547119140625, -0.046966552734375, 0.06634521484375, 0.0033473968505859375, 0.06317138671875, -0.048980712890625, 0.00597381591796875, -0.0390625, 0.00020742416381835938, -0.0278472900390625, -0.053009033203125, 0.005458831787109375, 0.0276336669921875, -0.001041412353515625, -0.00864410400390625, 0.04119873046875, 0.0030651092529296875, -0.042633056640625, 0.0268096923828125, 0.020294189453125, 0.0267181396484375, 0.0159759521484375, -0.051513671875, 0.012908935546875, 0.00714874267578125, -0.040374755859375, 0.0283966064453125, 0.002277374267578125, -0.005069732666015625, 0.060638427734375, 0.055908203125, -0.015167236328125, 0.01052093505859375, -0.0153045654296875, 0.07525634765625, -0.0379638671875, -0.01474761962890625, -0.05670166015625, 0.0400390625, 0.0032558441162109375, -0.053680419921875, 0.041107177734375, 0.049041748046875, 0.052734375, 0.0208740234375, 0.049163818359375, 0.00629425048828125, 0.0234375, -0.040496826171875, 0.046142578125, -0.058807373046875, 0.0284423828125, 0.0061492919921875, -0.0736083984375, -0.004638671875, 0.051239013671875, -0.018402099609375, 0.0033588409423828125, 0.0284576416015625, 0.06451416015625, 0.0133514404296875, -0.01172637939453125, 0.01013946533203125, 0.0127105712890625, 0.026397705078125, 0.0665283203125, 0.06341552734375, -0.047088623046875, 0.052734375, -0.028594970703125, -0.017852783203125, -0.019989013671875, -0.05560302734375, -0.072509765625, -0.0200042724609375, -0.01861572265625, -0.010955810546875, 0.005260467529296875, 0.0562744140625, 0.038116455078125, -0.0438232421875, -0.022613525390625, -0.00600433349609375, -0.00617218017578125, 0.0027828216552734375, -0.0119781494140625, 0.0245819091796875, -0.0082550048828125, -0.044647216796875, 0.037139892578125, 0.0007052421569824219, 0.0143890380859375, -0.02490234375, -0.0203094482421875, -0.0152587890625, 0.01116943359375, 0.045440673828125, 0.02142333984375, -0.0699462890625, -0.017608642578125, 0.0031757354736328125, -0.0105438232421875, 0.0093536376953125, 0.0017137527465820312, -0.057952880859375, 0.00826263427734375, 0.01068878173828125, 0.0282745361328125, 0.049560546875, 0.004489898681640625, 0.005924224853515625, -0.037567138671875, 0.03338623046875, 0.0015859603881835938, 0.01168060302734375, 0.0231475830078125, -0.03216552734375, 0.059417724609375, 0.01107025146484375, -0.052703857421875, -0.0706787109375, 0.00894927978515625, -0.078857421875, 0.00018024444580078125, 0.10369873046875, 0.000043392181396484375, -0.00948333740234375, 0.01441192626953125, -0.015960693359375, 0.0286865234375, -0.0294647216796875, 0.06048583984375, 0.042999267578125, -0.00667572021484375, -0.0077972412109375, -0.06048583984375, 0.0261688232421875, 0.029815673828125, -0.08203125, -0.0188140869140625, 0.03460693359375, 0.036407470703125, -0.0067596435546875, 0.051910400390625, 0.00157928466796875, 0.017852783203125, 0.00495147705078125, 0.0080413818359375, -0.018341064453125, -0.011627197265625, -0.00799560546875, -0.02069091796875, -0.004032135009765625, -0.0164337158203125 ] ]
intfloat/e5-large-v2
2023-08-07T05:01:43.000Z
[ "sentence-transformers", "pytorch", "safetensors", "bert", "mteb", "Sentence Transformers", "sentence-similarity", "en", "arxiv:2212.03533", "arxiv:2104.08663", "arxiv:2210.07316", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
intfloat
null
null
intfloat/e5-large-v2
142
250,017
sentence-transformers
2023-05-19T07:23:33
--- tags: - mteb - Sentence Transformers - sentence-similarity - sentence-transformers model-index: - name: e5-large-v2 results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 79.22388059701493 - type: ap value: 43.20816505595132 - type: f1 value: 73.27811303522058 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.748325 - type: ap value: 90.72534979701297 - type: f1 value: 93.73895874282185 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.612 - type: f1 value: 47.61157345898393 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 23.541999999999998 - type: map_at_10 value: 38.208 - type: map_at_100 value: 39.417 - type: map_at_1000 value: 39.428999999999995 - type: map_at_3 value: 33.95 - type: map_at_5 value: 36.329 - type: mrr_at_1 value: 23.755000000000003 - type: mrr_at_10 value: 38.288 - type: mrr_at_100 value: 39.511 - type: mrr_at_1000 value: 39.523 - type: mrr_at_3 value: 34.009 - type: mrr_at_5 value: 36.434 - type: ndcg_at_1 value: 23.541999999999998 - type: ndcg_at_10 value: 46.417 - type: ndcg_at_100 value: 51.812000000000005 - type: ndcg_at_1000 value: 52.137 - type: ndcg_at_3 value: 37.528 - type: ndcg_at_5 value: 41.81 - type: precision_at_1 value: 23.541999999999998 - type: precision_at_10 value: 7.269 - type: precision_at_100 value: 0.9690000000000001 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 15.979 - type: precision_at_5 value: 11.664 - type: recall_at_1 value: 23.541999999999998 - type: recall_at_10 value: 72.688 - type: recall_at_100 value: 96.871 - type: recall_at_1000 value: 99.431 - type: recall_at_3 value: 47.937000000000005 - type: recall_at_5 value: 58.321 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 45.546499570522094 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 41.01607489943561 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 59.616107510107774 - type: mrr value: 72.75106626214661 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.33018094733868 - type: cos_sim_spearman value: 83.60190492611737 - type: euclidean_pearson value: 82.1492450218961 - type: euclidean_spearman value: 82.70308926526991 - type: manhattan_pearson value: 81.93959600076842 - type: manhattan_spearman value: 82.73260801016369 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.54545454545455 - type: f1 value: 84.49582530928923 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 37.362725540120096 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 34.849509608178145 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 31.502999999999997 - type: map_at_10 value: 43.323 - type: map_at_100 value: 44.708999999999996 - type: map_at_1000 value: 44.838 - type: map_at_3 value: 38.987 - type: map_at_5 value: 41.516999999999996 - type: mrr_at_1 value: 38.769999999999996 - type: mrr_at_10 value: 49.13 - type: mrr_at_100 value: 49.697 - type: mrr_at_1000 value: 49.741 - type: mrr_at_3 value: 45.804 - type: mrr_at_5 value: 47.842 - type: ndcg_at_1 value: 38.769999999999996 - type: ndcg_at_10 value: 50.266999999999996 - type: ndcg_at_100 value: 54.967 - type: ndcg_at_1000 value: 56.976000000000006 - type: ndcg_at_3 value: 43.823 - type: ndcg_at_5 value: 47.12 - type: precision_at_1 value: 38.769999999999996 - type: precision_at_10 value: 10.057 - type: precision_at_100 value: 1.554 - type: precision_at_1000 value: 0.202 - type: precision_at_3 value: 21.125 - type: precision_at_5 value: 15.851 - type: recall_at_1 value: 31.502999999999997 - type: recall_at_10 value: 63.715999999999994 - type: recall_at_100 value: 83.61800000000001 - type: recall_at_1000 value: 96.63199999999999 - type: recall_at_3 value: 45.403 - type: recall_at_5 value: 54.481 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.833000000000002 - type: map_at_10 value: 37.330999999999996 - type: map_at_100 value: 38.580999999999996 - type: map_at_1000 value: 38.708 - type: map_at_3 value: 34.713 - type: map_at_5 value: 36.104 - type: mrr_at_1 value: 35.223 - type: mrr_at_10 value: 43.419000000000004 - type: mrr_at_100 value: 44.198 - type: mrr_at_1000 value: 44.249 - type: mrr_at_3 value: 41.614000000000004 - type: mrr_at_5 value: 42.553000000000004 - type: ndcg_at_1 value: 35.223 - type: ndcg_at_10 value: 42.687999999999995 - type: ndcg_at_100 value: 47.447 - type: ndcg_at_1000 value: 49.701 - type: ndcg_at_3 value: 39.162 - type: ndcg_at_5 value: 40.557 - type: precision_at_1 value: 35.223 - type: precision_at_10 value: 7.962 - type: precision_at_100 value: 1.304 - type: precision_at_1000 value: 0.18 - type: precision_at_3 value: 19.023 - type: precision_at_5 value: 13.184999999999999 - type: recall_at_1 value: 27.833000000000002 - type: recall_at_10 value: 51.881 - type: recall_at_100 value: 72.04 - type: recall_at_1000 value: 86.644 - type: recall_at_3 value: 40.778 - type: recall_at_5 value: 45.176 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 38.175 - type: map_at_10 value: 51.174 - type: map_at_100 value: 52.26499999999999 - type: map_at_1000 value: 52.315999999999995 - type: map_at_3 value: 47.897 - type: map_at_5 value: 49.703 - type: mrr_at_1 value: 43.448 - type: mrr_at_10 value: 54.505 - type: mrr_at_100 value: 55.216 - type: mrr_at_1000 value: 55.242000000000004 - type: mrr_at_3 value: 51.98500000000001 - type: mrr_at_5 value: 53.434000000000005 - type: ndcg_at_1 value: 43.448 - type: ndcg_at_10 value: 57.282 - type: ndcg_at_100 value: 61.537 - type: ndcg_at_1000 value: 62.546 - type: ndcg_at_3 value: 51.73799999999999 - type: ndcg_at_5 value: 54.324 - type: precision_at_1 value: 43.448 - type: precision_at_10 value: 9.292 - type: precision_at_100 value: 1.233 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 23.218 - type: precision_at_5 value: 15.887 - type: recall_at_1 value: 38.175 - type: recall_at_10 value: 72.00999999999999 - type: recall_at_100 value: 90.155 - type: recall_at_1000 value: 97.257 - type: recall_at_3 value: 57.133 - type: recall_at_5 value: 63.424 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 22.405 - type: map_at_10 value: 30.043 - type: map_at_100 value: 31.191000000000003 - type: map_at_1000 value: 31.275 - type: map_at_3 value: 27.034000000000002 - type: map_at_5 value: 28.688000000000002 - type: mrr_at_1 value: 24.068 - type: mrr_at_10 value: 31.993 - type: mrr_at_100 value: 32.992 - type: mrr_at_1000 value: 33.050000000000004 - type: mrr_at_3 value: 28.964000000000002 - type: mrr_at_5 value: 30.653000000000002 - type: ndcg_at_1 value: 24.068 - type: ndcg_at_10 value: 35.198 - type: ndcg_at_100 value: 40.709 - type: ndcg_at_1000 value: 42.855 - type: ndcg_at_3 value: 29.139 - type: ndcg_at_5 value: 32.045 - type: precision_at_1 value: 24.068 - type: precision_at_10 value: 5.65 - type: precision_at_100 value: 0.885 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 12.279 - type: precision_at_5 value: 8.994 - type: recall_at_1 value: 22.405 - type: recall_at_10 value: 49.391 - type: recall_at_100 value: 74.53699999999999 - type: recall_at_1000 value: 90.605 - type: recall_at_3 value: 33.126 - type: recall_at_5 value: 40.073 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 13.309999999999999 - type: map_at_10 value: 20.688000000000002 - type: map_at_100 value: 22.022 - type: map_at_1000 value: 22.152 - type: map_at_3 value: 17.954 - type: map_at_5 value: 19.439 - type: mrr_at_1 value: 16.294 - type: mrr_at_10 value: 24.479 - type: mrr_at_100 value: 25.515 - type: mrr_at_1000 value: 25.593 - type: mrr_at_3 value: 21.642 - type: mrr_at_5 value: 23.189999999999998 - type: ndcg_at_1 value: 16.294 - type: ndcg_at_10 value: 25.833000000000002 - type: ndcg_at_100 value: 32.074999999999996 - type: ndcg_at_1000 value: 35.083 - type: ndcg_at_3 value: 20.493 - type: ndcg_at_5 value: 22.949 - type: precision_at_1 value: 16.294 - type: precision_at_10 value: 5.112 - type: precision_at_100 value: 0.96 - type: precision_at_1000 value: 0.134 - type: precision_at_3 value: 9.908999999999999 - type: precision_at_5 value: 7.587000000000001 - type: recall_at_1 value: 13.309999999999999 - type: recall_at_10 value: 37.851 - type: recall_at_100 value: 64.835 - type: recall_at_1000 value: 86.334 - type: recall_at_3 value: 23.493 - type: recall_at_5 value: 29.528 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.857999999999997 - type: map_at_10 value: 35.503 - type: map_at_100 value: 36.957 - type: map_at_1000 value: 37.065 - type: map_at_3 value: 32.275999999999996 - type: map_at_5 value: 34.119 - type: mrr_at_1 value: 31.954 - type: mrr_at_10 value: 40.851 - type: mrr_at_100 value: 41.863 - type: mrr_at_1000 value: 41.900999999999996 - type: mrr_at_3 value: 38.129999999999995 - type: mrr_at_5 value: 39.737 - type: ndcg_at_1 value: 31.954 - type: ndcg_at_10 value: 41.343999999999994 - type: ndcg_at_100 value: 47.397 - type: ndcg_at_1000 value: 49.501 - type: ndcg_at_3 value: 36.047000000000004 - type: ndcg_at_5 value: 38.639 - type: precision_at_1 value: 31.954 - type: precision_at_10 value: 7.68 - type: precision_at_100 value: 1.247 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 17.132 - type: precision_at_5 value: 12.589 - type: recall_at_1 value: 25.857999999999997 - type: recall_at_10 value: 53.43599999999999 - type: recall_at_100 value: 78.82400000000001 - type: recall_at_1000 value: 92.78999999999999 - type: recall_at_3 value: 38.655 - type: recall_at_5 value: 45.216 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 24.709 - type: map_at_10 value: 34.318 - type: map_at_100 value: 35.657 - type: map_at_1000 value: 35.783 - type: map_at_3 value: 31.326999999999998 - type: map_at_5 value: 33.021 - type: mrr_at_1 value: 30.137000000000004 - type: mrr_at_10 value: 39.093 - type: mrr_at_100 value: 39.992 - type: mrr_at_1000 value: 40.056999999999995 - type: mrr_at_3 value: 36.606 - type: mrr_at_5 value: 37.861 - type: ndcg_at_1 value: 30.137000000000004 - type: ndcg_at_10 value: 39.974 - type: ndcg_at_100 value: 45.647999999999996 - type: ndcg_at_1000 value: 48.259 - type: ndcg_at_3 value: 35.028 - type: ndcg_at_5 value: 37.175999999999995 - type: precision_at_1 value: 30.137000000000004 - type: precision_at_10 value: 7.363 - type: precision_at_100 value: 1.184 - type: precision_at_1000 value: 0.161 - type: precision_at_3 value: 16.857 - type: precision_at_5 value: 11.963 - type: recall_at_1 value: 24.709 - type: recall_at_10 value: 52.087 - type: recall_at_100 value: 76.125 - type: recall_at_1000 value: 93.82300000000001 - type: recall_at_3 value: 38.149 - type: recall_at_5 value: 43.984 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.40791666666667 - type: map_at_10 value: 32.458083333333335 - type: map_at_100 value: 33.691916666666664 - type: map_at_1000 value: 33.81191666666666 - type: map_at_3 value: 29.51625 - type: map_at_5 value: 31.168083333333335 - type: mrr_at_1 value: 27.96591666666666 - type: mrr_at_10 value: 36.528583333333344 - type: mrr_at_100 value: 37.404 - type: mrr_at_1000 value: 37.464333333333336 - type: mrr_at_3 value: 33.92883333333333 - type: mrr_at_5 value: 35.41933333333333 - type: ndcg_at_1 value: 27.96591666666666 - type: ndcg_at_10 value: 37.89141666666666 - type: ndcg_at_100 value: 43.23066666666666 - type: ndcg_at_1000 value: 45.63258333333333 - type: ndcg_at_3 value: 32.811249999999994 - type: ndcg_at_5 value: 35.22566666666667 - type: precision_at_1 value: 27.96591666666666 - type: precision_at_10 value: 6.834083333333332 - type: precision_at_100 value: 1.12225 - type: precision_at_1000 value: 0.15241666666666667 - type: precision_at_3 value: 15.264333333333335 - type: precision_at_5 value: 11.039416666666666 - type: recall_at_1 value: 23.40791666666667 - type: recall_at_10 value: 49.927083333333336 - type: recall_at_100 value: 73.44641666666668 - type: recall_at_1000 value: 90.19950000000001 - type: recall_at_3 value: 35.88341666666667 - type: recall_at_5 value: 42.061249999999994 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 19.592000000000002 - type: map_at_10 value: 26.895999999999997 - type: map_at_100 value: 27.921000000000003 - type: map_at_1000 value: 28.02 - type: map_at_3 value: 24.883 - type: map_at_5 value: 25.812 - type: mrr_at_1 value: 22.698999999999998 - type: mrr_at_10 value: 29.520999999999997 - type: mrr_at_100 value: 30.458000000000002 - type: mrr_at_1000 value: 30.526999999999997 - type: mrr_at_3 value: 27.633000000000003 - type: mrr_at_5 value: 28.483999999999998 - type: ndcg_at_1 value: 22.698999999999998 - type: ndcg_at_10 value: 31.061 - type: ndcg_at_100 value: 36.398 - type: ndcg_at_1000 value: 38.89 - type: ndcg_at_3 value: 27.149 - type: ndcg_at_5 value: 28.627000000000002 - type: precision_at_1 value: 22.698999999999998 - type: precision_at_10 value: 5.106999999999999 - type: precision_at_100 value: 0.857 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 11.963 - type: precision_at_5 value: 8.221 - type: recall_at_1 value: 19.592000000000002 - type: recall_at_10 value: 41.329 - type: recall_at_100 value: 66.094 - type: recall_at_1000 value: 84.511 - type: recall_at_3 value: 30.61 - type: recall_at_5 value: 34.213 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 14.71 - type: map_at_10 value: 20.965 - type: map_at_100 value: 21.994 - type: map_at_1000 value: 22.133 - type: map_at_3 value: 18.741 - type: map_at_5 value: 19.951 - type: mrr_at_1 value: 18.307000000000002 - type: mrr_at_10 value: 24.66 - type: mrr_at_100 value: 25.540000000000003 - type: mrr_at_1000 value: 25.629 - type: mrr_at_3 value: 22.511 - type: mrr_at_5 value: 23.72 - type: ndcg_at_1 value: 18.307000000000002 - type: ndcg_at_10 value: 25.153 - type: ndcg_at_100 value: 30.229 - type: ndcg_at_1000 value: 33.623 - type: ndcg_at_3 value: 21.203 - type: ndcg_at_5 value: 23.006999999999998 - type: precision_at_1 value: 18.307000000000002 - type: precision_at_10 value: 4.725 - type: precision_at_100 value: 0.8659999999999999 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 10.14 - type: precision_at_5 value: 7.481 - type: recall_at_1 value: 14.71 - type: recall_at_10 value: 34.087 - type: recall_at_100 value: 57.147999999999996 - type: recall_at_1000 value: 81.777 - type: recall_at_3 value: 22.996 - type: recall_at_5 value: 27.73 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.472 - type: map_at_10 value: 32.699 - type: map_at_100 value: 33.867000000000004 - type: map_at_1000 value: 33.967000000000006 - type: map_at_3 value: 29.718 - type: map_at_5 value: 31.345 - type: mrr_at_1 value: 28.265 - type: mrr_at_10 value: 36.945 - type: mrr_at_100 value: 37.794 - type: mrr_at_1000 value: 37.857 - type: mrr_at_3 value: 34.266000000000005 - type: mrr_at_5 value: 35.768 - type: ndcg_at_1 value: 28.265 - type: ndcg_at_10 value: 38.35 - type: ndcg_at_100 value: 43.739 - type: ndcg_at_1000 value: 46.087 - type: ndcg_at_3 value: 33.004 - type: ndcg_at_5 value: 35.411 - type: precision_at_1 value: 28.265 - type: precision_at_10 value: 6.715999999999999 - type: precision_at_100 value: 1.059 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_3 value: 15.299 - type: precision_at_5 value: 10.951 - type: recall_at_1 value: 23.472 - type: recall_at_10 value: 51.413 - type: recall_at_100 value: 75.17 - type: recall_at_1000 value: 91.577 - type: recall_at_3 value: 36.651 - type: recall_at_5 value: 42.814 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.666 - type: map_at_10 value: 32.963 - type: map_at_100 value: 34.544999999999995 - type: map_at_1000 value: 34.792 - type: map_at_3 value: 29.74 - type: map_at_5 value: 31.5 - type: mrr_at_1 value: 29.051 - type: mrr_at_10 value: 38.013000000000005 - type: mrr_at_100 value: 38.997 - type: mrr_at_1000 value: 39.055 - type: mrr_at_3 value: 34.947 - type: mrr_at_5 value: 36.815 - type: ndcg_at_1 value: 29.051 - type: ndcg_at_10 value: 39.361000000000004 - type: ndcg_at_100 value: 45.186 - type: ndcg_at_1000 value: 47.867 - type: ndcg_at_3 value: 33.797 - type: ndcg_at_5 value: 36.456 - type: precision_at_1 value: 29.051 - type: precision_at_10 value: 7.668 - type: precision_at_100 value: 1.532 - type: precision_at_1000 value: 0.247 - type: precision_at_3 value: 15.876000000000001 - type: precision_at_5 value: 11.779 - type: recall_at_1 value: 23.666 - type: recall_at_10 value: 51.858000000000004 - type: recall_at_100 value: 77.805 - type: recall_at_1000 value: 94.504 - type: recall_at_3 value: 36.207 - type: recall_at_5 value: 43.094 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 15.662 - type: map_at_10 value: 23.594 - type: map_at_100 value: 24.593999999999998 - type: map_at_1000 value: 24.694 - type: map_at_3 value: 20.925 - type: map_at_5 value: 22.817999999999998 - type: mrr_at_1 value: 17.375 - type: mrr_at_10 value: 25.734 - type: mrr_at_100 value: 26.586 - type: mrr_at_1000 value: 26.671 - type: mrr_at_3 value: 23.044 - type: mrr_at_5 value: 24.975 - type: ndcg_at_1 value: 17.375 - type: ndcg_at_10 value: 28.186 - type: ndcg_at_100 value: 33.436 - type: ndcg_at_1000 value: 36.203 - type: ndcg_at_3 value: 23.152 - type: ndcg_at_5 value: 26.397 - type: precision_at_1 value: 17.375 - type: precision_at_10 value: 4.677 - type: precision_at_100 value: 0.786 - type: precision_at_1000 value: 0.109 - type: precision_at_3 value: 10.351 - type: precision_at_5 value: 7.985 - type: recall_at_1 value: 15.662 - type: recall_at_10 value: 40.066 - type: recall_at_100 value: 65.006 - type: recall_at_1000 value: 85.94000000000001 - type: recall_at_3 value: 27.400000000000002 - type: recall_at_5 value: 35.002 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 8.853 - type: map_at_10 value: 15.568000000000001 - type: map_at_100 value: 17.383000000000003 - type: map_at_1000 value: 17.584 - type: map_at_3 value: 12.561 - type: map_at_5 value: 14.056 - type: mrr_at_1 value: 18.958 - type: mrr_at_10 value: 28.288000000000004 - type: mrr_at_100 value: 29.432000000000002 - type: mrr_at_1000 value: 29.498 - type: mrr_at_3 value: 25.049 - type: mrr_at_5 value: 26.857 - type: ndcg_at_1 value: 18.958 - type: ndcg_at_10 value: 22.21 - type: ndcg_at_100 value: 29.596 - type: ndcg_at_1000 value: 33.583 - type: ndcg_at_3 value: 16.994999999999997 - type: ndcg_at_5 value: 18.95 - type: precision_at_1 value: 18.958 - type: precision_at_10 value: 7.192 - type: precision_at_100 value: 1.5 - type: precision_at_1000 value: 0.22399999999999998 - type: precision_at_3 value: 12.573 - type: precision_at_5 value: 10.202 - type: recall_at_1 value: 8.853 - type: recall_at_10 value: 28.087 - type: recall_at_100 value: 53.701 - type: recall_at_1000 value: 76.29899999999999 - type: recall_at_3 value: 15.913 - type: recall_at_5 value: 20.658 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 9.077 - type: map_at_10 value: 20.788999999999998 - type: map_at_100 value: 30.429000000000002 - type: map_at_1000 value: 32.143 - type: map_at_3 value: 14.692 - type: map_at_5 value: 17.139 - type: mrr_at_1 value: 70.75 - type: mrr_at_10 value: 78.036 - type: mrr_at_100 value: 78.401 - type: mrr_at_1000 value: 78.404 - type: mrr_at_3 value: 76.75 - type: mrr_at_5 value: 77.47500000000001 - type: ndcg_at_1 value: 58.12500000000001 - type: ndcg_at_10 value: 44.015 - type: ndcg_at_100 value: 49.247 - type: ndcg_at_1000 value: 56.211999999999996 - type: ndcg_at_3 value: 49.151 - type: ndcg_at_5 value: 46.195 - type: precision_at_1 value: 70.75 - type: precision_at_10 value: 35.5 - type: precision_at_100 value: 11.355 - type: precision_at_1000 value: 2.1950000000000003 - type: precision_at_3 value: 53.083000000000006 - type: precision_at_5 value: 44.800000000000004 - type: recall_at_1 value: 9.077 - type: recall_at_10 value: 26.259 - type: recall_at_100 value: 56.547000000000004 - type: recall_at_1000 value: 78.551 - type: recall_at_3 value: 16.162000000000003 - type: recall_at_5 value: 19.753999999999998 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 49.44500000000001 - type: f1 value: 44.67067691783401 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 68.182 - type: map_at_10 value: 78.223 - type: map_at_100 value: 78.498 - type: map_at_1000 value: 78.512 - type: map_at_3 value: 76.71 - type: map_at_5 value: 77.725 - type: mrr_at_1 value: 73.177 - type: mrr_at_10 value: 82.513 - type: mrr_at_100 value: 82.633 - type: mrr_at_1000 value: 82.635 - type: mrr_at_3 value: 81.376 - type: mrr_at_5 value: 82.182 - type: ndcg_at_1 value: 73.177 - type: ndcg_at_10 value: 82.829 - type: ndcg_at_100 value: 83.84 - type: ndcg_at_1000 value: 84.07900000000001 - type: ndcg_at_3 value: 80.303 - type: ndcg_at_5 value: 81.846 - type: precision_at_1 value: 73.177 - type: precision_at_10 value: 10.241999999999999 - type: precision_at_100 value: 1.099 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 31.247999999999998 - type: precision_at_5 value: 19.697 - type: recall_at_1 value: 68.182 - type: recall_at_10 value: 92.657 - type: recall_at_100 value: 96.709 - type: recall_at_1000 value: 98.184 - type: recall_at_3 value: 85.9 - type: recall_at_5 value: 89.755 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 21.108 - type: map_at_10 value: 33.342 - type: map_at_100 value: 35.281 - type: map_at_1000 value: 35.478 - type: map_at_3 value: 29.067 - type: map_at_5 value: 31.563000000000002 - type: mrr_at_1 value: 41.667 - type: mrr_at_10 value: 49.913000000000004 - type: mrr_at_100 value: 50.724000000000004 - type: mrr_at_1000 value: 50.766 - type: mrr_at_3 value: 47.504999999999995 - type: mrr_at_5 value: 49.033 - type: ndcg_at_1 value: 41.667 - type: ndcg_at_10 value: 41.144 - type: ndcg_at_100 value: 48.326 - type: ndcg_at_1000 value: 51.486 - type: ndcg_at_3 value: 37.486999999999995 - type: ndcg_at_5 value: 38.78 - type: precision_at_1 value: 41.667 - type: precision_at_10 value: 11.358 - type: precision_at_100 value: 1.873 - type: precision_at_1000 value: 0.244 - type: precision_at_3 value: 25 - type: precision_at_5 value: 18.519 - type: recall_at_1 value: 21.108 - type: recall_at_10 value: 47.249 - type: recall_at_100 value: 74.52 - type: recall_at_1000 value: 93.31 - type: recall_at_3 value: 33.271 - type: recall_at_5 value: 39.723000000000006 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 40.317 - type: map_at_10 value: 64.861 - type: map_at_100 value: 65.697 - type: map_at_1000 value: 65.755 - type: map_at_3 value: 61.258 - type: map_at_5 value: 63.590999999999994 - type: mrr_at_1 value: 80.635 - type: mrr_at_10 value: 86.528 - type: mrr_at_100 value: 86.66199999999999 - type: mrr_at_1000 value: 86.666 - type: mrr_at_3 value: 85.744 - type: mrr_at_5 value: 86.24300000000001 - type: ndcg_at_1 value: 80.635 - type: ndcg_at_10 value: 73.13199999999999 - type: ndcg_at_100 value: 75.927 - type: ndcg_at_1000 value: 76.976 - type: ndcg_at_3 value: 68.241 - type: ndcg_at_5 value: 71.071 - type: precision_at_1 value: 80.635 - type: precision_at_10 value: 15.326 - type: precision_at_100 value: 1.7500000000000002 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 43.961 - type: precision_at_5 value: 28.599999999999998 - type: recall_at_1 value: 40.317 - type: recall_at_10 value: 76.631 - type: recall_at_100 value: 87.495 - type: recall_at_1000 value: 94.362 - type: recall_at_3 value: 65.94200000000001 - type: recall_at_5 value: 71.499 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 91.686 - type: ap value: 87.5577120393173 - type: f1 value: 91.6629447355139 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 23.702 - type: map_at_10 value: 36.414 - type: map_at_100 value: 37.561 - type: map_at_1000 value: 37.605 - type: map_at_3 value: 32.456 - type: map_at_5 value: 34.827000000000005 - type: mrr_at_1 value: 24.355 - type: mrr_at_10 value: 37.01 - type: mrr_at_100 value: 38.085 - type: mrr_at_1000 value: 38.123000000000005 - type: mrr_at_3 value: 33.117999999999995 - type: mrr_at_5 value: 35.452 - type: ndcg_at_1 value: 24.384 - type: ndcg_at_10 value: 43.456 - type: ndcg_at_100 value: 48.892 - type: ndcg_at_1000 value: 49.964 - type: ndcg_at_3 value: 35.475 - type: ndcg_at_5 value: 39.711 - type: precision_at_1 value: 24.384 - type: precision_at_10 value: 6.7940000000000005 - type: precision_at_100 value: 0.951 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.052999999999999 - type: precision_at_5 value: 11.189 - type: recall_at_1 value: 23.702 - type: recall_at_10 value: 65.057 - type: recall_at_100 value: 90.021 - type: recall_at_1000 value: 98.142 - type: recall_at_3 value: 43.551 - type: recall_at_5 value: 53.738 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 94.62380300957591 - type: f1 value: 94.49871222100734 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.14090287277702 - type: f1 value: 60.32101258220515 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.84330867518494 - type: f1 value: 71.92248688515255 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 78.10692669804976 - type: f1 value: 77.9904839122866 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 31.822988923078444 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 30.38394880253403 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.82504612539082 - type: mrr value: 32.84462298174977 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 6.029 - type: map_at_10 value: 14.088999999999999 - type: map_at_100 value: 17.601 - type: map_at_1000 value: 19.144 - type: map_at_3 value: 10.156 - type: map_at_5 value: 11.892 - type: mrr_at_1 value: 46.44 - type: mrr_at_10 value: 56.596999999999994 - type: mrr_at_100 value: 57.11000000000001 - type: mrr_at_1000 value: 57.14 - type: mrr_at_3 value: 54.334 - type: mrr_at_5 value: 55.774 - type: ndcg_at_1 value: 44.891999999999996 - type: ndcg_at_10 value: 37.134 - type: ndcg_at_100 value: 33.652 - type: ndcg_at_1000 value: 42.548 - type: ndcg_at_3 value: 41.851 - type: ndcg_at_5 value: 39.842 - type: precision_at_1 value: 46.44 - type: precision_at_10 value: 27.647 - type: precision_at_100 value: 8.309999999999999 - type: precision_at_1000 value: 2.146 - type: precision_at_3 value: 39.422000000000004 - type: precision_at_5 value: 34.675 - type: recall_at_1 value: 6.029 - type: recall_at_10 value: 18.907 - type: recall_at_100 value: 33.76 - type: recall_at_1000 value: 65.14999999999999 - type: recall_at_3 value: 11.584999999999999 - type: recall_at_5 value: 14.626 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 39.373000000000005 - type: map_at_10 value: 55.836 - type: map_at_100 value: 56.611999999999995 - type: map_at_1000 value: 56.63 - type: map_at_3 value: 51.747 - type: map_at_5 value: 54.337999999999994 - type: mrr_at_1 value: 44.147999999999996 - type: mrr_at_10 value: 58.42699999999999 - type: mrr_at_100 value: 58.902 - type: mrr_at_1000 value: 58.914 - type: mrr_at_3 value: 55.156000000000006 - type: mrr_at_5 value: 57.291000000000004 - type: ndcg_at_1 value: 44.119 - type: ndcg_at_10 value: 63.444 - type: ndcg_at_100 value: 66.40599999999999 - type: ndcg_at_1000 value: 66.822 - type: ndcg_at_3 value: 55.962 - type: ndcg_at_5 value: 60.228 - type: precision_at_1 value: 44.119 - type: precision_at_10 value: 10.006 - type: precision_at_100 value: 1.17 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 25.135 - type: precision_at_5 value: 17.59 - type: recall_at_1 value: 39.373000000000005 - type: recall_at_10 value: 83.78999999999999 - type: recall_at_100 value: 96.246 - type: recall_at_1000 value: 99.324 - type: recall_at_3 value: 64.71900000000001 - type: recall_at_5 value: 74.508 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 69.199 - type: map_at_10 value: 82.892 - type: map_at_100 value: 83.578 - type: map_at_1000 value: 83.598 - type: map_at_3 value: 79.948 - type: map_at_5 value: 81.779 - type: mrr_at_1 value: 79.67 - type: mrr_at_10 value: 86.115 - type: mrr_at_100 value: 86.249 - type: mrr_at_1000 value: 86.251 - type: mrr_at_3 value: 85.08200000000001 - type: mrr_at_5 value: 85.783 - type: ndcg_at_1 value: 79.67 - type: ndcg_at_10 value: 86.839 - type: ndcg_at_100 value: 88.252 - type: ndcg_at_1000 value: 88.401 - type: ndcg_at_3 value: 83.86200000000001 - type: ndcg_at_5 value: 85.473 - type: precision_at_1 value: 79.67 - type: precision_at_10 value: 13.19 - type: precision_at_100 value: 1.521 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 36.677 - type: precision_at_5 value: 24.118000000000002 - type: recall_at_1 value: 69.199 - type: recall_at_10 value: 94.321 - type: recall_at_100 value: 99.20400000000001 - type: recall_at_1000 value: 99.947 - type: recall_at_3 value: 85.787 - type: recall_at_5 value: 90.365 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 55.82810046856353 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.38132611783628 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 5.127000000000001 - type: map_at_10 value: 12.235 - type: map_at_100 value: 14.417 - type: map_at_1000 value: 14.75 - type: map_at_3 value: 8.906 - type: map_at_5 value: 10.591000000000001 - type: mrr_at_1 value: 25.2 - type: mrr_at_10 value: 35.879 - type: mrr_at_100 value: 36.935 - type: mrr_at_1000 value: 36.997 - type: mrr_at_3 value: 32.783 - type: mrr_at_5 value: 34.367999999999995 - type: ndcg_at_1 value: 25.2 - type: ndcg_at_10 value: 20.509 - type: ndcg_at_100 value: 28.67 - type: ndcg_at_1000 value: 34.42 - type: ndcg_at_3 value: 19.948 - type: ndcg_at_5 value: 17.166 - type: precision_at_1 value: 25.2 - type: precision_at_10 value: 10.440000000000001 - type: precision_at_100 value: 2.214 - type: precision_at_1000 value: 0.359 - type: precision_at_3 value: 18.533 - type: precision_at_5 value: 14.860000000000001 - type: recall_at_1 value: 5.127000000000001 - type: recall_at_10 value: 21.147 - type: recall_at_100 value: 44.946999999999996 - type: recall_at_1000 value: 72.89 - type: recall_at_3 value: 11.277 - type: recall_at_5 value: 15.042 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.0373011786213 - type: cos_sim_spearman value: 79.27889560856613 - type: euclidean_pearson value: 80.31186315495655 - type: euclidean_spearman value: 79.41630415280811 - type: manhattan_pearson value: 80.31755140442013 - type: manhattan_spearman value: 79.43069870027611 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.8659751342045 - type: cos_sim_spearman value: 76.95377612997667 - type: euclidean_pearson value: 81.24552945497848 - type: euclidean_spearman value: 77.18236963555253 - type: manhattan_pearson value: 81.26477607759037 - type: manhattan_spearman value: 77.13821753062756 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 83.34597139044875 - type: cos_sim_spearman value: 84.124169425592 - type: euclidean_pearson value: 83.68590721511401 - type: euclidean_spearman value: 84.18846190846398 - type: manhattan_pearson value: 83.57630235061498 - type: manhattan_spearman value: 84.10244043726902 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.67641885599572 - type: cos_sim_spearman value: 80.46450725650428 - type: euclidean_pearson value: 81.61645042715865 - type: euclidean_spearman value: 80.61418394236874 - type: manhattan_pearson value: 81.55712034928871 - type: manhattan_spearman value: 80.57905670523951 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.86650310886782 - type: cos_sim_spearman value: 89.76081629222328 - type: euclidean_pearson value: 89.1530747029954 - type: euclidean_spearman value: 89.80990657280248 - type: manhattan_pearson value: 89.10640563278132 - type: manhattan_spearman value: 89.76282108434047 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.93864027911118 - type: cos_sim_spearman value: 85.47096193999023 - type: euclidean_pearson value: 85.03141840870533 - type: euclidean_spearman value: 85.43124029598181 - type: manhattan_pearson value: 84.99002664393512 - type: manhattan_spearman value: 85.39169195120834 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 88.7045343749832 - type: cos_sim_spearman value: 89.03262221146677 - type: euclidean_pearson value: 89.56078218264365 - type: euclidean_spearman value: 89.17827006466868 - type: manhattan_pearson value: 89.52717595468582 - type: manhattan_spearman value: 89.15878115952923 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 64.20191302875551 - type: cos_sim_spearman value: 64.11446552557646 - type: euclidean_pearson value: 64.6918197393619 - type: euclidean_spearman value: 63.440182631197764 - type: manhattan_pearson value: 64.55692904121835 - type: manhattan_spearman value: 63.424877742756266 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 86.37793104662344 - type: cos_sim_spearman value: 87.7357802629067 - type: euclidean_pearson value: 87.4286301545109 - type: euclidean_spearman value: 87.78452920777421 - type: manhattan_pearson value: 87.42445169331255 - type: manhattan_spearman value: 87.78537677249598 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 84.31465405081792 - type: mrr value: 95.7173781193389 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 57.760999999999996 - type: map_at_10 value: 67.904 - type: map_at_100 value: 68.539 - type: map_at_1000 value: 68.562 - type: map_at_3 value: 65.415 - type: map_at_5 value: 66.788 - type: mrr_at_1 value: 60.333000000000006 - type: mrr_at_10 value: 68.797 - type: mrr_at_100 value: 69.236 - type: mrr_at_1000 value: 69.257 - type: mrr_at_3 value: 66.667 - type: mrr_at_5 value: 67.967 - type: ndcg_at_1 value: 60.333000000000006 - type: ndcg_at_10 value: 72.24199999999999 - type: ndcg_at_100 value: 74.86 - type: ndcg_at_1000 value: 75.354 - type: ndcg_at_3 value: 67.93400000000001 - type: ndcg_at_5 value: 70.02199999999999 - type: precision_at_1 value: 60.333000000000006 - type: precision_at_10 value: 9.533 - type: precision_at_100 value: 1.09 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.778000000000002 - type: precision_at_5 value: 17.467 - type: recall_at_1 value: 57.760999999999996 - type: recall_at_10 value: 84.383 - type: recall_at_100 value: 96.267 - type: recall_at_1000 value: 100 - type: recall_at_3 value: 72.628 - type: recall_at_5 value: 78.094 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.8029702970297 - type: cos_sim_ap value: 94.9210324173411 - type: cos_sim_f1 value: 89.8521162672106 - type: cos_sim_precision value: 91.67533818938605 - type: cos_sim_recall value: 88.1 - type: dot_accuracy value: 99.69504950495049 - type: dot_ap value: 90.4919719146181 - type: dot_f1 value: 84.72289156626506 - type: dot_precision value: 81.76744186046511 - type: dot_recall value: 87.9 - type: euclidean_accuracy value: 99.79702970297029 - type: euclidean_ap value: 94.87827463795753 - type: euclidean_f1 value: 89.55680081507896 - type: euclidean_precision value: 91.27725856697819 - type: euclidean_recall value: 87.9 - type: manhattan_accuracy value: 99.7990099009901 - type: manhattan_ap value: 94.87587025149682 - type: manhattan_f1 value: 89.76298537569339 - type: manhattan_precision value: 90.53916581892166 - type: manhattan_recall value: 89 - type: max_accuracy value: 99.8029702970297 - type: max_ap value: 94.9210324173411 - type: max_f1 value: 89.8521162672106 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 65.92385753948724 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 33.671756975431144 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.677928036739004 - type: mrr value: 51.56413133435193 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.523589340819683 - type: cos_sim_spearman value: 30.187407518823235 - type: dot_pearson value: 29.039713969699015 - type: dot_spearman value: 29.114740651155508 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.211 - type: map_at_10 value: 1.6199999999999999 - type: map_at_100 value: 8.658000000000001 - type: map_at_1000 value: 21.538 - type: map_at_3 value: 0.575 - type: map_at_5 value: 0.919 - type: mrr_at_1 value: 78 - type: mrr_at_10 value: 86.18599999999999 - type: mrr_at_100 value: 86.18599999999999 - type: mrr_at_1000 value: 86.18599999999999 - type: mrr_at_3 value: 85 - type: mrr_at_5 value: 85.9 - type: ndcg_at_1 value: 74 - type: ndcg_at_10 value: 66.542 - type: ndcg_at_100 value: 50.163999999999994 - type: ndcg_at_1000 value: 45.696999999999996 - type: ndcg_at_3 value: 71.531 - type: ndcg_at_5 value: 70.45 - type: precision_at_1 value: 78 - type: precision_at_10 value: 69.39999999999999 - type: precision_at_100 value: 51.06 - type: precision_at_1000 value: 20.022000000000002 - type: precision_at_3 value: 76 - type: precision_at_5 value: 74.8 - type: recall_at_1 value: 0.211 - type: recall_at_10 value: 1.813 - type: recall_at_100 value: 12.098 - type: recall_at_1000 value: 42.618 - type: recall_at_3 value: 0.603 - type: recall_at_5 value: 0.987 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.2079999999999997 - type: map_at_10 value: 7.777000000000001 - type: map_at_100 value: 12.825000000000001 - type: map_at_1000 value: 14.196 - type: map_at_3 value: 4.285 - type: map_at_5 value: 6.177 - type: mrr_at_1 value: 30.612000000000002 - type: mrr_at_10 value: 42.635 - type: mrr_at_100 value: 43.955 - type: mrr_at_1000 value: 43.955 - type: mrr_at_3 value: 38.435 - type: mrr_at_5 value: 41.088 - type: ndcg_at_1 value: 28.571 - type: ndcg_at_10 value: 20.666999999999998 - type: ndcg_at_100 value: 31.840000000000003 - type: ndcg_at_1000 value: 43.191 - type: ndcg_at_3 value: 23.45 - type: ndcg_at_5 value: 22.994 - type: precision_at_1 value: 30.612000000000002 - type: precision_at_10 value: 17.959 - type: precision_at_100 value: 6.755 - type: precision_at_1000 value: 1.4200000000000002 - type: precision_at_3 value: 23.810000000000002 - type: precision_at_5 value: 23.673 - type: recall_at_1 value: 2.2079999999999997 - type: recall_at_10 value: 13.144 - type: recall_at_100 value: 42.491 - type: recall_at_1000 value: 77.04299999999999 - type: recall_at_3 value: 5.3469999999999995 - type: recall_at_5 value: 9.139 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.9044 - type: ap value: 14.625783489340755 - type: f1 value: 54.814936562590546 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 60.94227504244483 - type: f1 value: 61.22516038508854 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.602409155145864 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.94641473445789 - type: cos_sim_ap value: 76.91572747061197 - type: cos_sim_f1 value: 70.14348097317529 - type: cos_sim_precision value: 66.53254437869822 - type: cos_sim_recall value: 74.1688654353562 - type: dot_accuracy value: 84.80061989628658 - type: dot_ap value: 70.7952548895177 - type: dot_f1 value: 65.44780728844965 - type: dot_precision value: 61.53310104529617 - type: dot_recall value: 69.89445910290237 - type: euclidean_accuracy value: 86.94641473445789 - type: euclidean_ap value: 76.80774009393652 - type: euclidean_f1 value: 70.30522503879979 - type: euclidean_precision value: 68.94977168949772 - type: euclidean_recall value: 71.71503957783642 - type: manhattan_accuracy value: 86.8629671574179 - type: manhattan_ap value: 76.76518632600317 - type: manhattan_f1 value: 70.16056518946692 - type: manhattan_precision value: 68.360450563204 - type: manhattan_recall value: 72.0580474934037 - type: max_accuracy value: 86.94641473445789 - type: max_ap value: 76.91572747061197 - type: max_f1 value: 70.30522503879979 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.10428066907285 - type: cos_sim_ap value: 86.25114759921435 - type: cos_sim_f1 value: 78.37857884586856 - type: cos_sim_precision value: 75.60818546078993 - type: cos_sim_recall value: 81.35971666153372 - type: dot_accuracy value: 87.41995575736406 - type: dot_ap value: 81.51838010086782 - type: dot_f1 value: 74.77398015435503 - type: dot_precision value: 71.53002390662354 - type: dot_recall value: 78.32614721281182 - type: euclidean_accuracy value: 89.12368533395428 - type: euclidean_ap value: 86.33456799874504 - type: euclidean_f1 value: 78.45496750232127 - type: euclidean_precision value: 75.78388462366364 - type: euclidean_recall value: 81.32121958731136 - type: manhattan_accuracy value: 89.10622113556099 - type: manhattan_ap value: 86.31215061745333 - type: manhattan_f1 value: 78.40684906011539 - type: manhattan_precision value: 75.89536643366722 - type: manhattan_recall value: 81.09023714197721 - type: max_accuracy value: 89.12368533395428 - type: max_ap value: 86.33456799874504 - type: max_f1 value: 78.45496750232127 language: - en license: mit --- # E5-large-v2 [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf). Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022 This model has 24 layers and the embedding size is 1024. ## Usage Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset. ```python import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0) return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] # Each input text should start with "query: " or "passage: ". # For tasks other than retrieval, you can simply use the "query: " prefix. input_texts = ['query: how much protein should a female eat', 'query: summit define', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."] tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-large-v2') model = AutoModel.from_pretrained('intfloat/e5-large-v2') # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:2] @ embeddings[2:].T) * 100 print(scores.tolist()) ``` ## Training Details Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf). ## Benchmark Evaluation Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316). ## Support for Sentence Transformers Below is an example for usage with sentence_transformers. ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer('intfloat/e5-large-v2') input_texts = [ 'query: how much protein should a female eat', 'query: summit define', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments." ] embeddings = model.encode(input_texts, normalize_embeddings=True) ``` Package requirements `pip install sentence_transformers~=2.2.2` Contributors: [michaelfeil](https://huggingface.co/michaelfeil) ## FAQ **1. Do I need to add the prefix "query: " and "passage: " to input texts?** Yes, this is how the model is trained, otherwise you will see a performance degradation. Here are some rules of thumb: - Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval. - Use "query: " prefix for symmetric tasks such as semantic similarity, paraphrase retrieval. - Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering. **2. Why are my reproduced results slightly different from reported in the model card?** Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences. **3. Why does the cosine similarity scores distribute around 0.7 to 1.0?** This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss. For text embedding tasks like text retrieval or semantic similarity, what matters is the relative order of the scores instead of the absolute values, so this should not be an issue. ## Citation If you find our paper or models helpful, please consider cite as follows: ``` @article{wang2022text, title={Text Embeddings by Weakly-Supervised Contrastive Pre-training}, author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu}, journal={arXiv preprint arXiv:2212.03533}, year={2022} } ``` ## Limitations This model only works for English texts. Long texts will be truncated to at most 512 tokens.
67,842
[ [ -0.00958251953125, -0.05352783203125, 0.019561767578125, 0.01561737060546875, -0.018524169921875, -0.0345458984375, -0.00008380413055419922, -0.03289794921875, 0.006397247314453125, 0.02313232421875, -0.034332275390625, -0.04815673828125, -0.07659912109375, 0.0179290771484375, -0.02960205078125, 0.070068359375, 0.0005645751953125, 0.00423431396484375, -0.0274200439453125, -0.0046234130859375, -0.016326904296875, -0.0439453125, -0.0244140625, -0.022705078125, 0.0231170654296875, 0.016204833984375, 0.045501708984375, 0.041412353515625, 0.050689697265625, 0.0267181396484375, -0.01119232177734375, 0.01273345947265625, -0.042144775390625, -0.00905609130859375, 0.00016307830810546875, -0.040435791015625, -0.03155517578125, 0.015899658203125, 0.042022705078125, 0.060272216796875, 0.013031005859375, 0.01837158203125, 0.028594970703125, 0.0377197265625, -0.044342041015625, 0.01476287841796875, -0.0330810546875, 0.0122833251953125, 0.0073394775390625, -0.0008096694946289062, -0.0288848876953125, 0.01079559326171875, 0.0303497314453125, -0.045013427734375, 0.0197906494140625, 0.0084228515625, 0.0960693359375, 0.021759033203125, -0.03759765625, -0.01654052734375, -0.008758544921875, 0.07763671875, -0.05419921875, 0.03363037109375, 0.052215576171875, -0.02215576171875, -0.0090789794921875, -0.070068359375, -0.03033447265625, -0.01543426513671875, -0.0188140869140625, 0.01136016845703125, -0.018798828125, -0.003765106201171875, 0.032318115234375, 0.034271240234375, -0.0623779296875, -0.0020961761474609375, -0.025787353515625, -0.006710052490234375, 0.038787841796875, 0.00890350341796875, 0.0201873779296875, -0.034027099609375, -0.017913818359375, -0.0199127197265625, -0.0400390625, 0.001728057861328125, 0.0172576904296875, 0.0282440185546875, -0.0298614501953125, 0.0439453125, -0.0198822021484375, 0.046875, 0.014801025390625, 0.0080413818359375, 0.05206298828125, -0.040924072265625, -0.0204620361328125, -0.018798828125, 0.0771484375, 0.0377197265625, 0.0111083984375, -0.00606536865234375, -0.004390716552734375, -0.008514404296875, 0.006103515625, -0.08721923828125, -0.037628173828125, 0.016326904296875, -0.047210693359375, -0.01568603515625, 0.00965118408203125, -0.041229248046875, -0.00823974609375, -0.0212860107421875, 0.068603515625, -0.040557861328125, 0.007007598876953125, 0.01910400390625, -0.0179595947265625, 0.0125274658203125, 0.009796142578125, -0.060394287109375, 0.022674560546875, 0.00946044921875, 0.0682373046875, -0.004543304443359375, -0.0294342041015625, -0.040771484375, -0.006107330322265625, 0.0025348663330078125, 0.036773681640625, -0.030609130859375, -0.0193939208984375, 0.004886627197265625, 0.03057861328125, -0.0347900390625, -0.037872314453125, 0.040374755859375, -0.0254364013671875, 0.02972412109375, -0.0193939208984375, -0.04547119140625, -0.0035877227783203125, 0.0202178955078125, -0.032989501953125, 0.07879638671875, 0.0069427490234375, -0.073486328125, 0.0081787109375, -0.0308685302734375, -0.0302581787109375, -0.01220703125, -0.00848388671875, -0.0380859375, -0.0074462890625, 0.036895751953125, 0.03582763671875, -0.01273345947265625, 0.0006265640258789062, -0.006298065185546875, -0.037872314453125, 0.0171966552734375, -0.01175689697265625, 0.066162109375, 0.005352020263671875, -0.0350341796875, -0.01076507568359375, -0.050872802734375, 0.00405120849609375, 0.012939453125, -0.032440185546875, -0.01079559326171875, 0.005950927734375, 0.001953125, 0.0225067138671875, 0.028472900390625, -0.03509521484375, 0.01407623291015625, -0.037567138671875, 0.05633544921875, 0.041778564453125, 0.005512237548828125, 0.03271484375, -0.0288238525390625, 0.007785797119140625, 0.029815673828125, 0.004482269287109375, -0.00283050537109375, -0.0377197265625, -0.056304931640625, -0.00777435302734375, 0.042144775390625, 0.037689208984375, -0.036712646484375, 0.045806884765625, -0.02606201171875, -0.024169921875, -0.051666259765625, 0.0105438232421875, 0.0180816650390625, 0.0276947021484375, 0.06011962890625, -0.0055999755859375, -0.04803466796875, -0.07733154296875, -0.01995849609375, 0.0142822265625, -0.022918701171875, 0.0182037353515625, 0.068603515625, -0.0249481201171875, 0.042816162109375, -0.053558349609375, -0.039581298828125, -0.01371002197265625, 0.00870513916015625, 0.0321044921875, 0.057769775390625, 0.0283050537109375, -0.06683349609375, -0.034820556640625, -0.03961181640625, -0.067626953125, 0.00850677490234375, 0.00811004638671875, -0.0176239013671875, -0.0072479248046875, 0.04022216796875, -0.0498046875, 0.024017333984375, 0.0404052734375, -0.034271240234375, 0.0219879150390625, -0.0249481201171875, 0.01123809814453125, -0.08087158203125, 0.0005655288696289062, 0.015777587890625, -0.01509857177734375, -0.0261383056640625, 0.01062774658203125, -0.0007505416870117188, -0.0121917724609375, -0.03485107421875, 0.0191802978515625, -0.04345703125, 0.0168914794921875, -0.006107330322265625, 0.0287322998046875, 0.0235443115234375, 0.038299560546875, -0.006374359130859375, 0.0423583984375, 0.04412841796875, -0.06341552734375, 0.0008006095886230469, 0.05084228515625, -0.03155517578125, 0.0251922607421875, -0.0692138671875, 0.00875091552734375, -0.0009746551513671875, 0.0202178955078125, -0.0709228515625, -0.01401519775390625, 0.0176544189453125, -0.049346923828125, 0.0236968994140625, -0.0015859603881835938, -0.04144287109375, -0.022430419921875, -0.04022216796875, 0.0206146240234375, 0.03961181640625, -0.0265960693359375, 0.035125732421875, 0.0199127197265625, 0.0022335052490234375, -0.040557861328125, -0.0802001953125, -0.01132965087890625, 0.0003159046173095703, -0.050201416015625, 0.056060791015625, -0.013519287109375, 0.0166168212890625, 0.0061492919921875, -0.00836944580078125, 0.01526641845703125, -0.01180267333984375, 0.0189208984375, 0.0008392333984375, -0.0003733634948730469, 0.00681304931640625, -0.008392333984375, -0.002544403076171875, 0.0019245147705078125, -0.021240234375, 0.04443359375, -0.0236968994140625, 0.005382537841796875, -0.044769287109375, 0.0386962890625, 0.0157012939453125, -0.01904296875, 0.08453369140625, 0.0626220703125, -0.029296875, 0.007106781005859375, -0.0240936279296875, -0.0248260498046875, -0.035858154296875, 0.047515869140625, -0.0418701171875, -0.04351806640625, 0.032501220703125, 0.006954193115234375, -0.005626678466796875, 0.06854248046875, 0.0256195068359375, -0.0218658447265625, 0.0982666015625, 0.051849365234375, 0.01206207275390625, 0.031982421875, -0.054840087890625, 0.0026798248291015625, -0.07122802734375, -0.0265655517578125, -0.047119140625, -0.03631591796875, -0.06451416015625, -0.035003662109375, 0.024383544921875, 0.0166778564453125, -0.037078857421875, 0.025726318359375, -0.042816162109375, 0.00844573974609375, 0.043121337890625, 0.041046142578125, 0.0023403167724609375, 0.0103607177734375, -0.014434814453125, -0.02886962890625, -0.0662841796875, -0.0245208740234375, 0.0748291015625, 0.022979736328125, 0.05523681640625, -0.0048065185546875, 0.048095703125, 0.010040283203125, -0.009246826171875, -0.052398681640625, 0.040374755859375, -0.0308685302734375, -0.0240020751953125, -0.01236724853515625, -0.051849365234375, -0.0784912109375, 0.0287628173828125, -0.0328369140625, -0.051177978515625, 0.01506805419921875, -0.01094818115234375, -0.01824951171875, 0.00878143310546875, -0.06683349609375, 0.0791015625, 0.003307342529296875, -0.025787353515625, -0.0012578964233398438, -0.05206298828125, -0.0149078369140625, 0.0249481201171875, 0.00899505615234375, 0.003070831298828125, -0.006107330322265625, 0.08331298828125, -0.023101806640625, 0.07037353515625, -0.007190704345703125, 0.0275421142578125, 0.009979248046875, -0.01371002197265625, 0.042724609375, -0.0149688720703125, -0.00832366943359375, 0.017913818359375, 0.003734588623046875, -0.0458984375, -0.026763916015625, 0.060821533203125, -0.09210205078125, -0.0406494140625, -0.041290283203125, -0.034454345703125, 0.005298614501953125, 0.009002685546875, 0.048065185546875, 0.038116455078125, 0.0126953125, 0.045013427734375, 0.04510498046875, -0.0313720703125, 0.023406982421875, 0.01904296875, 0.00811004638671875, -0.03411865234375, 0.053070068359375, 0.03277587890625, 0.0157623291015625, 0.04901123046875, 0.0183258056640625, -0.0252685546875, -0.041839599609375, -0.00940704345703125, 0.034027099609375, -0.054443359375, -0.0153961181640625, -0.0814208984375, -0.0263519287109375, -0.044219970703125, -0.0015630722045898438, -0.0187530517578125, -0.0267181396484375, -0.0297393798828125, -0.004360198974609375, 0.0123443603515625, 0.0328369140625, -0.001834869384765625, 0.0242919921875, -0.04949951171875, 0.0231170654296875, 0.005214691162109375, 0.006511688232421875, -0.00905609130859375, -0.072265625, -0.032440185546875, 0.00885009765625, -0.04510498046875, -0.0670166015625, 0.03070068359375, 0.0303497314453125, 0.047210693359375, 0.01155853271484375, 0.0081787109375, 0.04754638671875, -0.0278167724609375, 0.07427978515625, 0.00994110107421875, -0.06842041015625, 0.050689697265625, -0.004451751708984375, 0.049774169921875, 0.0386962890625, 0.060821533203125, -0.02435302734375, -0.029815673828125, -0.057525634765625, -0.08367919921875, 0.046051025390625, 0.03009033203125, 0.018280029296875, -0.007167816162109375, 0.0233612060546875, -0.0039215087890625, 0.0165557861328125, -0.08154296875, -0.0252532958984375, -0.0260009765625, -0.023284912109375, -0.01605224609375, -0.0170135498046875, 0.0007581710815429688, -0.04254150390625, 0.0623779296875, -0.0018510818481445312, 0.049896240234375, 0.04107666015625, -0.037628173828125, 0.00870513916015625, 0.002834320068359375, 0.0222625732421875, 0.042083740234375, -0.0311431884765625, 0.0242156982421875, 0.026153564453125, -0.04876708984375, -0.012420654296875, 0.0180511474609375, -0.0150604248046875, 0.00323486328125, 0.038299560546875, 0.053863525390625, 0.0236968994140625, -0.025390625, 0.0482177734375, 0.0011987686157226562, -0.02728271484375, -0.00757598876953125, -0.00583648681640625, 0.0126953125, 0.0160980224609375, 0.034881591796875, -0.00685882568359375, 0.0117034912109375, -0.044586181640625, 0.00771331787109375, 0.0017156600952148438, -0.03265380859375, -0.021331787109375, 0.052001953125, 0.01605224609375, -0.005382537841796875, 0.07879638671875, -0.00982666015625, -0.04248046875, 0.035614013671875, 0.05279541015625, 0.0482177734375, -0.0099334716796875, 0.0103607177734375, 0.06524658203125, 0.029083251953125, 0.0025997161865234375, 0.01155853271484375, 0.01520538330078125, -0.049560546875, -0.02117919921875, -0.06982421875, -0.001735687255859375, 0.021026611328125, -0.03460693359375, 0.0169677734375, -0.004421234130859375, -0.0232696533203125, 0.0015439987182617188, 0.03564453125, -0.07342529296875, 0.0182342529296875, -0.00372314453125, 0.0526123046875, -0.07000732421875, 0.03851318359375, 0.0567626953125, -0.06390380859375, -0.05450439453125, -0.0001138448715209961, -0.02728271484375, -0.0428466796875, 0.0540771484375, 0.040374755859375, 0.01065826416015625, 0.003421783447265625, -0.040802001953125, -0.05224609375, 0.0843505859375, 0.01319122314453125, -0.037841796875, -0.0235137939453125, 0.0220794677734375, 0.032257080078125, -0.0347900390625, 0.03973388671875, 0.0234832763671875, 0.025146484375, -0.00481414794921875, -0.051239013671875, 0.0207977294921875, -0.0256500244140625, -0.00890350341796875, -0.0104217529296875, -0.04931640625, 0.08648681640625, -0.0187835693359375, -0.0021724700927734375, 0.0037555694580078125, 0.0477294921875, 0.005615234375, 0.002227783203125, 0.030303955078125, 0.045562744140625, 0.049041748046875, -0.003559112548828125, 0.0936279296875, -0.02001953125, 0.03912353515625, 0.062469482421875, 0.025909423828125, 0.06695556640625, 0.033599853515625, -0.0251922607421875, 0.0506591796875, 0.06219482421875, -0.0119781494140625, 0.051513671875, 0.0131988525390625, 0.00946044921875, -0.0229949951171875, 0.0004925727844238281, -0.046600341796875, 0.026763916015625, 0.01654052734375, -0.05157470703125, -0.01216888427734375, 0.005107879638671875, 0.006500244140625, -0.0113525390625, -0.0158233642578125, 0.037139892578125, 0.036895751953125, -0.0323486328125, 0.0706787109375, 0.0069427490234375, 0.054840087890625, -0.0526123046875, 0.012451171875, -0.01335906982421875, 0.03118896484375, -0.0236358642578125, -0.048248291015625, 0.0079803466796875, -0.01081085205078125, -0.0203094482421875, -0.009002685546875, 0.03802490234375, -0.04498291015625, -0.03485107421875, 0.0290374755859375, 0.04071044921875, 0.01995849609375, -0.022186279296875, -0.076416015625, 0.006103515625, -0.00009250640869140625, -0.033294677734375, 0.035400390625, 0.017791748046875, 0.019744873046875, 0.036834716796875, 0.03466796875, -0.01241302490234375, -0.004207611083984375, 0.01477813720703125, 0.06207275390625, -0.047271728515625, -0.04150390625, -0.0643310546875, 0.0301971435546875, -0.018524169921875, -0.02532958984375, 0.0626220703125, 0.051605224609375, 0.055419921875, -0.011474609375, 0.034088134765625, -0.0019073486328125, 0.008026123046875, -0.04296875, 0.047760009765625, -0.052398681640625, -0.004730224609375, -0.0209503173828125, -0.07855224609375, -0.015045166015625, 0.061981201171875, -0.03802490234375, 0.01029205322265625, 0.0694580078125, 0.0587158203125, -0.0172271728515625, -0.0117340087890625, 0.0163116455078125, 0.045135498046875, 0.0233612060546875, 0.06207275390625, 0.0408935546875, -0.08270263671875, 0.0576171875, -0.01059722900390625, -0.01502227783203125, -0.014801025390625, -0.056396484375, -0.0662841796875, -0.05194091796875, -0.044708251953125, -0.033660888671875, 0.007282257080078125, 0.07342529296875, 0.056854248046875, -0.0487060546875, -0.007904052734375, 0.0029144287109375, -0.01250457763671875, -0.0230255126953125, -0.0183563232421875, 0.047760009765625, -0.029205322265625, -0.06890869140625, 0.0145416259765625, -0.016815185546875, 0.00955963134765625, 0.007381439208984375, -0.00618743896484375, -0.049652099609375, -0.0016641616821289062, 0.05438232421875, -0.010040283203125, -0.027435302734375, -0.0289459228515625, -0.0020122528076171875, -0.02716064453125, 0.01136016845703125, 0.010589599609375, -0.048065185546875, 0.019561767578125, 0.04931640625, 0.0345458984375, 0.0770263671875, -0.00225067138671875, 0.030609130859375, -0.05438232421875, 0.01068115234375, 0.01105499267578125, 0.0259246826171875, 0.039764404296875, -0.019317626953125, 0.03271484375, 0.02777099609375, -0.04510498046875, -0.04608154296875, -0.0093994140625, -0.07366943359375, -0.0184478759765625, 0.07879638671875, -0.016448974609375, -0.02142333984375, 0.01104736328125, -0.003833770751953125, 0.0282440185546875, -0.0201568603515625, 0.051727294921875, 0.0611572265625, -0.007534027099609375, -0.0030345916748046875, -0.05615234375, 0.039886474609375, 0.034759521484375, -0.0399169921875, -0.0257415771484375, 0.005115509033203125, 0.03363037109375, 0.0146484375, 0.036834716796875, -0.01348876953125, -0.0001327991485595703, 0.0263824462890625, -0.0084228515625, -0.0031375885009765625, -0.00788116455078125, -0.006793975830078125, 0.0158538818359375, -0.01861572265625, -0.023834228515625 ] ]
flair/ner-english-ontonotes-large
2021-05-08T15:35:21.000Z
[ "flair", "pytorch", "token-classification", "sequence-tagger-model", "en", "dataset:ontonotes", "arxiv:2011.06993", "has_space", "region:us" ]
token-classification
flair
null
null
flair/ner-english-ontonotes-large
69
249,800
flair
2022-03-02T23:29:05
--- tags: - flair - token-classification - sequence-tagger-model language: en datasets: - ontonotes widget: - text: "On September 1st George won 1 dollar while watching Game of Thrones." --- ## English NER in Flair (Ontonotes large model) This is the large 18-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/). F1-Score: **90.93** (Ontonotes) Predicts 18 tags: | **tag** | **meaning** | |---------------------------------|-----------| | CARDINAL | cardinal value | | DATE | date value | | EVENT | event name | | FAC | building name | | GPE | geo-political entity | | LANGUAGE | language name | | LAW | law name | | LOC | location name | | MONEY | money name | | NORP | affiliation | | ORDINAL | ordinal value | | ORG | organization name | | PERCENT | percent value | | PERSON | person name | | PRODUCT | product name | | QUANTITY | quantity value | | TIME | time value | | WORK_OF_ART | name of work of art | Based on document-level XLM-R embeddings and [FLERT](https://arxiv.org/pdf/2011.06993v1.pdf/). --- ### Demo: How to use in Flair Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`) ```python from flair.data import Sentence from flair.models import SequenceTagger # load tagger tagger = SequenceTagger.load("flair/ner-english-ontonotes-large") # make example sentence sentence = Sentence("On September 1st George won 1 dollar while watching Game of Thrones.") # predict NER tags tagger.predict(sentence) # print sentence print(sentence) # print predicted NER spans print('The following NER tags are found:') # iterate over entities and print for entity in sentence.get_spans('ner'): print(entity) ``` This yields the following output: ``` Span [2,3]: "September 1st" [− Labels: DATE (1.0)] Span [4]: "George" [− Labels: PERSON (1.0)] Span [6,7]: "1 dollar" [− Labels: MONEY (1.0)] Span [10,11,12]: "Game of Thrones" [− Labels: WORK_OF_ART (1.0)] ``` So, the entities "*September 1st*" (labeled as a **date**), "*George*" (labeled as a **person**), "*1 dollar*" (labeled as a **money**) and "Game of Thrones" (labeled as a **work of art**) are found in the sentence "*On September 1st George Washington won 1 dollar while watching Game of Thrones*". --- ### Training: Script to train this model The following Flair script was used to train this model: ```python from flair.data import Corpus from flair.datasets import ColumnCorpus from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings # 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself) corpus: Corpus = ColumnCorpus( "resources/tasks/onto-ner", column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"}, tag_to_bioes="ner", ) # 2. what tag do we want to predict? tag_type = 'ner' # 3. make the tag dictionary from the corpus tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type) # 4. initialize fine-tuneable transformer embeddings WITH document context from flair.embeddings import TransformerWordEmbeddings embeddings = TransformerWordEmbeddings( model='xlm-roberta-large', layers="-1", subtoken_pooling="first", fine_tune=True, use_context=True, ) # 5. initialize bare-bones sequence tagger (no CRF, no RNN, no reprojection) from flair.models import SequenceTagger tagger = SequenceTagger( hidden_size=256, embeddings=embeddings, tag_dictionary=tag_dictionary, tag_type='ner', use_crf=False, use_rnn=False, reproject_embeddings=False, ) # 6. initialize trainer with AdamW optimizer from flair.trainers import ModelTrainer trainer = ModelTrainer(tagger, corpus, optimizer=torch.optim.AdamW) # 7. run training with XLM parameters (20 epochs, small LR) from torch.optim.lr_scheduler import OneCycleLR trainer.train('resources/taggers/ner-english-ontonotes-large', learning_rate=5.0e-6, mini_batch_size=4, mini_batch_chunk_size=1, max_epochs=20, scheduler=OneCycleLR, embeddings_storage_mode='none', weight_decay=0., ) ``` --- ### Cite Please cite the following paper when using this model. ``` @misc{schweter2020flert, title={FLERT: Document-Level Features for Named Entity Recognition}, author={Stefan Schweter and Alan Akbik}, year={2020}, eprint={2011.06993}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` --- ### Issues? The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
4,828
[ [ -0.025146484375, -0.0379638671875, 0.01103973388671875, 0.00644683837890625, -0.0101165771484375, -0.010345458984375, -0.01529693603515625, -0.030242919921875, 0.0426025390625, 0.032257080078125, -0.0299224853515625, -0.0411376953125, -0.042388916015625, 0.020477294921875, -0.0010700225830078125, 0.09051513671875, 0.006744384765625, 0.0159759521484375, -0.0037059783935546875, -0.005100250244140625, -0.0169830322265625, -0.04071044921875, -0.0595703125, -0.0267333984375, 0.04681396484375, 0.0241546630859375, 0.02996826171875, 0.056915283203125, 0.03533935546875, 0.02099609375, -0.0128021240234375, 0.00916290283203125, -0.0197906494140625, -0.0137939453125, -0.00865936279296875, -0.0301361083984375, -0.06036376953125, 0.007534027099609375, 0.04437255859375, 0.02264404296875, 0.004146575927734375, 0.01250457763671875, 0.0052490234375, 0.015960693359375, -0.0151214599609375, 0.034423828125, -0.051483154296875, -0.01151275634765625, -0.00860595703125, -0.01143646240234375, -0.0256195068359375, -0.021270751953125, 0.0148773193359375, -0.045928955078125, 0.004543304443359375, 0.018646240234375, 0.10791015625, 0.007480621337890625, -0.0235748291015625, -0.019317626953125, -0.036285400390625, 0.074951171875, -0.0732421875, 0.0194091796875, 0.0249786376953125, -0.01235198974609375, 0.005260467529296875, -0.056243896484375, -0.04852294921875, -0.0126800537109375, -0.00989532470703125, 0.018768310546875, -0.016754150390625, -0.00998687744140625, 0.0249176025390625, 0.01020050048828125, -0.05072021484375, -0.0010290145874023438, -0.01953125, -0.0106353759765625, 0.05584716796875, 0.020477294921875, 0.0105743408203125, -0.03240966796875, -0.03985595703125, -0.0163421630859375, -0.0264892578125, 0.0004494190216064453, 0.01322174072265625, 0.036346435546875, -0.0195770263671875, 0.035797119140625, 0.01239776611328125, 0.046173095703125, 0.0156707763671875, -0.0268707275390625, 0.04071044921875, -0.0169830322265625, -0.016845703125, 0.005283355712890625, 0.0684814453125, 0.02447509765625, 0.00981903076171875, -0.00429534912109375, -0.0167388916015625, 0.0098724365234375, -0.01531219482421875, -0.06207275390625, -0.023345947265625, 0.018707275390625, -0.02264404296875, -0.0300750732421875, 0.004154205322265625, -0.052520751953125, -0.0183258056640625, -0.01287078857421875, 0.0555419921875, -0.036865234375, -0.004901885986328125, 0.00016868114471435547, -0.026123046875, 0.020172119140625, 0.01497650146484375, -0.05804443359375, 0.0050811767578125, 0.028533935546875, 0.046630859375, 0.021484375, -0.0258941650390625, -0.0293731689453125, -0.0061798095703125, -0.0198211669921875, 0.055145263671875, -0.0251312255859375, -0.014617919921875, -0.01010894775390625, 0.01082611083984375, -0.0235748291015625, -0.018035888671875, 0.039520263671875, -0.0390625, 0.026702880859375, -0.020843505859375, -0.0592041015625, -0.020782470703125, 0.027618408203125, -0.049407958984375, 0.0648193359375, 0.0037059783935546875, -0.08905029296875, 0.03656005859375, -0.037109375, -0.0230865478515625, 0.004909515380859375, -0.006351470947265625, -0.044189453125, -0.016448974609375, 0.0124053955078125, 0.04864501953125, -0.01617431640625, 0.01323699951171875, -0.0183258056640625, -0.002117156982421875, 0.0166168212890625, 0.00222015380859375, 0.059417724609375, 0.0038700103759765625, -0.026397705078125, 0.00197601318359375, -0.06768798828125, -0.006839752197265625, 0.0265350341796875, -0.03363037109375, -0.016693115234375, -0.0009202957153320312, 0.0201568603515625, 0.02001953125, 0.0177001953125, -0.042755126953125, 0.036865234375, -0.033905029296875, 0.041015625, 0.0372314453125, 0.002841949462890625, 0.035888671875, -0.038055419921875, 0.0318603515625, 0.00722503662109375, -0.0164794921875, -0.0056610107421875, -0.044097900390625, -0.050628662109375, -0.01885986328125, 0.042755126953125, 0.058380126953125, -0.04583740234375, 0.047821044921875, -0.0294036865234375, -0.051727294921875, -0.025543212890625, -0.01294708251953125, 0.0200347900390625, 0.04736328125, 0.036590576171875, -0.005161285400390625, -0.060577392578125, -0.049102783203125, -0.007144927978515625, -0.005893707275390625, 0.012298583984375, 0.030487060546875, 0.06500244140625, -0.015533447265625, 0.057952880859375, -0.042694091796875, -0.0452880859375, -0.037628173828125, 0.01812744140625, 0.036773681640625, 0.04241943359375, 0.03240966796875, -0.0489501953125, -0.04962158203125, 0.002643585205078125, -0.03619384765625, 0.01873779296875, -0.0216522216796875, 0.0072174072265625, 0.03485107421875, 0.029632568359375, -0.031097412109375, 0.039154052734375, 0.025299072265625, -0.046356201171875, 0.0406494140625, -0.0099029541015625, -0.0079803466796875, -0.10693359375, 0.021453857421875, 0.01776123046875, -0.011962890625, -0.035369873046875, -0.02618408203125, 0.0090484619140625, 0.0136871337890625, -0.0157623291015625, 0.05584716796875, -0.0308074951171875, 0.01727294921875, -0.004634857177734375, 0.01557159423828125, 0.01367950439453125, 0.0255279541015625, 0.024505615234375, 0.0275115966796875, 0.03753662109375, -0.03741455078125, 0.0168914794921875, 0.03753662109375, -0.0316162109375, 0.0156707763671875, -0.040924072265625, -0.00830841064453125, -0.0084228515625, 0.01229095458984375, -0.07452392578125, -0.0111083984375, 0.0161590576171875, -0.056610107421875, 0.050994873046875, -0.004909515380859375, -0.02178955078125, -0.0282440185546875, -0.0166168212890625, -0.0007367134094238281, 0.03436279296875, -0.0287933349609375, 0.051483154296875, 0.0183868408203125, 0.0008778572082519531, -0.06005859375, -0.054595947265625, -0.00824737548828125, -0.0194854736328125, -0.050018310546875, 0.049285888671875, -0.0029773712158203125, -0.00640869140625, 0.0174407958984375, 0.0013952255249023438, 0.0029449462890625, 0.0003821849822998047, 0.0096282958984375, 0.0377197265625, -0.0189361572265625, 0.00222015380859375, -0.0185089111328125, -0.0035305023193359375, 0.00209808349609375, -0.017730712890625, 0.051788330078125, -0.00292205810546875, 0.023681640625, -0.03106689453125, 0.0021152496337890625, 0.018829345703125, -0.02435302734375, 0.07745361328125, 0.060150146484375, -0.041168212890625, -0.015960693359375, -0.033447265625, -0.0161590576171875, -0.0282440185546875, 0.051513671875, -0.0296478271484375, -0.042877197265625, 0.053436279296875, 0.0220947265625, 0.0168304443359375, 0.066650390625, 0.0204010009765625, 0.00353240966796875, 0.082763671875, 0.038818359375, -0.01348876953125, 0.03619384765625, -0.0574951171875, 0.00878143310546875, -0.06793212890625, -0.02398681640625, -0.042877197265625, -0.0154266357421875, -0.0504150390625, -0.0160064697265625, 0.012725830078125, 0.019195556640625, -0.037841796875, 0.044342041015625, -0.038299560546875, 0.01206207275390625, 0.040069580078125, 0.0034465789794921875, 0.00370025634765625, -0.00789642333984375, -0.0233612060546875, -0.0203857421875, -0.0595703125, -0.039581298828125, 0.08441162109375, 0.0291290283203125, 0.057952880859375, -0.0110626220703125, 0.067138671875, -0.0018968582153320312, 0.032073974609375, -0.0540771484375, 0.0311737060546875, -0.016754150390625, -0.057952880859375, -0.016876220703125, -0.033599853515625, -0.07855224609375, 0.002288818359375, -0.03973388671875, -0.06695556640625, 0.017059326171875, 0.00847625732421875, -0.038818359375, 0.04498291015625, -0.0240478515625, 0.078369140625, -0.01088714599609375, -0.028961181640625, 0.01482391357421875, -0.051971435546875, 0.007049560546875, 0.0021533966064453125, 0.0243988037109375, -0.00571441650390625, -0.0009284019470214844, 0.0828857421875, -0.021026611328125, 0.0577392578125, 0.0008893013000488281, 0.0117950439453125, 0.01216888427734375, -0.0072021484375, 0.04498291015625, 0.01151275634765625, -0.01520538330078125, 0.005329132080078125, 0.0015649795532226562, -0.0130615234375, -0.01434326171875, 0.055816650390625, -0.0672607421875, -0.023468017578125, -0.061981201171875, -0.0257568359375, 0.007244110107421875, 0.0153961181640625, 0.057281494140625, 0.05279541015625, -0.00951385498046875, 0.005290985107421875, 0.040069580078125, -0.0206146240234375, 0.04425048828125, 0.024505615234375, -0.032012939453125, -0.06048583984375, 0.070068359375, 0.0218658447265625, 0.002719879150390625, 0.04254150390625, 0.0186004638671875, -0.030487060546875, -0.0198974609375, -0.025634765625, 0.037322998046875, -0.044586181640625, -0.035888671875, -0.06005859375, -0.0268096923828125, -0.058380126953125, -0.0111541748046875, -0.016265869140625, -0.027191162109375, -0.0555419921875, -0.002529144287109375, 0.0289764404296875, 0.06427001953125, -0.016357421875, 0.0142822265625, -0.05206298828125, -0.0124053955078125, 0.005817413330078125, 0.00365447998046875, 0.0011968612670898438, -0.07366943359375, -0.025299072265625, -0.01049041748046875, -0.035186767578125, -0.0682373046875, 0.07611083984375, 0.0173797607421875, 0.031982421875, 0.038543701171875, -0.0034160614013671875, 0.0406494140625, -0.029754638671875, 0.0609130859375, 0.01084136962890625, -0.06280517578125, 0.030792236328125, -0.01849365234375, 0.00897979736328125, 0.01110076904296875, 0.052978515625, -0.037078857421875, -0.004474639892578125, -0.066650390625, -0.07720947265625, 0.058807373046875, -0.002056121826171875, 0.00301361083984375, -0.01329803466796875, 0.0175018310546875, -0.00884246826171875, 0.0060882568359375, -0.08056640625, -0.044586181640625, -0.01047515869140625, -0.01041412353515625, -0.025299072265625, -0.01210784912109375, 0.01303863525390625, -0.03558349609375, 0.08795166015625, -0.0007023811340332031, 0.0374755859375, 0.033477783203125, 0.006633758544921875, -0.004131317138671875, 0.011962890625, 0.03961181640625, 0.033111572265625, -0.03485107421875, -0.0096893310546875, 0.0242156982421875, -0.0283050537109375, -0.019927978515625, 0.024749755859375, -0.005451202392578125, 0.01323699951171875, 0.0333251953125, 0.06463623046875, 0.019287109375, -0.01971435546875, 0.0343017578125, -0.00228118896484375, -0.02593994140625, -0.0447998046875, -0.0169525146484375, 0.014984130859375, 0.01401519775390625, 0.0299530029296875, 0.009521484375, -0.005462646484375, -0.039031982421875, 0.00897216796875, 0.0394287109375, -0.031585693359375, -0.031036376953125, 0.07501220703125, -0.003406524658203125, -0.01103973388671875, 0.0335693359375, -0.042205810546875, -0.06304931640625, 0.058746337890625, 0.052154541015625, 0.053192138671875, -0.0137939453125, 0.006031036376953125, 0.061981201171875, 0.0234832763671875, -0.005512237548828125, 0.048248291015625, 0.03778076171875, -0.06707763671875, -0.0235748291015625, -0.06781005859375, -0.0079803466796875, 0.0281219482421875, -0.046844482421875, 0.042633056640625, -0.0280303955078125, -0.0312347412109375, 0.035003662109375, 0.029083251953125, -0.06634521484375, 0.0278472900390625, 0.0256500244140625, 0.0797119140625, -0.06561279296875, 0.0601806640625, 0.0718994140625, -0.053955078125, -0.09368896484375, -0.0148773193359375, -0.01059722900390625, -0.044769287109375, 0.062255859375, 0.0234832763671875, 0.034637451171875, 0.0194854736328125, -0.0428466796875, -0.09527587890625, 0.08721923828125, -0.00827789306640625, -0.031402587890625, -0.0169525146484375, -0.01364898681640625, 0.02362060546875, -0.0361328125, 0.04071044921875, 0.023468017578125, 0.037841796875, 0.0028514862060546875, -0.0655517578125, -0.0016355514526367188, -0.017730712890625, -0.01175689697265625, 0.02252197265625, -0.0489501953125, 0.0845947265625, -0.02252197265625, -0.00984954833984375, 0.016571044921875, 0.054351806640625, 0.006038665771484375, 0.02545166015625, 0.017333984375, 0.06842041015625, 0.05072021484375, -0.020050048828125, 0.0670166015625, -0.029022216796875, 0.051849365234375, 0.07635498046875, -0.014404296875, 0.0753173828125, 0.017425537109375, -0.01459503173828125, 0.05126953125, 0.046417236328125, -0.0152435302734375, 0.03460693359375, 0.01690673828125, 0.0037593841552734375, -0.026397705078125, -0.0019321441650390625, -0.0338134765625, 0.03668212890625, 0.025543212890625, -0.045654296875, 0.00042939186096191406, -0.002033233642578125, 0.034637451171875, -0.0087127685546875, -0.03521728515625, 0.05963134765625, 0.00640869140625, -0.03790283203125, 0.03961181640625, 0.0065765380859375, 0.06915283203125, -0.0380859375, 0.00279998779296875, -0.005523681640625, 0.02777099609375, -0.019927978515625, -0.039825439453125, 0.00959014892578125, -0.015625, -0.013519287109375, 0.00164794921875, 0.044647216796875, -0.040863037109375, -0.04315185546875, 0.021881103515625, 0.024993896484375, 0.01287078857421875, -0.001644134521484375, -0.05694580078125, -0.00958251953125, 0.0083465576171875, -0.039154052734375, 0.0201416015625, 0.023284912109375, 0.00872039794921875, 0.034942626953125, 0.034576416015625, 0.00820159912109375, -0.006866455078125, -0.01959228515625, 0.061492919921875, -0.063720703125, -0.0322265625, -0.06494140625, 0.04217529296875, -0.00894927978515625, -0.045745849609375, 0.0589599609375, 0.0665283203125, 0.061492919921875, -0.005847930908203125, 0.05816650390625, -0.0287017822265625, 0.047149658203125, -0.019744873046875, 0.0684814453125, -0.05712890625, 0.0014486312866210938, -0.02178955078125, -0.05352783203125, -0.02374267578125, 0.054595947265625, -0.035858154296875, 0.002048492431640625, 0.048919677734375, 0.053070068359375, 0.0115966796875, -0.004314422607421875, 0.015716552734375, 0.030029296875, -0.002017974853515625, 0.0361328125, 0.03900146484375, -0.051727294921875, 0.03765869140625, -0.04180908203125, -0.0141754150390625, -0.0192108154296875, -0.07183837890625, -0.0745849609375, -0.0526123046875, -0.04254150390625, -0.061981201171875, -0.02056884765625, 0.09393310546875, 0.03607177734375, -0.0701904296875, -0.014007568359375, 0.01082611083984375, -0.0084228515625, -0.0088958740234375, -0.0204620361328125, 0.041961669921875, -0.011260986328125, -0.055023193359375, 0.01007843017578125, -0.01116180419921875, 0.00815582275390625, 0.01203155517578125, -0.00002574920654296875, -0.0390625, 0.0150909423828125, 0.0298919677734375, 0.019805908203125, -0.0501708984375, -0.0113372802734375, 0.022705078125, -0.0228729248046875, 0.01299285888671875, 0.01204681396484375, -0.0504150390625, 0.01141357421875, 0.039825439453125, 0.017425537109375, 0.038421630859375, -0.0036487579345703125, 0.01023101806640625, -0.038177490234375, -0.01222991943359375, 0.025665283203125, 0.046539306640625, 0.02032470703125, -0.0236663818359375, 0.029754638671875, 0.028533935546875, -0.061676025390625, -0.054931640625, -0.0170745849609375, -0.08282470703125, -0.01067352294921875, 0.086669921875, -0.017364501953125, -0.034454345703125, 0.00951385498046875, -0.0147247314453125, 0.033782958984375, -0.03558349609375, 0.033111572265625, 0.03509521484375, -0.007091522216796875, 0.004119873046875, -0.03009033203125, 0.0499267578125, 0.02001953125, -0.044952392578125, -0.0202178955078125, 0.01183319091796875, 0.043182373046875, 0.0296478271484375, 0.0305938720703125, 0.007427215576171875, 0.0090179443359375, 0.011688232421875, 0.034576416015625, 0.00681304931640625, -0.004856109619140625, -0.030029296875, -0.00821685791015625, -0.0025310516357421875, -0.014617919921875 ] ]
microsoft/deberta-v3-base
2022-09-22T12:34:19.000Z
[ "transformers", "pytorch", "tf", "rust", "deberta-v2", "deberta", "deberta-v3", "fill-mask", "en", "arxiv:2006.03654", "arxiv:2111.09543", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
microsoft
null
null
microsoft/deberta-v3-base
124
249,697
transformers
2022-03-02T23:29:05
--- language: en tags: - deberta - deberta-v3 - fill-mask thumbnail: https://huggingface.co/front/thumbnails/microsoft.png license: mit --- ## DeBERTaV3: Improving DeBERTa using ELECTRA-Style Pre-Training with Gradient-Disentangled Embedding Sharing [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. With those two improvements, DeBERTa out perform RoBERTa on a majority of NLU tasks with 80GB training data. In [DeBERTa V3](https://arxiv.org/abs/2111.09543), we further improved the efficiency of DeBERTa using ELECTRA-Style pre-training with Gradient Disentangled Embedding Sharing. Compared to DeBERTa, our V3 version significantly improves the model performance on downstream tasks. You can find more technique details about the new model from our [paper](https://arxiv.org/abs/2111.09543). Please check the [official repository](https://github.com/microsoft/DeBERTa) for more implementation details and updates. The DeBERTa V3 base model comes with 12 layers and a hidden size of 768. It has only 86M backbone parameters with a vocabulary containing 128K tokens which introduces 98M parameters in the Embedding layer. This model was trained using the 160GB data as DeBERTa V2. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 2.0 and MNLI tasks. | Model |Vocabulary(K)|Backbone #Params(M)| SQuAD 2.0(F1/EM) | MNLI-m/mm(ACC)| |-------------------|----------|-------------------|-----------|----------| | RoBERTa-base |50 |86 | 83.7/80.5 | 87.6/- | | XLNet-base |32 |92 | -/80.2 | 86.8/- | | ELECTRA-base |30 |86 | -/80.5 | 88.8/ | | DeBERTa-base |50 |100 | 86.2/83.1| 88.8/88.5| | DeBERTa-v3-base |128|86 | **88.4/85.4** | **90.6/90.7**| | DeBERTa-v3-base + SiFT |128|86 | -/- | 91.0/-| We present the dev results on SQuAD 1.1/2.0 and MNLI tasks. #### Fine-tuning with HF transformers ```bash #!/bin/bash cd transformers/examples/pytorch/text-classification/ pip install datasets export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \ run_glue.py \ --model_name_or_path microsoft/deberta-v3-base \ --task_name $TASK_NAME \ --do_train \ --do_eval \ --evaluation_strategy steps \ --max_seq_length 256 \ --warmup_steps 500 \ --per_device_train_batch_size ${batch_size} \ --learning_rate 2e-5 \ --num_train_epochs 3 \ --output_dir $output_dir \ --overwrite_output_dir \ --logging_steps 1000 \ --logging_dir $output_dir ``` ### Citation If you find DeBERTa useful for your work, please cite the following papers: ``` latex @misc{he2021debertav3, title={DeBERTaV3: Improving DeBERTa using ELECTRA-Style Pre-Training with Gradient-Disentangled Embedding Sharing}, author={Pengcheng He and Jianfeng Gao and Weizhu Chen}, year={2021}, eprint={2111.09543}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
3,472
[ [ -0.029815673828125, -0.049072265625, 0.0160980224609375, 0.028900146484375, -0.01727294921875, 0.012908935546875, -0.00943756103515625, -0.0369873046875, 0.0245819091796875, -0.0009694099426269531, -0.0285797119140625, -0.0380859375, -0.06683349609375, -0.005809783935546875, -0.0136260986328125, 0.06036376953125, -0.012908935546875, 0.0082550048828125, 0.003810882568359375, -0.01654052734375, -0.037017822265625, -0.041717529296875, -0.04876708984375, -0.03497314453125, 0.04144287109375, 0.0162811279296875, 0.033111572265625, 0.014007568359375, 0.03692626953125, 0.022705078125, -0.021728515625, 0.01690673828125, -0.03570556640625, -0.003284454345703125, 0.01413726806640625, -0.0269317626953125, -0.06011962890625, 0.0029468536376953125, 0.03692626953125, 0.0176239013671875, 0.0031299591064453125, 0.0150909423828125, 0.0204620361328125, 0.073486328125, -0.060211181640625, 0.008148193359375, -0.04669189453125, 0.0013093948364257812, 0.00433349609375, -0.00374603271484375, -0.028106689453125, -0.00824737548828125, 0.0224456787109375, -0.03369140625, 0.016326904296875, -0.025360107421875, 0.08917236328125, 0.039398193359375, -0.01119232177734375, -0.006290435791015625, -0.04632568359375, 0.07171630859375, -0.057037353515625, 0.022186279296875, 0.02032470703125, 0.0087890625, -0.0034999847412109375, -0.04638671875, -0.043548583984375, -0.00577545166015625, -0.002933502197265625, 0.02264404296875, -0.047088623046875, -0.00592041015625, 0.0225982666015625, 0.0205078125, -0.038726806640625, 0.021087646484375, -0.03155517578125, 0.004222869873046875, 0.052215576171875, 0.0024700164794921875, -0.0009293556213378906, 0.002773284912109375, -0.02978515625, -0.027587890625, -0.057861328125, 0.00217437744140625, 0.03790283203125, -0.00855255126953125, -0.007572174072265625, 0.0119781494140625, -0.01070404052734375, 0.06109619140625, 0.01023101806640625, 0.0285491943359375, 0.057098388671875, -0.006626129150390625, -0.0235748291015625, 0.01230621337890625, 0.054351806640625, 0.0176239013671875, -0.01096343994140625, -0.0067291259765625, -0.0040283203125, -0.0007610321044921875, 0.0156707763671875, -0.07244873046875, -0.0419921875, 0.033111572265625, -0.0297393798828125, -0.0246429443359375, -0.006988525390625, -0.0595703125, -0.015625, -0.027191162109375, 0.04443359375, -0.042327880859375, -0.0234832763671875, 0.0198822021484375, -0.003292083740234375, 0.022003173828125, 0.037841796875, -0.08026123046875, 0.004627227783203125, 0.0250396728515625, 0.057464599609375, -0.006290435791015625, -0.0257720947265625, -0.0345458984375, -0.018768310546875, -0.00030803680419921875, 0.019439697265625, -0.0017347335815429688, 0.01039886474609375, -0.0162506103515625, 0.0008721351623535156, -0.017547607421875, -0.022216796875, 0.0166168212890625, -0.050933837890625, -0.00373077392578125, -0.01641845703125, -0.026458740234375, -0.03082275390625, 0.0171661376953125, -0.05218505859375, 0.07464599609375, 0.010528564453125, -0.059600830078125, 0.023040771484375, -0.047637939453125, -0.004009246826171875, -0.017333984375, 0.0065155029296875, -0.0433349609375, -0.012176513671875, 0.03338623046875, 0.047027587890625, -0.00704193115234375, 0.008697509765625, -0.0158233642578125, -0.0399169921875, 0.0194091796875, -0.0299224853515625, 0.08935546875, 0.021697998046875, -0.05694580078125, -0.0020351409912109375, -0.062225341796875, 0.0043487548828125, 0.01454925537109375, -0.015655517578125, -0.01812744140625, -0.0037441253662109375, -0.006160736083984375, 0.015655517578125, 0.039703369140625, -0.036407470703125, 0.019805908203125, -0.0272064208984375, 0.052398681640625, 0.052093505859375, -0.02374267578125, 0.03155517578125, -0.0172882080078125, 0.0212249755859375, 0.0214385986328125, 0.02191162109375, 0.01983642578125, -0.03778076171875, -0.0762939453125, -0.04742431640625, 0.04443359375, 0.03985595703125, -0.036865234375, 0.052490234375, -0.01464080810546875, -0.059844970703125, -0.0498046875, 0.01236724853515625, 0.0254058837890625, 0.0176544189453125, 0.05438232421875, -0.0025997161865234375, -0.05596923828125, -0.06756591796875, 0.009674072265625, -0.00040340423583984375, 0.004459381103515625, 0.0014476776123046875, 0.053619384765625, -0.0298004150390625, 0.055145263671875, -0.044647216796875, -0.03204345703125, -0.021087646484375, 0.013275146484375, 0.0338134765625, 0.052398681640625, 0.06494140625, -0.044830322265625, -0.043731689453125, -0.02862548828125, -0.04986572265625, 0.026336669921875, 0.007495880126953125, -0.0202178955078125, 0.029052734375, 0.01320648193359375, -0.03485107421875, 0.030364990234375, 0.047882080078125, -0.01155853271484375, -0.00023698806762695312, -0.0244903564453125, 0.0074310302734375, -0.0836181640625, 0.00946044921875, -0.001956939697265625, -0.003787994384765625, -0.04095458984375, -0.0102386474609375, 0.0128936767578125, 0.02001953125, -0.03692626953125, 0.01070404052734375, -0.039642333984375, 0.0169525146484375, 0.00007051229476928711, 0.0251312255859375, 0.01128387451171875, 0.06695556640625, 0.0005536079406738281, 0.045562744140625, 0.048431396484375, -0.046142578125, 0.0172882080078125, 0.030914306640625, -0.0244903564453125, 0.005641937255859375, -0.07574462890625, 0.02105712890625, -0.01265716552734375, 0.02471923828125, -0.07208251953125, 0.01129913330078125, 0.02685546875, -0.04132080078125, 0.035186767578125, -0.0118865966796875, -0.041259765625, -0.0228424072265625, -0.042755126953125, 0.01348876953125, 0.062347412109375, -0.05462646484375, 0.0234832763671875, 0.0254669189453125, 0.0255279541015625, -0.06646728515625, -0.059844970703125, -0.019927978515625, -0.024078369140625, -0.04412841796875, 0.05767822265625, -0.00879669189453125, -0.0025730133056640625, -0.0018053054809570312, 0.007080078125, -0.0178375244140625, 0.022186279296875, 0.012847900390625, 0.031280517578125, 0.004119873046875, 0.0137939453125, 0.0065765380859375, 0.006710052490234375, -0.007663726806640625, -0.004375457763671875, 0.048583984375, -0.0309906005859375, -0.004772186279296875, -0.0306243896484375, 0.006275177001953125, 0.032623291015625, -0.03173828125, 0.07086181640625, 0.07281494140625, -0.018798828125, -0.00753021240234375, -0.0298614501953125, -0.01446533203125, -0.03472900390625, 0.024169921875, -0.01340484619140625, -0.04620361328125, 0.046142578125, 0.01904296875, 0.016693115234375, 0.0528564453125, 0.03466796875, -0.01540374755859375, 0.077880859375, 0.0305023193359375, -0.008697509765625, 0.05194091796875, -0.086181640625, 0.00836181640625, -0.091064453125, -0.022064208984375, -0.031219482421875, -0.05072021484375, -0.0394287109375, -0.023681640625, 0.0121612548828125, 0.044342041015625, -0.0191650390625, 0.051666259765625, -0.0706787109375, 0.006839752197265625, 0.042510986328125, 0.03656005859375, 0.005023956298828125, 0.01025390625, 0.019500732421875, -0.00499725341796875, -0.059295654296875, -0.0311126708984375, 0.089599609375, 0.035125732421875, 0.055145263671875, 0.026824951171875, 0.0743408203125, 0.00623321533203125, -0.007442474365234375, -0.0394287109375, 0.0307769775390625, -0.007312774658203125, -0.037109375, -0.0173187255859375, -0.0224151611328125, -0.09039306640625, 0.0266876220703125, -0.01285552978515625, -0.07330322265625, 0.0306243896484375, 0.026214599609375, -0.0308990478515625, 0.0245208740234375, -0.045562744140625, 0.051544189453125, -0.004062652587890625, -0.03143310546875, -0.02557373046875, -0.03729248046875, 0.011566162109375, 0.0082244873046875, -0.0295562744140625, -0.005889892578125, 0.0092315673828125, 0.07269287109375, -0.0228118896484375, 0.057159423828125, -0.0192413330078125, -0.01788330078125, 0.038543701171875, -0.017242431640625, 0.052276611328125, 0.02130126953125, -0.006977081298828125, 0.03961181640625, 0.0005340576171875, -0.0290679931640625, -0.04669189453125, 0.0721435546875, -0.081298828125, -0.0313720703125, -0.04327392578125, -0.0281524658203125, -0.0038604736328125, -0.004306793212890625, 0.0367431640625, 0.0452880859375, 0.005519866943359375, 0.035308837890625, 0.07415771484375, -0.005519866943359375, 0.044036865234375, 0.03863525390625, 0.0068817138671875, -0.0240631103515625, 0.0672607421875, 0.0180206298828125, -0.00004798173904418945, 0.037872314453125, -0.019805908203125, -0.0297698974609375, -0.0528564453125, -0.039520263671875, 0.0079345703125, -0.050262451171875, -0.024169921875, -0.07086181640625, -0.01611328125, -0.024749755859375, 0.0038089752197265625, -0.036102294921875, -0.048004150390625, -0.0595703125, 0.003582000732421875, 0.055511474609375, 0.041290283203125, -0.0031261444091796875, 0.01499176025390625, -0.06182861328125, -0.0015621185302734375, 0.01499176025390625, 0.007808685302734375, 0.0142669677734375, -0.0423583984375, -0.029937744140625, 0.0230712890625, -0.046905517578125, -0.06689453125, 0.0301361083984375, -0.00514984130859375, 0.049072265625, -0.0109100341796875, 0.00768280029296875, 0.0408935546875, -0.024932861328125, 0.062469482421875, 0.0157623291015625, -0.07000732421875, 0.0498046875, -0.011688232421875, 0.01087188720703125, 0.04669189453125, 0.03717041015625, 0.0186614990234375, -0.00711822509765625, -0.06243896484375, -0.07269287109375, 0.058135986328125, 0.038360595703125, -0.0011930465698242188, 0.00563812255859375, 0.00511932373046875, -0.01386260986328125, 0.01239013671875, -0.0426025390625, -0.03985595703125, -0.020782470703125, -0.0157928466796875, -0.01155853271484375, -0.012939453125, -0.006671905517578125, -0.0430908203125, 0.0682373046875, 0.0094757080078125, 0.045623779296875, 0.038665771484375, -0.024169921875, 0.02008056640625, 0.007610321044921875, 0.048675537109375, 0.05389404296875, -0.037689208984375, -0.005146026611328125, 0.02972412109375, -0.037445068359375, 0.0162200927734375, 0.026153564453125, -0.0050201416015625, 0.0192108154296875, 0.0203857421875, 0.07501220703125, -0.00499725341796875, -0.0269317626953125, 0.03558349609375, -0.00952911376953125, -0.03131103515625, -0.0308990478515625, 0.006412506103515625, -0.013824462890625, 0.034698486328125, 0.0259246826171875, 0.01424407958984375, 0.01384735107421875, -0.01690673828125, 0.0013589859008789062, 0.02252197265625, -0.03033447265625, -0.0283966064453125, 0.04107666015625, 0.0233306884765625, 0.00385284423828125, 0.03948974609375, -0.0183258056640625, -0.040130615234375, 0.0565185546875, 0.029998779296875, 0.07000732421875, -0.001194000244140625, 0.0164947509765625, 0.050750732421875, 0.020965576171875, -0.0021953582763671875, 0.036102294921875, 0.0036468505859375, -0.03863525390625, -0.01953125, -0.035186767578125, -0.0097198486328125, 0.028228759765625, -0.053375244140625, 0.018035888671875, -0.0174713134765625, -0.015838623046875, -0.0016794204711914062, 0.0205841064453125, -0.07025146484375, -0.0008997917175292969, -0.010162353515625, 0.051239013671875, -0.03961181640625, 0.07037353515625, 0.05322265625, -0.042755126953125, -0.058929443359375, -0.01403045654296875, -0.03582763671875, -0.0419921875, 0.07464599609375, 0.01490020751953125, -0.01197052001953125, 0.0169525146484375, -0.02490234375, -0.068603515625, 0.1104736328125, 0.0296783447265625, -0.06219482421875, 0.0023822784423828125, -0.0212249755859375, 0.044647216796875, -0.0106964111328125, 0.0218048095703125, 0.02862548828125, 0.0185394287109375, -0.00807952880859375, -0.051116943359375, 0.01107025146484375, -0.0179443359375, 0.004039764404296875, 0.01837158203125, -0.063720703125, 0.08837890625, -0.01178741455078125, 0.004764556884765625, -0.0015850067138671875, 0.045654296875, 0.020751953125, 0.00875091552734375, 0.034027099609375, 0.056732177734375, 0.055419921875, -0.0126190185546875, 0.059814453125, -0.0253143310546875, 0.048248291015625, 0.07611083984375, 0.005413055419921875, 0.061798095703125, 0.035064697265625, -0.0261993408203125, 0.0487060546875, 0.045654296875, -0.010009765625, 0.056854248046875, 0.02459716796875, -0.0064544677734375, -0.0018625259399414062, 0.0244598388671875, -0.053375244140625, 0.0355224609375, 0.0101776123046875, -0.045135498046875, -0.00719451904296875, 0.0013828277587890625, 0.00525665283203125, -0.021148681640625, -0.0079345703125, 0.03973388671875, -0.00807952880859375, -0.048248291015625, 0.08929443359375, -0.01384735107421875, 0.058258056640625, -0.036651611328125, -0.0169525146484375, -0.00469970703125, 0.037750244140625, -0.017364501953125, -0.040985107421875, 0.005611419677734375, 0.00021779537200927734, -0.0154571533203125, 0.0009326934814453125, 0.031341552734375, -0.02532958984375, -0.0228271484375, 0.0341796875, 0.019500732421875, 0.0081787109375, -0.0011310577392578125, -0.06085205078125, 0.025146484375, 0.006389617919921875, -0.030914306640625, 0.0268402099609375, 0.0086669921875, 0.0245513916015625, 0.029327392578125, 0.02935791015625, -0.020111083984375, 0.011016845703125, -0.00804901123046875, 0.07781982421875, -0.01983642578125, -0.01560211181640625, -0.07427978515625, 0.04364013671875, -0.0163116455078125, -0.03570556640625, 0.064697265625, 0.025390625, 0.055511474609375, -0.01378631591796875, 0.038818359375, -0.0226593017578125, 0.0033054351806640625, -0.0477294921875, 0.049102783203125, -0.054412841796875, 0.01727294921875, -0.03375244140625, -0.0753173828125, -0.004711151123046875, 0.0545654296875, -0.0109100341796875, 0.0009675025939941406, 0.039520263671875, 0.0517578125, 0.00042557716369628906, -0.01464080810546875, 0.004634857177734375, 0.0025730133056640625, 0.03204345703125, 0.06854248046875, 0.05255126953125, -0.076171875, 0.042633056640625, -0.03363037109375, -0.0162200927734375, -0.02166748046875, -0.05584716796875, -0.08447265625, -0.050750732421875, -0.04217529296875, -0.033294677734375, 0.0020008087158203125, 0.058624267578125, 0.053680419921875, -0.0491943359375, 0.0185394287109375, -0.03179931640625, -0.001216888427734375, -0.03619384765625, -0.013031005859375, 0.042449951171875, -0.02362060546875, -0.08099365234375, 0.0268096923828125, -0.017730712890625, 0.0182647705078125, -0.0275115966796875, -0.0223236083984375, -0.03692626953125, -0.00899505615234375, 0.0399169921875, -0.0008711814880371094, -0.0396728515625, -0.0034046173095703125, 0.004055023193359375, -0.011627197265625, 0.0012807846069335938, 0.0201416015625, -0.058685302734375, 0.0207977294921875, 0.05609130859375, 0.0286865234375, 0.058349609375, -0.0298614501953125, 0.0225982666015625, -0.05010986328125, 0.02838134765625, 0.0197296142578125, 0.035675048828125, 0.01279449462890625, -0.0347900390625, 0.049346923828125, -0.01007080078125, -0.04840087890625, -0.0621337890625, -0.0007786750793457031, -0.0814208984375, -0.0115814208984375, 0.07147216796875, -0.038177490234375, -0.00948333740234375, 0.01165008544921875, -0.018829345703125, 0.0282745361328125, -0.03912353515625, 0.055389404296875, 0.038818359375, 0.0015411376953125, 0.0074920654296875, -0.036102294921875, 0.04510498046875, 0.0438232421875, -0.042877197265625, -0.013031005859375, 0.0168304443359375, 0.01343536376953125, 0.03033447265625, 0.045135498046875, -0.0027523040771484375, 0.013885498046875, -0.00911712646484375, 0.000335693359375, -0.0254364013671875, -0.033294677734375, -0.0316162109375, -0.01419830322265625, -0.00716400146484375, -0.054412841796875 ] ]
BAAI/llm-embedder
2023-11-03T06:29:15.000Z
[ "transformers", "pytorch", "bert", "feature-extraction", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "endpoints_compatible", "region:us" ]
feature-extraction
BAAI
null
null
BAAI/llm-embedder
43
247,724
transformers
2023-10-09T09:46:10
--- license: mit --- <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) <span style="#FF69B4;"> **Hiring:** We're seeking experienced NLP researchers and intern students focusing on dense retrieval and retrieval-augmented LLMs. If you're interested, please feel free to reach out to us via email at zhengliu1026@gmail.com.</span> FlagEmbedding can map any text to a low-dimensional dense vector, which can be used for tasks like retrieval, classification, clustering, and semantic search. And it can also be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [massive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages in a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from the embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For example, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 documents to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you can also download the models at https://model.baai.ac.cn/models . ## Frequently asked questions **1. How to fine-tune bge embedding model?** Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - In general, larger hyper-parameter `per_device_train_batch_size` brings better performance. You can expand it by enabling `--fp16`, `--deepspeed df_config.json` (df_config.json can refer to [ds_config.json](https://github.com/FlagOpen/FlagEmbedding/blob/master/examples/finetune/ds_config.json), `--gradient_checkpointing`, etc. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples of using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pair data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. For more training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). For more details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ### Our Contributors: <a href="https://github.com/FlagOpen/FlagEmbedding/graphs/contributors"> <img src="https://contrib.rocks/image?repo=FlagOpen/FlagEmbedding" /> </a> ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{llm_embedder, title={Retrieve Anything To Augment Large Language Models}, author={Peitian Zhang and Shitao Xiao and Zheng Liu and Zhicheng Dou and Jian-Yun Nie}, year={2023}, eprint={2310.07554}, archivePrefix={arXiv}, primaryClass={cs.IR} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
28,287
[ [ -0.037109375, -0.06573486328125, 0.031585693359375, 0.0120849609375, -0.0233612060546875, -0.016326904296875, -0.02374267578125, -0.02020263671875, 0.029571533203125, 0.0264892578125, -0.0235443115234375, -0.06597900390625, -0.036956787109375, -0.0034332275390625, -0.008453369140625, 0.04290771484375, -0.001918792724609375, 0.0096435546875, 0.00269317626953125, -0.0183258056640625, -0.0291900634765625, -0.01971435546875, -0.050262451171875, -0.0210113525390625, 0.0275421142578125, 0.0188446044921875, 0.04254150390625, 0.05548095703125, 0.0231475830078125, 0.020416259765625, -0.018829345703125, 0.01036834716796875, -0.03424072265625, -0.004711151123046875, -0.01499176025390625, -0.0242767333984375, -0.0338134765625, 0.011962890625, 0.051483154296875, 0.0357666015625, -0.00937652587890625, 0.0087738037109375, 0.0004134178161621094, 0.053985595703125, -0.03497314453125, 0.0196380615234375, -0.04486083984375, 0.0019283294677734375, -0.0174102783203125, 0.00986480712890625, -0.0367431640625, -0.0260467529296875, 0.01119232177734375, -0.045257568359375, 0.00748443603515625, 0.0216827392578125, 0.0958251953125, 0.01543426513671875, -0.0301971435546875, -0.01171112060546875, -0.00823211669921875, 0.07373046875, -0.0762939453125, 0.052703857421875, 0.037689208984375, 0.0170745849609375, -0.005374908447265625, -0.06170654296875, -0.02423095703125, -0.0118560791015625, -0.01409912109375, 0.03070068359375, 0.00010544061660766602, 0.000637054443359375, 0.0244140625, 0.04705810546875, -0.043731689453125, 0.00981903076171875, -0.004425048828125, -0.0103302001953125, 0.056427001953125, -0.01139068603515625, 0.033538818359375, -0.037506103515625, -0.02166748046875, -0.02587890625, -0.05859375, 0.003520965576171875, 0.0292205810546875, 0.0107879638671875, -0.0296173095703125, 0.041656494140625, -0.0178680419921875, 0.0460205078125, 0.0033054351806640625, 0.00519561767578125, 0.0404052734375, -0.0276947021484375, -0.0160980224609375, -0.01143646240234375, 0.06890869140625, 0.0281524658203125, -0.004840850830078125, 0.0025234222412109375, -0.02703857421875, -0.00598907470703125, -0.005741119384765625, -0.06646728515625, -0.0186614990234375, 0.01549530029296875, -0.0546875, -0.01529693603515625, 0.0152130126953125, -0.05828857421875, 0.006175994873046875, -0.0012502670288085938, 0.041107177734375, -0.053741455078125, -0.006778717041015625, 0.02532958984375, -0.014892578125, 0.0298309326171875, 0.0011453628540039062, -0.051055908203125, -0.0156707763671875, 0.042236328125, 0.0654296875, 0.00959014892578125, -0.00554656982421875, -0.0267181396484375, -0.00042128562927246094, -0.0107879638671875, 0.0247039794921875, -0.03851318359375, -0.01197052001953125, 0.0137786865234375, 0.0257415771484375, -0.00711822509765625, -0.022613525390625, 0.06683349609375, -0.040771484375, 0.023895263671875, -0.025299072265625, -0.059295654296875, -0.033416748046875, 0.01021575927734375, -0.059173583984375, 0.08563232421875, -0.00685882568359375, -0.06378173828125, 0.007110595703125, -0.04736328125, -0.01317596435546875, -0.01617431640625, -0.0043487548828125, -0.046600341796875, -0.00861358642578125, 0.02850341796875, 0.04547119140625, -0.017791748046875, 0.006847381591796875, -0.026214599609375, -0.045440673828125, -0.0011806488037109375, -0.01727294921875, 0.0833740234375, 0.019378662109375, -0.0248870849609375, -0.015655517578125, -0.032806396484375, 0.0077972412109375, 0.022369384765625, -0.0248870849609375, -0.0223846435546875, 0.01361846923828125, 0.0166168212890625, 0.0019073486328125, 0.039337158203125, -0.051300048828125, 0.01529693603515625, -0.04168701171875, 0.04510498046875, 0.0433349609375, 0.0101165771484375, 0.0203704833984375, -0.03643798828125, 0.0227203369140625, -0.002960205078125, -0.00197601318359375, -0.014373779296875, -0.0418701171875, -0.04669189453125, -0.0242919921875, 0.054290771484375, 0.0499267578125, -0.0633544921875, 0.05096435546875, -0.0361328125, -0.046722412109375, -0.070068359375, 0.011810302734375, 0.038787841796875, 0.001049041748046875, 0.05303955078125, -0.01125335693359375, -0.03900146484375, -0.0716552734375, -0.005706787109375, 0.0037078857421875, -0.0067901611328125, 0.037261962890625, 0.047149658203125, -0.02728271484375, 0.033905029296875, -0.052978515625, -0.02587890625, -0.017303466796875, -0.005069732666015625, 0.02789306640625, 0.0386962890625, 0.0479736328125, -0.07318115234375, -0.042144775390625, -0.0014657974243164062, -0.056732177734375, 0.0060272216796875, 0.004261016845703125, -0.0182037353515625, 0.013946533203125, 0.0469970703125, -0.0330810546875, 0.016815185546875, 0.0380859375, -0.0193634033203125, 0.022979736328125, -0.0019435882568359375, 0.00829315185546875, -0.09765625, 0.0037326812744140625, 0.0222320556640625, -0.010772705078125, -0.0199432373046875, 0.0372314453125, 0.011688232421875, 0.01412200927734375, -0.0271453857421875, 0.044158935546875, -0.041473388671875, 0.01532745361328125, 0.011688232421875, 0.041900634765625, -0.00872802734375, 0.038909912109375, -0.003932952880859375, 0.055267333984375, 0.030670166015625, -0.0280303955078125, 0.0081329345703125, 0.0379638671875, -0.035400390625, 0.006103515625, -0.049835205078125, -0.00543975830078125, -0.004547119140625, 0.0134735107421875, -0.0631103515625, -0.0058746337890625, 0.0190277099609375, -0.04376220703125, 0.039276123046875, -0.023773193359375, -0.0396728515625, -0.0291748046875, -0.06842041015625, 0.013580322265625, 0.045806884765625, -0.048980712890625, 0.0172119140625, 0.022979736328125, 0.004192352294921875, -0.05889892578125, -0.0631103515625, -0.00873565673828125, 0.0001615285873413086, -0.03802490234375, 0.042999267578125, -0.0042266845703125, 0.0170745849609375, 0.0145416259765625, -0.005268096923828125, 0.01174163818359375, 0.0097503662109375, 0.00036835670471191406, 0.018463134765625, -0.033447265625, 0.0015888214111328125, 0.0169525146484375, 0.00867462158203125, -0.01374053955078125, -0.0108795166015625, 0.032867431640625, -0.01812744140625, -0.0265350341796875, -0.017913818359375, 0.021392822265625, 0.0206451416015625, -0.0293731689453125, 0.04351806640625, 0.0750732421875, -0.0276031494140625, -0.00798797607421875, -0.05157470703125, -0.0097808837890625, -0.036407470703125, 0.034942626953125, -0.023101806640625, -0.07720947265625, 0.029937744140625, -0.00272369384765625, 0.0182952880859375, 0.0479736328125, 0.0263214111328125, -0.0102386474609375, 0.0809326171875, 0.0283966064453125, -0.020294189453125, 0.0496826171875, -0.050872802734375, 0.01290130615234375, -0.09002685546875, -0.002628326416015625, -0.0285797119140625, -0.030029296875, -0.0982666015625, -0.038330078125, 0.00302886962890625, 0.02105712890625, -0.02984619140625, 0.0298614501953125, -0.04669189453125, 0.01312255859375, 0.0362548828125, 0.0204315185546875, -0.0012655258178710938, 0.0090484619140625, -0.033203125, -0.01873779296875, -0.04669189453125, -0.038818359375, 0.0780029296875, 0.037445068359375, 0.04486083984375, 0.0251922607421875, 0.061614990234375, 0.016021728515625, 0.008087158203125, -0.05804443359375, 0.04541015625, -0.03759765625, -0.042877197265625, -0.02532958984375, -0.0357666015625, -0.08343505859375, 0.029296875, -0.0222930908203125, -0.0584716796875, 0.00847625732421875, -0.0139007568359375, -0.004070281982421875, 0.03814697265625, -0.04876708984375, 0.0758056640625, -0.009033203125, -0.0226593017578125, -0.004619598388671875, -0.033905029296875, 0.0249176025390625, 0.01299285888671875, 0.0071258544921875, 0.003185272216796875, -0.0210723876953125, 0.05712890625, -0.012603759765625, 0.049957275390625, -0.01238250732421875, 0.012054443359375, 0.0298919677734375, -0.0133819580078125, 0.041473388671875, 0.007228851318359375, -0.01556396484375, 0.0244903564453125, 0.004863739013671875, -0.03692626953125, -0.03631591796875, 0.06854248046875, -0.052978515625, -0.05242919921875, -0.029449462890625, -0.02154541015625, 0.01222991943359375, 0.031341552734375, 0.024169921875, 0.0148773193359375, -0.00785064697265625, 0.04852294921875, 0.06982421875, -0.039459228515625, 0.0299835205078125, 0.0248870849609375, -0.02203369140625, -0.045501708984375, 0.084716796875, 0.0170745849609375, -0.003993988037109375, 0.04876708984375, -0.0006108283996582031, -0.0206146240234375, -0.044464111328125, -0.033599853515625, 0.0457763671875, -0.044097900390625, -0.01329803466796875, -0.04888916015625, -0.032135009765625, -0.03155517578125, -0.0015115737915039062, -0.0220794677734375, -0.0229034423828125, -0.01508331298828125, -0.020233154296875, 0.018951416015625, 0.035552978515625, 0.00769805908203125, 0.0059051513671875, -0.050872802734375, 0.015625, -0.0048828125, 0.03240966796875, 0.004100799560546875, -0.04168701171875, -0.043731689453125, 0.01140594482421875, -0.035491943359375, -0.0775146484375, 0.0273590087890625, 0.00543212890625, 0.06201171875, 0.022247314453125, -0.003875732421875, 0.03253173828125, -0.038726806640625, 0.08026123046875, -0.006984710693359375, -0.05780029296875, 0.03558349609375, -0.0224456787109375, 0.01428985595703125, 0.045379638671875, 0.0478515625, -0.033905029296875, -0.02203369140625, -0.03839111328125, -0.07427978515625, 0.036285400390625, 0.01479339599609375, 0.0025482177734375, -0.020050048828125, 0.025909423828125, -0.013641357421875, -0.00015354156494140625, -0.061004638671875, -0.05511474609375, -0.0230712890625, -0.0259857177734375, -0.00824737548828125, -0.02166748046875, 0.01401519775390625, -0.0247955322265625, 0.07568359375, -0.0011243820190429688, 0.0406494140625, 0.026763916015625, -0.0244598388671875, 0.0177764892578125, 0.020111083984375, 0.022979736328125, 0.0173797607421875, -0.0305938720703125, -0.00861358642578125, 0.0253143310546875, -0.0404052734375, -0.0045166015625, 0.0233001708984375, -0.03643798828125, 0.0144500732421875, 0.0234222412109375, 0.05279541015625, 0.03619384765625, -0.033905029296875, 0.043914794921875, 0.00820159912109375, -0.01617431640625, -0.0247955322265625, -0.00604248046875, 0.02532958984375, 0.0185546875, 0.011810302734375, -0.0310516357421875, 0.018890380859375, -0.040008544921875, 0.0252227783203125, 0.037139892578125, -0.0284881591796875, -0.006587982177734375, 0.054290771484375, 0.002796173095703125, -0.0036296844482421875, 0.038177490234375, -0.036956787109375, -0.0545654296875, 0.032684326171875, 0.0284423828125, 0.06317138671875, -0.009490966796875, 0.0157928466796875, 0.0648193359375, 0.0384521484375, -0.02520751953125, 0.0257110595703125, 0.006103515625, -0.045074462890625, -0.03369140625, -0.0426025390625, -0.005733489990234375, 0.020294189453125, -0.0430908203125, 0.026641845703125, -0.032257080078125, -0.01357269287109375, 0.001285552978515625, 0.033538818359375, -0.057159423828125, 0.0074462890625, 0.00485992431640625, 0.08416748046875, -0.045135498046875, 0.0635986328125, 0.0755615234375, -0.06951904296875, -0.055755615234375, 0.001796722412109375, -0.00867462158203125, -0.04754638671875, 0.029754638671875, 0.0205230712890625, 0.01220703125, 0.006374359130859375, -0.03741455078125, -0.06884765625, 0.11956787109375, 0.004913330078125, -0.042694091796875, -0.006305694580078125, -0.0188140869140625, 0.034454345703125, -0.0272674560546875, 0.0343017578125, 0.031005859375, 0.045257568359375, -0.01419830322265625, -0.0504150390625, 0.0391845703125, -0.023834228515625, 0.017547607421875, 0.00276947021484375, -0.0721435546875, 0.0618896484375, 0.0036373138427734375, -0.022613525390625, 0.0158233642578125, 0.053741455078125, 0.0213165283203125, 0.028106689453125, 0.020599365234375, 0.07476806640625, 0.04925537109375, -0.0166778564453125, 0.08526611328125, -0.0211334228515625, 0.046295166015625, 0.06640625, 0.0133209228515625, 0.08294677734375, 0.0088653564453125, -0.01788330078125, 0.050933837890625, 0.060333251953125, -0.024658203125, 0.03369140625, 0.001316070556640625, 0.003757476806640625, -0.025970458984375, 0.004169464111328125, -0.039459228515625, 0.017974853515625, 0.0242156982421875, -0.03814697265625, 0.0022411346435546875, -0.02349853515625, 0.006072998046875, 0.006134033203125, -0.003925323486328125, 0.04156494140625, 0.0254364013671875, -0.033233642578125, 0.051055908203125, 0.0160980224609375, 0.0794677734375, -0.0311431884765625, -0.0117340087890625, -0.0205078125, -0.007595062255859375, -0.0191802978515625, -0.06158447265625, -0.0057220458984375, -0.019439697265625, -0.01433563232421875, 0.0008392333984375, 0.04193115234375, -0.04541015625, -0.03094482421875, 0.04315185546875, 0.038177490234375, 0.0186004638671875, 0.0133819580078125, -0.084716796875, 0.006786346435546875, 0.025238037109375, -0.040008544921875, 0.024658203125, 0.0325927734375, -0.00469970703125, 0.04656982421875, 0.046295166015625, 0.005889892578125, -0.0013551712036132812, 0.003711700439453125, 0.039093017578125, -0.067626953125, -0.0206451416015625, -0.049774169921875, 0.0301055908203125, -0.021514892578125, -0.0001169443130493164, 0.059783935546875, 0.0557861328125, 0.08062744140625, -0.0042572021484375, 0.05859375, -0.0074615478515625, 0.029388427734375, -0.041717529296875, 0.0660400390625, -0.07757568359375, 0.0208740234375, -0.026397705078125, -0.07183837890625, -0.0154571533203125, 0.05157470703125, -0.0234832763671875, 0.016204833984375, 0.04888916015625, 0.073486328125, -0.018890380859375, -0.01552581787109375, 0.02362060546875, 0.031585693359375, 0.0126495361328125, 0.05877685546875, 0.02886962890625, -0.07025146484375, 0.047698974609375, -0.01715087890625, 0.0085906982421875, -0.038787841796875, -0.04901123046875, -0.0684814453125, -0.0546875, -0.031097412109375, -0.020050048828125, -0.00133514404296875, 0.06964111328125, 0.0279388427734375, -0.05548095703125, -0.00669097900390625, 0.0154571533203125, 0.034210205078125, -0.0193634033203125, -0.020965576171875, 0.0469970703125, -0.004978179931640625, -0.069580078125, 0.02728271484375, -0.00498199462890625, -0.005382537841796875, -0.005573272705078125, -0.01837158203125, -0.06658935546875, 0.005096435546875, 0.045257568359375, 0.0212554931640625, -0.0694580078125, -0.0305328369140625, 0.004428863525390625, -0.0206451416015625, -0.00921630859375, 0.01268768310546875, -0.032958984375, 0.024505615234375, 0.046783447265625, 0.059967041015625, 0.051544189453125, -0.004608154296875, 0.01496124267578125, -0.046356201171875, -0.00530242919921875, 0.000652313232421875, 0.0533447265625, 0.0290985107421875, -0.0203857421875, 0.0697021484375, 0.01611328125, -0.03369140625, -0.0596923828125, 0.0025615692138671875, -0.08123779296875, -0.024139404296875, 0.0845947265625, -0.028900146484375, -0.016143798828125, 0.0193634033203125, -0.0129241943359375, 0.037506103515625, -0.036651611328125, 0.0350341796875, 0.059326171875, 0.031219482421875, -0.0121307373046875, -0.06781005859375, 0.02398681640625, 0.0474853515625, -0.0227508544921875, -0.024261474609375, 0.0260467529296875, 0.037353515625, 0.017730712890625, 0.01324462890625, -0.017730712890625, 0.0225372314453125, -0.007328033447265625, 0.002727508544921875, -0.01139068603515625, 0.0169525146484375, -0.01494598388671875, 0.0005097389221191406, -0.00927734375, -0.0198822021484375 ] ]
rizvandwiki/gender-classification-2
2023-05-18T11:17:43.000Z
[ "transformers", "pytorch", "tensorboard", "safetensors", "vit", "image-classification", "huggingpics", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
rizvandwiki
null
null
rizvandwiki/gender-classification-2
15
245,474
transformers
2022-12-12T03:13:20
--- tags: - image-classification - pytorch - huggingpics metrics: - accuracy model-index: - name: gender-classification-2 results: - task: name: Image Classification type: image-classification metrics: - name: Accuracy type: accuracy value: 0.9910714030265808 --- # gender-classification-2 Autogenerated by HuggingPics🤗🖼️ Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb). Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics). ## Example Images #### female ![female](images/female.jpg) #### male ![male](images/male.jpg)
743
[ [ -0.0292205810546875, -0.037628173828125, 0.00006431341171264648, 0.043426513671875, -0.01849365234375, 0.01103973388671875, 0.0222625732421875, -0.033660888671875, 0.0260772705078125, 0.00789642333984375, -0.0372314453125, -0.03887939453125, -0.05059814453125, 0.0112762451171875, -0.02691650390625, 0.0684814453125, 0.0040740966796875, 0.006435394287109375, -0.003749847412109375, 0.007015228271484375, -0.045379638671875, -0.01233673095703125, -0.043701171875, -0.0285797119140625, 0.0247802734375, 0.04290771484375, 0.039215087890625, 0.038360595703125, 0.03082275390625, 0.031646728515625, 0.006603240966796875, 0.002849578857421875, -0.0211639404296875, 0.005771636962890625, -0.006305694580078125, -0.043701171875, -0.04083251953125, 0.03668212890625, 0.0269775390625, 0.015045166015625, -0.0097198486328125, 0.0255126953125, -0.015350341796875, 0.03228759765625, -0.0296783447265625, 0.0257110595703125, -0.028106689453125, 0.0273284912109375, -0.003734588623046875, -0.0066375732421875, -0.01149749755859375, -0.055908203125, 0.00428009033203125, -0.0330810546875, 0.033660888671875, 0.0292816162109375, 0.07244873046875, -0.00351715087890625, -0.035858154296875, -0.02825927734375, -0.01079559326171875, 0.0232391357421875, 0.0102691650390625, 0.0044708251953125, 0.0294647216796875, 0.036224365234375, -0.0086822509765625, -0.045501708984375, -0.02008056640625, 0.003955841064453125, -0.0007219314575195312, -0.010772705078125, -0.031524658203125, -0.01165771484375, 0.003536224365234375, 0.0318603515625, -0.05224609375, -0.0041656494140625, -0.053466796875, -0.0241851806640625, 0.050079345703125, -0.0174407958984375, 0.07763671875, -0.0158233642578125, -0.0282440185546875, 0.0025806427001953125, -0.0322265625, -0.0069427490234375, 0.042510986328125, 0.0016393661499023438, -0.0260009765625, 0.0494384765625, 0.0267181396484375, 0.01776123046875, 0.05804443359375, 0.0029659271240234375, 0.056976318359375, 0.00920867919921875, -0.01050567626953125, 0.0090484619140625, 0.07196044921875, 0.04608154296875, 0.00516510009765625, -0.0040435791015625, 0.0122222900390625, 0.00626373291015625, -0.00508880615234375, -0.06549072265625, -0.0430908203125, -0.00818634033203125, -0.046173095703125, -0.04986572265625, -0.003589630126953125, -0.0518798828125, -0.025726318359375, -0.005580902099609375, 0.01142120361328125, -0.0193634033203125, -0.036285400390625, -0.02191162109375, -0.037994384765625, 0.0286865234375, 0.022552490234375, -0.06927490234375, 0.017059326171875, 0.02874755859375, 0.05230712890625, 0.035003662109375, -0.0157928466796875, -0.0120391845703125, -0.00994873046875, -0.04498291015625, 0.057342529296875, -0.0004050731658935547, -0.0310516357421875, -0.01549530029296875, 0.0203094482421875, -0.0033245086669921875, -0.03692626953125, 0.039581298828125, -0.014068603515625, 0.050384521484375, -0.017059326171875, -0.032989501953125, -0.0227813720703125, -0.003093719482421875, -0.03704833984375, 0.07562255859375, 0.03424072265625, -0.07208251953125, 0.0248565673828125, -0.058868408203125, -0.002307891845703125, 0.030914306640625, -0.0247802734375, -0.048736572265625, -0.00820159912109375, -0.028839111328125, 0.0300140380859375, -0.0163726806640625, 0.01702880859375, -0.045013427734375, -0.00970458984375, 0.006343841552734375, 0.0302581787109375, 0.07611083984375, 0.021728515625, -0.03057861328125, 0.01285552978515625, -0.022552490234375, -0.0038585662841796875, 0.040374755859375, 0.0191497802734375, -0.01422882080078125, -0.035552978515625, 0.01496124267578125, 0.0172119140625, 0.01517486572265625, -0.052276611328125, 0.0450439453125, 0.011871337890625, 0.0248870849609375, 0.0382080078125, -0.0215606689453125, 0.052398681640625, -0.01355743408203125, 0.053497314453125, -0.01468658447265625, 0.018463134765625, 0.0225067138671875, -0.046630859375, -0.0487060546875, -0.041839599609375, 0.0163726806640625, 0.043731689453125, -0.04156494140625, 0.08123779296875, 0.0123291015625, -0.052093505859375, -0.016693115234375, 0.01666259765625, 0.034271240234375, 0.0284576416015625, 0.0014629364013671875, -0.0364990234375, -0.061981201171875, -0.0714111328125, 0.0213470458984375, -0.045257568359375, 0.005565643310546875, 0.0211334228515625, 0.0540771484375, -0.036224365234375, 0.058807373046875, -0.043914794921875, -0.02996826171875, 0.00006157159805297852, 0.03936767578125, -0.00460052490234375, 0.0631103515625, 0.07989501953125, -0.045806884765625, -0.003414154052734375, -0.02978515625, -0.061981201171875, -0.0259246826171875, 0.03155517578125, -0.044219970703125, 0.01398468017578125, 0.040283203125, -0.0259246826171875, 0.038116455078125, 0.031646728515625, -0.060516357421875, 0.035736083984375, -0.0195770263671875, 0.003631591796875, -0.06488037109375, 0.0005693435668945312, 0.007587432861328125, -0.041839599609375, -0.0258941650390625, 0.01242828369140625, 0.0228424072265625, -0.01020050048828125, -0.04119873046875, 0.043914794921875, -0.036865234375, -0.01319122314453125, -0.025360107421875, -0.02227783203125, -0.0246734619140625, -0.006622314453125, 0.00882720947265625, 0.0209197998046875, 0.0643310546875, -0.0272216796875, 0.05682373046875, 0.05938720703125, -0.01282501220703125, 0.03985595703125, -0.05487060546875, 0.024688720703125, -0.01702880859375, 0.04693603515625, -0.08001708984375, -0.04840087890625, 0.067138671875, -0.053253173828125, 0.041748046875, -0.0211181640625, -0.039398193359375, -0.04656982421875, -0.033538818359375, 0.046630859375, 0.07183837890625, -0.05938720703125, 0.0254669189453125, 0.0309295654296875, -0.0042572021484375, -0.0268707275390625, -0.0596923828125, 0.00931549072265625, -0.02288818359375, -0.037261962890625, 0.00475311279296875, -0.006771087646484375, 0.0103759765625, 0.01045989990234375, 0.020050048828125, -0.0232086181640625, -0.0129547119140625, 0.04241943359375, 0.039703369140625, 0.0156402587890625, 0.01531982421875, 0.005947113037109375, -0.0196075439453125, 0.0030231475830078125, -0.0015697479248046875, 0.04132080078125, -0.04522705078125, -0.007076263427734375, -0.04437255859375, 0.0161895751953125, 0.032135009765625, 0.01006317138671875, 0.0197296142578125, 0.07269287109375, -0.036834716796875, -0.024444580078125, -0.031829833984375, -0.0021610260009765625, -0.033538818359375, 0.01047515869140625, -0.0340576171875, -0.050018310546875, 0.028533935546875, 0.00347900390625, -0.026641845703125, 0.042755126953125, 0.039947509765625, -0.026092529296875, 0.07550048828125, 0.0501708984375, -0.006988525390625, 0.0232086181640625, -0.0240936279296875, -0.0173797607421875, -0.04937744140625, -0.031768798828125, -0.031494140625, -0.03656005859375, -0.07647705078125, -0.02825927734375, 0.00481414794921875, 0.005130767822265625, -0.030426025390625, 0.053497314453125, -0.055206298828125, 0.03961181640625, 0.053466796875, 0.0245208740234375, -0.0189208984375, -0.01343536376953125, 0.0182647705078125, 0.0038013458251953125, -0.039703369140625, -0.0223541259765625, 0.056640625, 0.054046630859375, 0.06597900390625, 0.00399017333984375, 0.07427978515625, 0.00930023193359375, 0.0259552001953125, -0.052337646484375, 0.0458984375, -0.022552490234375, -0.08013916015625, 0.0108642578125, -0.0243072509765625, -0.07598876953125, -0.014739990234375, -0.004688262939453125, -0.04473876953125, 0.0227203369140625, 0.027496337890625, 0.01274871826171875, 0.01045989990234375, -0.044830322265625, 0.061981201171875, -0.0195770263671875, 0.0017805099487304688, 0.01322174072265625, -0.056854248046875, 0.046112060546875, 0.0145111083984375, -0.0010061264038085938, -0.033660888671875, -0.00954437255859375, 0.0435791015625, -0.0194549560546875, 0.0703125, -0.032867431640625, 0.0036182403564453125, 0.011444091796875, 0.00945281982421875, -0.01512908935546875, -0.00496673583984375, 0.03912353515625, 0.0218048095703125, -0.0024471282958984375, -0.031585693359375, -0.0281219482421875, 0.043548583984375, -0.06268310546875, 0.0022525787353515625, -0.05572509765625, -0.00799560546875, 0.01506805419921875, -0.00982666015625, 0.050201416015625, 0.0090484619140625, -0.006427764892578125, 0.003223419189453125, 0.0380859375, -0.0164031982421875, 0.0220489501953125, 0.00478363037109375, -0.05120849609375, -0.04644775390625, 0.046295166015625, -0.003108978271484375, -0.0162506103515625, -0.0037517547607421875, 0.0341796875, -0.050689697265625, -0.01097869873046875, -0.038177490234375, 0.01971435546875, -0.0419921875, -0.01251220703125, -0.01561737060546875, -0.006320953369140625, -0.042266845703125, -0.01605224609375, -0.0190887451171875, -0.0252227783203125, -0.04278564453125, -0.0100250244140625, 0.0269775390625, 0.029022216796875, -0.0205841064453125, 0.00951385498046875, -0.039215087890625, 0.054901123046875, 0.046844482421875, 0.030914306640625, -0.038970947265625, -0.03826904296875, 0.0430908203125, -0.035552978515625, -0.0301361083984375, -0.0648193359375, 0.040557861328125, 0.040924072265625, 0.027587890625, 0.0445556640625, -0.010650634765625, 0.052978515625, -0.023468017578125, 0.0321044921875, 0.05914306640625, -0.07073974609375, 0.059234619140625, -0.028900146484375, -0.0098114013671875, 0.040191650390625, 0.0276947021484375, -0.020111083984375, -0.01535797119140625, -0.0638427734375, -0.042022705078125, 0.047760009765625, 0.03277587890625, 0.005626678466796875, 0.001926422119140625, 0.045623779296875, 0.015655517578125, 0.01306915283203125, -0.05963134765625, -0.0302581787109375, -0.01293182373046875, -0.042236328125, 0.015045166015625, -0.023345947265625, -0.0225830078125, -0.04156494140625, 0.05328369140625, -0.0188751220703125, 0.033660888671875, 0.01244354248046875, -0.00467681884765625, -0.031402587890625, -0.02154541015625, 0.01666259765625, 0.0135345458984375, -0.023590087890625, 0.01160430908203125, -0.00832366943359375, -0.0210113525390625, 0.038604736328125, 0.016357421875, -0.027679443359375, 0.015228271484375, 0.00861358642578125, 0.0509033203125, 0.0032787322998046875, -0.015350341796875, 0.042510986328125, -0.043243408203125, -0.0357666015625, -0.0401611328125, 0.0173797607421875, 0.0018053054809570312, 0.0239105224609375, 0.0153961181640625, 0.0113372802734375, 0.03045654296875, -0.0577392578125, 0.00820159912109375, 0.0246429443359375, -0.01085662841796875, -0.02069091796875, 0.0509033203125, 0.02978515625, -0.0242462158203125, 0.042266845703125, -0.06976318359375, -0.057525634765625, 0.0677490234375, 0.0390625, 0.086669921875, -0.0291900634765625, 0.041839599609375, 0.05029296875, 0.014984130859375, -0.0113525390625, 0.045440673828125, -0.0020294189453125, -0.051422119140625, 0.0073089599609375, -0.031982421875, -0.0269775390625, 0.006038665771484375, -0.032562255859375, 0.049041748046875, -0.048309326171875, -0.02264404296875, 0.005199432373046875, -0.003932952880859375, -0.045684814453125, 0.028533935546875, 0.049774169921875, 0.056365966796875, -0.07275390625, 0.034698486328125, 0.0654296875, -0.017974853515625, -0.035675048828125, -0.0176849365234375, 0.029876708984375, -0.0614013671875, 0.072509765625, 0.0443115234375, -0.007656097412109375, -0.011810302734375, -0.06988525390625, -0.0357666015625, 0.07452392578125, 0.02813720703125, -0.0306396484375, -0.00027561187744140625, -0.042266845703125, 0.0239410400390625, -0.03546142578125, 0.03509521484375, 0.018280029296875, 0.0284576416015625, 0.01360321044921875, -0.0787353515625, -0.00624847412109375, -0.043243408203125, 0.020416259765625, 0.00901031494140625, -0.051116943359375, 0.045562744140625, -0.02734375, -0.011810302734375, 0.0233917236328125, 0.048492431640625, 0.021087646484375, 0.041015625, 0.07373046875, 0.044158935546875, 0.039947509765625, -0.0300445556640625, 0.034881591796875, 0.0005240440368652344, 0.0469970703125, 0.0931396484375, 0.00627899169921875, 0.016571044921875, 0.0026874542236328125, -0.0245208740234375, 0.0517578125, 0.08538818359375, -0.044525146484375, 0.0127105712890625, 0.026123046875, -0.0152740478515625, -0.0016918182373046875, 0.004581451416015625, -0.039581298828125, 0.05633544921875, 0.0084991455078125, -0.044464111328125, -0.001903533935546875, 0.00811767578125, 0.00885772705078125, -0.01212310791015625, -0.016937255859375, 0.05657958984375, -0.015106201171875, -0.021820068359375, 0.0439453125, -0.01142120361328125, 0.0611572265625, -0.033538818359375, 0.0011720657348632812, 0.01236724853515625, -0.00046324729919433594, -0.0338134765625, -0.052886962890625, 0.006427764892578125, -0.0069122314453125, 0.0034027099609375, -0.0191192626953125, 0.0794677734375, -0.049835205078125, -0.046539306640625, 0.0228424072265625, 0.02587890625, 0.02813720703125, 0.014739990234375, -0.07257080078125, -0.00698089599609375, -0.0179595947265625, 0.0011873245239257812, 0.0166473388671875, 0.0108489990234375, 0.00888824462890625, 0.06103515625, 0.0277252197265625, -0.002288818359375, -0.002239227294921875, 0.0026264190673828125, 0.03564453125, -0.033477783203125, -0.03070068359375, -0.043731689453125, 0.001399993896484375, -0.00917816162109375, -0.025390625, 0.049774169921875, 0.047882080078125, 0.048919677734375, -0.038787841796875, 0.05914306640625, -0.058868408203125, 0.022491455078125, 0.01071929931640625, 0.042236328125, -0.042449951171875, -0.050567626953125, -0.028533935546875, -0.03790283203125, -0.0180511474609375, 0.057220458984375, 0.007602691650390625, 0.00530242919921875, 0.03997802734375, 0.0450439453125, -0.00281524658203125, -0.016082763671875, -0.008056640625, -0.016632080078125, 0.005970001220703125, 0.0272979736328125, 0.072021484375, -0.034149169921875, 0.00958251953125, -0.048919677734375, -0.0462646484375, -0.0236358642578125, -0.0771484375, -0.06500244140625, -0.048736572265625, -0.06939697265625, -0.032745361328125, -0.0174102783203125, 0.089111328125, 0.0928955078125, -0.0577392578125, -0.017120361328125, -0.031768798828125, -0.001861572265625, 0.0196533203125, -0.0181121826171875, 0.0217742919921875, -0.00505828857421875, -0.0782470703125, -0.01331329345703125, -0.0026721954345703125, 0.04473876953125, 0.002338409423828125, -0.0013599395751953125, 0.01058197021484375, -0.0382080078125, 0.030029296875, 0.039459228515625, -0.041259765625, -0.0191802978515625, -0.03436279296875, 0.0015554428100585938, 0.004261016845703125, 0.0294647216796875, -0.02252197265625, 0.035552978515625, 0.05706787109375, 0.003917694091796875, 0.004425048828125, 0.0174407958984375, 0.020599365234375, -0.04815673828125, 0.014373779296875, 0.00716400146484375, 0.038330078125, 0.032958984375, -0.031524658203125, 0.051727294921875, 0.048583984375, -0.0577392578125, -0.05206298828125, 0.01297760009765625, -0.09088134765625, -0.01218414306640625, 0.07666015625, -0.0075531005859375, -0.0201263427734375, 0.00872039794921875, -0.053314208984375, 0.0443115234375, -0.039581298828125, 0.0625, 0.0243988037109375, -0.02972412109375, -0.022674560546875, -0.00457763671875, 0.022674560546875, -0.0098419189453125, -0.08184814453125, -0.0311126708984375, 0.041900634765625, 0.04290771484375, 0.02935791015625, 0.0308685302734375, -0.0212249755859375, 0.041961669921875, -0.0010824203491210938, 0.054107666015625, -0.00843048095703125, -0.008819580078125, -0.0318603515625, 0.00623321533203125, -0.006195068359375, -0.060699462890625 ] ]
mistralai/Mistral-7B-Instruct-v0.1
2023-10-11T12:31:14.000Z
[ "transformers", "pytorch", "mistral", "text-generation", "finetuned", "arxiv:2310.06825", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
mistralai
null
null
mistralai/Mistral-7B-Instruct-v0.1
958
243,608
transformers
2023-09-27T14:31:52
--- license: apache-2.0 pipeline_tag: text-generation tags: - finetuned inference: parameters: temperature: 0.7 --- # Model Card for Mistral-7B-Instruct-v0.1 The Mistral-7B-Instruct-v0.1 Large Language Model (LLM) is a instruct fine-tuned version of the [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) generative text model using a variety of publicly available conversation datasets. For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/announcing-mistral-7b/). ## Instruction format In order to leverage instruction fine-tuning, your prompt should be surrounded by `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id. E.g. ``` text = "<s>[INST] What is your favourite condiment? [/INST]" "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> " "[INST] Do you have mayonnaise recipes? [/INST]" ``` This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method: ```python from transformers import AutoModelForCausalLM, AutoTokenizer device = "cuda" # the device to load the model onto model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1") tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1") messages = [ {"role": "user", "content": "What is your favourite condiment?"}, {"role": "assistant", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"}, {"role": "user", "content": "Do you have mayonnaise recipes?"} ] encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt") model_inputs = encodeds.to(device) model.to(device) generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True) decoded = tokenizer.batch_decode(generated_ids) print(decoded[0]) ``` ## Model Architecture This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices: - Grouped-Query Attention - Sliding-Window Attention - Byte-fallback BPE tokenizer ## Troubleshooting - If you see the following error: ``` Traceback (most recent call last): File "", line 1, in File "/transformers/models/auto/auto_factory.py", line 482, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/transformers/models/auto/configuration_auto.py", line 1022, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/transformers/models/auto/configuration_auto.py", line 723, in getitem raise KeyError(key) KeyError: 'mistral' ``` Installing transformers from source should solve the issue pip install git+https://github.com/huggingface/transformers This should not be required after transformers-v4.33.4. ## Limitations The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance. It does not have any moderation mechanisms. We're looking forward to engaging with the community on ways to make the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs. ## The Mistral AI Team Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lélio Renard Lavaud, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed.
3,848
[ [ -0.03118896484375, -0.055389404296875, 0.00525665283203125, 0.034820556640625, -0.0111236572265625, -0.021759033203125, -0.00600433349609375, -0.0111846923828125, -0.00632476806640625, 0.02935791015625, -0.04345703125, -0.035186767578125, -0.04925537109375, -0.0019321441650390625, -0.0301666259765625, 0.07452392578125, 0.0019245147705078125, -0.006816864013671875, 0.00394439697265625, 0.00978851318359375, -0.050048828125, -0.046478271484375, -0.07403564453125, -0.020721435546875, 0.01226806640625, 0.0138092041015625, 0.04632568359375, 0.044677734375, 0.0235595703125, 0.0311279296875, -0.036102294921875, 0.018707275390625, -0.039764404296875, 0.0101165771484375, -0.00896453857421875, -0.03790283203125, -0.04376220703125, -0.0103302001953125, 0.0271453857421875, 0.0093841552734375, -0.0242462158203125, 0.0196075439453125, 0.005054473876953125, 0.0205078125, -0.03131103515625, 0.0318603515625, -0.045623779296875, -0.00534820556640625, -0.01031494140625, -0.00527191162109375, -0.0224151611328125, 0.00402069091796875, 0.003391265869140625, -0.043975830078125, 0.0224151611328125, 0.0019073486328125, 0.082763671875, 0.039398193359375, -0.0298919677734375, -0.00576019287109375, -0.04937744140625, 0.037750244140625, -0.0743408203125, 0.021697998046875, 0.01715087890625, 0.018951416015625, -0.039886474609375, -0.087890625, -0.049652099609375, -0.012237548828125, -0.005039215087890625, 0.0191192626953125, -0.0419921875, 0.0200042724609375, 0.0307464599609375, 0.0333251953125, -0.0253753662109375, 0.0025119781494140625, -0.04766845703125, -0.0157012939453125, 0.0266265869140625, 0.032867431640625, 0.0098724365234375, -0.0179290771484375, -0.026611328125, -0.017791748046875, -0.0158233642578125, 0.0262603759765625, 0.005565643310546875, 0.01007080078125, -0.0517578125, 0.0283660888671875, -0.01174163818359375, 0.041656494140625, 0.046295166015625, 0.0024967193603515625, 0.01136016845703125, -0.0159912109375, -0.0340576171875, -0.004436492919921875, 0.09161376953125, 0.0235595703125, -0.01039886474609375, 0.024322509765625, -0.022705078125, -0.003993988037109375, 0.0185394287109375, -0.06951904296875, -0.0259246826171875, 0.024627685546875, -0.03753662109375, -0.0355224609375, 0.0178985595703125, -0.0360107421875, -0.0169525146484375, -0.00482177734375, 0.03741455078125, -0.02294921875, -0.0287017822265625, 0.007373809814453125, -0.026824951171875, 0.0352783203125, 0.0237579345703125, -0.074951171875, 0.0238037109375, 0.04449462890625, 0.05194091796875, 0.00711822509765625, -0.0206298828125, -0.01296234130859375, -0.0009603500366210938, -0.0210723876953125, 0.041259765625, -0.008880615234375, -0.048797607421875, -0.010162353515625, 0.0209197998046875, -0.0089111328125, -0.052337646484375, 0.041717529296875, -0.025543212890625, 0.032623291015625, -0.01145172119140625, -0.037139892578125, -0.00949859619140625, 0.004611968994140625, -0.035675048828125, 0.09326171875, 0.016082763671875, -0.06683349609375, -0.0016260147094726562, -0.049530029296875, -0.02587890625, -0.011810302734375, -0.01274871826171875, -0.0241546630859375, 0.016082763671875, 0.01523590087890625, 0.037811279296875, -0.0283966064453125, 0.0311737060546875, -0.005207061767578125, -0.021331787109375, 0.030181884765625, -0.061676025390625, 0.07476806640625, 0.022674560546875, -0.045745849609375, 0.0230560302734375, -0.04046630859375, -0.0222625732421875, 0.005878448486328125, -0.00632476806640625, 0.00597381591796875, -0.019622802734375, 0.00958251953125, 0.0235595703125, 0.03863525390625, -0.0162506103515625, 0.0208587646484375, -0.03350830078125, 0.04119873046875, 0.0711669921875, 0.00969696044921875, 0.035400390625, -0.0283966064453125, 0.036041259765625, 0.0305938720703125, 0.0518798828125, -0.01271820068359375, -0.037628173828125, -0.095703125, -0.016754150390625, 0.0008449554443359375, 0.043670654296875, -0.04840087890625, 0.05621337890625, -0.0098876953125, -0.06201171875, -0.046356201171875, 0.004589080810546875, 0.024658203125, 0.035675048828125, 0.030548095703125, -0.021697998046875, -0.033172607421875, -0.06610107421875, 0.007465362548828125, -0.02197265625, 0.0218353271484375, 0.00960540771484375, 0.035186767578125, -0.04046630859375, 0.06890869140625, -0.03680419921875, -0.023223876953125, -0.03021240234375, -0.00391387939453125, 0.0362548828125, 0.04254150390625, 0.031585693359375, -0.04791259765625, -0.034515380859375, -0.0017652511596679688, -0.052520751953125, -0.016937255859375, -0.0027027130126953125, -0.02734375, 0.0030975341796875, 0.045867919921875, -0.06549072265625, 0.042144775390625, 0.03692626953125, -0.0352783203125, 0.0372314453125, -0.01904296875, 0.0030193328857421875, -0.11260986328125, -0.0038909912109375, -0.0002079010009765625, -0.0024776458740234375, -0.049530029296875, -0.0220184326171875, -0.0032901763916015625, 0.005767822265625, -0.04296875, 0.06353759765625, -0.019927978515625, 0.044677734375, -0.02459716796875, -0.0283203125, 0.00760650634765625, 0.04296875, -0.006793975830078125, 0.050689697265625, 0.055206298828125, -0.057525634765625, 0.0322265625, 0.0272979736328125, 0.0065460205078125, 0.028106689453125, -0.07550048828125, -0.002471923828125, -0.01277923583984375, 0.032989501953125, -0.07012939453125, -0.021270751953125, 0.05377197265625, -0.040924072265625, 0.0362548828125, -0.0193328857421875, -0.025726318359375, -0.0232086181640625, -0.0009379386901855469, 0.02447509765625, 0.046356201171875, -0.039581298828125, 0.058013916015625, 0.0233154296875, 0.0103302001953125, -0.0472412109375, -0.042236328125, -0.007083892822265625, -0.0194091796875, -0.033477783203125, 0.0228118896484375, 0.0019426345825195312, -0.00505828857421875, -0.00591278076171875, -0.0175628662109375, -0.0098724365234375, 0.0093841552734375, 0.051116943359375, 0.023040771484375, -0.00757598876953125, -0.008331298828125, 0.0286865234375, -0.020355224609375, 0.0150146484375, -0.0036163330078125, 0.05712890625, -0.01433563232421875, -0.01236724853515625, -0.058624267578125, -0.0028533935546875, 0.052734375, -0.0322265625, 0.06817626953125, 0.07025146484375, -0.0191497802734375, -0.01267242431640625, -0.05059814453125, -0.0128021240234375, -0.04241943359375, 0.0323486328125, -0.0200958251953125, -0.06292724609375, 0.047760009765625, 0.006439208984375, 0.0218048095703125, 0.060882568359375, 0.054779052734375, 0.00173187255859375, 0.06463623046875, 0.044647216796875, -0.0224456787109375, 0.05462646484375, -0.0413818359375, 0.0068817138671875, -0.053619384765625, -0.0201568603515625, -0.0384521484375, 0.0010862350463867188, -0.03985595703125, -0.037322998046875, 0.032958984375, 0.011749267578125, -0.04522705078125, 0.02850341796875, -0.061279296875, 0.0027370452880859375, 0.04071044921875, -0.00226593017578125, 0.0045318603515625, -0.01041412353515625, -0.00258636474609375, 0.01300048828125, -0.06365966796875, -0.036529541015625, 0.060699462890625, 0.045013427734375, 0.07159423828125, -0.005817413330078125, 0.060760498046875, -0.00273895263671875, 0.03485107421875, -0.042449951171875, 0.03192138671875, 0.00537109375, -0.05267333984375, -0.005359649658203125, -0.03594970703125, -0.0775146484375, 0.0264434814453125, -0.0093536376953125, -0.0452880859375, 0.024322509765625, 0.0166168212890625, -0.03497314453125, 0.014434814453125, -0.05865478515625, 0.09246826171875, -0.0198211669921875, -0.0015649795532226562, -0.0003452301025390625, -0.04718017578125, 0.032958984375, 0.0190582275390625, 0.015777587890625, 0.00305938720703125, 0.0040740966796875, 0.059326171875, -0.03790283203125, 0.08258056640625, -0.017333984375, -0.01468658447265625, 0.024383544921875, -0.00820159912109375, 0.00922393798828125, 0.01348114013671875, 0.005847930908203125, 0.0185394287109375, 0.030548095703125, -0.00479888916015625, -0.050262451171875, 0.050140380859375, -0.08941650390625, -0.05633544921875, -0.0380859375, -0.023681640625, 0.021026611328125, 0.01485443115234375, 0.052734375, 0.031036376953125, -0.0158233642578125, 0.007080078125, 0.053070068359375, -0.0283203125, 0.03289794921875, 0.0208587646484375, -0.0243377685546875, -0.046295166015625, 0.053466796875, -0.005535125732421875, 0.0204925537109375, 0.0265350341796875, 0.00955963134765625, -0.0161895751953125, -0.0022296905517578125, -0.0268096923828125, 0.0118255615234375, -0.04864501953125, -0.0258941650390625, -0.05450439453125, -0.04400634765625, -0.0390625, 0.00982666015625, -0.047515869140625, -0.0145416259765625, -0.048492431640625, 0.01001739501953125, 0.03265380859375, 0.042205810546875, 0.00785064697265625, 0.05615234375, -0.06976318359375, 0.031982421875, 0.010009765625, 0.006038665771484375, 0.0185546875, -0.07598876953125, -0.0253143310546875, 0.0187225341796875, -0.047515869140625, -0.056732177734375, 0.033599853515625, -0.006847381591796875, 0.05078125, 0.042144775390625, 0.006694793701171875, 0.06475830078125, -0.026336669921875, 0.06390380859375, 0.01108551025390625, -0.06732177734375, 0.036834716796875, -0.01290130615234375, 0.026641845703125, 0.0247344970703125, 0.0124969482421875, -0.049835205078125, -0.0316162109375, -0.0670166015625, -0.0633544921875, 0.04541015625, 0.027923583984375, 0.04315185546875, -0.0170745849609375, 0.027069091796875, -0.00988006591796875, 0.0133056640625, -0.055572509765625, -0.055023193359375, -0.031646728515625, -0.0260162353515625, 0.007415771484375, -0.01348114013671875, -0.006023406982421875, -0.044830322265625, 0.059661865234375, 0.0224456787109375, 0.03167724609375, 0.020751953125, 0.00843048095703125, -0.006927490234375, 0.00921630859375, 0.027618408203125, 0.023468017578125, -0.0007500648498535156, -0.004482269287109375, 0.0200653076171875, -0.04022216796875, 0.01141357421875, 0.0239715576171875, -0.006572723388671875, 0.008453369140625, 0.018463134765625, 0.08428955078125, 0.006893157958984375, -0.0218963623046875, 0.04718017578125, -0.033172607421875, -0.005962371826171875, -0.045196533203125, 0.0297088623046875, 0.006343841552734375, 0.032958984375, 0.0252227783203125, 0.004955291748046875, -0.00911712646484375, -0.0179595947265625, -0.00675201416015625, 0.0299835205078125, -0.035125732421875, -0.0253753662109375, 0.06903076171875, 0.01085662841796875, -0.020355224609375, 0.049530029296875, -0.0066070556640625, -0.0372314453125, 0.047576904296875, 0.036346435546875, 0.0699462890625, -0.0210723876953125, 0.0020771026611328125, 0.02288818359375, 0.0278472900390625, -0.02252197265625, 0.0124359130859375, -0.0009479522705078125, -0.062347412109375, -0.00121307373046875, -0.050079345703125, -0.003681182861328125, 0.01499176025390625, -0.04327392578125, 0.034912109375, -0.039306640625, -0.040130615234375, -0.02099609375, -0.0177001953125, -0.0596923828125, 0.006366729736328125, -0.00013196468353271484, 0.07208251953125, -0.060028076171875, 0.052520751953125, 0.0496826171875, -0.041534423828125, -0.0859375, -0.01340484619140625, -0.00925445556640625, -0.04132080078125, 0.034423828125, 0.0142974853515625, 0.01438140869140625, 0.0174560546875, -0.061614990234375, -0.051116943359375, 0.09088134765625, 0.0212249755859375, -0.0239715576171875, -0.0188751220703125, 0.004467010498046875, 0.043060302734375, -0.00951385498046875, 0.04437255859375, 0.0298004150390625, 0.0230560302734375, 0.001087188720703125, -0.0894775390625, 0.006603240966796875, -0.03692626953125, 0.014923095703125, -0.003787994384765625, -0.0662841796875, 0.07037353515625, 0.008453369140625, -0.013427734375, 0.016754150390625, 0.0810546875, 0.0157928466796875, 0.00391387939453125, 0.0299072265625, 0.048553466796875, 0.034515380859375, -0.0022125244140625, 0.06939697265625, -0.048004150390625, 0.0322265625, 0.074951171875, 0.0184326171875, 0.046478271484375, 0.033050537109375, -0.00620269775390625, 0.039031982421875, 0.050384521484375, -0.00839996337890625, 0.0193939208984375, 0.00701904296875, -0.01219940185546875, -0.00797271728515625, -0.01171112060546875, -0.039703369140625, 0.0305938720703125, 0.0286407470703125, -0.03826904296875, -0.00885772705078125, -0.00992584228515625, 0.0108795166015625, -0.0297698974609375, -0.01157379150390625, 0.0577392578125, 0.0284881591796875, -0.04083251953125, 0.07073974609375, 0.01513671875, 0.06109619140625, -0.046234130859375, -0.01277923583984375, -0.024932861328125, 0.01468658447265625, -0.018310546875, -0.029754638671875, -0.00799560546875, -0.0157012939453125, -0.0113067626953125, -0.0014410018920898438, 0.040771484375, -0.031768798828125, -0.033294677734375, 0.00533294677734375, 0.040374755859375, -0.0001500844955444336, -0.0059051513671875, -0.06036376953125, 0.017791748046875, -0.0015897750854492188, -0.0272674560546875, 0.0092010498046875, 0.054290771484375, -0.0033512115478515625, 0.05682373046875, 0.039764404296875, -0.0139007568359375, 0.0150604248046875, 0.00772857666015625, 0.08154296875, -0.042083740234375, -0.0294952392578125, -0.067626953125, 0.06591796875, 0.00016057491302490234, -0.044342041015625, 0.055328369140625, 0.04278564453125, 0.06549072265625, -0.009613037109375, 0.056976318359375, -0.0160369873046875, -0.002460479736328125, -0.0150909423828125, 0.05401611328125, -0.032501220703125, 0.0274505615234375, -0.018951416015625, -0.06121826171875, 0.0103302001953125, 0.05413818359375, -0.01006317138671875, 0.006805419921875, 0.031982421875, 0.09002685546875, -0.025543212890625, -0.018707275390625, 0.007656097412109375, 0.029449462890625, 0.0404052734375, 0.02587890625, 0.06903076171875, -0.048828125, 0.03515625, -0.035675048828125, -0.02642822265625, -0.000027358531951904297, -0.03271484375, -0.0853271484375, -0.0513916015625, -0.00739288330078125, -0.043670654296875, -0.0158843994140625, 0.094482421875, 0.03900146484375, -0.041107177734375, -0.035125732421875, -0.0023288726806640625, -0.004955291748046875, -0.015625, -0.0194854736328125, 0.019439697265625, -0.03253173828125, -0.048797607421875, 0.007434844970703125, 0.0022983551025390625, 0.0247802734375, -0.0232391357421875, -0.00811767578125, 0.005474090576171875, -0.0025482177734375, 0.02496337890625, 0.016357421875, -0.062255859375, -0.0004711151123046875, 0.0024394989013671875, -0.0311279296875, -0.0068359375, 0.0280609130859375, -0.032928466796875, 0.0224151611328125, 0.0231170654296875, 0.028961181640625, 0.040985107421875, -0.0015535354614257812, 0.04449462890625, -0.031982421875, 0.026641845703125, -0.004833221435546875, 0.027618408203125, 0.008544921875, -0.0248260498046875, 0.00719451904296875, 0.01219940185546875, -0.038299560546875, -0.048919677734375, -0.002902984619140625, -0.07769775390625, -0.005786895751953125, 0.09295654296875, 0.0004031658172607422, -0.027984619140625, 0.00070953369140625, -0.052001953125, 0.05108642578125, -0.03887939453125, 0.060699462890625, 0.034912109375, 0.0084381103515625, -0.0185089111328125, -0.0297088623046875, 0.0477294921875, 0.017333984375, -0.046783447265625, -0.000032961368560791016, 0.0254364013671875, 0.02593994140625, -0.0018510818481445312, 0.056610107421875, 0.0019044876098632812, 0.0204925537109375, -0.007709503173828125, 0.0178985595703125, -0.01491546630859375, -0.00627899169921875, -0.030029296875, -0.0221405029296875, 0.00848388671875, -0.0128021240234375 ] ]
thenlper/gte-base
2023-10-12T02:06:40.000Z
[ "sentence-transformers", "pytorch", "onnx", "safetensors", "bert", "mteb", "sentence-similarity", "Sentence Transformers", "en", "arxiv:2308.03281", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
thenlper
null
null
thenlper/gte-base
44
240,050
sentence-transformers
2023-07-27T03:21:20
--- tags: - mteb - sentence-similarity - sentence-transformers - Sentence Transformers model-index: - name: gte-base results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 74.17910447761193 - type: ap value: 36.827146398068926 - type: f1 value: 68.11292888046363 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 91.77345000000001 - type: ap value: 88.33530426691347 - type: f1 value: 91.76549906404642 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.964 - type: f1 value: 48.22995586184998 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 32.147999999999996 - type: map_at_10 value: 48.253 - type: map_at_100 value: 49.038 - type: map_at_1000 value: 49.042 - type: map_at_3 value: 43.433 - type: map_at_5 value: 46.182 - type: mrr_at_1 value: 32.717 - type: mrr_at_10 value: 48.467 - type: mrr_at_100 value: 49.252 - type: mrr_at_1000 value: 49.254999999999995 - type: mrr_at_3 value: 43.599 - type: mrr_at_5 value: 46.408 - type: ndcg_at_1 value: 32.147999999999996 - type: ndcg_at_10 value: 57.12199999999999 - type: ndcg_at_100 value: 60.316 - type: ndcg_at_1000 value: 60.402 - type: ndcg_at_3 value: 47.178 - type: ndcg_at_5 value: 52.146 - type: precision_at_1 value: 32.147999999999996 - type: precision_at_10 value: 8.542 - type: precision_at_100 value: 0.9900000000000001 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 19.346 - type: precision_at_5 value: 14.026 - type: recall_at_1 value: 32.147999999999996 - type: recall_at_10 value: 85.42 - type: recall_at_100 value: 99.004 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 58.037000000000006 - type: recall_at_5 value: 70.128 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 48.59706013699614 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 43.01463593002057 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 61.80250355752458 - type: mrr value: 74.79455216989844 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 89.87448576082345 - type: cos_sim_spearman value: 87.64235843637468 - type: euclidean_pearson value: 88.4901825511062 - type: euclidean_spearman value: 87.74537283182033 - type: manhattan_pearson value: 88.39040638362911 - type: manhattan_spearman value: 87.62669542888003 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 85.06818181818183 - type: f1 value: 85.02524460098233 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.20471092679967 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 36.58967592147641 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 32.411 - type: map_at_10 value: 45.162 - type: map_at_100 value: 46.717 - type: map_at_1000 value: 46.836 - type: map_at_3 value: 41.428 - type: map_at_5 value: 43.54 - type: mrr_at_1 value: 39.914 - type: mrr_at_10 value: 51.534 - type: mrr_at_100 value: 52.185 - type: mrr_at_1000 value: 52.22 - type: mrr_at_3 value: 49.046 - type: mrr_at_5 value: 50.548 - type: ndcg_at_1 value: 39.914 - type: ndcg_at_10 value: 52.235 - type: ndcg_at_100 value: 57.4 - type: ndcg_at_1000 value: 58.982 - type: ndcg_at_3 value: 47.332 - type: ndcg_at_5 value: 49.62 - type: precision_at_1 value: 39.914 - type: precision_at_10 value: 10.258000000000001 - type: precision_at_100 value: 1.6219999999999999 - type: precision_at_1000 value: 0.20500000000000002 - type: precision_at_3 value: 23.462 - type: precision_at_5 value: 16.71 - type: recall_at_1 value: 32.411 - type: recall_at_10 value: 65.408 - type: recall_at_100 value: 87.248 - type: recall_at_1000 value: 96.951 - type: recall_at_3 value: 50.349999999999994 - type: recall_at_5 value: 57.431 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 31.911 - type: map_at_10 value: 42.608000000000004 - type: map_at_100 value: 43.948 - type: map_at_1000 value: 44.089 - type: map_at_3 value: 39.652 - type: map_at_5 value: 41.236 - type: mrr_at_1 value: 40.064 - type: mrr_at_10 value: 48.916 - type: mrr_at_100 value: 49.539 - type: mrr_at_1000 value: 49.583 - type: mrr_at_3 value: 46.741 - type: mrr_at_5 value: 48.037 - type: ndcg_at_1 value: 40.064 - type: ndcg_at_10 value: 48.442 - type: ndcg_at_100 value: 52.798 - type: ndcg_at_1000 value: 54.871 - type: ndcg_at_3 value: 44.528 - type: ndcg_at_5 value: 46.211 - type: precision_at_1 value: 40.064 - type: precision_at_10 value: 9.178 - type: precision_at_100 value: 1.452 - type: precision_at_1000 value: 0.193 - type: precision_at_3 value: 21.614 - type: precision_at_5 value: 15.185 - type: recall_at_1 value: 31.911 - type: recall_at_10 value: 58.155 - type: recall_at_100 value: 76.46300000000001 - type: recall_at_1000 value: 89.622 - type: recall_at_3 value: 46.195 - type: recall_at_5 value: 51.288999999999994 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 40.597 - type: map_at_10 value: 54.290000000000006 - type: map_at_100 value: 55.340999999999994 - type: map_at_1000 value: 55.388999999999996 - type: map_at_3 value: 50.931000000000004 - type: map_at_5 value: 52.839999999999996 - type: mrr_at_1 value: 46.646 - type: mrr_at_10 value: 57.524 - type: mrr_at_100 value: 58.225 - type: mrr_at_1000 value: 58.245999999999995 - type: mrr_at_3 value: 55.235 - type: mrr_at_5 value: 56.589 - type: ndcg_at_1 value: 46.646 - type: ndcg_at_10 value: 60.324999999999996 - type: ndcg_at_100 value: 64.30900000000001 - type: ndcg_at_1000 value: 65.19 - type: ndcg_at_3 value: 54.983000000000004 - type: ndcg_at_5 value: 57.621 - type: precision_at_1 value: 46.646 - type: precision_at_10 value: 9.774 - type: precision_at_100 value: 1.265 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_3 value: 24.911 - type: precision_at_5 value: 16.977999999999998 - type: recall_at_1 value: 40.597 - type: recall_at_10 value: 74.773 - type: recall_at_100 value: 91.61200000000001 - type: recall_at_1000 value: 97.726 - type: recall_at_3 value: 60.458 - type: recall_at_5 value: 66.956 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.122 - type: map_at_10 value: 36.711 - type: map_at_100 value: 37.775 - type: map_at_1000 value: 37.842999999999996 - type: map_at_3 value: 33.693 - type: map_at_5 value: 35.607 - type: mrr_at_1 value: 29.153000000000002 - type: mrr_at_10 value: 38.873999999999995 - type: mrr_at_100 value: 39.739000000000004 - type: mrr_at_1000 value: 39.794000000000004 - type: mrr_at_3 value: 36.102000000000004 - type: mrr_at_5 value: 37.876 - type: ndcg_at_1 value: 29.153000000000002 - type: ndcg_at_10 value: 42.048 - type: ndcg_at_100 value: 47.144999999999996 - type: ndcg_at_1000 value: 48.901 - type: ndcg_at_3 value: 36.402 - type: ndcg_at_5 value: 39.562999999999995 - type: precision_at_1 value: 29.153000000000002 - type: precision_at_10 value: 6.4750000000000005 - type: precision_at_100 value: 0.951 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 15.479999999999999 - type: precision_at_5 value: 11.028 - type: recall_at_1 value: 27.122 - type: recall_at_10 value: 56.279999999999994 - type: recall_at_100 value: 79.597 - type: recall_at_1000 value: 92.804 - type: recall_at_3 value: 41.437000000000005 - type: recall_at_5 value: 49.019 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.757 - type: map_at_10 value: 26.739 - type: map_at_100 value: 28.015 - type: map_at_1000 value: 28.127999999999997 - type: map_at_3 value: 23.986 - type: map_at_5 value: 25.514 - type: mrr_at_1 value: 22.015 - type: mrr_at_10 value: 31.325999999999997 - type: mrr_at_100 value: 32.368 - type: mrr_at_1000 value: 32.426 - type: mrr_at_3 value: 28.897000000000002 - type: mrr_at_5 value: 30.147000000000002 - type: ndcg_at_1 value: 22.015 - type: ndcg_at_10 value: 32.225 - type: ndcg_at_100 value: 38.405 - type: ndcg_at_1000 value: 40.932 - type: ndcg_at_3 value: 27.403 - type: ndcg_at_5 value: 29.587000000000003 - type: precision_at_1 value: 22.015 - type: precision_at_10 value: 5.9830000000000005 - type: precision_at_100 value: 1.051 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 13.391 - type: precision_at_5 value: 9.602 - type: recall_at_1 value: 17.757 - type: recall_at_10 value: 44.467 - type: recall_at_100 value: 71.53699999999999 - type: recall_at_1000 value: 89.281 - type: recall_at_3 value: 31.095 - type: recall_at_5 value: 36.818 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.354 - type: map_at_10 value: 42.134 - type: map_at_100 value: 43.429 - type: map_at_1000 value: 43.532 - type: map_at_3 value: 38.491 - type: map_at_5 value: 40.736 - type: mrr_at_1 value: 37.247 - type: mrr_at_10 value: 47.775 - type: mrr_at_100 value: 48.522999999999996 - type: mrr_at_1000 value: 48.567 - type: mrr_at_3 value: 45.059 - type: mrr_at_5 value: 46.811 - type: ndcg_at_1 value: 37.247 - type: ndcg_at_10 value: 48.609 - type: ndcg_at_100 value: 53.782 - type: ndcg_at_1000 value: 55.666000000000004 - type: ndcg_at_3 value: 42.866 - type: ndcg_at_5 value: 46.001 - type: precision_at_1 value: 37.247 - type: precision_at_10 value: 8.892999999999999 - type: precision_at_100 value: 1.341 - type: precision_at_1000 value: 0.168 - type: precision_at_3 value: 20.5 - type: precision_at_5 value: 14.976 - type: recall_at_1 value: 30.354 - type: recall_at_10 value: 62.273 - type: recall_at_100 value: 83.65599999999999 - type: recall_at_1000 value: 95.82000000000001 - type: recall_at_3 value: 46.464 - type: recall_at_5 value: 54.225 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.949 - type: map_at_10 value: 37.230000000000004 - type: map_at_100 value: 38.644 - type: map_at_1000 value: 38.751999999999995 - type: map_at_3 value: 33.816 - type: map_at_5 value: 35.817 - type: mrr_at_1 value: 33.446999999999996 - type: mrr_at_10 value: 42.970000000000006 - type: mrr_at_100 value: 43.873 - type: mrr_at_1000 value: 43.922 - type: mrr_at_3 value: 40.467999999999996 - type: mrr_at_5 value: 41.861 - type: ndcg_at_1 value: 33.446999999999996 - type: ndcg_at_10 value: 43.403000000000006 - type: ndcg_at_100 value: 49.247 - type: ndcg_at_1000 value: 51.361999999999995 - type: ndcg_at_3 value: 38.155 - type: ndcg_at_5 value: 40.643 - type: precision_at_1 value: 33.446999999999996 - type: precision_at_10 value: 8.128 - type: precision_at_100 value: 1.274 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 18.493000000000002 - type: precision_at_5 value: 13.333 - type: recall_at_1 value: 26.949 - type: recall_at_10 value: 56.006 - type: recall_at_100 value: 80.99199999999999 - type: recall_at_1000 value: 95.074 - type: recall_at_3 value: 40.809 - type: recall_at_5 value: 47.57 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.243583333333333 - type: map_at_10 value: 37.193250000000006 - type: map_at_100 value: 38.44833333333334 - type: map_at_1000 value: 38.56083333333333 - type: map_at_3 value: 34.06633333333333 - type: map_at_5 value: 35.87858333333334 - type: mrr_at_1 value: 32.291583333333335 - type: mrr_at_10 value: 41.482749999999996 - type: mrr_at_100 value: 42.33583333333333 - type: mrr_at_1000 value: 42.38683333333333 - type: mrr_at_3 value: 38.952999999999996 - type: mrr_at_5 value: 40.45333333333333 - type: ndcg_at_1 value: 32.291583333333335 - type: ndcg_at_10 value: 42.90533333333334 - type: ndcg_at_100 value: 48.138666666666666 - type: ndcg_at_1000 value: 50.229083333333335 - type: ndcg_at_3 value: 37.76133333333334 - type: ndcg_at_5 value: 40.31033333333334 - type: precision_at_1 value: 32.291583333333335 - type: precision_at_10 value: 7.585583333333333 - type: precision_at_100 value: 1.2045000000000001 - type: precision_at_1000 value: 0.15733333333333335 - type: precision_at_3 value: 17.485416666666666 - type: precision_at_5 value: 12.5145 - type: recall_at_1 value: 27.243583333333333 - type: recall_at_10 value: 55.45108333333334 - type: recall_at_100 value: 78.25858333333335 - type: recall_at_1000 value: 92.61716666666665 - type: recall_at_3 value: 41.130583333333334 - type: recall_at_5 value: 47.73133333333334 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.325 - type: map_at_10 value: 32.795 - type: map_at_100 value: 33.96 - type: map_at_1000 value: 34.054 - type: map_at_3 value: 30.64 - type: map_at_5 value: 31.771 - type: mrr_at_1 value: 29.908 - type: mrr_at_10 value: 35.83 - type: mrr_at_100 value: 36.868 - type: mrr_at_1000 value: 36.928 - type: mrr_at_3 value: 33.896 - type: mrr_at_5 value: 34.893 - type: ndcg_at_1 value: 29.908 - type: ndcg_at_10 value: 36.746 - type: ndcg_at_100 value: 42.225 - type: ndcg_at_1000 value: 44.523 - type: ndcg_at_3 value: 32.82 - type: ndcg_at_5 value: 34.583000000000006 - type: precision_at_1 value: 29.908 - type: precision_at_10 value: 5.6129999999999995 - type: precision_at_100 value: 0.9079999999999999 - type: precision_at_1000 value: 0.11800000000000001 - type: precision_at_3 value: 13.753000000000002 - type: precision_at_5 value: 9.417 - type: recall_at_1 value: 26.325 - type: recall_at_10 value: 45.975 - type: recall_at_100 value: 70.393 - type: recall_at_1000 value: 87.217 - type: recall_at_3 value: 35.195 - type: recall_at_5 value: 39.69 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.828 - type: map_at_10 value: 25.759 - type: map_at_100 value: 26.961000000000002 - type: map_at_1000 value: 27.094 - type: map_at_3 value: 23.166999999999998 - type: map_at_5 value: 24.610000000000003 - type: mrr_at_1 value: 21.61 - type: mrr_at_10 value: 29.605999999999998 - type: mrr_at_100 value: 30.586000000000002 - type: mrr_at_1000 value: 30.664 - type: mrr_at_3 value: 27.214 - type: mrr_at_5 value: 28.571 - type: ndcg_at_1 value: 21.61 - type: ndcg_at_10 value: 30.740000000000002 - type: ndcg_at_100 value: 36.332 - type: ndcg_at_1000 value: 39.296 - type: ndcg_at_3 value: 26.11 - type: ndcg_at_5 value: 28.297 - type: precision_at_1 value: 21.61 - type: precision_at_10 value: 5.643 - type: precision_at_100 value: 1.0 - type: precision_at_1000 value: 0.14400000000000002 - type: precision_at_3 value: 12.4 - type: precision_at_5 value: 9.119 - type: recall_at_1 value: 17.828 - type: recall_at_10 value: 41.876000000000005 - type: recall_at_100 value: 66.648 - type: recall_at_1000 value: 87.763 - type: recall_at_3 value: 28.957 - type: recall_at_5 value: 34.494 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.921000000000003 - type: map_at_10 value: 37.156 - type: map_at_100 value: 38.399 - type: map_at_1000 value: 38.498 - type: map_at_3 value: 34.134 - type: map_at_5 value: 35.936 - type: mrr_at_1 value: 32.649 - type: mrr_at_10 value: 41.19 - type: mrr_at_100 value: 42.102000000000004 - type: mrr_at_1000 value: 42.157 - type: mrr_at_3 value: 38.464 - type: mrr_at_5 value: 40.148 - type: ndcg_at_1 value: 32.649 - type: ndcg_at_10 value: 42.679 - type: ndcg_at_100 value: 48.27 - type: ndcg_at_1000 value: 50.312 - type: ndcg_at_3 value: 37.269000000000005 - type: ndcg_at_5 value: 40.055 - type: precision_at_1 value: 32.649 - type: precision_at_10 value: 7.155 - type: precision_at_100 value: 1.124 - type: precision_at_1000 value: 0.14100000000000001 - type: precision_at_3 value: 16.791 - type: precision_at_5 value: 12.015 - type: recall_at_1 value: 27.921000000000003 - type: recall_at_10 value: 55.357 - type: recall_at_100 value: 79.476 - type: recall_at_1000 value: 93.314 - type: recall_at_3 value: 40.891 - type: recall_at_5 value: 47.851 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.524 - type: map_at_10 value: 35.135 - type: map_at_100 value: 36.665 - type: map_at_1000 value: 36.886 - type: map_at_3 value: 31.367 - type: map_at_5 value: 33.724 - type: mrr_at_1 value: 30.631999999999998 - type: mrr_at_10 value: 39.616 - type: mrr_at_100 value: 40.54 - type: mrr_at_1000 value: 40.585 - type: mrr_at_3 value: 36.462 - type: mrr_at_5 value: 38.507999999999996 - type: ndcg_at_1 value: 30.631999999999998 - type: ndcg_at_10 value: 41.61 - type: ndcg_at_100 value: 47.249 - type: ndcg_at_1000 value: 49.662 - type: ndcg_at_3 value: 35.421 - type: ndcg_at_5 value: 38.811 - type: precision_at_1 value: 30.631999999999998 - type: precision_at_10 value: 8.123 - type: precision_at_100 value: 1.5810000000000002 - type: precision_at_1000 value: 0.245 - type: precision_at_3 value: 16.337 - type: precision_at_5 value: 12.568999999999999 - type: recall_at_1 value: 25.524 - type: recall_at_10 value: 54.994 - type: recall_at_100 value: 80.03099999999999 - type: recall_at_1000 value: 95.25099999999999 - type: recall_at_3 value: 37.563 - type: recall_at_5 value: 46.428999999999995 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 22.224 - type: map_at_10 value: 30.599999999999998 - type: map_at_100 value: 31.526 - type: map_at_1000 value: 31.629 - type: map_at_3 value: 27.491 - type: map_at_5 value: 29.212 - type: mrr_at_1 value: 24.214 - type: mrr_at_10 value: 32.632 - type: mrr_at_100 value: 33.482 - type: mrr_at_1000 value: 33.550000000000004 - type: mrr_at_3 value: 29.852 - type: mrr_at_5 value: 31.451 - type: ndcg_at_1 value: 24.214 - type: ndcg_at_10 value: 35.802 - type: ndcg_at_100 value: 40.502 - type: ndcg_at_1000 value: 43.052 - type: ndcg_at_3 value: 29.847 - type: ndcg_at_5 value: 32.732 - type: precision_at_1 value: 24.214 - type: precision_at_10 value: 5.804 - type: precision_at_100 value: 0.885 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 12.692999999999998 - type: precision_at_5 value: 9.242 - type: recall_at_1 value: 22.224 - type: recall_at_10 value: 49.849 - type: recall_at_100 value: 71.45 - type: recall_at_1000 value: 90.583 - type: recall_at_3 value: 34.153 - type: recall_at_5 value: 41.004000000000005 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 12.386999999999999 - type: map_at_10 value: 20.182 - type: map_at_100 value: 21.86 - type: map_at_1000 value: 22.054000000000002 - type: map_at_3 value: 17.165 - type: map_at_5 value: 18.643 - type: mrr_at_1 value: 26.906000000000002 - type: mrr_at_10 value: 37.907999999999994 - type: mrr_at_100 value: 38.868 - type: mrr_at_1000 value: 38.913 - type: mrr_at_3 value: 34.853 - type: mrr_at_5 value: 36.567 - type: ndcg_at_1 value: 26.906000000000002 - type: ndcg_at_10 value: 28.103 - type: ndcg_at_100 value: 35.073 - type: ndcg_at_1000 value: 38.653 - type: ndcg_at_3 value: 23.345 - type: ndcg_at_5 value: 24.828 - type: precision_at_1 value: 26.906000000000002 - type: precision_at_10 value: 8.547 - type: precision_at_100 value: 1.617 - type: precision_at_1000 value: 0.22799999999999998 - type: precision_at_3 value: 17.025000000000002 - type: precision_at_5 value: 12.834000000000001 - type: recall_at_1 value: 12.386999999999999 - type: recall_at_10 value: 33.306999999999995 - type: recall_at_100 value: 57.516 - type: recall_at_1000 value: 77.74799999999999 - type: recall_at_3 value: 21.433 - type: recall_at_5 value: 25.915 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 9.322 - type: map_at_10 value: 20.469 - type: map_at_100 value: 28.638 - type: map_at_1000 value: 30.433 - type: map_at_3 value: 14.802000000000001 - type: map_at_5 value: 17.297 - type: mrr_at_1 value: 68.75 - type: mrr_at_10 value: 76.29599999999999 - type: mrr_at_100 value: 76.62400000000001 - type: mrr_at_1000 value: 76.633 - type: mrr_at_3 value: 75.083 - type: mrr_at_5 value: 75.771 - type: ndcg_at_1 value: 54.87499999999999 - type: ndcg_at_10 value: 41.185 - type: ndcg_at_100 value: 46.400000000000006 - type: ndcg_at_1000 value: 54.223 - type: ndcg_at_3 value: 45.489000000000004 - type: ndcg_at_5 value: 43.161 - type: precision_at_1 value: 68.75 - type: precision_at_10 value: 32.300000000000004 - type: precision_at_100 value: 10.607999999999999 - type: precision_at_1000 value: 2.237 - type: precision_at_3 value: 49.083 - type: precision_at_5 value: 41.6 - type: recall_at_1 value: 9.322 - type: recall_at_10 value: 25.696 - type: recall_at_100 value: 52.898 - type: recall_at_1000 value: 77.281 - type: recall_at_3 value: 15.943 - type: recall_at_5 value: 19.836000000000002 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 48.650000000000006 - type: f1 value: 43.528467245539396 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 66.56 - type: map_at_10 value: 76.767 - type: map_at_100 value: 77.054 - type: map_at_1000 value: 77.068 - type: map_at_3 value: 75.29299999999999 - type: map_at_5 value: 76.24 - type: mrr_at_1 value: 71.842 - type: mrr_at_10 value: 81.459 - type: mrr_at_100 value: 81.58800000000001 - type: mrr_at_1000 value: 81.59100000000001 - type: mrr_at_3 value: 80.188 - type: mrr_at_5 value: 81.038 - type: ndcg_at_1 value: 71.842 - type: ndcg_at_10 value: 81.51899999999999 - type: ndcg_at_100 value: 82.544 - type: ndcg_at_1000 value: 82.829 - type: ndcg_at_3 value: 78.92 - type: ndcg_at_5 value: 80.406 - type: precision_at_1 value: 71.842 - type: precision_at_10 value: 10.066 - type: precision_at_100 value: 1.076 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 30.703000000000003 - type: precision_at_5 value: 19.301 - type: recall_at_1 value: 66.56 - type: recall_at_10 value: 91.55 - type: recall_at_100 value: 95.67099999999999 - type: recall_at_1000 value: 97.539 - type: recall_at_3 value: 84.46900000000001 - type: recall_at_5 value: 88.201 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 20.087 - type: map_at_10 value: 32.830999999999996 - type: map_at_100 value: 34.814 - type: map_at_1000 value: 34.999 - type: map_at_3 value: 28.198 - type: map_at_5 value: 30.779 - type: mrr_at_1 value: 38.889 - type: mrr_at_10 value: 48.415 - type: mrr_at_100 value: 49.187 - type: mrr_at_1000 value: 49.226 - type: mrr_at_3 value: 45.705 - type: mrr_at_5 value: 47.225 - type: ndcg_at_1 value: 38.889 - type: ndcg_at_10 value: 40.758 - type: ndcg_at_100 value: 47.671 - type: ndcg_at_1000 value: 50.744 - type: ndcg_at_3 value: 36.296 - type: ndcg_at_5 value: 37.852999999999994 - type: precision_at_1 value: 38.889 - type: precision_at_10 value: 11.466 - type: precision_at_100 value: 1.8499999999999999 - type: precision_at_1000 value: 0.24 - type: precision_at_3 value: 24.126 - type: precision_at_5 value: 18.21 - type: recall_at_1 value: 20.087 - type: recall_at_10 value: 48.042 - type: recall_at_100 value: 73.493 - type: recall_at_1000 value: 91.851 - type: recall_at_3 value: 32.694 - type: recall_at_5 value: 39.099000000000004 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 38.096000000000004 - type: map_at_10 value: 56.99999999999999 - type: map_at_100 value: 57.914 - type: map_at_1000 value: 57.984 - type: map_at_3 value: 53.900999999999996 - type: map_at_5 value: 55.827000000000005 - type: mrr_at_1 value: 76.19200000000001 - type: mrr_at_10 value: 81.955 - type: mrr_at_100 value: 82.164 - type: mrr_at_1000 value: 82.173 - type: mrr_at_3 value: 80.963 - type: mrr_at_5 value: 81.574 - type: ndcg_at_1 value: 76.19200000000001 - type: ndcg_at_10 value: 65.75 - type: ndcg_at_100 value: 68.949 - type: ndcg_at_1000 value: 70.342 - type: ndcg_at_3 value: 61.29 - type: ndcg_at_5 value: 63.747 - type: precision_at_1 value: 76.19200000000001 - type: precision_at_10 value: 13.571 - type: precision_at_100 value: 1.6070000000000002 - type: precision_at_1000 value: 0.179 - type: precision_at_3 value: 38.663 - type: precision_at_5 value: 25.136999999999997 - type: recall_at_1 value: 38.096000000000004 - type: recall_at_10 value: 67.853 - type: recall_at_100 value: 80.365 - type: recall_at_1000 value: 89.629 - type: recall_at_3 value: 57.995 - type: recall_at_5 value: 62.843 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 85.95200000000001 - type: ap value: 80.73847277002109 - type: f1 value: 85.92406135678594 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 20.916999999999998 - type: map_at_10 value: 33.23 - type: map_at_100 value: 34.427 - type: map_at_1000 value: 34.477000000000004 - type: map_at_3 value: 29.292 - type: map_at_5 value: 31.6 - type: mrr_at_1 value: 21.547 - type: mrr_at_10 value: 33.839999999999996 - type: mrr_at_100 value: 34.979 - type: mrr_at_1000 value: 35.022999999999996 - type: mrr_at_3 value: 29.988 - type: mrr_at_5 value: 32.259 - type: ndcg_at_1 value: 21.519 - type: ndcg_at_10 value: 40.209 - type: ndcg_at_100 value: 45.954 - type: ndcg_at_1000 value: 47.187 - type: ndcg_at_3 value: 32.227 - type: ndcg_at_5 value: 36.347 - type: precision_at_1 value: 21.519 - type: precision_at_10 value: 6.447 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 13.877999999999998 - type: precision_at_5 value: 10.404 - type: recall_at_1 value: 20.916999999999998 - type: recall_at_10 value: 61.7 - type: recall_at_100 value: 88.202 - type: recall_at_1000 value: 97.588 - type: recall_at_3 value: 40.044999999999995 - type: recall_at_5 value: 49.964999999999996 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.02781577747379 - type: f1 value: 92.83653922768306 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 72.04286365709075 - type: f1 value: 53.43867658525793 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.47276395427035 - type: f1 value: 69.77017399597342 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.3819771351715 - type: f1 value: 76.8484533435409 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.16515993299593 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 31.77145323314774 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 32.53637706586391 - type: mrr value: 33.7312926288863 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 7.063999999999999 - type: map_at_10 value: 15.046999999999999 - type: map_at_100 value: 19.116 - type: map_at_1000 value: 20.702 - type: map_at_3 value: 10.932 - type: map_at_5 value: 12.751999999999999 - type: mrr_at_1 value: 50.464 - type: mrr_at_10 value: 58.189 - type: mrr_at_100 value: 58.733999999999995 - type: mrr_at_1000 value: 58.769000000000005 - type: mrr_at_3 value: 56.24400000000001 - type: mrr_at_5 value: 57.68299999999999 - type: ndcg_at_1 value: 48.142 - type: ndcg_at_10 value: 37.897 - type: ndcg_at_100 value: 35.264 - type: ndcg_at_1000 value: 44.033 - type: ndcg_at_3 value: 42.967 - type: ndcg_at_5 value: 40.815 - type: precision_at_1 value: 50.15500000000001 - type: precision_at_10 value: 28.235 - type: precision_at_100 value: 8.994 - type: precision_at_1000 value: 2.218 - type: precision_at_3 value: 40.041 - type: precision_at_5 value: 35.046 - type: recall_at_1 value: 7.063999999999999 - type: recall_at_10 value: 18.598 - type: recall_at_100 value: 35.577999999999996 - type: recall_at_1000 value: 67.43 - type: recall_at_3 value: 11.562999999999999 - type: recall_at_5 value: 14.771 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 29.046 - type: map_at_10 value: 44.808 - type: map_at_100 value: 45.898 - type: map_at_1000 value: 45.927 - type: map_at_3 value: 40.19 - type: map_at_5 value: 42.897 - type: mrr_at_1 value: 32.706 - type: mrr_at_10 value: 47.275 - type: mrr_at_100 value: 48.075 - type: mrr_at_1000 value: 48.095 - type: mrr_at_3 value: 43.463 - type: mrr_at_5 value: 45.741 - type: ndcg_at_1 value: 32.706 - type: ndcg_at_10 value: 52.835 - type: ndcg_at_100 value: 57.345 - type: ndcg_at_1000 value: 57.985 - type: ndcg_at_3 value: 44.171 - type: ndcg_at_5 value: 48.661 - type: precision_at_1 value: 32.706 - type: precision_at_10 value: 8.895999999999999 - type: precision_at_100 value: 1.143 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 20.238999999999997 - type: precision_at_5 value: 14.728 - type: recall_at_1 value: 29.046 - type: recall_at_10 value: 74.831 - type: recall_at_100 value: 94.192 - type: recall_at_1000 value: 98.897 - type: recall_at_3 value: 52.37500000000001 - type: recall_at_5 value: 62.732 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 70.38799999999999 - type: map_at_10 value: 84.315 - type: map_at_100 value: 84.955 - type: map_at_1000 value: 84.971 - type: map_at_3 value: 81.33399999999999 - type: map_at_5 value: 83.21300000000001 - type: mrr_at_1 value: 81.03 - type: mrr_at_10 value: 87.395 - type: mrr_at_100 value: 87.488 - type: mrr_at_1000 value: 87.48899999999999 - type: mrr_at_3 value: 86.41499999999999 - type: mrr_at_5 value: 87.074 - type: ndcg_at_1 value: 81.04 - type: ndcg_at_10 value: 88.151 - type: ndcg_at_100 value: 89.38199999999999 - type: ndcg_at_1000 value: 89.479 - type: ndcg_at_3 value: 85.24000000000001 - type: ndcg_at_5 value: 86.856 - type: precision_at_1 value: 81.04 - type: precision_at_10 value: 13.372 - type: precision_at_100 value: 1.526 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.217 - type: precision_at_5 value: 24.502 - type: recall_at_1 value: 70.38799999999999 - type: recall_at_10 value: 95.452 - type: recall_at_100 value: 99.59700000000001 - type: recall_at_1000 value: 99.988 - type: recall_at_3 value: 87.11 - type: recall_at_5 value: 91.662 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 59.334991029213235 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 62.586500854616666 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 5.153 - type: map_at_10 value: 14.277000000000001 - type: map_at_100 value: 16.922 - type: map_at_1000 value: 17.302999999999997 - type: map_at_3 value: 9.961 - type: map_at_5 value: 12.257 - type: mrr_at_1 value: 25.4 - type: mrr_at_10 value: 37.458000000000006 - type: mrr_at_100 value: 38.681 - type: mrr_at_1000 value: 38.722 - type: mrr_at_3 value: 34.1 - type: mrr_at_5 value: 36.17 - type: ndcg_at_1 value: 25.4 - type: ndcg_at_10 value: 23.132 - type: ndcg_at_100 value: 32.908 - type: ndcg_at_1000 value: 38.754 - type: ndcg_at_3 value: 21.82 - type: ndcg_at_5 value: 19.353 - type: precision_at_1 value: 25.4 - type: precision_at_10 value: 12.1 - type: precision_at_100 value: 2.628 - type: precision_at_1000 value: 0.402 - type: precision_at_3 value: 20.732999999999997 - type: precision_at_5 value: 17.34 - type: recall_at_1 value: 5.153 - type: recall_at_10 value: 24.54 - type: recall_at_100 value: 53.293 - type: recall_at_1000 value: 81.57 - type: recall_at_3 value: 12.613 - type: recall_at_5 value: 17.577 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 84.86284404925333 - type: cos_sim_spearman value: 78.85870555294795 - type: euclidean_pearson value: 82.20105295276093 - type: euclidean_spearman value: 78.92125617009592 - type: manhattan_pearson value: 82.15840025289069 - type: manhattan_spearman value: 78.85955732900803 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 84.98747423389027 - type: cos_sim_spearman value: 75.71298531799367 - type: euclidean_pearson value: 81.59709559192291 - type: euclidean_spearman value: 75.40622749225653 - type: manhattan_pearson value: 81.55553547608804 - type: manhattan_spearman value: 75.39380235424899 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 83.76861330695503 - type: cos_sim_spearman value: 85.72991921531624 - type: euclidean_pearson value: 84.84504307397536 - type: euclidean_spearman value: 86.02679162824732 - type: manhattan_pearson value: 84.79969439220142 - type: manhattan_spearman value: 85.99238837291625 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 83.31929747511796 - type: cos_sim_spearman value: 81.50806522502528 - type: euclidean_pearson value: 82.93936686512777 - type: euclidean_spearman value: 81.54403447993224 - type: manhattan_pearson value: 82.89696981900828 - type: manhattan_spearman value: 81.52817825470865 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 87.14413295332908 - type: cos_sim_spearman value: 88.81032027008195 - type: euclidean_pearson value: 88.19205563407645 - type: euclidean_spearman value: 88.89738339479216 - type: manhattan_pearson value: 88.11075942004189 - type: manhattan_spearman value: 88.8297061675564 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 82.15980075557017 - type: cos_sim_spearman value: 83.81896308594801 - type: euclidean_pearson value: 83.11195254311338 - type: euclidean_spearman value: 84.10479481755407 - type: manhattan_pearson value: 83.13915225100556 - type: manhattan_spearman value: 84.09895591027859 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.93669480147919 - type: cos_sim_spearman value: 87.89861394614361 - type: euclidean_pearson value: 88.37316413202339 - type: euclidean_spearman value: 88.18033817842569 - type: manhattan_pearson value: 88.39427578879469 - type: manhattan_spearman value: 88.09185009236847 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 66.62215083348255 - type: cos_sim_spearman value: 67.33243665716736 - type: euclidean_pearson value: 67.60871701996284 - type: euclidean_spearman value: 66.75929225238659 - type: manhattan_pearson value: 67.63907838970992 - type: manhattan_spearman value: 66.79313656754846 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.65549191934764 - type: cos_sim_spearman value: 85.73266847750143 - type: euclidean_pearson value: 85.75609932254318 - type: euclidean_spearman value: 85.9452287759371 - type: manhattan_pearson value: 85.69717413063573 - type: manhattan_spearman value: 85.86546318377046 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 87.08164129085783 - type: mrr value: 96.2877273416489 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 62.09400000000001 - type: map_at_10 value: 71.712 - type: map_at_100 value: 72.128 - type: map_at_1000 value: 72.14399999999999 - type: map_at_3 value: 68.93 - type: map_at_5 value: 70.694 - type: mrr_at_1 value: 65.0 - type: mrr_at_10 value: 72.572 - type: mrr_at_100 value: 72.842 - type: mrr_at_1000 value: 72.856 - type: mrr_at_3 value: 70.44399999999999 - type: mrr_at_5 value: 71.744 - type: ndcg_at_1 value: 65.0 - type: ndcg_at_10 value: 76.178 - type: ndcg_at_100 value: 77.887 - type: ndcg_at_1000 value: 78.227 - type: ndcg_at_3 value: 71.367 - type: ndcg_at_5 value: 73.938 - type: precision_at_1 value: 65.0 - type: precision_at_10 value: 10.033 - type: precision_at_100 value: 1.097 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 27.667 - type: precision_at_5 value: 18.4 - type: recall_at_1 value: 62.09400000000001 - type: recall_at_10 value: 89.022 - type: recall_at_100 value: 96.833 - type: recall_at_1000 value: 99.333 - type: recall_at_3 value: 75.922 - type: recall_at_5 value: 82.428 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.82178217821782 - type: cos_sim_ap value: 95.71282508220798 - type: cos_sim_f1 value: 90.73120494335737 - type: cos_sim_precision value: 93.52441613588111 - type: cos_sim_recall value: 88.1 - type: dot_accuracy value: 99.73960396039604 - type: dot_ap value: 92.98534606529098 - type: dot_f1 value: 86.83024536805209 - type: dot_precision value: 86.96088264794383 - type: dot_recall value: 86.7 - type: euclidean_accuracy value: 99.82475247524752 - type: euclidean_ap value: 95.72927039014849 - type: euclidean_f1 value: 90.89974293059126 - type: euclidean_precision value: 93.54497354497354 - type: euclidean_recall value: 88.4 - type: manhattan_accuracy value: 99.82574257425742 - type: manhattan_ap value: 95.72142177390405 - type: manhattan_f1 value: 91.00152516522625 - type: manhattan_precision value: 92.55429162357808 - type: manhattan_recall value: 89.5 - type: max_accuracy value: 99.82574257425742 - type: max_ap value: 95.72927039014849 - type: max_f1 value: 91.00152516522625 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 66.63957663468679 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 36.003307257923964 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 53.005825525863905 - type: mrr value: 53.854683919022165 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.503611569974098 - type: cos_sim_spearman value: 31.17155564248449 - type: dot_pearson value: 26.740428413981306 - type: dot_spearman value: 26.55727635469746 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.23600000000000002 - type: map_at_10 value: 1.7670000000000001 - type: map_at_100 value: 10.208 - type: map_at_1000 value: 25.997999999999998 - type: map_at_3 value: 0.605 - type: map_at_5 value: 0.9560000000000001 - type: mrr_at_1 value: 84.0 - type: mrr_at_10 value: 90.167 - type: mrr_at_100 value: 90.167 - type: mrr_at_1000 value: 90.167 - type: mrr_at_3 value: 89.667 - type: mrr_at_5 value: 90.167 - type: ndcg_at_1 value: 77.0 - type: ndcg_at_10 value: 68.783 - type: ndcg_at_100 value: 54.196 - type: ndcg_at_1000 value: 52.077 - type: ndcg_at_3 value: 71.642 - type: ndcg_at_5 value: 70.45700000000001 - type: precision_at_1 value: 84.0 - type: precision_at_10 value: 73.0 - type: precision_at_100 value: 55.48 - type: precision_at_1000 value: 23.102 - type: precision_at_3 value: 76.0 - type: precision_at_5 value: 74.8 - type: recall_at_1 value: 0.23600000000000002 - type: recall_at_10 value: 1.9869999999999999 - type: recall_at_100 value: 13.749 - type: recall_at_1000 value: 50.157 - type: recall_at_3 value: 0.633 - type: recall_at_5 value: 1.0290000000000001 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.437 - type: map_at_10 value: 8.791 - type: map_at_100 value: 15.001999999999999 - type: map_at_1000 value: 16.549 - type: map_at_3 value: 3.8080000000000003 - type: map_at_5 value: 5.632000000000001 - type: mrr_at_1 value: 20.408 - type: mrr_at_10 value: 36.96 - type: mrr_at_100 value: 37.912 - type: mrr_at_1000 value: 37.912 - type: mrr_at_3 value: 29.592000000000002 - type: mrr_at_5 value: 34.489999999999995 - type: ndcg_at_1 value: 19.387999999999998 - type: ndcg_at_10 value: 22.554 - type: ndcg_at_100 value: 35.197 - type: ndcg_at_1000 value: 46.58 - type: ndcg_at_3 value: 20.285 - type: ndcg_at_5 value: 21.924 - type: precision_at_1 value: 20.408 - type: precision_at_10 value: 21.837 - type: precision_at_100 value: 7.754999999999999 - type: precision_at_1000 value: 1.537 - type: precision_at_3 value: 21.769 - type: precision_at_5 value: 23.673 - type: recall_at_1 value: 1.437 - type: recall_at_10 value: 16.314999999999998 - type: recall_at_100 value: 47.635 - type: recall_at_1000 value: 82.963 - type: recall_at_3 value: 4.955 - type: recall_at_5 value: 8.805 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.6128 - type: ap value: 14.279639861175664 - type: f1 value: 54.922292491204274 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 57.01188455008489 - type: f1 value: 57.377953019225515 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 52.306769136544254 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.64701674912082 - type: cos_sim_ap value: 72.46600945328552 - type: cos_sim_f1 value: 67.96572367648784 - type: cos_sim_precision value: 61.21801649397336 - type: cos_sim_recall value: 76.38522427440633 - type: dot_accuracy value: 82.33295583238957 - type: dot_ap value: 62.54843443071716 - type: dot_f1 value: 60.38378562507096 - type: dot_precision value: 52.99980067769583 - type: dot_recall value: 70.15831134564644 - type: euclidean_accuracy value: 85.7423854085951 - type: euclidean_ap value: 72.76873850945174 - type: euclidean_f1 value: 68.23556960543262 - type: euclidean_precision value: 61.3344559040202 - type: euclidean_recall value: 76.88654353562005 - type: manhattan_accuracy value: 85.74834594981225 - type: manhattan_ap value: 72.66825372446462 - type: manhattan_f1 value: 68.21539194662853 - type: manhattan_precision value: 62.185056472632496 - type: manhattan_recall value: 75.54089709762533 - type: max_accuracy value: 85.74834594981225 - type: max_ap value: 72.76873850945174 - type: max_f1 value: 68.23556960543262 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.73171110334924 - type: cos_sim_ap value: 85.51855542063649 - type: cos_sim_f1 value: 77.95706775700934 - type: cos_sim_precision value: 74.12524298805887 - type: cos_sim_recall value: 82.20665229442562 - type: dot_accuracy value: 86.94842240074514 - type: dot_ap value: 80.90995345771762 - type: dot_f1 value: 74.20765027322403 - type: dot_precision value: 70.42594385285575 - type: dot_recall value: 78.41854019094548 - type: euclidean_accuracy value: 88.73753250281368 - type: euclidean_ap value: 85.54712254033734 - type: euclidean_f1 value: 78.07565728654365 - type: euclidean_precision value: 75.1120597652081 - type: euclidean_recall value: 81.282722513089 - type: manhattan_accuracy value: 88.72588970388482 - type: manhattan_ap value: 85.52118291594071 - type: manhattan_f1 value: 78.04428724070593 - type: manhattan_precision value: 74.83219105490002 - type: manhattan_recall value: 81.54450261780106 - type: max_accuracy value: 88.73753250281368 - type: max_ap value: 85.54712254033734 - type: max_f1 value: 78.07565728654365 language: - en license: mit --- # gte-base General Text Embeddings (GTE) model. [Towards General Text Embeddings with Multi-stage Contrastive Learning](https://arxiv.org/abs/2308.03281) The GTE models are trained by Alibaba DAMO Academy. They are mainly based on the BERT framework and currently offer three different sizes of models, including [GTE-large](https://huggingface.co/thenlper/gte-large), [GTE-base](https://huggingface.co/thenlper/gte-base), and [GTE-small](https://huggingface.co/thenlper/gte-small). The GTE models are trained on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be applied to various downstream tasks of text embeddings, including **information retrieval**, **semantic textual similarity**, **text reranking**, etc. ## Metrics We compared the performance of the GTE models with other popular text embedding models on the MTEB benchmark. For more detailed comparison results, please refer to the [MTEB leaderboard](https://huggingface.co/spaces/mteb/leaderboard). | Model Name | Model Size (GB) | Dimension | Sequence Length | Average (56) | Clustering (11) | Pair Classification (3) | Reranking (4) | Retrieval (15) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [**gte-large**](https://huggingface.co/thenlper/gte-large) | 0.67 | 1024 | 512 | **63.13** | 46.84 | 85.00 | 59.13 | 52.22 | 83.35 | 31.66 | 73.33 | | [**gte-base**](https://huggingface.co/thenlper/gte-base) | 0.22 | 768 | 512 | **62.39** | 46.2 | 84.57 | 58.61 | 51.14 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1.34 | 1024| 512 | 62.25 | 44.49 | 86.03 | 56.61 | 50.56 | 82.05 | 30.19 | 75.24 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.44 | 768 | 512 | 61.5 | 43.80 | 85.73 | 55.91 | 50.29 | 81.05 | 30.28 | 73.84 | | [**gte-small**](https://huggingface.co/thenlper/gte-small) | 0.07 | 384 | 512 | **61.36** | 44.89 | 83.54 | 57.7 | 49.46 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | - | 1536 | 8192 | 60.99 | 45.9 | 84.89 | 56.32 | 49.25 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.13 | 384 | 512 | 59.93 | 39.92 | 84.67 | 54.32 | 49.04 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 9.73 | 768 | 512 | 59.51 | 43.72 | 85.06 | 56.42 | 42.24 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 0.44 | 768 | 514 | 57.78 | 43.69 | 83.04 | 59.36 | 43.81 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 28.27 | 4096 | 2048 | 57.59 | 38.93 | 81.9 | 55.65 | 48.22 | 77.74 | 33.6 | 66.19 | | [all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) | 0.13 | 384 | 512 | 56.53 | 41.81 | 82.41 | 58.44 | 42.69 | 79.8 | 27.9 | 63.21 | | [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | 0.09 | 384 | 512 | 56.26 | 42.35 | 82.37 | 58.04 | 41.95 | 78.9 | 30.81 | 63.05 | | [contriever-base-msmarco](https://huggingface.co/nthakur/contriever-base-msmarco) | 0.44 | 768 | 512 | 56.00 | 41.1 | 82.54 | 53.14 | 41.88 | 76.51 | 30.36 | 66.68 | | [sentence-t5-base](https://huggingface.co/sentence-transformers/sentence-t5-base) | 0.22 | 768 | 512 | 55.27 | 40.21 | 85.18 | 53.09 | 33.63 | 81.14 | 31.39 | 69.81 | ## Usage Code example ```python import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0) return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] input_texts = [ "what is the capital of China?", "how to implement quick sort in python?", "Beijing", "sorting algorithms" ] tokenizer = AutoTokenizer.from_pretrained("thenlper/gte-base") model = AutoModel.from_pretrained("thenlper/gte-base") # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # (Optionally) normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:1] @ embeddings[1:].T) * 100 print(scores.tolist()) ``` Use with sentence-transformers: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim sentences = ['That is a happy person', 'That is a very happy person'] model = SentenceTransformer('thenlper/gte-base') embeddings = model.encode(sentences) print(cos_sim(embeddings[0], embeddings[1])) ``` ### Limitation This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens. ### Citation If you find our paper or models helpful, please consider citing them as follows: ``` @misc{li2023general, title={Towards General Text Embeddings with Multi-stage Contrastive Learning}, author={Zehan Li and Xin Zhang and Yanzhao Zhang and Dingkun Long and Pengjun Xie and Meishan Zhang}, year={2023}, eprint={2308.03281}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
68,118
[ [ -0.039276123046875, -0.043121337890625, 0.020294189453125, 0.0181732177734375, -0.0165863037109375, -0.009033203125, -0.02227783203125, -0.0209503173828125, 0.033966064453125, 0.0059967041015625, -0.038909912109375, -0.05426025390625, -0.05560302734375, -0.0030841827392578125, -0.0137176513671875, 0.0743408203125, -0.0027408599853515625, -0.01340484619140625, 0.0004398822784423828, -0.0218353271484375, -0.015380859375, -0.03216552734375, -0.045501708984375, -0.011444091796875, 0.03143310546875, 0.020904541015625, 0.05682373046875, 0.049224853515625, 0.0240020751953125, 0.0281219482421875, -0.0152435302734375, -0.0015354156494140625, -0.0274200439453125, -0.012664794921875, 0.01276397705078125, -0.0272064208984375, -0.0347900390625, 0.008880615234375, 0.03668212890625, 0.0267333984375, -0.001811981201171875, 0.01186370849609375, 0.0185089111328125, 0.03192138671875, -0.0232696533203125, 0.0158233642578125, -0.016754150390625, 0.0084686279296875, -0.0064544677734375, 0.011444091796875, -0.0288238525390625, -0.0233917236328125, 0.0229644775390625, -0.032958984375, 0.0140380859375, 0.01258087158203125, 0.10443115234375, 0.01434326171875, -0.023529052734375, -0.03009033203125, -0.01169586181640625, 0.06719970703125, -0.06854248046875, 0.02392578125, 0.016845703125, -0.004764556884765625, -0.0005159378051757812, -0.06134033203125, -0.04296875, -0.003475189208984375, -0.044281005859375, 0.0207061767578125, -0.0193939208984375, -0.0014057159423828125, 0.020751953125, 0.03924560546875, -0.056640625, -0.0010614395141601562, -0.0028400421142578125, -0.0114898681640625, 0.04351806640625, 0.002040863037109375, 0.0379638671875, -0.041900634765625, -0.035308837890625, -0.0233154296875, -0.030029296875, 0.01462554931640625, 0.0191802978515625, -0.0015344619750976562, -0.04827880859375, 0.043243408203125, -0.006256103515625, 0.036346435546875, 0.009063720703125, -0.0013074874877929688, 0.052215576171875, -0.027191162109375, -0.023956298828125, -0.0267181396484375, 0.08795166015625, 0.032379150390625, 0.0137786865234375, -0.00023758411407470703, -0.007579803466796875, -0.003570556640625, -0.01384735107421875, -0.0709228515625, -0.0216522216796875, 0.0151519775390625, -0.047210693359375, -0.0247650146484375, 0.0097198486328125, -0.0697021484375, -0.006099700927734375, -0.000728607177734375, 0.042633056640625, -0.049530029296875, -0.0108795166015625, 0.00565338134765625, -0.01320648193359375, 0.02288818359375, -0.005672454833984375, -0.06121826171875, 0.010040283203125, 0.0261688232421875, 0.069580078125, 0.01184844970703125, -0.0210113525390625, -0.0088043212890625, -0.00897979736328125, -0.0051422119140625, 0.03973388671875, -0.0247955322265625, -0.01035308837890625, -0.00043392181396484375, 0.00997161865234375, -0.0264434814453125, -0.0165863037109375, 0.058441162109375, -0.006134033203125, 0.045166015625, -0.01092529296875, -0.043792724609375, -0.01432037353515625, 0.0171356201171875, -0.046478271484375, 0.0941162109375, 0.007251739501953125, -0.0789794921875, 0.016571044921875, -0.045867919921875, -0.006072998046875, -0.031646728515625, -0.00836181640625, -0.05267333984375, -0.00878143310546875, 0.0404052734375, 0.04998779296875, -0.01375579833984375, 0.002391815185546875, -0.0128173828125, -0.0163726806640625, 0.00353240966796875, -0.0167083740234375, 0.07562255859375, 0.006938934326171875, -0.048248291015625, 0.01244354248046875, -0.047027587890625, 0.00878143310546875, 0.0244293212890625, -0.01142120361328125, -0.020233154296875, -0.007080078125, 0.008026123046875, 0.0325927734375, 0.0215606689453125, -0.03558349609375, 0.017730712890625, -0.036224365234375, 0.055511474609375, 0.059356689453125, -0.00027179718017578125, 0.02655029296875, -0.027557373046875, 0.015899658203125, 0.010955810546875, 0.022125244140625, -0.01065826416015625, -0.04638671875, -0.07012939453125, -0.038543701171875, 0.035125732421875, 0.047119140625, -0.0560302734375, 0.059661865234375, -0.031402587890625, -0.040618896484375, -0.045440673828125, 0.0005631446838378906, 0.032623291015625, 0.0265350341796875, 0.037811279296875, -0.007709503173828125, -0.03558349609375, -0.08056640625, -0.0062408447265625, -0.002986907958984375, 0.0024089813232421875, 0.02838134765625, 0.056549072265625, -0.0230255126953125, 0.06304931640625, -0.046051025390625, -0.01146697998046875, -0.0162353515625, 0.01398468017578125, 0.03607177734375, 0.044647216796875, 0.061920166015625, -0.056640625, -0.0572509765625, -0.020538330078125, -0.062744140625, 0.00914764404296875, -0.0037021636962890625, -0.0254974365234375, 0.014678955078125, 0.02978515625, -0.0631103515625, 0.03564453125, 0.0401611328125, -0.046966552734375, 0.031341552734375, -0.022857666015625, 0.007904052734375, -0.10272216796875, 0.0028820037841796875, 0.01424407958984375, -0.017242431640625, -0.049102783203125, 0.00600433349609375, 0.004604339599609375, 0.01163482666015625, -0.025909423828125, 0.0482177734375, -0.0479736328125, 0.0172576904296875, 0.00948333740234375, 0.0277862548828125, -0.00511932373046875, 0.05914306640625, -0.00902557373046875, 0.051361083984375, 0.04168701171875, -0.027679443359375, 0.00885009765625, 0.035064697265625, -0.030792236328125, 0.038909912109375, -0.0528564453125, 0.0041961669921875, -0.00815582275390625, 0.019683837890625, -0.08221435546875, -0.0076141357421875, 0.031402587890625, -0.0479736328125, 0.0270233154296875, -0.001495361328125, -0.042388916015625, -0.048797607421875, -0.0531005859375, 0.0123138427734375, 0.03680419921875, -0.039337158203125, 0.0290374755859375, 0.016204833984375, -0.01107025146484375, -0.06146240234375, -0.062042236328125, -0.002086639404296875, -0.0193634033203125, -0.06134033203125, 0.04559326171875, -0.016082763671875, 0.00861358642578125, 0.01392364501953125, 0.00705718994140625, 0.003391265869140625, -0.00708770751953125, 0.0151519775390625, 0.026947021484375, -0.0191497802734375, -0.0021343231201171875, -0.00092315673828125, -0.008392333984375, -0.0103759765625, -0.003986358642578125, 0.05596923828125, -0.0212249755859375, -0.00469970703125, -0.03851318359375, 0.0199432373046875, 0.034149169921875, -0.004383087158203125, 0.069580078125, 0.06793212890625, -0.035552978515625, 0.00612640380859375, -0.038543701171875, -0.0085906982421875, -0.03790283203125, 0.0249786376953125, -0.034332275390625, -0.07025146484375, 0.054779052734375, 0.02496337890625, 0.01125335693359375, 0.0711669921875, 0.046051025390625, -0.00003546476364135742, 0.07965087890625, 0.04193115234375, -0.02398681640625, 0.047821044921875, -0.05340576171875, 0.017578125, -0.0703125, -0.025970458984375, -0.028564453125, -0.03863525390625, -0.06317138671875, -0.0325927734375, 0.0113983154296875, 0.015869140625, -0.0374755859375, 0.0438232421875, -0.046905517578125, 0.010528564453125, 0.040802001953125, 0.020538330078125, -0.0095672607421875, 0.0022563934326171875, -0.03424072265625, -0.01641845703125, -0.04833984375, -0.0299224853515625, 0.0657958984375, 0.04071044921875, 0.035858154296875, 0.00843048095703125, 0.052001953125, 0.010498046875, 0.0195159912109375, -0.048797607421875, 0.042633056640625, -0.01136016845703125, -0.047119140625, -0.0177154541015625, -0.039947509765625, -0.06640625, 0.027252197265625, -0.0241241455078125, -0.0694580078125, 0.01496124267578125, -0.0093994140625, -0.0297393798828125, 0.031494140625, -0.06134033203125, 0.07421875, 0.0004265308380126953, -0.023834228515625, -0.001972198486328125, -0.04974365234375, 0.0178985595703125, 0.039703369140625, 0.01166534423828125, 0.002399444580078125, -0.00687408447265625, 0.0650634765625, -0.036773681640625, 0.053375244140625, -0.01513671875, 0.006622314453125, 0.02508544921875, -0.018402099609375, 0.04522705078125, 0.006694793701171875, 0.0005669593811035156, 0.0028533935546875, -0.02008056640625, -0.03948974609375, -0.037200927734375, 0.0609130859375, -0.068115234375, -0.039520263671875, -0.044158935546875, -0.0274505615234375, -0.007114410400390625, 0.016357421875, 0.040618896484375, 0.032745361328125, -0.0005631446838378906, 0.03143310546875, 0.049346923828125, -0.029876708984375, 0.059539794921875, -0.0006723403930664062, 0.004730224609375, -0.05078125, 0.06304931640625, 0.004177093505859375, 0.0045318603515625, 0.030731201171875, 0.00482940673828125, -0.03216552734375, -0.0246124267578125, -0.0299224853515625, 0.0423583984375, -0.04449462890625, -0.01143646240234375, -0.060791015625, -0.032012939453125, -0.037841796875, -0.01027679443359375, -0.01739501953125, -0.035614013671875, -0.040771484375, -0.0202789306640625, 0.0263214111328125, 0.05450439453125, -0.00516510009765625, 0.0161285400390625, -0.03656005859375, 0.0222625732421875, 0.0158233642578125, 0.0293121337890625, -0.000194549560546875, -0.054534912109375, -0.02581787109375, -0.005489349365234375, -0.0264739990234375, -0.0689697265625, 0.03973388671875, 0.005840301513671875, 0.050537109375, 0.0253753662109375, -0.0131378173828125, 0.052215576171875, -0.03436279296875, 0.07061767578125, 0.0257720947265625, -0.07305908203125, 0.0294342041015625, -0.01496124267578125, 0.01174163818359375, 0.031829833984375, 0.03582763671875, -0.04132080078125, -0.022979736328125, -0.062255859375, -0.07965087890625, 0.04913330078125, 0.031585693359375, 0.008544921875, -0.0082244873046875, 0.024993896484375, -0.01065826416015625, 0.00824737548828125, -0.07159423828125, -0.059326171875, -0.0283966064453125, -0.0428466796875, -0.012420654296875, -0.026641845703125, 0.00504302978515625, -0.0288848876953125, 0.054718017578125, -0.00013828277587890625, 0.06317138671875, 0.0258331298828125, -0.0193939208984375, 0.028564453125, 0.0113067626953125, 0.055511474609375, 0.018402099609375, -0.01285552978515625, -0.0021495819091796875, 0.017608642578125, -0.041412353515625, 0.006256103515625, 0.0174560546875, -0.01166534423828125, 0.006954193115234375, 0.0280914306640625, 0.06744384765625, 0.017364501953125, -0.0205230712890625, 0.05316162109375, -0.009552001953125, -0.02862548828125, -0.019500732421875, -0.007320404052734375, 0.0164031982421875, 0.0213775634765625, 0.0257568359375, -0.0031280517578125, 0.003597259521484375, -0.037017822265625, 0.02728271484375, 0.025360107421875, -0.036529541015625, -0.0211639404296875, 0.051666259765625, -0.0003597736358642578, -0.0019779205322265625, 0.040283203125, -0.0201416015625, -0.0504150390625, 0.046417236328125, 0.032867431640625, 0.06640625, -0.00659942626953125, 0.0231781005859375, 0.05584716796875, 0.0293731689453125, -0.0003046989440917969, 0.015045166015625, 0.025482177734375, -0.04144287109375, -0.0234832763671875, -0.034088134765625, 0.0006513595581054688, 0.0167236328125, -0.0377197265625, 0.0256805419921875, -0.0273895263671875, -0.0160064697265625, -0.007297515869140625, 0.0162200927734375, -0.056304931640625, 0.0172119140625, 0.007312774658203125, 0.07464599609375, -0.057952880859375, 0.061126708984375, 0.0489501953125, -0.044403076171875, -0.0596923828125, -0.00043463706970214844, -0.00872802734375, -0.057220458984375, 0.0338134765625, 0.02392578125, 0.0182952880859375, 0.00778961181640625, -0.033966064453125, -0.0673828125, 0.1063232421875, 0.015838623046875, -0.025634765625, -0.0199737548828125, 0.0031986236572265625, 0.038299560546875, -0.0279083251953125, 0.05865478515625, 0.03045654296875, 0.037689208984375, -0.01451873779296875, -0.0491943359375, 0.02496337890625, -0.03680419921875, 0.0042572021484375, 0.0037384033203125, -0.07965087890625, 0.081787109375, -0.003940582275390625, -0.00897979736328125, 0.01446533203125, 0.059906005859375, 0.0160064697265625, 0.0006470680236816406, 0.037811279296875, 0.06982421875, 0.045166015625, -0.02838134765625, 0.08538818359375, -0.023956298828125, 0.05615234375, 0.052459716796875, 0.0306396484375, 0.06146240234375, 0.0199737548828125, -0.01267242431640625, 0.065673828125, 0.05450439453125, -0.0035114288330078125, 0.039459228515625, 0.00962066650390625, -0.0019140243530273438, -0.00394439697265625, 0.01275634765625, -0.0330810546875, 0.009124755859375, 0.0229644775390625, -0.03814697265625, -0.005279541015625, -0.0028781890869140625, 0.0252685546875, -0.006328582763671875, -0.005626678466796875, 0.0382080078125, 0.0113983154296875, -0.022918701171875, 0.042938232421875, 0.006793975830078125, 0.082275390625, -0.0251922607421875, 0.0120086669921875, -0.01629638671875, 0.008697509765625, -0.0160980224609375, -0.0716552734375, 0.00942230224609375, -0.002857208251953125, -0.0159912109375, -0.01340484619140625, 0.042022705078125, -0.0312042236328125, -0.0278167724609375, 0.034820556640625, 0.032745361328125, -0.0029239654541015625, 0.0072174072265625, -0.08740234375, -0.00540924072265625, 0.01195526123046875, -0.059814453125, 0.017333984375, 0.029937744140625, 0.0155029296875, 0.03961181640625, 0.028717041015625, -0.0149383544921875, 0.020599365234375, -0.0038738250732421875, 0.0574951171875, -0.06787109375, -0.03729248046875, -0.07183837890625, 0.061248779296875, -0.0273895263671875, -0.036346435546875, 0.054840087890625, 0.049072265625, 0.04656982421875, -0.007213592529296875, 0.04913330078125, -0.0283966064453125, 0.034820556640625, -0.0341796875, 0.056640625, -0.054229736328125, -0.0119171142578125, -0.02294921875, -0.06304931640625, -0.0229339599609375, 0.05889892578125, -0.024871826171875, 0.015380859375, 0.06060791015625, 0.05328369140625, 0.00028061866760253906, -0.01541900634765625, 0.00341033935546875, 0.035919189453125, 0.02410888671875, 0.06585693359375, 0.0361328125, -0.0694580078125, 0.049957275390625, -0.034423828125, -0.011322021484375, -0.02252197265625, -0.059234619140625, -0.08184814453125, -0.054107666015625, -0.0305023193359375, -0.026702880859375, -0.01263427734375, 0.06793212890625, 0.04998779296875, -0.06134033203125, -0.00011843442916870117, -0.0029239654541015625, 0.01131439208984375, -0.006397247314453125, -0.0251312255859375, 0.047637939453125, -0.0072021484375, -0.07452392578125, 0.004467010498046875, 0.001369476318359375, 0.018585205078125, 0.0007948875427246094, -0.0169525146484375, -0.037261962890625, 0.00353240966796875, 0.04705810546875, 0.01358795166015625, -0.047760009765625, -0.037841796875, 0.0078887939453125, -0.035797119140625, 0.0151214599609375, 0.0247955322265625, -0.032867431640625, 0.0007781982421875, 0.0513916015625, 0.0262298583984375, 0.050048828125, -0.0108795166015625, 0.00887298583984375, -0.05474853515625, 0.0227203369140625, 0.002483367919921875, 0.044189453125, 0.0200347900390625, -0.018341064453125, 0.048553466796875, 0.02618408203125, -0.042236328125, -0.053619384765625, -0.00939178466796875, -0.08355712890625, -0.01019287109375, 0.07373046875, -0.021881103515625, -0.0232696533203125, 0.01947021484375, -0.0212554931640625, 0.025390625, -0.0303192138671875, 0.043182373046875, 0.06622314453125, -0.0001418590545654297, -0.026031494140625, -0.045379638671875, 0.032867431640625, 0.03741455078125, -0.055572509765625, -0.026519775390625, 0.007480621337890625, 0.0240478515625, 0.0292510986328125, 0.039825439453125, -0.0107421875, 0.0062408447265625, 0.0006356239318847656, 0.011077880859375, 0.00785064697265625, -0.005279541015625, -0.01141357421875, 0.01032257080078125, -0.017608642578125, -0.0185394287109375 ] ]
stabilityai/stable-diffusion-x4-upscaler
2023-07-05T16:19:13.000Z
[ "diffusers", "stable-diffusion", "arxiv:2112.10752", "arxiv:2202.00512", "arxiv:1910.09700", "license:openrail++", "has_space", "diffusers:StableDiffusionUpscalePipeline", "region:us" ]
null
stabilityai
null
null
stabilityai/stable-diffusion-x4-upscaler
502
233,214
diffusers
2022-11-23T17:42:04
--- license: openrail++ tags: - stable-diffusion inference: false --- # Stable Diffusion x4 upscaler model card This model card focuses on the model associated with the Stable Diffusion Upscaler, available [here](https://github.com/Stability-AI/stablediffusion). This model is trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). ![Image](https://github.com/Stability-AI/stablediffusion/raw/main/assets/stable-samples/upscaling/merged-dog.png) - Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `x4-upscaler-ema.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-x4-upscaler/resolve/main/x4-upscaler-ema.ckpt). - Use it with 🧨 [`diffusers`](https://huggingface.co/stabilityai/stable-diffusion-x4-upscaler#examples) ## Model Details - **Developed by:** Robin Rombach, Patrick Esser - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)). - **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/). - **Cite as:** @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } ## Examples Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 in a simple and efficient manner. ```bash pip install diffusers transformers accelerate scipy safetensors ``` ```python import requests from PIL import Image from io import BytesIO from diffusers import StableDiffusionUpscalePipeline import torch # load model and scheduler model_id = "stabilityai/stable-diffusion-x4-upscaler" pipeline = StableDiffusionUpscalePipeline.from_pretrained(model_id, torch_dtype=torch.float16) pipeline = pipeline.to("cuda") # let's download an image url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png" response = requests.get(url) low_res_img = Image.open(BytesIO(response.content)).convert("RGB") low_res_img = low_res_img.resize((128, 128)) prompt = "a white cat" upscaled_image = pipeline(prompt=prompt, image=low_res_img).images[0] upscaled_image.save("upsampled_cat.png") ``` **Notes**: - Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance) - If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed) # Uses ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include - Safe deployment of models which have the potential to generate harmful content. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use _Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_. The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. - The model was trained mainly with English captions and will not work as well in other languages. - The autoencoding part of the model is lossy - The model was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section). ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion vw was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as white and western cultures are often set as the default. Further, the ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts. Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ## Training **Training Data** The model developers used the following dataset for training the model: - LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic. **Training Procedure** Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training, - Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4 - Text prompts are encoded through the OpenCLIP-ViT/H text-encoder. - The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention. - The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512. We currently provide the following checkpoints: - `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. 850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`. - `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset. - `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. - `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://github.com/saic-mdal/lama). - `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). - **Hardware:** 32 x 8 x A100 GPUs - **Optimizer:** AdamW - **Gradient Accumulations**: 1 - **Batch:** 32 x 8 x 2 x 4 = 2048 - **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant ## Evaluation Results Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints: ![pareto](model-variants.jpg) Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores. ## Environmental Impact **Stable Diffusion v1** **Estimated Emissions** Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact. - **Hardware Type:** A100 PCIe 40GB - **Hours used:** 200000 - **Cloud Provider:** AWS - **Compute Region:** US-east - **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq. ## Citation @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } *This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
12,601
[ [ -0.035614013671875, -0.059814453125, 0.0233306884765625, 0.00930023193359375, -0.01386260986328125, -0.024200439453125, 0.001583099365234375, -0.035064697265625, -0.00717926025390625, 0.027679443359375, -0.02886962890625, -0.029693603515625, -0.050933837890625, -0.0078582763671875, -0.0304107666015625, 0.06988525390625, -0.0059051513671875, -0.0009350776672363281, -0.01439666748046875, -0.00612640380859375, -0.0226593017578125, -0.0120697021484375, -0.0816650390625, -0.01512908935546875, 0.032012939453125, 0.007068634033203125, 0.05810546875, 0.04461669921875, 0.03466796875, 0.025970458984375, -0.0220794677734375, -0.0031948089599609375, -0.054412841796875, -0.004913330078125, 0.003589630126953125, -0.021148681640625, -0.041473388671875, 0.01165771484375, 0.05426025390625, 0.01544952392578125, -0.0033817291259765625, 0.006847381591796875, 0.0015277862548828125, 0.0433349609375, -0.0390625, -0.00717926025390625, -0.021484375, 0.0137481689453125, -0.01241302490234375, 0.0209503173828125, -0.0265655517578125, -0.013824462890625, 0.007808685302734375, -0.06011962890625, 0.032867431640625, -0.0179595947265625, 0.08941650390625, 0.0280914306640625, -0.0205535888671875, -0.0016508102416992188, -0.051300048828125, 0.047393798828125, -0.047821044921875, 0.0214691162109375, 0.0266876220703125, 0.0069732666015625, -0.0005736351013183594, -0.0731201171875, -0.044952392578125, -0.007350921630859375, 0.0007476806640625, 0.03271484375, -0.032470703125, -0.0032711029052734375, 0.03802490234375, 0.02288818359375, -0.041229248046875, -0.004734039306640625, -0.040374755859375, -0.008087158203125, 0.048431396484375, 0.006565093994140625, 0.022064208984375, -0.01800537109375, -0.02581787109375, -0.007442474365234375, -0.031036376953125, 0.0022678375244140625, 0.029693603515625, -0.022308349609375, -0.031982421875, 0.02716064453125, 0.0106964111328125, 0.037567138671875, 0.02764892578125, -0.01081085205078125, 0.0289306640625, -0.02685546875, -0.01389312744140625, -0.03509521484375, 0.068115234375, 0.0498046875, -0.00975799560546875, 0.01122283935546875, -0.0084381103515625, 0.017913818359375, 0.0093231201171875, -0.09222412109375, -0.0341796875, 0.0129852294921875, -0.050628662109375, -0.04425048828125, -0.017791748046875, -0.08074951171875, -0.0107574462890625, 0.0176239013671875, 0.0352783203125, -0.023406982421875, -0.036102294921875, -0.007556915283203125, -0.0282135009765625, 0.012603759765625, 0.030975341796875, -0.052490234375, 0.01454925537109375, -0.0022487640380859375, 0.08551025390625, -0.0258636474609375, -0.00030040740966796875, -0.01372528076171875, 0.005794525146484375, -0.0244598388671875, 0.051239013671875, -0.0252227783203125, -0.039276123046875, -0.020721435546875, 0.0266876220703125, 0.015228271484375, -0.03961181640625, 0.054168701171875, -0.035919189453125, 0.026214599609375, -0.006992340087890625, -0.0289306640625, -0.0148773193359375, 0.002223968505859375, -0.056243896484375, 0.0841064453125, 0.0228729248046875, -0.0679931640625, 0.009185791015625, -0.055206298828125, -0.01190948486328125, -0.004978179931640625, 0.000053048133850097656, -0.055145263671875, -0.01043701171875, 0.005214691162109375, 0.03082275390625, -0.00798797607421875, 0.0145416259765625, -0.0203857421875, -0.0181732177734375, -0.0009551048278808594, -0.045318603515625, 0.0716552734375, 0.0226593017578125, -0.03302001953125, 0.01187896728515625, -0.049591064453125, -0.0226593017578125, 0.040802001953125, -0.0197906494140625, -0.011993408203125, -0.0162811279296875, 0.0196380615234375, 0.02850341796875, 0.006443023681640625, -0.0364990234375, -0.006992340087890625, -0.02203369140625, 0.044708251953125, 0.056182861328125, 0.0087127685546875, 0.056243896484375, -0.0310211181640625, 0.0440673828125, 0.0216217041015625, 0.022186279296875, -0.016204833984375, -0.06170654296875, -0.052886962890625, -0.0166778564453125, 0.017974853515625, 0.03466796875, -0.0560302734375, 0.01111602783203125, 0.0018072128295898438, -0.0484619140625, -0.0243682861328125, -0.003040313720703125, 0.0217437744140625, 0.055877685546875, 0.02215576171875, -0.0312042236328125, -0.0236053466796875, -0.052978515625, 0.031280517578125, -0.00354766845703125, 0.012420654296875, 0.0224151611328125, 0.04638671875, -0.0306243896484375, 0.03802490234375, -0.04547119140625, -0.0269927978515625, 0.00885772705078125, 0.005588531494140625, -0.0019350051879882812, 0.048583984375, 0.0634765625, -0.076904296875, -0.049285888671875, -0.01554107666015625, -0.06463623046875, 0.0035686492919921875, 0.005039215087890625, -0.029510498046875, 0.03192138671875, 0.038787841796875, -0.05633544921875, 0.046051025390625, 0.053070068359375, -0.024200439453125, 0.038848876953125, -0.0310821533203125, 0.0025920867919921875, -0.08074951171875, 0.004119873046875, 0.0297698974609375, -0.0286102294921875, -0.045318603515625, 0.00839996337890625, 0.0006909370422363281, -0.0179901123046875, -0.042816162109375, 0.0570068359375, -0.035400390625, 0.02813720703125, -0.02984619140625, 0.0008006095886230469, 0.015838623046875, 0.0250701904296875, 0.022735595703125, 0.051116943359375, 0.061859130859375, -0.0391845703125, 0.01416778564453125, 0.0198822021484375, -0.007434844970703125, 0.038818359375, -0.064697265625, 0.007213592529296875, -0.0285491943359375, 0.02197265625, -0.077880859375, -0.021728515625, 0.041473388671875, -0.03228759765625, 0.029327392578125, -0.0162353515625, -0.0258026123046875, -0.037506103515625, -0.01904296875, 0.04071044921875, 0.076904296875, -0.0297698974609375, 0.04052734375, 0.034820556640625, 0.01013946533203125, -0.032012939453125, -0.055511474609375, -0.0099945068359375, -0.02789306640625, -0.06671142578125, 0.045654296875, -0.0217742919921875, -0.00994110107421875, 0.01568603515625, 0.0148162841796875, -0.0015687942504882812, -0.0110015869140625, 0.03424072265625, 0.018524169921875, 0.000011801719665527344, -0.01247406005859375, 0.01462554931640625, -0.01522064208984375, 0.00439453125, -0.00537872314453125, 0.026611328125, 0.005100250244140625, -0.00312042236328125, -0.05145263671875, 0.03363037109375, 0.039031982421875, 0.0011320114135742188, 0.05426025390625, 0.0782470703125, -0.038421630859375, 0.004367828369140625, -0.0269012451171875, -0.017822265625, -0.03912353515625, 0.0290069580078125, -0.0124359130859375, -0.04608154296875, 0.051300048828125, 0.000759124755859375, 0.0035800933837890625, 0.047088623046875, 0.058319091796875, -0.0116729736328125, 0.0909423828125, 0.04803466796875, 0.024383544921875, 0.054107666015625, -0.058502197265625, -0.00714111328125, -0.0673828125, -0.0185394287109375, -0.0130462646484375, -0.01959228515625, -0.03509521484375, -0.05145263671875, 0.0258941650390625, 0.00528717041015625, -0.010498046875, 0.0086212158203125, -0.046630859375, 0.0267486572265625, 0.0239715576171875, 0.0152587890625, 0.0013608932495117188, 0.01230621337890625, 0.006317138671875, -0.014312744140625, -0.057769775390625, -0.040863037109375, 0.0728759765625, 0.039581298828125, 0.06524658203125, 0.0023517608642578125, 0.03564453125, 0.030792236328125, 0.0272369384765625, -0.03802490234375, 0.038970947265625, -0.024688720703125, -0.05126953125, -0.00878143310546875, -0.02166748046875, -0.0726318359375, 0.01306915283203125, -0.01837158203125, -0.0297393798828125, 0.041839599609375, 0.019317626953125, -0.0237579345703125, 0.0284271240234375, -0.060577392578125, 0.07403564453125, -0.005359649658203125, -0.0577392578125, -0.0036907196044921875, -0.0521240234375, 0.0274200439453125, -0.006938934326171875, 0.01837158203125, -0.00701141357421875, -0.005237579345703125, 0.06732177734375, -0.025787353515625, 0.06817626953125, -0.02996826171875, 0.0002684593200683594, 0.02587890625, -0.00923919677734375, 0.02581787109375, 0.01404571533203125, -0.0119171142578125, 0.0321044921875, 0.0069732666015625, -0.033843994140625, -0.026641845703125, 0.06134033203125, -0.0794677734375, -0.036285400390625, -0.03167724609375, -0.03167724609375, 0.039276123046875, 0.0162811279296875, 0.0634765625, 0.028411865234375, -0.01361083984375, -0.00873565673828125, 0.06085205078125, -0.018585205078125, 0.032623291015625, 0.0100555419921875, -0.0243072509765625, -0.040069580078125, 0.05963134765625, 0.02093505859375, 0.041015625, -0.01241302490234375, 0.011627197265625, -0.01507568359375, -0.04132080078125, -0.04693603515625, 0.0227813720703125, -0.0625, -0.0197906494140625, -0.06097412109375, -0.0262603759765625, -0.031890869140625, -0.01251220703125, -0.0262603759765625, -0.0157623291015625, -0.06573486328125, 0.007572174072265625, 0.02435302734375, 0.039337158203125, -0.02435302734375, 0.033599853515625, -0.033905029296875, 0.03399658203125, 0.01165771484375, 0.01311492919921875, 0.00728607177734375, -0.059844970703125, -0.0158233642578125, 0.00826263427734375, -0.046783447265625, -0.07354736328125, 0.036102294921875, 0.008270263671875, 0.04010009765625, 0.0430908203125, -0.0008511543273925781, 0.045623779296875, -0.031341552734375, 0.072265625, 0.01568603515625, -0.04669189453125, 0.048919677734375, -0.029296875, 0.0155029296875, 0.0169830322265625, 0.04644775390625, -0.0154876708984375, -0.025787353515625, -0.0599365234375, -0.06732177734375, 0.051483154296875, 0.0295257568359375, 0.03216552734375, -0.0083160400390625, 0.055206298828125, -0.00479888916015625, -0.0106658935546875, -0.0810546875, -0.037017822265625, -0.0267486572265625, 0.00601959228515625, 0.00543975830078125, -0.0335693359375, -0.01120758056640625, -0.0419921875, 0.068115234375, 0.0035915374755859375, 0.044097900390625, 0.0281219482421875, 0.00518035888671875, -0.03509521484375, -0.02734375, 0.032318115234375, 0.0249176025390625, -0.01406097412109375, -0.004116058349609375, -0.002399444580078125, -0.040863037109375, 0.0222930908203125, 0.01354217529296875, -0.050445556640625, -0.00054931640625, -0.0061798095703125, 0.07037353515625, -0.0185699462890625, -0.031494140625, 0.048095703125, -0.0233917236328125, -0.0299835205078125, -0.036529541015625, 0.0106964111328125, 0.0068817138671875, 0.0241241455078125, 0.006687164306640625, 0.0404052734375, 0.01477813720703125, -0.023040771484375, 0.007114410400390625, 0.03692626953125, -0.031585693359375, -0.0287322998046875, 0.0831298828125, 0.01049041748046875, -0.01971435546875, 0.045562744140625, -0.041473388671875, -0.0157012939453125, 0.0465087890625, 0.05682373046875, 0.0543212890625, -0.0180816650390625, 0.03460693359375, 0.0537109375, 0.0185089111328125, -0.019805908203125, 0.0132598876953125, 0.01522064208984375, -0.061920166015625, -0.01528167724609375, -0.036956787109375, -0.001453399658203125, 0.01410675048828125, -0.04107666015625, 0.0350341796875, -0.037200927734375, -0.042510986328125, 0.005893707275390625, -0.0165557861328125, -0.038238525390625, 0.010650634765625, 0.0221099853515625, 0.06201171875, -0.08135986328125, 0.05975341796875, 0.05377197265625, -0.049652099609375, -0.04095458984375, 0.0027065277099609375, -0.005725860595703125, -0.0233306884765625, 0.03607177734375, 0.009521484375, 0.002674102783203125, 0.0148162841796875, -0.060577392578125, -0.0706787109375, 0.0908203125, 0.0293731689453125, -0.0209503173828125, -0.007152557373046875, -0.0236053466796875, 0.043182373046875, -0.032012939453125, 0.021728515625, 0.0252685546875, 0.0286102294921875, 0.032012939453125, -0.0328369140625, 0.01416778564453125, -0.031341552734375, 0.0245361328125, -0.00499725341796875, -0.069580078125, 0.07293701171875, -0.029296875, -0.022857666015625, 0.0233154296875, 0.056915283203125, 0.0148468017578125, 0.0204925537109375, 0.034149169921875, 0.0655517578125, 0.044952392578125, -0.0099334716796875, 0.08123779296875, -0.0027370452880859375, 0.0264434814453125, 0.0550537109375, -0.00603485107421875, 0.0491943359375, 0.0263824462890625, -0.00724029541015625, 0.045257568359375, 0.05859375, -0.027740478515625, 0.055145263671875, -0.00433349609375, -0.0154266357421875, -0.002384185791015625, -0.004322052001953125, -0.032928466796875, 0.00511932373046875, 0.027496337890625, -0.0433349609375, -0.011993408203125, 0.025970458984375, 0.003894805908203125, -0.01488494873046875, -0.0112152099609375, 0.045806884765625, 0.0015707015991210938, -0.029296875, 0.050048828125, 0.013580322265625, 0.06781005859375, -0.033660888671875, -0.01027679443359375, -0.0109100341796875, 0.01081085205078125, -0.022552490234375, -0.0576171875, 0.039764404296875, -0.00909423828125, -0.0214080810546875, -0.0121612548828125, 0.06732177734375, -0.022552490234375, -0.048736572265625, 0.03192138671875, 0.0185394287109375, 0.0252838134765625, 0.00421142578125, -0.0819091796875, 0.02459716796875, -0.0010156631469726562, -0.0245361328125, 0.0225677490234375, 0.0170745849609375, 0.00003832578659057617, 0.0390625, 0.04840087890625, -0.0004162788391113281, 0.0081024169921875, -0.00434112548828125, 0.06707763671875, -0.0254364013671875, -0.021484375, -0.0545654296875, 0.053436279296875, -0.010772705078125, -0.0130157470703125, 0.055633544921875, 0.040679931640625, 0.060272216796875, -0.00499725341796875, 0.055938720703125, -0.0185699462890625, -0.0031585693359375, -0.0281219482421875, 0.061798095703125, -0.059661865234375, 0.0014295578002929688, -0.029998779296875, -0.06573486328125, -0.0163421630859375, 0.067626953125, -0.0225067138671875, 0.0230865478515625, 0.029205322265625, 0.0755615234375, -0.01013946533203125, -0.017669677734375, 0.0318603515625, 0.0236053466796875, 0.0253448486328125, 0.017059326171875, 0.0589599609375, -0.056365966796875, 0.036163330078125, -0.040130615234375, -0.0254058837890625, -0.000029027462005615234, -0.05999755859375, -0.064697265625, -0.0517578125, -0.058807373046875, -0.058135986328125, -0.00716400146484375, 0.03131103515625, 0.07794189453125, -0.037872314453125, -0.0031948089599609375, -0.0118865966796875, -0.0012769699096679688, -0.003673553466796875, -0.02191162109375, 0.022979736328125, 0.00821685791015625, -0.07269287109375, -0.00922393798828125, 0.0201568603515625, 0.041412353515625, -0.04229736328125, -0.01554107666015625, -0.01959228515625, -0.007190704345703125, 0.046051025390625, 0.01021575927734375, -0.048187255859375, -0.00034928321838378906, -0.0043487548828125, 0.0007529258728027344, 0.0186920166015625, 0.0162811279296875, -0.0487060546875, 0.03216552734375, 0.041107177734375, 0.0148162841796875, 0.06512451171875, -0.0030117034912109375, 0.010498046875, -0.039520263671875, 0.023834228515625, 0.004180908203125, 0.0291595458984375, 0.02880859375, -0.049072265625, 0.03155517578125, 0.03863525390625, -0.061187744140625, -0.056976318359375, 0.0116729736328125, -0.0814208984375, -0.0191192626953125, 0.0960693359375, -0.016754150390625, -0.032318115234375, 0.0030384063720703125, -0.03216552734375, 0.01806640625, -0.034912109375, 0.041015625, 0.037017822265625, -0.01190948486328125, -0.044952392578125, -0.04827880859375, 0.044677734375, 0.00537109375, -0.04937744140625, -0.0178070068359375, 0.05078125, 0.04693603515625, 0.0254974365234375, 0.0723876953125, -0.0299224853515625, 0.0178070068359375, 0.00724029541015625, 0.001468658447265625, 0.00885009765625, -0.0096893310546875, -0.037811279296875, 0.006191253662109375, -0.007740020751953125, -0.0014524459838867188 ] ]
google/flan-t5-xxl
2023-07-27T11:42:14.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "t5", "text2text-generation", "en", "fr", "ro", "de", "multilingual", "dataset:svakulenk0/qrecc", "dataset:taskmaster2", "dataset:djaym7/wiki_dialog", "dataset:deepmind/code_contests", "dataset:lambada", "dataset:gsm8k", "dataset:aqua_rat", "dataset:esnli", "dataset:quasc", "dataset:qed", "arxiv:2210.11416", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
google
null
null
google/flan-t5-xxl
967
231,423
transformers
2022-10-21T15:54:59
--- language: - en - fr - ro - de - multilingual widget: - text: "Translate to German: My name is Arthur" example_title: "Translation" - text: "Please answer to the following question. Who is going to be the next Ballon d'or?" example_title: "Question Answering" - text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering." example_title: "Logical reasoning" - text: "Please answer the following question. What is the boiling point of Nitrogen?" example_title: "Scientific knowledge" - text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?" example_title: "Yes/no question" - text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?" example_title: "Reasoning task" - text: "Q: ( False or not False or False ) is? A: Let's think step by step" example_title: "Boolean Expressions" - text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?" example_title: "Math reasoning" - text: "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?" example_title: "Premise and hypothesis" tags: - text2text-generation datasets: - svakulenk0/qrecc - taskmaster2 - djaym7/wiki_dialog - deepmind/code_contests - lambada - gsm8k - aqua_rat - esnli - quasc - qed license: apache-2.0 --- # Model Card for FLAN-T5 XXL <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg" alt="drawing" width="600"/> # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Uses](#uses) 4. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 5. [Training Details](#training-details) 6. [Evaluation](#evaluation) 7. [Environmental Impact](#environmental-impact) 8. [Citation](#citation) # TL;DR If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract : > Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large). # Model Details ## Model Description - **Model type:** Language model - **Language(s) (NLP):** English, German, French - **License:** Apache 2.0 - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2210.11416.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl", device_map="auto") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU using different precisions #### FP16 <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl", device_map="auto", torch_dtype=torch.float16) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> #### INT8 <details> <summary> Click to expand </summary> ```python # pip install bitsandbytes accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl", device_map="auto", load_in_8bit=True) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> # Uses ## Direct Use and Downstream Use The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that: > The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf): > Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application. ## Ethical considerations and risks > Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. ## Known Limitations > Flan-T5 has not been tested in real world applications. ## Sensitive Use: > Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech. # Training Details ## Training Data The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2): ![table.png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan_t5_tasks.png) ## Training Procedure According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf): > These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size. The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax). # Evaluation ## Testing Data, Factors & Metrics The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation: ![image.png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan_t5_evals_lang.png) For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf). ## Results For full results for FLAN-T5-XXL, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips ≥ 4. - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @misc{https://doi.org/10.48550/arxiv.2210.11416, doi = {10.48550/ARXIV.2210.11416}, url = {https://arxiv.org/abs/2210.11416}, author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason}, keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Scaling Instruction-Finetuned Language Models}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
10,304
[ [ -0.03863525390625, -0.0430908203125, 0.021453857421875, 0.00164794921875, -0.00437164306640625, -0.01271820068359375, -0.032012939453125, -0.04656982421875, -0.0088958740234375, 0.00672149658203125, -0.04022216796875, -0.03131103515625, -0.049468994140625, 0.0015430450439453125, -0.0219573974609375, 0.07666015625, -0.009033203125, -0.0056915283203125, 0.00617218017578125, -0.0065460205078125, -0.007244110107421875, -0.025238037109375, -0.047607421875, -0.0248870849609375, 0.033355712890625, 0.0207366943359375, 0.037811279296875, 0.0386962890625, 0.0394287109375, 0.026336669921875, -0.0156707763671875, -0.00661468505859375, -0.039093017578125, -0.03466796875, 0.0089569091796875, -0.032012939453125, -0.045928955078125, -0.0013608932495117188, 0.0350341796875, 0.038116455078125, 0.00431060791015625, 0.0235595703125, -0.005786895751953125, 0.0207672119140625, -0.038177490234375, 0.0223388671875, -0.022613525390625, 0.00670623779296875, -0.0186309814453125, 0.0087890625, -0.0190582275390625, -0.01535797119140625, 0.00591278076171875, -0.050323486328125, 0.043060302734375, -0.00971221923828125, 0.1114501953125, 0.0126800537109375, -0.0026035308837890625, -0.01085662841796875, -0.057373046875, 0.07257080078125, -0.07000732421875, 0.0362548828125, 0.01220703125, 0.0255126953125, 0.01009368896484375, -0.063720703125, -0.049163818359375, -0.0213470458984375, -0.01027679443359375, 0.01293182373046875, -0.00783538818359375, 0.0129852294921875, 0.042205810546875, 0.049530029296875, -0.031402587890625, -0.004180908203125, -0.053619384765625, -0.012908935546875, 0.05615234375, -0.0037479400634765625, 0.036468505859375, -0.00772857666015625, -0.0264129638671875, -0.03875732421875, -0.024200439453125, 0.01049041748046875, 0.023040771484375, 0.02996826171875, -0.041839599609375, 0.0304107666015625, -0.0005698204040527344, 0.035400390625, 0.0273590087890625, -0.032867431640625, 0.03973388671875, -0.0289154052734375, -0.0290985107421875, -0.01177978515625, 0.07415771484375, 0.0133514404296875, 0.017425537109375, -0.00684356689453125, -0.02752685546875, 0.0007576942443847656, 0.016571044921875, -0.07525634765625, -0.01194000244140625, 0.03265380859375, -0.02825927734375, -0.0396728515625, 0.0149383544921875, -0.06256103515625, -0.000011920928955078125, -0.0027446746826171875, 0.036346435546875, -0.034271240234375, -0.0450439453125, -0.0008959770202636719, -0.0148468017578125, 0.024261474609375, 0.0004763603210449219, -0.08172607421875, 0.0185699462890625, 0.039642333984375, 0.0643310546875, 0.006969451904296875, -0.0226287841796875, -0.0177459716796875, -0.00408172607421875, -0.011383056640625, 0.033111572265625, -0.0263519287109375, -0.0296478271484375, -0.002773284912109375, 0.0175018310546875, -0.014312744140625, -0.036224365234375, 0.053741455078125, -0.0225982666015625, 0.040496826171875, -0.021697998046875, -0.03948974609375, -0.027862548828125, -0.0059814453125, -0.051544189453125, 0.08544921875, 0.02691650390625, -0.057403564453125, 0.03204345703125, -0.068603515625, -0.032562255859375, -0.01042938232421875, 0.01203155517578125, -0.057952880859375, 0.0036106109619140625, 0.0255889892578125, 0.0288543701171875, -0.01910400390625, 0.0181732177734375, -0.037811279296875, -0.0217437744140625, -0.0121002197265625, -0.00954437255859375, 0.0794677734375, 0.0276031494140625, -0.060791015625, 0.025054931640625, -0.040008544921875, -0.0055389404296875, 0.0256500244140625, -0.0019521713256835938, 0.0166015625, -0.027679443359375, 0.0165557861328125, 0.0316162109375, 0.0163421630859375, -0.041778564453125, 0.0035991668701171875, -0.03851318359375, 0.03912353515625, 0.04254150390625, -0.01507568359375, 0.03302001953125, -0.03961181640625, 0.036407470703125, 0.01885986328125, 0.0217742919921875, -0.0041961669921875, -0.0309600830078125, -0.08612060546875, 0.00341033935546875, 0.018157958984375, 0.033538818359375, -0.043731689453125, 0.033111572265625, -0.036163330078125, -0.05560302734375, -0.03375244140625, 0.0086212158203125, 0.026519775390625, 0.0379638671875, 0.03955078125, -0.006458282470703125, -0.037567138671875, -0.054718017578125, -0.01229095458984375, -0.0003418922424316406, -0.005062103271484375, 0.02435302734375, 0.0556640625, -0.006313323974609375, 0.041229248046875, -0.026641845703125, -0.02862548828125, -0.03173828125, 0.005878448486328125, 0.01116180419921875, 0.050018310546875, 0.06402587890625, -0.045623779296875, -0.02880859375, 0.0012798309326171875, -0.0616455078125, -0.0013933181762695312, -0.01152801513671875, -0.0091552734375, 0.037384033203125, 0.022064208984375, -0.050872802734375, 0.029876708984375, 0.036041259765625, -0.0196990966796875, 0.02752685546875, -0.00875091552734375, 0.00390625, -0.08416748046875, 0.03857421875, 0.01114654541015625, -0.0120086669921875, -0.0543212890625, 0.0084075927734375, 0.003536224365234375, -0.01385498046875, -0.049896240234375, 0.056854248046875, -0.0252838134765625, 0.002101898193359375, -0.00821685791015625, 0.0015153884887695312, 0.000038623809814453125, 0.0484619140625, 0.00505828857421875, 0.060211181640625, 0.0284271240234375, -0.05523681640625, 0.00493621826171875, 0.01145172119140625, -0.016510009765625, 0.01520538330078125, -0.0584716796875, 0.01540374755859375, 0.0019521713256835938, 0.017791748046875, -0.049835205078125, -0.0263519287109375, 0.020263671875, -0.03704833984375, 0.03668212890625, 0.004718780517578125, -0.0293121337890625, -0.045928955078125, -0.0206146240234375, 0.027587890625, 0.052459716796875, -0.047454833984375, 0.0506591796875, 0.0175628662109375, 0.0242919921875, -0.040191650390625, -0.062103271484375, -0.02392578125, -0.036041259765625, -0.0640869140625, 0.04730224609375, -0.0019207000732421875, 0.00263214111328125, -0.0134124755859375, -0.0093994140625, -0.0007557868957519531, 0.00284576416015625, 0.01348876953125, 0.006683349609375, -0.014404296875, -0.01239013671875, -0.015289306640625, -0.01259613037109375, 0.004184722900390625, -0.024383544921875, 0.0443115234375, -0.0226287841796875, 0.0124359130859375, -0.059295654296875, -0.001720428466796875, 0.04473876953125, -0.0146331787109375, 0.0672607421875, 0.0849609375, -0.036956787109375, 0.0027618408203125, -0.043609619140625, -0.027008056640625, -0.0404052734375, 0.011505126953125, -0.03485107421875, -0.047027587890625, 0.050201416015625, 0.01800537109375, 0.0214385986328125, 0.064208984375, 0.037994384765625, -0.0004925727844238281, 0.0665283203125, 0.050262451171875, -0.00417327880859375, 0.057586669921875, -0.05419921875, 0.0174560546875, -0.04840087890625, -0.01529693603515625, -0.03460693359375, -0.019866943359375, -0.055267333984375, -0.02313232421875, 0.0233612060546875, 0.00722503662109375, -0.045440673828125, 0.033447265625, -0.032562255859375, 0.0035076141357421875, 0.0443115234375, 0.01502227783203125, -0.007381439208984375, 0.003551483154296875, -0.0095672607421875, 0.00008976459503173828, -0.051849365234375, -0.031768798828125, 0.076416015625, 0.03314208984375, 0.0294036865234375, 0.003482818603515625, 0.04937744140625, -0.00301361083984375, 0.022186279296875, -0.04058837890625, 0.0312347412109375, -0.01390838623046875, -0.0660400390625, -0.0028820037841796875, -0.03125, -0.06329345703125, 0.0017242431640625, -0.005794525146484375, -0.05780029296875, 0.0014667510986328125, 0.01171875, -0.031402587890625, 0.04693603515625, -0.06658935546875, 0.087158203125, -0.026092529296875, -0.038848876953125, -0.00281524658203125, -0.040374755859375, 0.03851318359375, 0.01264190673828125, 0.0032558441162109375, 0.0056610107421875, 0.0101776123046875, 0.060516357421875, -0.0550537109375, 0.055389404296875, -0.0343017578125, -0.0091094970703125, 0.028900146484375, -0.021820068359375, 0.0257110595703125, -0.0216827392578125, -0.00934600830078125, 0.033203125, 0.0028057098388671875, -0.043548583984375, -0.037200927734375, 0.05303955078125, -0.07781982421875, -0.042449951171875, -0.03485107421875, -0.026519775390625, 0.004444122314453125, 0.03338623046875, 0.03302001953125, 0.0236968994140625, 0.0031261444091796875, 0.004302978515625, 0.031280517578125, -0.033538818359375, 0.04925537109375, -0.0026836395263671875, -0.020599365234375, -0.0278167724609375, 0.0701904296875, 0.007770538330078125, 0.038330078125, 0.0233154296875, 0.0255584716796875, -0.0268707275390625, -0.01299285888671875, -0.036224365234375, 0.032379150390625, -0.0457763671875, -0.0114593505859375, -0.047454833984375, -0.016815185546875, -0.037841796875, -0.01212310791015625, -0.036468505859375, -0.027496337890625, -0.031524658203125, -0.00495147705078125, 0.0260009765625, 0.04534912109375, -0.004680633544921875, 0.031463623046875, -0.0462646484375, 0.028289794921875, 0.00498199462890625, 0.03363037109375, 0.0057220458984375, -0.051971435546875, -0.01087188720703125, 0.0205230712890625, -0.037109375, -0.043914794921875, 0.028900146484375, 0.0171051025390625, 0.0252532958984375, 0.037384033203125, -0.004604339599609375, 0.06689453125, -0.00800323486328125, 0.07696533203125, 0.005069732666015625, -0.077392578125, 0.038665771484375, -0.0340576171875, 0.034271240234375, 0.02362060546875, 0.0245361328125, -0.023040771484375, -0.01508331298828125, -0.08544921875, -0.054901123046875, 0.070556640625, 0.024078369140625, 0.003047943115234375, 0.0235595703125, 0.0186309814453125, -0.007808685302734375, 0.004161834716796875, -0.060516357421875, -0.0212554931640625, -0.036956787109375, -0.0197296142578125, -0.007175445556640625, -0.0025177001953125, -0.00437164306640625, -0.0302276611328125, 0.06103515625, 0.0013380050659179688, 0.05853271484375, 0.0121917724609375, -0.017120361328125, -0.0138397216796875, -0.00286865234375, 0.06707763671875, 0.0285491943359375, -0.027435302734375, -0.0144500732421875, 0.0280914306640625, -0.040740966796875, -0.007442474365234375, 0.01061248779296875, -0.0229644775390625, -0.00382232666015625, 0.02996826171875, 0.07861328125, 0.0095367431640625, -0.0242462158203125, 0.03485107421875, -0.00728607177734375, -0.031829833984375, -0.0352783203125, 0.0244903564453125, 0.009002685546875, 0.0020122528076171875, 0.0094146728515625, 0.006771087646484375, -0.0138092041015625, -0.0236968994140625, 0.00836181640625, 0.01324462890625, -0.0213470458984375, -0.037384033203125, 0.08502197265625, 0.01509857177734375, -0.0085296630859375, 0.046295166015625, -0.005382537841796875, -0.04052734375, 0.054718017578125, 0.029876708984375, 0.075439453125, -0.00806427001953125, -0.0027446746826171875, 0.06787109375, 0.0240936279296875, -0.004566192626953125, 0.0218658447265625, 0.0030975341796875, -0.040771484375, -0.01336669921875, -0.047454833984375, -0.0017747879028320312, 0.031280517578125, -0.03900146484375, 0.04034423828125, -0.0521240234375, -0.016693115234375, 0.00931549072265625, 0.03167724609375, -0.0760498046875, 0.0299530029296875, 0.02227783203125, 0.062744140625, -0.0579833984375, 0.056884765625, 0.041534423828125, -0.0655517578125, -0.08203125, -0.002712249755859375, -0.0054473876953125, -0.04034423828125, 0.03778076171875, 0.0304412841796875, -0.0017709732055664062, -0.0003800392150878906, -0.03961181640625, -0.06451416015625, 0.096435546875, 0.027679443359375, -0.032562255859375, -0.00714874267578125, 0.02117919921875, 0.0450439453125, -0.0171051025390625, 0.055419921875, 0.044097900390625, 0.05145263671875, 0.0036754608154296875, -0.077880859375, 0.0168609619140625, -0.0216064453125, 0.003887176513671875, 0.00296783447265625, -0.08056640625, 0.06585693359375, -0.02679443359375, -0.02655029296875, -0.00046515464782714844, 0.06689453125, 0.0159149169921875, 0.00677490234375, 0.043792724609375, 0.0469970703125, 0.057037353515625, -0.0196533203125, 0.100341796875, -0.042449951171875, 0.0498046875, 0.048004150390625, 0.016937255859375, 0.044677734375, 0.0195159912109375, -0.02197265625, 0.035308837890625, 0.0513916015625, -0.01099395751953125, 0.021270751953125, -0.0013666152954101562, -0.0164794921875, -0.0039043426513671875, -0.006378173828125, -0.0394287109375, 0.0222320556640625, 0.0285491943359375, -0.031005859375, -0.007106781005859375, -0.006435394287109375, 0.0294647216796875, -0.0247344970703125, -0.005527496337890625, 0.037078857421875, 0.01043701171875, -0.0570068359375, 0.07891845703125, 0.00833892822265625, 0.06329345703125, -0.035400390625, 0.01983642578125, -0.0209197998046875, 0.0280914306640625, -0.03436279296875, -0.031524658203125, 0.0245819091796875, -0.001453399658203125, -0.0020008087158203125, -0.01593017578125, 0.0390625, -0.031829833984375, -0.0538330078125, 0.024444580078125, 0.00995635986328125, 0.01131439208984375, 0.0157928466796875, -0.0660400390625, 0.02020263671875, 0.0098419189453125, -0.0281982421875, 0.009796142578125, 0.006557464599609375, 0.00342559814453125, 0.044677734375, 0.041015625, -0.0167236328125, 0.01580810546875, 0.01042938232421875, 0.057647705078125, -0.050689697265625, -0.0215606689453125, -0.04742431640625, 0.049652099609375, -0.00487518310546875, -0.039794921875, 0.048370361328125, 0.043548583984375, 0.08709716796875, -0.009124755859375, 0.06951904296875, -0.0308990478515625, 0.01971435546875, -0.0276336669921875, 0.051849365234375, -0.05975341796875, 0.00138092041015625, -0.0306854248046875, -0.0623779296875, -0.016815185546875, 0.060028076171875, -0.03253173828125, 0.048980712890625, 0.056182861328125, 0.06488037109375, -0.02728271484375, 0.006786346435546875, 0.0100860595703125, 0.0206298828125, 0.050933837890625, 0.049835205078125, 0.0157318115234375, -0.07080078125, 0.04095458984375, -0.062744140625, 0.005245208740234375, -0.0176849365234375, -0.05377197265625, -0.074951171875, -0.042877197265625, -0.02703857421875, -0.039276123046875, -0.01045989990234375, 0.0587158203125, 0.059356689453125, -0.0753173828125, -0.0206146240234375, -0.0237274169921875, -0.00800323486328125, -0.0207366943359375, -0.0198822021484375, 0.0384521484375, -0.037384033203125, -0.08355712890625, 0.00677490234375, -0.017486572265625, 0.018341064453125, -0.024078369140625, -0.0125732421875, -0.0218658447265625, -0.0204315185546875, 0.025604248046875, 0.0299072265625, -0.060333251953125, -0.031524658203125, 0.007671356201171875, -0.0069732666015625, 0.013458251953125, 0.036773681640625, -0.037353515625, 0.02655029296875, 0.041534423828125, 0.037353515625, 0.0631103515625, -0.00019276142120361328, 0.04766845703125, -0.037017822265625, 0.0300750732421875, 0.0022945404052734375, 0.0212554931640625, 0.031097412109375, -0.0218048095703125, 0.041961669921875, 0.0245819091796875, -0.03668212890625, -0.05743408203125, -0.01152801513671875, -0.07568359375, 0.0024509429931640625, 0.093505859375, -0.016998291015625, -0.03436279296875, 0.01995849609375, 0.0006880760192871094, 0.044708251953125, -0.031280517578125, 0.0501708984375, 0.0487060546875, 0.0033283233642578125, -0.02734375, -0.054168701171875, 0.04901123046875, 0.041229248046875, -0.058074951171875, -0.015350341796875, 0.00917816162109375, 0.037078857421875, 0.0189056396484375, 0.0277252197265625, -0.007549285888671875, 0.0111236572265625, 0.0133209228515625, 0.019439697265625, -0.01021575927734375, -0.0084686279296875, -0.0217437744140625, 0.0036258697509765625, -0.01212310791015625, -0.0085601806640625 ] ]
monster-labs/control_v1p_sd15_qrcode_monster
2023-07-21T11:35:31.000Z
[ "diffusers", "stable-diffusion", "controlnet", "qrcode", "en", "license:openrail++", "has_space", "diffusers:ControlNetModel", "region:us" ]
null
monster-labs
null
null
monster-labs/control_v1p_sd15_qrcode_monster
1,051
230,519
diffusers
2023-06-24T15:07:20
--- tags: - stable-diffusion - controlnet - qrcode license: openrail++ language: - en --- # Controlnet QR Code Monster v2 For SD-1.5 ![QR code in shape of a blue monster, reading "https://qrcode.monster"](images/monster.png) ## Model Description This model is made to generate creative QR codes that still scan. Keep in mind that not all generated codes might be readable, but you can try different parameters and prompts to get the desired results. **NEW VERSION** Introducing the upgraded version of our model - Controlnet QR code Monster v2. V2 is a huge upgrade over v1, for scannability AND creativity. QR codes can now seamlessly blend the image by using a gray-colored background (#808080). As with the former version, the readability of some generated codes may vary, however playing around with parameters and prompts could yield better results. You can find in in the `v2/` subfolder. ## How to Use - **Condition**: QR codes are passed as condition images with a module size of 16px. Use a higher error correction level to make it easier to read (sometimes a lower level can be easier to read if smaller in size). Use a gray background for the rest of the image to make the code integrate better. - **Prompts**: Use a prompt to guide the QR code generation. The output will highly depend on the given prompt. Some seem to be really easily accepted by the qr code process, some will require careful tweaking to get good results. - **Controlnet guidance scale**: Set the controlnet guidance scale value: - High values: The generated QR code will be more readable. - Low values: The generated QR code will be more creative. ### Tips - For an optimally readable output, try generating multiple QR codes with similar parameters, then choose the best ones. - Use the Image-to-Image feature to improve the readability of a generated QR code: - Decrease the denoising strength to retain more of the original image. - Increase the controlnet guidance scale value for better readability. A typical workflow for "saving" a code would be : Max out the guidance scale and minimize the denoising strength, then bump the strength until the code scans. ## Example Outputs Here are some examples of creative, yet scannable QR codes produced by our model: ![City ruins with a building facade in shape of a QR code, reading "https://qrcode.monster"](images/architecture.png) ![QR code in shape of a tree, reading "https://qrcode.monster"](images/tree.png) ![A gothic sculpture in shape of a QR code, reading "https://qrcode.monster"](images/skulls.png) Feel free to experiment with prompts, parameters, and the Image-to-Image feature to achieve the desired QR code output. Good luck and have fun!
2,721
[ [ -0.0288848876953125, -0.01335906982421875, 0.0085601806640625, 0.004741668701171875, -0.05535888671875, -0.01030731201171875, 0.0206756591796875, -0.047088623046875, 0.040771484375, 0.05767822265625, -0.0258636474609375, -0.041961669921875, -0.0268096923828125, 0.016876220703125, -0.00449371337890625, 0.06640625, 0.0017843246459960938, 0.00563812255859375, 0.0224609375, 0.004749298095703125, -0.045654296875, -0.03448486328125, -0.08123779296875, -0.016448974609375, 0.053436279296875, 0.027008056640625, 0.057647705078125, 0.06536865234375, 0.04583740234375, 0.01161956787109375, -0.023284912109375, 0.01763916015625, -0.0433349609375, 0.002239227294921875, -0.0192108154296875, -0.0224609375, -0.042205810546875, -0.01702880859375, 0.0165557861328125, -0.020782470703125, -0.001354217529296875, -0.01154327392578125, -0.004055023193359375, 0.04791259765625, -0.034759521484375, 0.0070343017578125, -0.018829345703125, 0.025238037109375, 0.00714874267578125, -0.044525146484375, -0.015045166015625, -0.04962158203125, -0.033416748046875, -0.037628173828125, -0.0049591064453125, 0.00209808349609375, 0.07122802734375, 0.01557159423828125, -0.0472412109375, -0.036224365234375, -0.07135009765625, 0.0343017578125, -0.044708251953125, 0.01090240478515625, 0.03350830078125, 0.050140380859375, -0.012725830078125, -0.074951171875, -0.056854248046875, -0.028472900390625, 0.0200958251953125, 0.03375244140625, -0.03204345703125, -0.005889892578125, 0.038665771484375, 0.018768310546875, -0.030792236328125, -0.0008616447448730469, -0.0675048828125, -0.0079498291015625, 0.06170654296875, 0.031890869140625, 0.032745361328125, -0.024169921875, -0.027313232421875, -0.040679931640625, -0.038909912109375, 0.029022216796875, 0.01120758056640625, -0.0123748779296875, -0.02105712890625, 0.056854248046875, -0.006206512451171875, 0.0248260498046875, 0.03033447265625, -0.0173797607421875, 0.0121612548828125, -0.03466796875, 0.0001481771469116211, -0.00024211406707763672, 0.043731689453125, 0.06768798828125, 0.0016336441040039062, 0.007537841796875, -0.03326416015625, -0.0017480850219726562, 0.058563232421875, -0.07672119140625, -0.0447998046875, 0.03228759765625, -0.05133056640625, -0.0264739990234375, -0.0027923583984375, -0.03570556640625, -0.038177490234375, -0.0127716064453125, 0.0310821533203125, -0.0277862548828125, -0.03411865234375, 0.03204345703125, -0.0250244140625, 0.021759033203125, 0.042816162109375, -0.048736572265625, -0.00514984130859375, 0.0139312744140625, 0.0631103515625, -0.001399993896484375, 0.004222869873046875, -0.0179290771484375, -0.002777099609375, -0.055572509765625, 0.04754638671875, -0.003978729248046875, -0.042938232421875, -0.0223388671875, 0.015472412109375, 0.0224609375, -0.054473876953125, 0.042633056640625, -0.06463623046875, -0.028411865234375, -0.01605224609375, -0.0194549560546875, -0.0279998779296875, 0.00279998779296875, -0.07403564453125, 0.051513671875, 0.022735595703125, -0.0697021484375, 0.006893157958984375, -0.05279541015625, -0.0092926025390625, 0.0288238525390625, -0.01039886474609375, -0.0218963623046875, -0.0030879974365234375, -0.026458740234375, 0.01198577880859375, 0.00933837890625, -0.0155487060546875, 0.0005145072937011719, -0.035797119140625, 0.020233154296875, 0.0122528076171875, 0.0780029296875, 0.04833984375, -0.023651123046875, 0.00022530555725097656, -0.04559326171875, 0.038604736328125, 0.0137786865234375, -0.04119873046875, 0.0001232624053955078, 0.0014600753784179688, 0.01922607421875, 0.0257110595703125, 0.0247344970703125, -0.032867431640625, 0.013824462890625, -0.0017480850219726562, 0.025299072265625, 0.00597381591796875, 0.031585693359375, 0.0200042724609375, -0.0472412109375, 0.0709228515625, 0.0194244384765625, 0.02691650390625, 0.00641632080078125, -0.029876708984375, -0.0303955078125, -0.01282501220703125, -0.0034847259521484375, 0.06597900390625, -0.100341796875, 0.038604736328125, 0.00940704345703125, -0.0625, -0.011810302734375, -0.01049041748046875, 0.024169921875, 0.0204620361328125, 0.00911712646484375, -0.032806396484375, -0.035888671875, -0.05865478515625, 0.015838623046875, 0.00036597251892089844, -0.0159759521484375, 0.00823211669921875, 0.0494384765625, -0.04095458984375, 0.058441162109375, -0.030181884765625, -0.0003139972686767578, -0.028839111328125, -0.0113525390625, 0.0282135009765625, 0.0675048828125, 0.035064697265625, -0.08392333984375, -0.008026123046875, 0.01381683349609375, -0.04315185546875, 0.010894775390625, -0.0183563232421875, -0.01959228515625, -0.0128326416015625, 0.02557373046875, -0.0219573974609375, 0.0511474609375, 0.0280303955078125, -0.0292510986328125, 0.062286376953125, -0.039093017578125, 0.0191802978515625, -0.10302734375, 0.005031585693359375, 0.0004260540008544922, -0.00797271728515625, -0.040557861328125, 0.040252685546875, 0.025482177734375, -0.004180908203125, -0.03826904296875, 0.031036376953125, -0.0238189697265625, 0.00004011392593383789, -0.0295257568359375, -0.0156402587890625, 0.01302337646484375, 0.038818359375, 0.004150390625, 0.0662841796875, 0.0303497314453125, -0.06829833984375, 0.061370849609375, 0.01666259765625, -0.037994384765625, 0.01464080810546875, -0.0845947265625, 0.0235137939453125, -0.01611328125, 0.0200347900390625, -0.047088623046875, -0.01050567626953125, 0.05462646484375, -0.040374755859375, 0.0260467529296875, -0.00909423828125, -0.04046630859375, -0.033660888671875, -0.0303497314453125, 0.0228118896484375, 0.064208984375, -0.037109375, 0.023284912109375, 0.0193634033203125, 0.007171630859375, 0.00215911865234375, -0.06805419921875, 0.0162353515625, -0.0148773193359375, -0.03125, 0.01922607421875, -0.0168914794921875, -0.0027675628662109375, -0.00867462158203125, 0.006805419921875, -0.0264739990234375, -0.005199432373046875, 0.023284912109375, 0.0057525634765625, -0.0128631591796875, -0.006317138671875, 0.0215911865234375, -0.010467529296875, -0.007633209228515625, -0.02142333984375, 0.0216064453125, 0.00926971435546875, -0.000354766845703125, -0.038055419921875, 0.03125, 0.054229736328125, -0.039398193359375, 0.029541015625, 0.037841796875, -0.0538330078125, -0.0201873779296875, 0.00001233816146850586, -0.0167083740234375, -0.040771484375, 0.0287933349609375, -0.042572021484375, -0.058135986328125, 0.044097900390625, 0.004726409912109375, 0.0129547119140625, 0.0184173583984375, 0.0323486328125, -0.0109710693359375, 0.0849609375, 0.044921875, 0.046295166015625, 0.037689208984375, -0.034698486328125, 0.029754638671875, -0.0811767578125, -0.054443359375, -0.04327392578125, -0.00543212890625, -0.04412841796875, -0.0280609130859375, 0.041473388671875, 0.07440185546875, -0.0201416015625, 0.042388916015625, -0.06365966796875, 0.021942138671875, 0.04058837890625, 0.041961669921875, 0.00363922119140625, 0.018707275390625, -0.002246856689453125, -0.016693115234375, -0.05169677734375, -0.049957275390625, 0.03680419921875, 0.03460693359375, 0.06439208984375, 0.0151214599609375, 0.059173583984375, 0.027313232421875, -0.01346588134765625, -0.062103271484375, 0.0287322998046875, -0.017120361328125, -0.0261077880859375, -0.010833740234375, -0.0240020751953125, -0.07403564453125, 0.0019741058349609375, -0.0243682861328125, -0.043914794921875, 0.040924072265625, 0.02386474609375, -0.03338623046875, 0.033782958984375, -0.041961669921875, 0.049407958984375, -0.00986480712890625, -0.01371002197265625, 0.00600433349609375, -0.02197265625, 0.027191162109375, -0.0120849609375, -0.0137481689453125, 0.033416748046875, 0.0002913475036621094, 0.047393798828125, -0.04949951171875, 0.0496826171875, 0.0006289482116699219, -0.029876708984375, 0.04644775390625, 0.02447509765625, 0.035888671875, 0.0023193359375, -0.01425933837890625, -0.00600433349609375, 0.0325927734375, -0.0308837890625, -0.050933837890625, 0.0085296630859375, -0.047454833984375, -0.0106658935546875, -0.0035762786865234375, -0.0213470458984375, 0.039764404296875, 0.0204010009765625, 0.06494140625, 0.062164306640625, 0.0189056396484375, 0.004608154296875, 0.06549072265625, -0.03265380859375, 0.0075225830078125, 0.01409149169921875, -0.02545166015625, -0.044281005859375, 0.06268310546875, 0.0163421630859375, 0.0023479461669921875, 0.0214691162109375, 0.0031909942626953125, -0.0389404296875, -0.053741455078125, -0.045745849609375, -0.00540924072265625, -0.057098388671875, -0.026458740234375, -0.051422119140625, -0.0277862548828125, -0.0279998779296875, -0.0183868408203125, -0.0059661865234375, -0.011871337890625, -0.03375244140625, 0.015594482421875, 0.054229736328125, 0.068115234375, -0.0322265625, 0.0254364013671875, -0.068603515625, 0.03228759765625, 0.0286102294921875, 0.0214080810546875, 0.00565338134765625, -0.0305938720703125, -0.02618408203125, 0.0200653076171875, -0.0111236572265625, -0.0826416015625, 0.03656005859375, -0.030181884765625, 0.0088348388671875, 0.040283203125, 0.031341552734375, 0.01873779296875, -0.022735595703125, 0.053436279296875, 0.039581298828125, -0.04742431640625, 0.03564453125, -0.04229736328125, 0.034454345703125, 0.0258941650390625, 0.020233154296875, -0.050140380859375, -0.013885498046875, -0.0149078369140625, -0.035919189453125, 0.041473388671875, 0.0093231201171875, 0.01470184326171875, 0.0020580291748046875, 0.06195068359375, -0.01873779296875, -0.0027446746826171875, -0.04156494140625, -0.0095977783203125, -0.042877197265625, 0.0008149147033691406, -0.001171112060546875, -0.043609619140625, 0.01352691650390625, -0.0224151611328125, 0.0272369384765625, 0.00629425048828125, 0.041656494140625, 0.025482177734375, 0.0244598388671875, -0.0110321044921875, 0.0006656646728515625, 0.060455322265625, 0.042877197265625, -0.0189056396484375, 0.003986358642578125, -0.00528717041015625, -0.07769775390625, 0.00328826904296875, -0.027740478515625, -0.0295867919921875, -0.002277374267578125, 0.00579071044921875, 0.051239013671875, -0.01288604736328125, -0.032501220703125, 0.01517486572265625, -0.016632080078125, -0.03436279296875, -0.038665771484375, 0.04400634765625, 0.0112457275390625, 0.039947509765625, 0.0251007080078125, 0.012451171875, 0.0216827392578125, -0.021026611328125, 0.010528564453125, -0.004756927490234375, -0.01502227783203125, -0.011810302734375, 0.058441162109375, 0.004650115966796875, -0.04425048828125, 0.0364990234375, -0.0748291015625, -0.052886962890625, 0.0924072265625, 0.0665283203125, 0.06365966796875, 0.0041656494140625, 0.0265655517578125, 0.055755615234375, 0.043212890625, -0.007556915283203125, 0.07159423828125, 0.0029754638671875, -0.05975341796875, -0.002040863037109375, -0.0267486572265625, -0.035003662109375, 0.00548553466796875, -0.06866455078125, 0.0263519287109375, -0.0616455078125, -0.022216796875, -0.0166778564453125, 0.0303955078125, -0.036651611328125, 0.04400634765625, -0.01080322265625, 0.08587646484375, -0.01313018798828125, 0.068603515625, 0.07586669921875, -0.031982421875, -0.08892822265625, -0.0200347900390625, -0.0159149169921875, -0.0301361083984375, 0.033599853515625, -0.0006542205810546875, -0.026214599609375, 0.0209808349609375, -0.07806396484375, -0.053436279296875, 0.0950927734375, -0.01470947265625, -0.0487060546875, 0.01325225830078125, -0.0263214111328125, 0.03594970703125, -0.0305023193359375, 0.007106781005859375, 0.0021266937255859375, 0.019134521484375, 0.0285797119140625, -0.041046142578125, 0.01727294921875, -0.050933837890625, 0.035980224609375, 0.007167816162109375, -0.0288848876953125, 0.07220458984375, -0.03717041015625, -0.042633056640625, 0.017578125, 0.038787841796875, 0.030731201171875, 0.035736083984375, 0.0221405029296875, 0.0270538330078125, 0.061004638671875, 0.022735595703125, 0.0831298828125, -0.02105712890625, 0.010650634765625, 0.066650390625, -0.012451171875, 0.0460205078125, -0.002742767333984375, 0.0036907196044921875, 0.007808685302734375, 0.06298828125, -0.0433349609375, 0.039947509765625, 0.041290283203125, -0.00841522216796875, 0.0021114349365234375, -0.0132904052734375, -0.03765869140625, 0.01308441162109375, 0.00791168212890625, -0.0022029876708984375, -0.0051727294921875, 0.01715087890625, 0.003040313720703125, -0.007625579833984375, -0.045806884765625, 0.03515625, 0.00010836124420166016, -0.00945281982421875, 0.0548095703125, 0.034515380859375, 0.070556640625, -0.039093017578125, -0.0269317626953125, -0.0252532958984375, -0.0037097930908203125, -0.0312347412109375, -0.048553466796875, 0.0257110595703125, 0.005466461181640625, -0.034027099609375, 0.0015344619750976562, 0.05181884765625, -0.0185546875, -0.0200653076171875, 0.01221466064453125, 0.01342010498046875, 0.05438232421875, 0.0106964111328125, -0.037689208984375, 0.015045166015625, 0.0197906494140625, 0.0176544189453125, -0.002285003662109375, 0.03564453125, -0.00337982177734375, 0.04412841796875, 0.036773681640625, 0.01273345947265625, 0.0169219970703125, -0.01074981689453125, 0.053253173828125, -0.06005859375, -0.05096435546875, -0.01476287841796875, 0.0266265869140625, 0.0147705078125, -0.051788330078125, 0.041290283203125, 0.038787841796875, 0.0645751953125, -0.02545166015625, 0.044097900390625, -0.005100250244140625, 0.021697998046875, -0.050018310546875, 0.08966064453125, -0.051300048828125, -0.004924774169921875, -0.0035076141357421875, -0.0560302734375, -0.005794525146484375, 0.050872802734375, 0.004596710205078125, 0.011383056640625, 0.03662109375, 0.07830810546875, 0.01399993896484375, -0.00743865966796875, 0.02777099609375, -0.002269744873046875, 0.0089874267578125, 0.04779052734375, 0.07220458984375, -0.04443359375, 0.022705078125, -0.03155517578125, -0.0233154296875, -0.0142364501953125, -0.053863525390625, -0.06719970703125, -0.0391845703125, -0.045013427734375, -0.06536865234375, -0.0012807846069335938, 0.050933837890625, 0.051910400390625, -0.04083251953125, -0.0011224746704101562, -0.00267791748046875, 0.00884246826171875, -0.004276275634765625, -0.0185546875, 0.0167999267578125, 0.0193939208984375, -0.04315185546875, 0.003856658935546875, 0.002490997314453125, 0.052093505859375, 0.00875091552734375, -0.0300445556640625, 0.03240966796875, 0.006748199462890625, 0.03033447265625, 0.040985107421875, -0.01800537109375, 0.00136566162109375, 0.0006260871887207031, -0.043792724609375, -0.0035572052001953125, 0.05426025390625, -0.028564453125, 0.029205322265625, 0.039276123046875, 0.027862548828125, 0.0293426513671875, -0.0106658935546875, 0.01293182373046875, 0.007747650146484375, -0.00917816162109375, 0.0269927978515625, -0.00885009765625, 0.003021240234375, -0.0499267578125, 0.018218994140625, -0.017059326171875, -0.0272369384765625, -0.039520263671875, 0.028228759765625, -0.0892333984375, -0.026580810546875, 0.07757568359375, -0.005237579345703125, -0.0220489501953125, -0.023162841796875, -0.041168212890625, 0.025115966796875, -0.0178985595703125, 0.036773681640625, -0.0005331039428710938, -0.0010395050048828125, -0.046112060546875, -0.043212890625, 0.051055908203125, 0.0010814666748046875, -0.058990478515625, -0.0382080078125, 0.0433349609375, 0.0146026611328125, 0.018280029296875, 0.045623779296875, -0.0274505615234375, 0.056976318359375, 0.031280517578125, 0.046722412109375, -0.02685546875, -0.02032470703125, -0.06298828125, -0.0103912353515625, 0.0020313262939453125, -0.04266357421875 ] ]
hustvl/yolos-tiny
2023-06-05T11:57:44.000Z
[ "transformers", "pytorch", "safetensors", "yolos", "object-detection", "vision", "dataset:coco", "arxiv:2106.00666", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
object-detection
hustvl
null
null
hustvl/yolos-tiny
146
230,156
transformers
2022-04-26T09:28:47
--- license: apache-2.0 tags: - object-detection - vision datasets: - coco widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg example_title: Savanna - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg example_title: Football Match - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg example_title: Airport --- # YOLOS (tiny-sized) model YOLOS model fine-tuned on COCO 2017 object detection (118k annotated images). It was introduced in the paper [You Only Look at One Sequence: Rethinking Transformer in Vision through Object Detection](https://arxiv.org/abs/2106.00666) by Fang et al. and first released in [this repository](https://github.com/hustvl/YOLOS). Disclaimer: The team releasing YOLOS did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description YOLOS is a Vision Transformer (ViT) trained using the DETR loss. Despite its simplicity, a base-sized YOLOS model is able to achieve 42 AP on COCO validation 2017 (similar to DETR and more complex frameworks such as Faster R-CNN). The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ## Intended uses & limitations You can use the raw model for object detection. See the [model hub](https://huggingface.co/models?search=hustvl/yolos) to look for all available YOLOS models. ### How to use Here is how to use this model: ```python from transformers import YolosImageProcessor, YolosForObjectDetection from PIL import Image import torch import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) model = YolosForObjectDetection.from_pretrained('hustvl/yolos-tiny') image_processor = YolosImageProcessor.from_pretrained("hustvl/yolos-tiny") inputs = image_processor(images=image, return_tensors="pt") outputs = model(**inputs) # model predicts bounding boxes and corresponding COCO classes logits = outputs.logits bboxes = outputs.pred_boxes # print results target_sizes = torch.tensor([image.size[::-1]]) results = image_processor.post_process_object_detection(outputs, threshold=0.9, target_sizes=target_sizes)[0] for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] print( f"Detected {model.config.id2label[label.item()]} with confidence " f"{round(score.item(), 3)} at location {box}" ) ``` Currently, both the feature extractor and model support PyTorch. ## Training data The YOLOS model was pre-trained on [ImageNet-1k](https://huggingface.co/datasets/imagenet2012) and fine-tuned on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ### Training The model was pre-trained for 300 epochs on ImageNet-1k and fine-tuned for 300 epochs on COCO. ## Evaluation results This model achieves an AP (average precision) of **28.7** on COCO 2017 validation. For more details regarding evaluation results, we refer to the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-00666, author = {Yuxin Fang and Bencheng Liao and Xinggang Wang and Jiemin Fang and Jiyang Qi and Rui Wu and Jianwei Niu and Wenyu Liu}, title = {You Only Look at One Sequence: Rethinking Transformer in Vision through Object Detection}, journal = {CoRR}, volume = {abs/2106.00666}, year = {2021}, url = {https://arxiv.org/abs/2106.00666}, eprinttype = {arXiv}, eprint = {2106.00666}, timestamp = {Fri, 29 Apr 2022 19:49:16 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-00666.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
4,618
[ [ -0.050323486328125, -0.043914794921875, 0.007465362548828125, -0.01497650146484375, -0.0221099853515625, -0.0215606689453125, -0.00054168701171875, -0.059326171875, 0.01093292236328125, 0.0343017578125, -0.03692626953125, -0.032867431640625, -0.0419921875, 0.0208740234375, -0.0241546630859375, 0.06390380859375, 0.0178680419921875, -0.0261993408203125, -0.0151519775390625, -0.0166015625, -0.0259857177734375, -0.005649566650390625, -0.030670166015625, -0.00754547119140625, 0.023345947265625, 0.024078369140625, 0.049072265625, 0.05499267578125, 0.0379638671875, 0.0289764404296875, -0.0236968994140625, 0.01473236083984375, -0.026885986328125, -0.025390625, 0.0029296875, -0.044189453125, -0.030364990234375, 0.00390625, 0.0311737060546875, 0.0282135009765625, 0.00907135009765625, 0.02337646484375, -0.0013427734375, 0.038787841796875, -0.052337646484375, 0.01425933837890625, -0.047943115234375, 0.023284912109375, -0.01490020751953125, 0.0031490325927734375, -0.034393310546875, -0.0096893310546875, 0.0114288330078125, -0.0465087890625, 0.051788330078125, 0.01099395751953125, 0.096435546875, 0.01493072509765625, -0.00125885009765625, 0.0035915374755859375, -0.0226287841796875, 0.0577392578125, -0.0489501953125, 0.03466796875, 0.02044677734375, 0.02996826171875, 0.0033092498779296875, -0.0631103515625, -0.044921875, -0.0150299072265625, -0.02337646484375, -0.00684356689453125, -0.0204315185546875, -0.01546478271484375, 0.036163330078125, 0.0269775390625, -0.0382080078125, -0.004550933837890625, -0.058624267578125, -0.0271148681640625, 0.055145263671875, 0.0032596588134765625, 0.00670623779296875, -0.01076507568359375, -0.03277587890625, -0.02349853515625, -0.0302734375, 0.026885986328125, 0.0292816162109375, 0.0077972412109375, -0.01537322998046875, 0.0382080078125, -0.02593994140625, 0.07257080078125, 0.0225830078125, -0.0238800048828125, 0.0303955078125, -0.012481689453125, -0.032958984375, 0.00420379638671875, 0.0859375, 0.023345947265625, 0.0181121826171875, -0.0088653564453125, 0.002185821533203125, 0.0037689208984375, 0.02081298828125, -0.0672607421875, -0.0194854736328125, -0.004161834716796875, -0.0258941650390625, -0.042510986328125, 0.026824951171875, -0.058349609375, -0.00020325183868408203, 0.0032634735107421875, 0.027008056640625, -0.018280029296875, -0.00933074951171875, 0.0226898193359375, 0.005645751953125, 0.03753662109375, 0.01739501953125, -0.04541015625, 0.0069732666015625, 0.020050048828125, 0.0634765625, -0.003902435302734375, -0.0258941650390625, -0.0277557373046875, -0.0323486328125, -0.036712646484375, 0.06427001953125, -0.01035308837890625, -0.018402099609375, -0.0242462158203125, 0.03875732421875, -0.002979278564453125, -0.031280517578125, 0.041778564453125, -0.0297698974609375, 0.01380157470703125, -0.013397216796875, -0.002162933349609375, -0.0294647216796875, 0.042755126953125, -0.0443115234375, 0.0809326171875, 0.0161285400390625, -0.06231689453125, 0.0283660888671875, -0.02880859375, -0.022552490234375, -0.007396697998046875, 0.000598907470703125, -0.0673828125, 0.007343292236328125, 0.0333251953125, 0.0308837890625, -0.0298614501953125, 0.0135955810546875, -0.041595458984375, -0.027130126953125, 0.01537322998046875, -0.025482177734375, 0.07183837890625, 0.01267242431640625, -0.021087646484375, 0.009246826171875, -0.045806884765625, 0.0039825439453125, 0.0447998046875, 0.0008854866027832031, 0.017120361328125, -0.01910400390625, -0.004913330078125, 0.018890380859375, 0.00592803955078125, -0.045379638671875, 0.0242462158203125, -0.02801513671875, 0.029876708984375, 0.037078857421875, -0.0167236328125, 0.021087646484375, -0.0121917724609375, 0.0282440185546875, 0.0135345458984375, 0.038238525390625, -0.03729248046875, -0.059539794921875, -0.052276611328125, -0.00908660888671875, -0.005889892578125, 0.01323699951171875, -0.031005859375, 0.051910400390625, -0.0121002197265625, -0.038787841796875, -0.036376953125, -0.006237030029296875, 0.021484375, 0.05859375, 0.036590576171875, -0.0413818359375, -0.043731689453125, -0.07568359375, 0.0168304443359375, 0.004451751708984375, 0.007770538330078125, 0.006988525390625, 0.0506591796875, -0.01255035400390625, 0.09197998046875, -0.058990478515625, -0.032928466796875, -0.0103607177734375, -0.008453369140625, 0.0158843994140625, 0.03302001953125, 0.0633544921875, -0.07244873046875, -0.040252685546875, 0.0033168792724609375, -0.06689453125, 0.00888824462890625, 0.030609130859375, -0.00164794921875, 0.0175933837890625, 0.0250396728515625, -0.0150909423828125, 0.07440185546875, 0.01314544677734375, -0.01947021484375, 0.046356201171875, -0.004405975341796875, 0.01213836669921875, -0.055328369140625, 0.01055908203125, 0.00846099853515625, -0.020904541015625, -0.045440673828125, 0.00832366943359375, 0.01210784912109375, -0.01290130615234375, -0.058135986328125, 0.032073974609375, -0.03497314453125, -0.01276397705078125, -0.0292816162109375, -0.004016876220703125, 0.03070068359375, 0.05859375, 0.01959228515625, 0.0291748046875, 0.06219482421875, -0.045196533203125, 0.0279693603515625, 0.0171661376953125, -0.03204345703125, 0.048583984375, -0.063720703125, 0.0204620361328125, -0.016357421875, 0.01024627685546875, -0.06744384765625, -0.0156402587890625, 0.04119873046875, -0.032073974609375, 0.034332275390625, -0.02850341796875, -0.01470184326171875, -0.08056640625, -0.037078857421875, 0.04510498046875, 0.021270751953125, -0.04705810546875, 0.02655029296875, 0.004558563232421875, 0.042236328125, -0.06451416015625, -0.0657958984375, -0.0154876708984375, -0.00757598876953125, -0.046051025390625, 0.0233154296875, -0.0180511474609375, 0.0187530517578125, 0.01094818115234375, -0.0259857177734375, 0.004825592041015625, -0.0012826919555664062, 0.0182952880859375, 0.046844482421875, -0.02044677734375, -0.0238037109375, -0.023223876953125, -0.0135955810546875, -0.01184844970703125, -0.0286712646484375, 0.048492431640625, -0.041961669921875, -0.0296630859375, -0.053955078125, -0.0029926300048828125, 0.0489501953125, -0.0306549072265625, 0.0518798828125, 0.06640625, -0.042877197265625, 0.020263671875, -0.05194091796875, -0.00739288330078125, -0.03753662109375, 0.0293121337890625, -0.033843994140625, -0.03460693359375, 0.051239013671875, 0.043365478515625, -0.0122528076171875, 0.0506591796875, 0.03436279296875, 0.0127410888671875, 0.06256103515625, 0.03912353515625, -0.006549835205078125, 0.037139892578125, -0.07366943359375, 0.0226287841796875, -0.07879638671875, -0.05303955078125, -0.03387451171875, -0.024688720703125, -0.038543701171875, -0.048065185546875, 0.0267333984375, 0.035919189453125, -0.023895263671875, 0.06488037109375, -0.07171630859375, 0.03350830078125, 0.039581298828125, 0.039031982421875, 0.0114288330078125, -0.00859832763671875, 0.0121612548828125, 0.00901031494140625, -0.058349609375, -0.0193023681640625, 0.077880859375, 0.0219879150390625, 0.0538330078125, -0.0290985107421875, 0.036468505859375, 0.007232666015625, 0.0242919921875, -0.06781005859375, 0.0419921875, 0.01082611083984375, -0.056854248046875, -0.01079559326171875, -0.01593017578125, -0.07342529296875, 0.019683837890625, -0.01959228515625, -0.083984375, 0.0462646484375, 0.013336181640625, -0.0158233642578125, 0.04461669921875, -0.043121337890625, 0.08636474609375, -0.0238800048828125, -0.043365478515625, 0.0265655517578125, -0.057342529296875, 0.0308380126953125, 0.004566192626953125, -0.01236724853515625, -0.01377105712890625, 0.0267486572265625, 0.0621337890625, -0.0293121337890625, 0.050445556640625, -0.0018482208251953125, 0.01270294189453125, 0.062286376953125, -0.003376007080078125, 0.0340576171875, 0.004222869873046875, 0.0021610260009765625, 0.044647216796875, -0.0010633468627929688, -0.0289154052734375, -0.022674560546875, 0.036712646484375, -0.045440673828125, -0.041717529296875, -0.0305633544921875, -0.043609619140625, 0.0255889892578125, 0.02777099609375, 0.0655517578125, 0.027679443359375, -0.0007519721984863281, 0.02325439453125, 0.04010009765625, -0.0253753662109375, 0.036224365234375, 0.0121917724609375, -0.0162200927734375, -0.01136016845703125, 0.056488037109375, 0.0007786750793457031, 0.0165252685546875, 0.0262451171875, 0.036956787109375, -0.027191162109375, -0.02227783203125, -0.029266357421875, 0.0183258056640625, -0.046173095703125, -0.049346923828125, -0.044036865234375, -0.0187225341796875, -0.056732177734375, -0.020782470703125, -0.048919677734375, 0.002490997314453125, -0.02886962890625, 0.0017480850219726562, 0.014862060546875, 0.03515625, 0.003009796142578125, 0.0262603759765625, -0.0260162353515625, 0.030426025390625, 0.02740478515625, 0.025299072265625, 0.0015459060668945312, -0.050506591796875, -0.01284027099609375, 0.00897979736328125, -0.04156494140625, -0.0643310546875, 0.034942626953125, -0.00439453125, 0.03948974609375, 0.06256103515625, 0.00016498565673828125, 0.057037353515625, -0.00925445556640625, 0.043548583984375, 0.04119873046875, -0.06201171875, 0.05230712890625, 0.002140045166015625, 0.01412200927734375, 0.020294189453125, 0.0160675048828125, -0.023040771484375, -0.0065460205078125, -0.056488037109375, -0.041748046875, 0.08984375, 0.0104522705078125, -0.0224151611328125, 0.0049591064453125, 0.01424407958984375, -0.006427764892578125, 0.011810302734375, -0.0723876953125, -0.0251312255859375, -0.037841796875, -0.004390716552734375, -0.00675201416015625, -0.0018787384033203125, 0.00469970703125, -0.035125732421875, 0.032562255859375, -0.02099609375, 0.053131103515625, 0.027191162109375, -0.0108795166015625, -0.01593017578125, -0.017242431640625, 0.03790283203125, 0.0253448486328125, -0.0278778076171875, -0.008331298828125, 0.016845703125, -0.03277587890625, -0.005321502685546875, 0.0008702278137207031, -0.019256591796875, -0.013885498046875, 0.043609619140625, 0.0712890625, 0.01255035400390625, -0.0274810791015625, 0.041351318359375, 0.019561767578125, -0.01071929931640625, -0.0026950836181640625, 0.020416259765625, -0.01904296875, 0.0219268798828125, 0.031402587890625, 0.0298614501953125, 0.003936767578125, -0.04833984375, 0.01125335693359375, 0.0307464599609375, -0.024169921875, -0.027740478515625, 0.069091796875, -0.0290069580078125, -0.035552978515625, 0.03936767578125, -0.026458740234375, -0.058197021484375, 0.08294677734375, 0.036529541015625, 0.054168701171875, -0.01303863525390625, 0.00579833984375, 0.064697265625, 0.0259857177734375, -0.02703857421875, -0.0260772705078125, -0.0150299072265625, -0.06353759765625, 0.006313323974609375, -0.041229248046875, 0.0007305145263671875, 0.007808685302734375, -0.065185546875, 0.042755126953125, -0.0184173583984375, -0.00873565673828125, 0.0005121231079101562, 0.0102996826171875, -0.0899658203125, 0.022003173828125, 0.015899658203125, 0.06097412109375, -0.06036376953125, 0.060546875, 0.0310516357421875, -0.03643798828125, -0.05352783203125, -0.0188140869140625, -0.003437042236328125, -0.078369140625, 0.03662109375, 0.046173095703125, -0.004467010498046875, -0.0067138671875, -0.0587158203125, -0.05810546875, 0.1002197265625, 0.02349853515625, -0.036956787109375, -0.0016632080078125, -0.0042724609375, 0.0264739990234375, -0.03155517578125, 0.03759765625, 0.037322998046875, 0.0343017578125, 0.0274200439453125, -0.048553466796875, -0.01241302490234375, -0.0030727386474609375, -0.00039386749267578125, 0.01361846923828125, -0.0478515625, 0.060211181640625, -0.042327880859375, -0.0075836181640625, 0.007114410400390625, 0.043731689453125, 0.0131378173828125, 0.041717529296875, 0.046905517578125, 0.06024169921875, 0.03997802734375, -0.019500732421875, 0.078369140625, 0.0010509490966796875, 0.05853271484375, 0.0653076171875, 0.01045989990234375, 0.0264434814453125, 0.02069091796875, -0.0247650146484375, 0.0204315185546875, 0.0458984375, -0.022369384765625, 0.040802001953125, -0.0010547637939453125, 0.0207977294921875, -0.00495147705078125, -0.00508880615234375, -0.03814697265625, 0.042938232421875, 0.01316070556640625, -0.0171051025390625, -0.00841522216796875, 0.0244598388671875, -0.0135040283203125, -0.014984130859375, -0.0223236083984375, 0.022308349609375, -0.0166778564453125, -0.042510986328125, 0.049468994140625, -0.0010480880737304688, 0.05877685546875, -0.035125732421875, 0.0174713134765625, -0.0220947265625, 0.004241943359375, -0.026092529296875, -0.052337646484375, 0.024505615234375, -0.0024089813232421875, 0.0007224082946777344, 0.01433563232421875, 0.059722900390625, -0.0203094482421875, -0.054962158203125, 0.02239990234375, 0.005176544189453125, 0.0158233642578125, -0.01439666748046875, -0.05474853515625, 0.025726318359375, 0.005657196044921875, -0.03857421875, 0.003238677978515625, 0.02239990234375, -0.0057373046875, 0.048187255859375, 0.035888671875, -0.016998291015625, 0.022979736328125, -0.0207672119140625, 0.062255859375, -0.034698486328125, -0.0252532958984375, -0.0469970703125, 0.034515380859375, -0.00963592529296875, -0.0254364013671875, 0.033935546875, 0.049896240234375, 0.091796875, -0.0173797607421875, 0.00884246826171875, -0.035919189453125, -0.0110931396484375, -0.00901031494140625, 0.0218048095703125, -0.05633544921875, 0.007488250732421875, -0.02581787109375, -0.0662841796875, -0.0214996337890625, 0.0732421875, -0.0251007080078125, 0.006732940673828125, 0.04315185546875, 0.07208251953125, -0.03009033203125, -0.016632080078125, 0.036651611328125, 0.0035839080810546875, 0.016387939453125, 0.05108642578125, 0.0296783447265625, -0.0804443359375, 0.048309326171875, -0.05853271484375, 0.0033016204833984375, -0.0389404296875, -0.05523681640625, -0.06903076171875, -0.04522705078125, -0.04656982421875, -0.0244140625, -0.02374267578125, 0.037445068359375, 0.091552734375, -0.057342529296875, 0.00408172607421875, -0.0121917724609375, 0.023468017578125, -0.0299072265625, -0.0211639404296875, 0.03363037109375, 0.007602691650390625, -0.0751953125, 0.01224517822265625, 0.017425537109375, 0.01139068603515625, -0.00762176513671875, -0.0210723876953125, -0.020263671875, -0.0249176025390625, 0.03839111328125, 0.037139892578125, -0.05560302734375, -0.027374267578125, 0.0102081298828125, -0.00010454654693603516, 0.033203125, 0.05078125, -0.0712890625, 0.0545654296875, 0.0340576171875, 0.0162811279296875, 0.07147216796875, -0.00930023193359375, -0.01128387451171875, -0.040252685546875, 0.05322265625, 0.009735107421875, 0.0426025390625, 0.0215911865234375, -0.03253173828125, 0.052703857421875, 0.04290771484375, -0.0241851806640625, -0.06951904296875, 0.011566162109375, -0.10101318359375, -0.01482391357421875, 0.037506103515625, -0.0207366943359375, -0.038330078125, 0.01318359375, -0.00958251953125, 0.03997802734375, -0.00548553466796875, 0.049346923828125, 0.0155487060546875, 0.001972198486328125, -0.0335693359375, -0.0217742919921875, 0.0016231536865234375, -0.0054168701171875, -0.0504150390625, -0.03448486328125, 0.01312255859375, 0.0322265625, 0.043060302734375, 0.0294647216796875, -0.0154266357421875, 0.0212860107421875, 0.0015048980712890625, 0.01763916015625, -0.038818359375, -0.04083251953125, -0.01053619384765625, 0.01212310791015625, -0.023284912109375, -0.044189453125 ] ]
flair/ner-english-ontonotes
2023-04-07T09:23:02.000Z
[ "flair", "pytorch", "token-classification", "sequence-tagger-model", "en", "dataset:ontonotes", "region:us" ]
token-classification
flair
null
null
flair/ner-english-ontonotes
15
228,076
flair
2022-03-02T23:29:05
--- tags: - flair - token-classification - sequence-tagger-model language: en datasets: - ontonotes widget: - text: "On September 1st George Washington won 1 dollar." --- ## English NER in Flair (Ontonotes default model) This is the 18-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/). F1-Score: **89.27** (Ontonotes) Predicts 18 tags: | **tag** | **meaning** | |---------------------------------|-----------| | CARDINAL | cardinal value | | DATE | date value | | EVENT | event name | | FAC | building name | | GPE | geo-political entity | | LANGUAGE | language name | | LAW | law name | | LOC | location name | | MONEY | money name | | NORP | affiliation | | ORDINAL | ordinal value | | ORG | organization name | | PERCENT | percent value | | PERSON | person name | | PRODUCT | product name | | QUANTITY | quantity value | | TIME | time value | | WORK_OF_ART | name of work of art | Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF. --- ### Demo: How to use in Flair Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`) ```python from flair.data import Sentence from flair.models import SequenceTagger # load tagger tagger = SequenceTagger.load("flair/ner-english-ontonotes") # make example sentence sentence = Sentence("On September 1st George Washington won 1 dollar.") # predict NER tags tagger.predict(sentence) # print sentence print(sentence) # print predicted NER spans print('The following NER tags are found:') # iterate over entities and print for entity in sentence.get_spans('ner'): print(entity) ``` This yields the following output: ``` Span [2,3]: "September 1st" [− Labels: DATE (0.8824)] Span [4,5]: "George Washington" [− Labels: PERSON (0.9604)] Span [7,8]: "1 dollar" [− Labels: MONEY (0.9837)] ``` So, the entities "*September 1st*" (labeled as a **date**), "*George Washington*" (labeled as a **person**) and "*1 dollar*" (labeled as a **money**) are found in the sentence "*On September 1st George Washington won 1 dollar*". --- ### Training: Script to train this model The following Flair script was used to train this model: ```python from flair.data import Corpus from flair.datasets import ColumnCorpus from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings # 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself) corpus: Corpus = ColumnCorpus( "resources/tasks/onto-ner", column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"}, tag_to_bioes="ner", ) # 2. what tag do we want to predict? tag_type = 'ner' # 3. make the tag dictionary from the corpus tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type) # 4. initialize each embedding we use embedding_types = [ # GloVe embeddings WordEmbeddings('en-crawl'), # contextual string embeddings, forward FlairEmbeddings('news-forward'), # contextual string embeddings, backward FlairEmbeddings('news-backward'), ] # embedding stack consists of Flair and GloVe embeddings embeddings = StackedEmbeddings(embeddings=embedding_types) # 5. initialize sequence tagger from flair.models import SequenceTagger tagger = SequenceTagger(hidden_size=256, embeddings=embeddings, tag_dictionary=tag_dictionary, tag_type=tag_type) # 6. initialize trainer from flair.trainers import ModelTrainer trainer = ModelTrainer(tagger, corpus) # 7. run training trainer.train('resources/taggers/ner-english-ontonotes', train_with_dev=True, max_epochs=150) ``` --- ### Cite Please cite the following paper when using this model. ``` @inproceedings{akbik2018coling, title={Contextual String Embeddings for Sequence Labeling}, author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland}, booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics}, pages = {1638--1649}, year = {2018} } ``` --- ### Issues? The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
4,414
[ [ -0.0222625732421875, -0.042022705078125, 0.008453369140625, 0.01259613037109375, -0.01177978515625, -0.008453369140625, -0.01491546630859375, -0.0308685302734375, 0.050506591796875, 0.0244598388671875, -0.0289459228515625, -0.04071044921875, -0.03814697265625, 0.023101806640625, 0.003509521484375, 0.09332275390625, 0.0171051025390625, 0.0174102783203125, -0.00795745849609375, -0.00543975830078125, -0.0240478515625, -0.04571533203125, -0.047210693359375, -0.0223846435546875, 0.040008544921875, 0.0284576416015625, 0.0307464599609375, 0.049285888671875, 0.03240966796875, 0.021484375, -0.016448974609375, 0.003116607666015625, -0.00685882568359375, -0.00962066650390625, -0.0137176513671875, -0.02978515625, -0.06683349609375, 0.01403045654296875, 0.04229736328125, 0.0263671875, 0.006687164306640625, 0.0107269287109375, 0.003589630126953125, 0.0131072998046875, -0.01537322998046875, 0.03216552734375, -0.048065185546875, -0.021240234375, -0.0135650634765625, -0.01409912109375, -0.0261077880859375, -0.028594970703125, 0.0148162841796875, -0.045745849609375, 0.004978179931640625, 0.0159454345703125, 0.10205078125, 0.005523681640625, -0.0257110595703125, -0.022491455078125, -0.026153564453125, 0.06304931640625, -0.0718994140625, 0.013427734375, 0.031341552734375, -0.009429931640625, -0.0010232925415039062, -0.05419921875, -0.047393798828125, -0.01201629638671875, -0.013916015625, 0.0196075439453125, -0.01091766357421875, -0.01151275634765625, 0.01537322998046875, 0.0194549560546875, -0.049652099609375, -0.01275634765625, -0.0158843994140625, -0.0111846923828125, 0.060455322265625, 0.020660400390625, 0.014984130859375, -0.0386962890625, -0.032684326171875, -0.01629638671875, -0.023681640625, -0.0003979206085205078, 0.008941650390625, 0.0435791015625, -0.0228729248046875, 0.039459228515625, 0.0027561187744140625, 0.053619384765625, 0.017181396484375, -0.02667236328125, 0.0439453125, -0.027130126953125, -0.0161590576171875, -0.00115203857421875, 0.0728759765625, 0.0251922607421875, 0.0130157470703125, -0.004375457763671875, -0.014251708984375, 0.014404296875, -0.0121612548828125, -0.046356201171875, -0.018768310546875, 0.0104217529296875, -0.020782470703125, -0.0362548828125, -0.0008606910705566406, -0.059814453125, -0.0167083740234375, -0.004150390625, 0.049346923828125, -0.0311279296875, -0.006275177001953125, 0.00231170654296875, -0.0301361083984375, 0.01316070556640625, 0.0106964111328125, -0.06304931640625, 0.0097808837890625, 0.0252532958984375, 0.04620361328125, 0.0261077880859375, -0.0211029052734375, -0.021881103515625, -0.007793426513671875, -0.01158905029296875, 0.053466796875, -0.02825927734375, -0.0226287841796875, -0.0063018798828125, 0.006683349609375, -0.02252197265625, -0.011383056640625, 0.04351806640625, -0.035003662109375, 0.034088134765625, -0.022552490234375, -0.054718017578125, -0.0231781005859375, 0.0303192138671875, -0.04852294921875, 0.064697265625, 0.002147674560546875, -0.08966064453125, 0.032684326171875, -0.036834716796875, -0.03302001953125, 0.006107330322265625, -0.00047326087951660156, -0.03436279296875, -0.007564544677734375, 0.01220703125, 0.04840087890625, -0.0181427001953125, 0.0231781005859375, -0.023834228515625, 0.0018329620361328125, 0.0157928466796875, 0.0033550262451171875, 0.06689453125, -0.002292633056640625, -0.0215606689453125, 0.0019321441650390625, -0.071533203125, -0.0147247314453125, 0.0221710205078125, -0.036224365234375, -0.023773193359375, 0.0008001327514648438, 0.021270751953125, 0.0193023681640625, 0.0155181884765625, -0.04290771484375, 0.044281005859375, -0.0384521484375, 0.038665771484375, 0.035186767578125, 0.0026531219482421875, 0.045989990234375, -0.03973388671875, 0.031951904296875, -0.0007319450378417969, -0.0172271728515625, -0.00728607177734375, -0.055877685546875, -0.05084228515625, -0.015106201171875, 0.040557861328125, 0.06304931640625, -0.04461669921875, 0.0477294921875, -0.030487060546875, -0.048248291015625, -0.025299072265625, -0.0208892822265625, 0.0184326171875, 0.053314208984375, 0.038970947265625, -0.01409912109375, -0.06689453125, -0.04522705078125, -0.0196533203125, -0.0204620361328125, 0.0218963623046875, 0.0274810791015625, 0.06146240234375, -0.01082611083984375, 0.061981201171875, -0.03753662109375, -0.037750244140625, -0.03173828125, 0.0173797607421875, 0.037811279296875, 0.044097900390625, 0.026824951171875, -0.045501708984375, -0.04400634765625, -0.0034923553466796875, -0.033721923828125, 0.01343536376953125, -0.022918701171875, 0.01104736328125, 0.038787841796875, 0.0269927978515625, -0.03228759765625, 0.040679931640625, 0.0239105224609375, -0.053802490234375, 0.040557861328125, -0.0005464553833007812, -0.00965118408203125, -0.1104736328125, 0.023681640625, 0.0225372314453125, -0.01512908935546875, -0.0394287109375, -0.027099609375, 0.007282257080078125, 0.0234375, -0.01078033447265625, 0.06201171875, -0.023529052734375, 0.017730712890625, 0.0004611015319824219, 0.00963592529296875, 0.011322021484375, 0.0249786376953125, 0.027618408203125, 0.024932861328125, 0.032867431640625, -0.03839111328125, 0.00878143310546875, 0.0379638671875, -0.027191162109375, 0.013397216796875, -0.03948974609375, -0.01471710205078125, -0.00820159912109375, 0.01471710205078125, -0.08349609375, -0.01485443115234375, 0.0242156982421875, -0.06842041015625, 0.045745849609375, -0.0023403167724609375, -0.021759033203125, -0.02557373046875, -0.01497650146484375, -0.00194549560546875, 0.0341796875, -0.0242462158203125, 0.046051025390625, 0.023040771484375, -0.0007996559143066406, -0.055084228515625, -0.05816650390625, -0.0090484619140625, -0.0218353271484375, -0.047119140625, 0.04449462890625, -0.0029277801513671875, -0.0081024169921875, 0.015716552734375, 0.006366729736328125, 0.0013284683227539062, 0.00982666015625, 0.0111846923828125, 0.039764404296875, -0.01776123046875, 0.00778961181640625, -0.0160369873046875, -0.0003483295440673828, 0.0006608963012695312, -0.017364501953125, 0.04644775390625, -0.008148193359375, 0.03021240234375, -0.031341552734375, -0.0021877288818359375, 0.01462554931640625, -0.02203369140625, 0.069580078125, 0.051116943359375, -0.036224365234375, -0.007232666015625, -0.032867431640625, -0.0171661376953125, -0.02801513671875, 0.043731689453125, -0.033599853515625, -0.048309326171875, 0.04656982421875, 0.0218353271484375, 0.01486968994140625, 0.0706787109375, 0.0308685302734375, 0.0006737709045410156, 0.0765380859375, 0.04376220703125, -0.0186920166015625, 0.03369140625, -0.04144287109375, 0.00508880615234375, -0.0587158203125, -0.0244598388671875, -0.043243408203125, -0.009765625, -0.058441162109375, -0.01299285888671875, 0.00788116455078125, 0.025543212890625, -0.0369873046875, 0.0421142578125, -0.038482666015625, 0.0153045654296875, 0.043670654296875, -0.01091766357421875, 0.01093292236328125, -0.004703521728515625, -0.023193359375, -0.0179901123046875, -0.057464599609375, -0.039459228515625, 0.0828857421875, 0.029449462890625, 0.050201416015625, -0.0020427703857421875, 0.06622314453125, -8.940696716308594e-7, 0.036163330078125, -0.056396484375, 0.029052734375, -0.012725830078125, -0.06585693359375, -0.00984954833984375, -0.020782470703125, -0.0693359375, 0.008941650390625, -0.036773681640625, -0.06585693359375, 0.020263671875, 0.01171112060546875, -0.039276123046875, 0.032440185546875, -0.01922607421875, 0.07421875, -0.0036792755126953125, -0.02239990234375, 0.0175323486328125, -0.059814453125, 0.0170745849609375, 0.00827789306640625, 0.0305633544921875, -0.00893402099609375, -0.004680633544921875, 0.0765380859375, -0.0181427001953125, 0.07208251953125, 0.00649261474609375, 0.0165252685546875, 0.01523590087890625, -0.00543975830078125, 0.036285400390625, 0.0188140869140625, -0.0121612548828125, 0.0011720657348632812, -0.00475311279296875, -0.011322021484375, -0.00905609130859375, 0.04974365234375, -0.05718994140625, -0.022857666015625, -0.068603515625, -0.0211334228515625, 0.0005922317504882812, 0.0127105712890625, 0.05670166015625, 0.04705810546875, -0.017822265625, -0.00548553466796875, 0.027587890625, -0.01776123046875, 0.05194091796875, 0.031280517578125, -0.0310821533203125, -0.06396484375, 0.06842041015625, 0.0103912353515625, -0.0036220550537109375, 0.038970947265625, 0.020751953125, -0.033721923828125, -0.0095062255859375, -0.02978515625, 0.035186767578125, -0.043212890625, -0.03369140625, -0.056396484375, -0.0157623291015625, -0.06451416015625, -0.00899505615234375, -0.0126953125, -0.044464111328125, -0.058685302734375, -0.0019474029541015625, 0.029144287109375, 0.06622314453125, -0.025421142578125, 0.019134521484375, -0.0526123046875, -0.0136566162109375, 0.00142669677734375, 0.002529144287109375, -0.006961822509765625, -0.0740966796875, -0.0205230712890625, -0.0126495361328125, -0.032623291015625, -0.07952880859375, 0.076904296875, 0.021636962890625, 0.030029296875, 0.0309600830078125, -0.009246826171875, 0.03515625, -0.0281982421875, 0.06304931640625, 0.00972747802734375, -0.0687255859375, 0.037017822265625, -0.0218353271484375, 0.00951385498046875, 0.0210418701171875, 0.056884765625, -0.044281005859375, -0.00572967529296875, -0.07000732421875, -0.0714111328125, 0.0535888671875, -0.0073699951171875, 0.003993988037109375, -0.02337646484375, 0.01470184326171875, -0.01316070556640625, 0.00337982177734375, -0.081787109375, -0.042938232421875, -0.0180816650390625, -0.017822265625, -0.0305633544921875, -0.01438140869140625, 0.013671875, -0.041259765625, 0.0859375, -0.00287628173828125, 0.0264892578125, 0.0311279296875, 0.005374908447265625, 0.004650115966796875, 0.014404296875, 0.04522705078125, 0.02301025390625, -0.0306549072265625, -0.0125274658203125, 0.020538330078125, -0.025390625, -0.01512908935546875, 0.0197601318359375, -0.006916046142578125, 0.014251708984375, 0.037261962890625, 0.06146240234375, 0.014129638671875, -0.02044677734375, 0.038543701171875, -0.0034580230712890625, -0.0178680419921875, -0.03839111328125, -0.0242919921875, 0.0158843994140625, 0.010162353515625, 0.0186004638671875, 0.00931549072265625, 0.0023136138916015625, -0.041229248046875, 0.00830078125, 0.0325927734375, -0.03466796875, -0.038970947265625, 0.0694580078125, 0.0035572052001953125, -0.01094818115234375, 0.03125, -0.043243408203125, -0.0604248046875, 0.0533447265625, 0.05499267578125, 0.053436279296875, -0.0171661376953125, 0.010650634765625, 0.0648193359375, 0.0255126953125, -0.0071868896484375, 0.0556640625, 0.033294677734375, -0.06744384765625, -0.0258026123046875, -0.07025146484375, -0.00027871131896972656, 0.0234375, -0.045989990234375, 0.032470703125, -0.032989501953125, -0.03643798828125, 0.031280517578125, 0.024322509765625, -0.05438232421875, 0.0302581787109375, 0.024017333984375, 0.0789794921875, -0.07440185546875, 0.06658935546875, 0.0810546875, -0.059234619140625, -0.08355712890625, -0.015869140625, 0.00464630126953125, -0.04180908203125, 0.0611572265625, 0.0196075439453125, 0.037872314453125, 0.01406097412109375, -0.04266357421875, -0.09564208984375, 0.07373046875, -0.019622802734375, -0.030487060546875, -0.011016845703125, -0.01727294921875, 0.0251312255859375, -0.034027099609375, 0.04107666015625, 0.032867431640625, 0.037017822265625, -0.0013408660888671875, -0.06951904296875, -0.00022923946380615234, -0.024169921875, -0.0120391845703125, 0.01415252685546875, -0.0465087890625, 0.0849609375, -0.0182952880859375, -0.00908660888671875, 0.019317626953125, 0.059478759765625, 0.00772857666015625, 0.016143798828125, 0.01282501220703125, 0.06866455078125, 0.05560302734375, -0.01922607421875, 0.07037353515625, -0.02374267578125, 0.0472412109375, 0.0841064453125, -0.00324249267578125, 0.07476806640625, 0.0186309814453125, -0.01153564453125, 0.05303955078125, 0.051788330078125, -0.00809478759765625, 0.0384521484375, 0.0157623291015625, -0.0019664764404296875, -0.026458740234375, -0.0079345703125, -0.033843994140625, 0.04254150390625, 0.0286102294921875, -0.04266357421875, 0.00396728515625, -0.009246826171875, 0.033172607421875, -0.0035724639892578125, -0.02880859375, 0.0625, 0.004711151123046875, -0.0400390625, 0.036346435546875, 0.011322021484375, 0.07379150390625, -0.030487060546875, 0.0013580322265625, -0.01143646240234375, 0.0243072509765625, -0.014801025390625, -0.04071044921875, 0.0171966552734375, -0.019256591796875, -0.017913818359375, 0.00020873546600341797, 0.05206298828125, -0.04107666015625, -0.03936767578125, 0.016693115234375, 0.0273590087890625, 0.0101318359375, -0.00007778406143188477, -0.053436279296875, -0.011016845703125, 0.00907135009765625, -0.0384521484375, 0.0155487060546875, 0.01485443115234375, 0.0020580291748046875, 0.030609130859375, 0.032745361328125, 0.0032482147216796875, -0.0037899017333984375, -0.0181732177734375, 0.06005859375, -0.06597900390625, -0.03399658203125, -0.0692138671875, 0.0477294921875, -0.0081024169921875, -0.051300048828125, 0.05975341796875, 0.06207275390625, 0.059478759765625, -0.006542205810546875, 0.06268310546875, -0.033416748046875, 0.050384521484375, -0.01303863525390625, 0.06744384765625, -0.06121826171875, -0.0037899017333984375, -0.0181732177734375, -0.04400634765625, -0.0321044921875, 0.0572509765625, -0.0294342041015625, -0.004718780517578125, 0.046539306640625, 0.05078125, 0.0156707763671875, 0.00447845458984375, 0.0045013427734375, 0.0284576416015625, -0.004787445068359375, 0.034332275390625, 0.041595458984375, -0.04937744140625, 0.0233917236328125, -0.040374755859375, -0.0153045654296875, -0.020660400390625, -0.07232666015625, -0.07373046875, -0.0572509765625, -0.034393310546875, -0.064453125, -0.0179901123046875, 0.0877685546875, 0.0285186767578125, -0.07177734375, -0.020660400390625, 0.007709503173828125, -0.005035400390625, 0.0012502670288085938, -0.018951416015625, 0.03363037109375, -0.0175323486328125, -0.05322265625, 0.0223846435546875, -0.0127716064453125, 0.01263427734375, 0.0163116455078125, 0.0015993118286132812, -0.05029296875, 0.011810302734375, 0.0297393798828125, 0.02630615234375, -0.0548095703125, -0.0062103271484375, 0.0181732177734375, -0.02520751953125, 0.01161956787109375, 0.01361083984375, -0.0582275390625, 0.01318359375, 0.05413818359375, 0.0181427001953125, 0.031707763671875, -0.0006656646728515625, 0.0197296142578125, -0.044189453125, -0.005176544189453125, 0.0279083251953125, 0.040130615234375, 0.0209503173828125, -0.0147857666015625, 0.035491943359375, 0.034454345703125, -0.053192138671875, -0.050140380859375, -0.01910400390625, -0.07794189453125, -0.015045166015625, 0.08416748046875, -0.00957489013671875, -0.033447265625, 0.006969451904296875, -0.006359100341796875, 0.035064697265625, -0.032562255859375, 0.026336669921875, 0.03753662109375, -0.0019159317016601562, 0.01262664794921875, -0.025970458984375, 0.055938720703125, 0.0294342041015625, -0.044097900390625, -0.025634765625, 0.0163116455078125, 0.042449951171875, 0.0272216796875, 0.044830322265625, 0.006809234619140625, 0.0095977783203125, 0.01114654541015625, 0.0384521484375, 0.01314544677734375, -0.00896453857421875, -0.040802001953125, -0.00962066650390625, -0.0035114288330078125, -0.0150146484375 ] ]
Babelscape/rebel-large
2023-06-20T10:17:00.000Z
[ "transformers", "pytorch", "safetensors", "bart", "text2text-generation", "seq2seq", "relation-extraction", "en", "dataset:Babelscape/rebel-dataset", "license:cc-by-nc-sa-4.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
text2text-generation
Babelscape
null
null
Babelscape/rebel-large
140
226,519
transformers
2022-03-02T23:29:04
--- language: - en widget: - text: "Punta Cana is a resort town in the municipality of Higuey, in La Altagracia Province, the eastern most province of the Dominican Republic" tags: - seq2seq - relation-extraction datasets: - Babelscape/rebel-dataset model-index: - name: REBEL results: - task: name: Relation Extraction type: Relation-Extraction dataset: name: "CoNLL04" type: CoNLL04 metrics: - name: RE+ Macro F1 type: re+ macro f1 value: 76.65 - task: name: Relation Extraction type: Relation-Extraction dataset: name: "NYT" type: NYT metrics: - name: F1 type: f1 value: 93.4 license: cc-by-nc-sa-4.0 --- [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/rebel-relation-extraction-by-end-to-end/relation-extraction-on-nyt)](https://paperswithcode.com/sota/relation-extraction-on-nyt?p=rebel-relation-extraction-by-end-to-end) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/rebel-relation-extraction-by-end-to-end/relation-extraction-on-conll04)](https://paperswithcode.com/sota/relation-extraction-on-conll04?p=rebel-relation-extraction-by-end-to-end) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/rebel-relation-extraction-by-end-to-end/joint-entity-and-relation-extraction-on-3)](https://paperswithcode.com/sota/joint-entity-and-relation-extraction-on-3?p=rebel-relation-extraction-by-end-to-end) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/rebel-relation-extraction-by-end-to-end/relation-extraction-on-ade-corpus)](https://paperswithcode.com/sota/relation-extraction-on-ade-corpus?p=rebel-relation-extraction-by-end-to-end) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/rebel-relation-extraction-by-end-to-end/relation-extraction-on-re-tacred)](https://paperswithcode.com/sota/relation-extraction-on-re-tacred?p=rebel-relation-extraction-by-end-to-end) ## Multilingual update! Check [mREBEL](https://huggingface.co/Babelscape/mrebel-large), a multilingual version covering more relation types, languages and including entity types. # REBEL <img src="https://i.ibb.co/qsLzNqS/hf-rebel.png" width="30" alt="hf-rebel" border="0" style="display:inline; white-space:nowrap;">: Relation Extraction By End-to-end Language generation This is the model card for the Findings of EMNLP 2021 paper [REBEL: Relation Extraction By End-to-end Language generation](https://github.com/Babelscape/rebel/blob/main/docs/EMNLP_2021_REBEL__Camera_Ready_.pdf). We present a new linearization approach and a reframing of Relation Extraction as a seq2seq task. The paper can be found [here](https://github.com/Babelscape/rebel/blob/main/docs/EMNLP_2021_REBEL__Camera_Ready_.pdf). If you use the code, please reference this work in your paper: @inproceedings{huguet-cabot-navigli-2021-rebel-relation, title = "{REBEL}: Relation Extraction By End-to-end Language generation", author = "Huguet Cabot, Pere-Llu{\'\i}s and Navigli, Roberto", booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021", month = nov, year = "2021", address = "Punta Cana, Dominican Republic", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.findings-emnlp.204", pages = "2370--2381", abstract = "Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling multiple applications such as populating or validating knowledge bases, factchecking, and other downstream tasks. However, it usually involves multiple-step pipelines that propagate errors or are limited to a small number of relation types. To overcome these issues, we propose the use of autoregressive seq2seq models. Such models have previously been shown to perform well not only in language generation, but also in NLU tasks such as Entity Linking, thanks to their framing as seq2seq tasks. In this paper, we show how Relation Extraction can be simplified by expressing triplets as a sequence of text and we present REBEL, a seq2seq model based on BART that performs end-to-end relation extraction for more than 200 different relation types. We show our model{'}s flexibility by fine-tuning it on an array of Relation Extraction and Relation Classification benchmarks, with it attaining state-of-the-art performance in most of them.", } The original repository for the paper can be found [here](https://github.com/Babelscape/rebel) Be aware that the inference widget at the right does not output special tokens, which are necessary to distinguish the subject, object and relation types. For a demo of REBEL and its pre-training dataset check the [Spaces demo](https://huggingface.co/spaces/Babelscape/rebel-demo). ## Pipeline usage ```python from transformers import pipeline triplet_extractor = pipeline('text2text-generation', model='Babelscape/rebel-large', tokenizer='Babelscape/rebel-large') # We need to use the tokenizer manually since we need special tokens. extracted_text = triplet_extractor.tokenizer.batch_decode([triplet_extractor("Punta Cana is a resort town in the municipality of Higuey, in La Altagracia Province, the eastern most province of the Dominican Republic", return_tensors=True, return_text=False)[0]["generated_token_ids"]]) print(extracted_text[0]) # Function to parse the generated text and extract the triplets def extract_triplets(text): triplets = [] relation, subject, relation, object_ = '', '', '', '' text = text.strip() current = 'x' for token in text.replace("<s>", "").replace("<pad>", "").replace("</s>", "").split(): if token == "<triplet>": current = 't' if relation != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) relation = '' subject = '' elif token == "<subj>": current = 's' if relation != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) object_ = '' elif token == "<obj>": current = 'o' relation = '' else: if current == 't': subject += ' ' + token elif current == 's': object_ += ' ' + token elif current == 'o': relation += ' ' + token if subject != '' and relation != '' and object_ != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) return triplets extracted_triplets = extract_triplets(extracted_text[0]) print(extracted_triplets) ``` ## Model and Tokenizer using transformers ```python from transformers import AutoModelForSeq2SeqLM, AutoTokenizer def extract_triplets(text): triplets = [] relation, subject, relation, object_ = '', '', '', '' text = text.strip() current = 'x' for token in text.replace("<s>", "").replace("<pad>", "").replace("</s>", "").split(): if token == "<triplet>": current = 't' if relation != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) relation = '' subject = '' elif token == "<subj>": current = 's' if relation != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) object_ = '' elif token == "<obj>": current = 'o' relation = '' else: if current == 't': subject += ' ' + token elif current == 's': object_ += ' ' + token elif current == 'o': relation += ' ' + token if subject != '' and relation != '' and object_ != '': triplets.append({'head': subject.strip(), 'type': relation.strip(),'tail': object_.strip()}) return triplets # Load model and tokenizer tokenizer = AutoTokenizer.from_pretrained("Babelscape/rebel-large") model = AutoModelForSeq2SeqLM.from_pretrained("Babelscape/rebel-large") gen_kwargs = { "max_length": 256, "length_penalty": 0, "num_beams": 3, "num_return_sequences": 3, } # Text to extract triplets from text = 'Punta Cana is a resort town in the municipality of Higüey, in La Altagracia Province, the easternmost province of the Dominican Republic.' # Tokenizer text model_inputs = tokenizer(text, max_length=256, padding=True, truncation=True, return_tensors = 'pt') # Generate generated_tokens = model.generate( model_inputs["input_ids"].to(model.device), attention_mask=model_inputs["attention_mask"].to(model.device), **gen_kwargs, ) # Extract text decoded_preds = tokenizer.batch_decode(generated_tokens, skip_special_tokens=False) # Extract triplets for idx, sentence in enumerate(decoded_preds): print(f'Prediction triplets sentence {idx}') print(extract_triplets(sentence)) ```
9,210
[ [ -0.00910186767578125, -0.051300048828125, 0.021881103515625, 0.0176239013671875, -0.0220947265625, 0.0036334991455078125, -0.0152740478515625, -0.0282440185546875, 0.021697998046875, 0.0305023193359375, -0.036590576171875, -0.04400634765625, -0.0273284912109375, 0.01210784912109375, -0.0093536376953125, 0.0723876953125, -0.01352691650390625, -0.01434326171875, 0.001178741455078125, 0.0009598731994628906, -0.0137481689453125, -0.04302978515625, -0.0501708984375, 0.015625, 0.0257720947265625, 0.025421142578125, 0.044403076171875, 0.0504150390625, 0.03387451171875, 0.02093505859375, -0.0068817138671875, 0.024566650390625, -0.02960205078125, 0.0286102294921875, -0.016265869140625, -0.0240631103515625, -0.049041748046875, -0.002613067626953125, 0.049346923828125, 0.02099609375, 0.00984954833984375, 0.039093017578125, -0.0098419189453125, 0.038665771484375, -0.046539306640625, 0.0394287109375, -0.04351806640625, -0.01513671875, -0.0212860107421875, 0.0017786026000976562, -0.02215576171875, -0.0145721435546875, -0.01137542724609375, -0.043670654296875, 0.018768310546875, 0.005550384521484375, 0.11676025390625, -0.0013217926025390625, -0.0408935546875, -0.017059326171875, -0.050201416015625, 0.048370361328125, -0.05145263671875, 0.020782470703125, 0.01434326171875, 0.0003821849822998047, -0.0170135498046875, -0.06573486328125, -0.048309326171875, 0.004779815673828125, -0.0238189697265625, 0.023345947265625, -0.029052734375, -0.01483154296875, 0.01898193359375, 0.032318115234375, -0.0301513671875, -0.01194000244140625, -0.031158447265625, -0.0035572052001953125, 0.050506591796875, 0.003978729248046875, 0.03399658203125, -0.05853271484375, -0.0223388671875, 0.005977630615234375, -0.03973388671875, 0.00988006591796875, 0.025543212890625, 0.02484130859375, -0.0132904052734375, 0.042510986328125, 0.003391265869140625, 0.03485107421875, 0.0106964111328125, -0.01012420654296875, 0.0313720703125, -0.048583984375, -0.0062713623046875, -0.05352783203125, 0.09759521484375, 0.0279998779296875, 0.01210784912109375, 0.002346038818359375, -0.0153656005859375, 0.00406646728515625, -0.0189208984375, -0.0491943359375, -0.0196075439453125, 0.0161285400390625, -0.034942626953125, -0.020965576171875, -0.0013303756713867188, -0.08087158203125, 0.0200042724609375, 0.00022792816162109375, 0.021636962890625, -0.031280517578125, -0.0506591796875, -0.0076141357421875, -0.013153076171875, 0.018829345703125, 0.00047707557678222656, -0.0672607421875, 0.01371002197265625, 0.042388916015625, 0.06146240234375, -0.0093841552734375, -0.0175628662109375, -0.03314208984375, 0.0061798095703125, -0.007656097412109375, 0.05059814453125, -0.0121002197265625, -0.013458251953125, -0.0014858245849609375, 0.013946533203125, -0.01308441162109375, -0.023345947265625, 0.03515625, -0.037506103515625, 0.021209716796875, -0.0154876708984375, -0.0221405029296875, -0.00470733642578125, 0.004364013671875, -0.055908203125, 0.09381103515625, 0.02685546875, -0.052886962890625, 0.015960693359375, -0.0267486572265625, -0.0290374755859375, -0.01418304443359375, -0.01517486572265625, -0.053741455078125, -0.025970458984375, 0.042938232421875, 0.046356201171875, -0.037139892578125, 0.0173797607421875, -0.0302276611328125, 0.01451873779296875, 0.006977081298828125, 0.0018177032470703125, 0.0875244140625, 0.0187530517578125, -0.0219879150390625, -0.003696441650390625, -0.053009033203125, -0.00060272216796875, 0.03204345703125, -0.0188140869140625, -0.006595611572265625, -0.0135040283203125, 0.0015554428100585938, 0.036773681640625, 0.0176849365234375, -0.037750244140625, 0.021820068359375, -0.042388916015625, 0.042572021484375, 0.0352783203125, 0.0377197265625, 0.027740478515625, -0.016021728515625, 0.043731689453125, 0.01434326171875, 0.00830078125, -0.021270751953125, -0.029693603515625, -0.08013916015625, -0.0173492431640625, 0.027069091796875, 0.0233917236328125, -0.07220458984375, 0.042999267578125, -0.049591064453125, -0.03826904296875, -0.030975341796875, 0.007053375244140625, 0.0498046875, 0.0203857421875, 0.035980224609375, -0.0107421875, -0.06658935546875, -0.056121826171875, -0.0322265625, -0.02716064453125, 0.021575927734375, 0.004638671875, 0.05609130859375, -0.0006680488586425781, 0.0810546875, -0.057708740234375, -0.0195159912109375, -0.0259246826171875, 0.005321502685546875, 0.0265045166015625, 0.04339599609375, 0.053131103515625, -0.0584716796875, -0.047698974609375, -0.0109710693359375, -0.0594482421875, 0.0084686279296875, -0.0081787109375, -0.028045654296875, 0.0244140625, 0.0249786376953125, -0.054473876953125, 0.034027099609375, 0.0037937164306640625, -0.0323486328125, 0.045928955078125, 0.005382537841796875, -0.0088653564453125, -0.1024169921875, 0.0040435791015625, -0.00489044189453125, -0.0301513671875, -0.049407958984375, 0.0211029052734375, 0.016845703125, 0.0211029052734375, -0.022308349609375, 0.03631591796875, -0.0692138671875, -0.01465606689453125, 0.021270751953125, 0.022705078125, 0.017242431640625, 0.05755615234375, -0.004863739013671875, 0.043426513671875, 0.029052734375, -0.0386962890625, 0.0301055908203125, 0.039825439453125, 0.00778961181640625, 0.024993896484375, -0.01898193359375, 0.0172119140625, -0.0035648345947265625, 0.016998291015625, -0.0665283203125, -0.004253387451171875, 0.0289459228515625, -0.052032470703125, 0.023712158203125, -0.0345458984375, -0.042572021484375, -0.037841796875, -0.033721923828125, 0.01198577880859375, 0.03399658203125, -0.044647216796875, 0.053131103515625, 0.04119873046875, -0.005157470703125, -0.052886962890625, -0.048553466796875, 0.004940032958984375, -0.003582000732421875, -0.04669189453125, 0.01953125, 0.0170135498046875, -0.010040283203125, 0.0145721435546875, -0.00237274169921875, -0.00534820556640625, -0.007625579833984375, 0.01204681396484375, 0.03778076171875, -0.0194549560546875, 0.020477294921875, -0.0022487640380859375, -0.010101318359375, 0.0118865966796875, -0.02197265625, 0.046966552734375, -0.0277252197265625, 0.0024890899658203125, -0.0220794677734375, 0.025970458984375, 0.031524658203125, -0.032440185546875, 0.049072265625, 0.049346923828125, -0.0333251953125, 0.00026226043701171875, -0.0496826171875, 0.00826263427734375, -0.030731201171875, 0.027374267578125, -0.044677734375, -0.0401611328125, 0.046844482421875, 0.0117950439453125, 0.018707275390625, 0.08441162109375, 0.029693603515625, 0.00792694091796875, 0.059051513671875, 0.024688720703125, -0.0010356903076171875, 0.021392822265625, -0.0626220703125, 0.005977630615234375, -0.07318115234375, -0.025360107421875, -0.035308837890625, 0.0018625259399414062, -0.0509033203125, -0.0243988037109375, 0.0059356689453125, 0.0253448486328125, -0.0162200927734375, 0.054656982421875, -0.03741455078125, 0.0111236572265625, 0.0278167724609375, -0.0015363693237304688, 0.0294952392578125, -0.0029468536376953125, -0.00927734375, -0.00860595703125, -0.040618896484375, -0.031524658203125, 0.086669921875, 0.015411376953125, 0.027801513671875, 0.027313232421875, 0.0677490234375, -0.0162811279296875, 0.0271759033203125, -0.056732177734375, 0.044677734375, -0.01213836669921875, -0.05194091796875, -0.010467529296875, -0.040863037109375, -0.0877685546875, 0.0194091796875, -0.0230865478515625, -0.058807373046875, 0.031341552734375, -0.0028858184814453125, -0.0300750732421875, 0.0272216796875, -0.046966552734375, 0.046722412109375, -0.0074005126953125, -0.04107666015625, 0.0158233642578125, -0.0526123046875, 0.035980224609375, 0.00022721290588378906, 0.0249481201171875, -0.01531219482421875, -0.01483154296875, 0.0726318359375, -0.0419921875, 0.0355224609375, -0.0072784423828125, 0.004150390625, 0.02398681640625, 0.004703521728515625, 0.03741455078125, 0.0161285400390625, -0.003955841064453125, 0.00423431396484375, 0.00882720947265625, -0.0146942138671875, -0.044647216796875, 0.048004150390625, -0.06982421875, -0.03582763671875, -0.037353515625, -0.0159759521484375, 0.0022430419921875, 0.0308074951171875, 0.0277557373046875, 0.039276123046875, 0.0027408599853515625, 0.0009479522705078125, 0.002674102783203125, -0.002685546875, 0.056304931640625, 0.02313232421875, -0.04705810546875, -0.0623779296875, 0.048980712890625, 0.01959228515625, 0.02630615234375, 0.02447509765625, 0.002918243408203125, -0.030548095703125, -0.0218353271484375, -0.03167724609375, 0.03839111328125, -0.0557861328125, -0.0401611328125, -0.0755615234375, -0.037109375, -0.0635986328125, 0.0176239013671875, -0.032012939453125, -0.056549072265625, -0.04901123046875, -0.00640869140625, 0.0382080078125, 0.0455322265625, -0.006481170654296875, 0.0156097412109375, -0.056182861328125, 0.04052734375, 0.015716552734375, -0.0093231201171875, -0.00954437255859375, -0.044586181640625, -0.005496978759765625, -0.0093841552734375, -0.03240966796875, -0.08099365234375, 0.056488037109375, 0.0132904052734375, 0.0198974609375, 0.017242431640625, 0.0034770965576171875, 0.0765380859375, -0.017791748046875, 0.06561279296875, 0.0164794921875, -0.0753173828125, 0.04107666015625, -0.027740478515625, -0.003116607666015625, 0.032928466796875, 0.0189971923828125, -0.03570556640625, -0.017242431640625, -0.08306884765625, -0.08062744140625, 0.07000732421875, 0.04547119140625, -0.00698089599609375, -0.0034313201904296875, 0.056671142578125, -0.02008056640625, 0.0112457275390625, -0.08013916015625, -0.047760009765625, -0.0037670135498046875, -0.017669677734375, 0.00778961181640625, -0.032135009765625, 0.007556915283203125, -0.027801513671875, 0.06939697265625, 0.01308441162109375, 0.0305633544921875, 0.0343017578125, -0.0167388916015625, -0.01406097412109375, 0.00884246826171875, 0.01361083984375, 0.04461669921875, -0.01427459716796875, 0.0158843994140625, 0.00904083251953125, -0.028411865234375, 0.0073699951171875, 0.0173492431640625, -0.0153045654296875, 0.016357421875, 0.042022705078125, 0.05059814453125, 0.00557708740234375, -0.0059356689453125, 0.06317138671875, -0.0015010833740234375, -0.0192108154296875, -0.044189453125, -0.00992584228515625, 0.01314544677734375, 0.00543975830078125, 0.0256195068359375, 0.0068817138671875, -0.003437042236328125, -0.04364013671875, 0.0183563232421875, 0.0253448486328125, -0.006687164306640625, -0.0249786376953125, 0.053741455078125, 0.002094268798828125, -0.03900146484375, 0.0300140380859375, -0.03753662109375, -0.046661376953125, 0.043853759765625, 0.04150390625, 0.0675048828125, -0.0036869049072265625, 0.00319671630859375, 0.059783935546875, 0.0284881591796875, -0.0011949539184570312, 0.03326416015625, 0.00464630126953125, -0.05841064453125, -0.0200042724609375, -0.046051025390625, 0.01169586181640625, 0.0195159912109375, -0.057281494140625, 0.042327880859375, -0.0262451171875, -0.0179901123046875, 0.01352691650390625, 0.01220703125, -0.0504150390625, 0.0008101463317871094, -0.0134429931640625, 0.036895751953125, -0.050201416015625, 0.052886962890625, 0.0318603515625, -0.060577392578125, -0.096923828125, -0.003620147705078125, -0.004505157470703125, -0.052886962890625, 0.052886962890625, 0.015380859375, 0.00168609619140625, 0.0103912353515625, -0.03973388671875, -0.0836181640625, 0.0821533203125, 0.0220184326171875, -0.02825927734375, -0.02069091796875, -0.0005974769592285156, 0.036102294921875, -0.031280517578125, 0.043853759765625, 0.056915283203125, 0.023590087890625, 0.0058135986328125, -0.05145263671875, 0.03277587890625, -0.034912109375, 0.003772735595703125, 0.0213165283203125, -0.06231689453125, 0.08416748046875, -0.004100799560546875, -0.0372314453125, 0.0022373199462890625, 0.0631103515625, 0.005096435546875, 0.00632476806640625, 0.04180908203125, 0.03302001953125, 0.0296173095703125, -0.011077880859375, 0.044464111328125, -0.01273345947265625, 0.061065673828125, 0.0919189453125, -0.00543975830078125, 0.0723876953125, 0.0285491943359375, -0.0243988037109375, 0.060791015625, 0.039581298828125, -0.0106353759765625, 0.0085296630859375, -0.009246826171875, -0.004314422607421875, 0.001232147216796875, 0.018310546875, -0.0250396728515625, 0.04351806640625, 0.0294189453125, -0.03814697265625, -0.01226806640625, -0.021728515625, 0.037353515625, 0.006694793701171875, -0.00470733642578125, 0.060546875, 0.0109710693359375, -0.06842041015625, 0.05853271484375, 0.0009341239929199219, 0.038818359375, -0.0176239013671875, 0.005161285400390625, -0.0002715587615966797, 0.0077056884765625, -0.026824951171875, -0.046356201171875, 0.0180511474609375, 0.004695892333984375, 0.0012540817260742188, -0.0111541748046875, 0.050933837890625, -0.04608154296875, -0.04052734375, 0.0280914306640625, 0.047149658203125, 0.013580322265625, 0.007965087890625, -0.061431884765625, -0.018646240234375, 0.01824951171875, -0.037506103515625, -0.0201568603515625, 0.02923583984375, 0.013031005859375, 0.0309295654296875, 0.0654296875, 0.0170135498046875, -0.0014848709106445312, -0.009521484375, 0.07568359375, -0.05828857421875, -0.0291748046875, -0.08349609375, 0.04168701171875, -0.036468505859375, -0.037811279296875, 0.064453125, 0.0640869140625, 0.033447265625, -0.0010509490966796875, 0.047637939453125, -0.0159149169921875, 0.04278564453125, -0.03582763671875, 0.0408935546875, -0.0355224609375, 0.0169525146484375, -0.027099609375, -0.058624267578125, -0.031707763671875, 0.021942138671875, -0.02276611328125, 0.0039825439453125, 0.07147216796875, 0.064697265625, 0.006114959716796875, -0.009124755859375, -0.0081329345703125, 0.02667236328125, 0.038604736328125, 0.043182373046875, 0.05889892578125, -0.0557861328125, 0.06756591796875, -0.0303802490234375, -0.01151275634765625, -0.007049560546875, -0.05731201171875, -0.0665283203125, -0.06292724609375, -0.01837158203125, -0.0445556640625, -0.01303863525390625, 0.07720947265625, 0.035064697265625, -0.040069580078125, -0.01226806640625, -0.02069091796875, 0.00782012939453125, -0.0111236572265625, -0.021270751953125, 0.05255126953125, -0.017425537109375, -0.0501708984375, 0.001468658447265625, 0.01450347900390625, 0.007572174072265625, -0.00803375244140625, 0.0155181884765625, -0.050323486328125, -0.0193023681640625, 0.02520751953125, 0.01934814453125, -0.05413818359375, -0.005764007568359375, -0.0030117034912109375, -0.0119476318359375, 0.0229034423828125, 0.035919189453125, -0.04119873046875, 0.0281982421875, 0.03472900390625, 0.007476806640625, 0.06494140625, -0.004302978515625, 0.0364990234375, -0.056671142578125, 0.00823974609375, 0.0175933837890625, 0.035858154296875, 0.0253753662109375, -0.04107666015625, 0.0533447265625, 0.034942626953125, -0.037353515625, -0.059967041015625, 0.01145172119140625, -0.073486328125, -0.00205230712890625, 0.11138916015625, -0.0256500244140625, -0.0297698974609375, -0.018890380859375, -0.013153076171875, 0.054351806640625, -0.031951904296875, 0.021575927734375, 0.0288543701171875, 0.002483367919921875, -0.0014600753784179688, -0.027923583984375, 0.04217529296875, 0.0258941650390625, -0.0675048828125, 0.0041656494140625, 0.0113372802734375, 0.0300750732421875, 0.0298919677734375, 0.055084228515625, -0.0191192626953125, 0.0102691650390625, -0.00885772705078125, 0.0235748291015625, -0.005550384521484375, 0.0044097900390625, -0.025146484375, -0.002208709716796875, -0.0107879638671875, -0.018035888671875 ] ]
microsoft/wavlm-large
2022-02-02T21:21:50.000Z
[ "transformers", "pytorch", "wavlm", "feature-extraction", "speech", "en", "arxiv:1912.07875", "arxiv:2106.06909", "arxiv:2101.00390", "arxiv:2110.13900", "has_space", "region:us" ]
feature-extraction
microsoft
null
null
microsoft/wavlm-large
37
226,095
transformers
2022-03-02T23:29:05
--- language: - en tags: - speech inference: false --- # WavLM-Large [Microsoft's WavLM](https://github.com/microsoft/unilm/tree/master/wavlm) The large model pretrained on 16kHz sampled speech audio. When using the model, make sure that your speech input is also sampled at 16kHz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. The model was pre-trained on: - 60,000 hours of [Libri-Light](https://arxiv.org/abs/1912.07875) - 10,000 hours of [GigaSpeech](https://arxiv.org/abs/2106.06909) - 24,000 hours of [VoxPopuli](https://arxiv.org/abs/2101.00390) [Paper: WavLM: Large-Scale Self-Supervised Pre-Training for Full Stack Speech Processing](https://arxiv.org/abs/2110.13900) Authors: Sanyuan Chen, Chengyi Wang, Zhengyang Chen, Yu Wu, Shujie Liu, Zhuo Chen, Jinyu Li, Naoyuki Kanda, Takuya Yoshioka, Xiong Xiao, Jian Wu, Long Zhou, Shuo Ren, Yanmin Qian, Yao Qian, Jian Wu, Michael Zeng, Furu Wei **Abstract** *Self-supervised learning (SSL) achieves great success in speech recognition, while limited exploration has been attempted for other speech processing tasks. As speech signal contains multi-faceted information including speaker identity, paralinguistics, spoken content, etc., learning universal representations for all speech tasks is challenging. In this paper, we propose a new pre-trained model, WavLM, to solve full-stack downstream speech tasks. WavLM is built based on the HuBERT framework, with an emphasis on both spoken content modeling and speaker identity preservation. We first equip the Transformer structure with gated relative position bias to improve its capability on recognition tasks. For better speaker discrimination, we propose an utterance mixing training strategy, where additional overlapped utterances are created unsupervisely and incorporated during model training. Lastly, we scale up the training dataset from 60k hours to 94k hours. WavLM Large achieves state-of-the-art performance on the SUPERB benchmark, and brings significant improvements for various speech processing tasks on their representative benchmarks.* The original model can be found under https://github.com/microsoft/unilm/tree/master/wavlm. # Usage This is an English pre-trained speech model that has to be fine-tuned on a downstream task like speech recognition or audio classification before it can be used in inference. The model was pre-trained in English and should therefore perform well only in English. The model has been shown to work well on the [SUPERB benchmark](https://superbbenchmark.org/). **Note**: The model was pre-trained on phonemes rather than characters. This means that one should make sure that the input text is converted to a sequence of phonemes before fine-tuning. ## Speech Recognition To fine-tune the model for speech recognition, see [the official speech recognition example](https://github.com/huggingface/transformers/tree/master/examples/pytorch/speech-recognition). ## Speech Classification To fine-tune the model for speech classification, see [the official audio classification example](https://github.com/huggingface/transformers/tree/master/examples/pytorch/audio-classification). ## Speaker Verification TODO ## Speaker Diarization TODO # Contribution The model was contributed by [cywang](https://huggingface.co/cywang) and [patrickvonplaten](https://huggingface.co/patrickvonplaten). # License The official license can be found [here](https://github.com/microsoft/UniSpeech/blob/main/LICENSE) ![design](https://raw.githubusercontent.com/patrickvonplaten/scientific_images/master/wavlm.png)
3,891
[ [ -0.0240020751953125, -0.051513671875, 0.01012420654296875, 0.01308441162109375, -0.0186004638671875, -0.00733184814453125, -0.0170440673828125, -0.04888916015625, -0.01079559326171875, 0.0294952392578125, -0.044830322265625, -0.041229248046875, -0.036865234375, -0.01190948486328125, -0.0278167724609375, 0.06524658203125, 0.0272674560546875, 0.0179443359375, -0.004253387451171875, -0.00493621826171875, -0.033782958984375, -0.05517578125, -0.0479736328125, -0.0347900390625, 0.0262603759765625, 0.003490447998046875, 0.0170440673828125, 0.03131103515625, 0.009185791015625, 0.02410888671875, -0.0277557373046875, 0.001430511474609375, -0.030487060546875, -0.0055999755859375, -0.0021209716796875, -0.01145172119140625, -0.051971435546875, 0.01375579833984375, 0.048370361328125, 0.036041259765625, -0.02996826171875, 0.04296875, 0.01340484619140625, 0.0235595703125, -0.0261993408203125, 0.0126953125, -0.0628662109375, -0.0022106170654296875, -0.0264739990234375, -0.0013856887817382812, -0.02191162109375, 0.0100860595703125, 0.01419830322265625, -0.0306854248046875, 0.004001617431640625, 0.01007843017578125, 0.059844970703125, 0.027252197265625, -0.0259857177734375, -0.0037288665771484375, -0.0614013671875, 0.076416015625, -0.069580078125, 0.06256103515625, 0.036834716796875, 0.01538848876953125, 0.000026881694793701172, -0.0654296875, -0.03948974609375, -0.0240020751953125, 0.022857666015625, 0.01641845703125, -0.034027099609375, 0.0134429931640625, 0.033355712890625, 0.01776123046875, -0.05450439453125, 0.028564453125, -0.042022705078125, -0.0445556640625, 0.060882568359375, -0.00885009765625, 0.00402069091796875, 0.0024776458740234375, -0.0255279541015625, -0.003215789794921875, -0.0305633544921875, 0.02288818359375, 0.013427734375, 0.03582763671875, -0.0265045166015625, 0.025390625, 0.0016088485717773438, 0.05810546875, -0.01030731201171875, -0.0299835205078125, 0.0518798828125, -0.007434844970703125, -0.0235595703125, 0.0263214111328125, 0.06640625, -0.00815582275390625, 0.0265045166015625, 0.00836181640625, -0.0254058837890625, 0.0007061958312988281, 0.00799560546875, -0.0521240234375, -0.019195556640625, 0.033721923828125, -0.03759765625, -0.003948211669921875, -0.01367950439453125, -0.0074920654296875, 0.003753662109375, -0.05535888671875, 0.04730224609375, -0.029510498046875, -0.028594970703125, -0.0016498565673828125, 0.0080718994140625, 0.01325225830078125, 0.0167236328125, -0.04852294921875, 0.01175689697265625, 0.04345703125, 0.0545654296875, -0.0168609619140625, -0.0290069580078125, -0.058624267578125, 0.00012934207916259766, -0.00994873046875, 0.03143310546875, -0.01065826416015625, -0.02813720703125, -0.01383209228515625, -0.004993438720703125, 0.01158905029296875, -0.03680419921875, 0.045379638671875, -0.028076171875, 0.016571044921875, 0.017303466796875, -0.059814453125, -0.002471923828125, -0.010833740234375, -0.0288543701171875, 0.0767822265625, -0.00720977783203125, -0.056976318359375, 0.00968170166015625, -0.05218505859375, -0.044281005859375, 0.005954742431640625, 0.00959014892578125, -0.031463623046875, -0.004245758056640625, 0.007236480712890625, 0.039031982421875, -0.0091705322265625, 0.0178070068359375, -0.006702423095703125, -0.02288818359375, 0.00836181640625, -0.03570556640625, 0.07879638671875, 0.0361328125, -0.0205078125, 0.033050537109375, -0.081787109375, 0.005916595458984375, 0.00647735595703125, -0.0289154052734375, -0.0032215118408203125, -0.00489044189453125, 0.033111572265625, 0.0249786376953125, 0.024444580078125, -0.05517578125, -0.01219940185546875, -0.040008544921875, 0.048309326171875, 0.044952392578125, -0.021026611328125, 0.0189971923828125, -0.005977630615234375, 0.022430419921875, -0.004817962646484375, 0.02288818359375, -0.0058746337890625, -0.03662109375, -0.0413818359375, -0.0208587646484375, 0.038665771484375, 0.0545654296875, -0.020050048828125, 0.0616455078125, -0.008697509765625, -0.038330078125, -0.07000732421875, 0.01129150390625, 0.03448486328125, 0.04803466796875, 0.056396484375, 0.0005335807800292969, -0.0650634765625, -0.05279541015625, -0.0018815994262695312, -0.01082611083984375, -0.02294921875, 0.013214111328125, 0.0195159912109375, -0.026611328125, 0.072998046875, -0.0176849365234375, -0.03863525390625, -0.004268646240234375, 0.0032253265380859375, 0.0172576904296875, 0.0506591796875, 0.01351165771484375, -0.061859130859375, -0.01024627685546875, -0.0181427001953125, -0.0297393798828125, -0.0051727294921875, 0.020111083984375, 0.00930023193359375, 0.0165863037109375, 0.050323486328125, -0.034637451171875, 0.0191802978515625, 0.054473876953125, -0.0037078857421875, 0.0447998046875, -0.0164031982421875, -0.0295867919921875, -0.08331298828125, 0.007045745849609375, -0.005359649658203125, -0.035369873046875, -0.050384521484375, -0.02825927734375, 0.0012912750244140625, -0.0252685546875, -0.050994873046875, 0.03729248046875, -0.032470703125, -0.004611968994140625, -0.021484375, 0.0129547119140625, -0.0166473388671875, 0.036163330078125, 0.003917694091796875, 0.052398681640625, 0.05621337890625, -0.05316162109375, 0.0369873046875, 0.017913818359375, -0.0289154052734375, 0.024169921875, -0.067626953125, 0.0264129638671875, 0.000774383544921875, 0.0216522216796875, -0.075927734375, 0.011627197265625, -0.01184844970703125, -0.04913330078125, 0.038543701171875, -0.00994873046875, -0.0234375, -0.048187255859375, 0.009246826171875, 0.020904541015625, 0.07159423828125, -0.042327880859375, 0.0455322265625, 0.046600341796875, -0.004055023193359375, -0.032867431640625, -0.0579833984375, -0.00757598876953125, -0.00797271728515625, -0.039825439453125, 0.039398193359375, -0.015716552734375, 0.0003097057342529297, -0.0229949951171875, -0.0104522705078125, 0.00923919677734375, -0.00226593017578125, 0.0297393798828125, 0.0171966552734375, -0.0159759521484375, 0.0218658447265625, -0.023406982421875, -0.0121307373046875, -0.0021648406982421875, -0.03643798828125, 0.05303955078125, -0.008392333984375, -0.0176849365234375, -0.0634765625, 0.0166168212890625, 0.03973388671875, -0.046783447265625, 0.015716552734375, 0.07977294921875, -0.02288818359375, -0.012451171875, -0.06170654296875, -0.0033416748046875, -0.035400390625, 0.0438232421875, -0.027923583984375, -0.07403564453125, 0.0196685791015625, 0.01059722900390625, 0.0117034912109375, 0.039794921875, 0.0322265625, -0.0214385986328125, 0.0806884765625, 0.0347900390625, -0.03271484375, 0.0445556640625, -0.0169677734375, 0.0005464553833007812, -0.0682373046875, -0.02105712890625, -0.0401611328125, -0.006534576416015625, -0.03460693359375, -0.02935791015625, 0.00623321533203125, 0.0064239501953125, -0.0228118896484375, 0.0238037109375, -0.05218505859375, -0.0006589889526367188, 0.049346923828125, -0.001438140869140625, 0.0022068023681640625, 0.005023956298828125, -0.0098724365234375, 0.0007891654968261719, -0.041168212890625, -0.0225982666015625, 0.07122802734375, 0.042999267578125, 0.048675537109375, -0.0136260986328125, 0.058868408203125, 0.0214691162109375, -0.006252288818359375, -0.0599365234375, 0.030975341796875, -0.01165008544921875, -0.03271484375, -0.03948974609375, -0.034210205078125, -0.08172607421875, 0.024200439453125, -0.022064208984375, -0.05548095703125, -0.01336669921875, 0.0245819091796875, -0.0177001953125, 0.0151824951171875, -0.05072021484375, 0.055389404296875, -0.02447509765625, 0.0017309188842773438, -0.0185546875, -0.05615234375, 0.0091552734375, -0.002044677734375, 0.0299224853515625, -0.020263671875, 0.0239715576171875, 0.0745849609375, -0.0235748291015625, 0.0455322265625, -0.037872314453125, -0.0181427001953125, 0.020782470703125, -0.02264404296875, 0.032379150390625, -0.026885986328125, -0.005390167236328125, 0.031219482421875, 0.020416259765625, -0.026275634765625, -0.0264739990234375, 0.03472900390625, -0.06854248046875, -0.033935546875, -0.0066070556640625, -0.03314208984375, -0.031646728515625, 0.0115509033203125, 0.035308837890625, 0.051971435546875, -0.0106964111328125, 0.0249176025390625, 0.0460205078125, -0.00933074951171875, 0.036041259765625, 0.05181884765625, -0.03082275390625, -0.0248870849609375, 0.07220458984375, 0.0278472900390625, 0.0096588134765625, 0.026611328125, 0.028411865234375, -0.04388427734375, -0.056732177734375, -0.01181793212890625, 0.00693511962890625, -0.031951904296875, -0.004016876220703125, -0.055633544921875, -0.03955078125, -0.05511474609375, 0.0291900634765625, -0.040771484375, -0.0134429931640625, -0.038970947265625, 0.00276947021484375, 0.02447509765625, 0.044342041015625, 0.00015032291412353516, 0.0131683349609375, -0.0537109375, 0.035980224609375, 0.035858154296875, 0.01067352294921875, 0.01104736328125, -0.077392578125, -0.01496124267578125, 0.0231781005859375, -0.00560760498046875, -0.040557861328125, 0.02044677734375, 0.0186920166015625, 0.058563232421875, 0.0223236083984375, 0.00286865234375, 0.05255126953125, -0.05279541015625, 0.06976318359375, 0.0259246826171875, -0.08258056640625, 0.059234619140625, -0.01265716552734375, 0.043212890625, 0.0249786376953125, 0.0189361572265625, -0.029998779296875, -0.01605224609375, -0.053436279296875, -0.060699462890625, 0.04718017578125, 0.01220703125, 0.01458740234375, 0.0233001708984375, 0.00412750244140625, -0.0106201171875, 0.01183319091796875, -0.053466796875, -0.03424072265625, -0.031280517578125, -0.01445770263671875, -0.0311126708984375, -0.02288818359375, 0.006763458251953125, -0.058563232421875, 0.06512451171875, 0.013885498046875, 0.0104522705078125, 0.020599365234375, -0.0221710205078125, 0.01934814453125, 0.022979736328125, 0.051361083984375, 0.04412841796875, -0.0214080810546875, 0.00463104248046875, 0.028839111328125, -0.041961669921875, 0.0033740997314453125, 0.025604248046875, 0.01169586181640625, 0.004413604736328125, 0.03265380859375, 0.09136962890625, 0.0185546875, -0.039154052734375, 0.038818359375, 0.00284576416015625, -0.0286102294921875, -0.03759765625, 0.0040740966796875, 0.0178070068359375, 0.0167694091796875, 0.04248046875, -0.0073699951171875, 0.004108428955078125, -0.034393310546875, 0.01541900634765625, 0.032501220703125, -0.0411376953125, -0.01540374755859375, 0.05828857421875, 0.020721435546875, -0.051727294921875, 0.0386962890625, -0.01259613037109375, -0.04254150390625, 0.02056884765625, 0.0574951171875, 0.057891845703125, -0.05450439453125, -0.0104522705078125, 0.0228118896484375, 0.006908416748046875, 0.00814056396484375, 0.028106689453125, -0.0321044921875, -0.04150390625, -0.032470703125, -0.07342529296875, -0.01216888427734375, 0.033233642578125, -0.047943115234375, 0.01032257080078125, -0.025634765625, -0.0233917236328125, 0.0181427001953125, 0.0065765380859375, -0.05889892578125, 0.0301055908203125, 0.0300750732421875, 0.06573486328125, -0.0445556640625, 0.0904541015625, 0.03466796875, -0.01788330078125, -0.08099365234375, -0.0122222900390625, -0.006839752197265625, -0.058807373046875, 0.046875, 0.0098876953125, -0.023468017578125, 0.026336669921875, -0.052703857421875, -0.07135009765625, 0.07647705078125, 0.0187225341796875, -0.07666015625, 0.00106048583984375, -0.006084442138671875, 0.03985595703125, -0.0199737548828125, -0.0015869140625, 0.0310211181640625, 0.0195770263671875, 0.01197052001953125, -0.10003662109375, -0.0108642578125, -0.017822265625, -0.0017566680908203125, -0.0162353515625, -0.032958984375, 0.06671142578125, -0.01261138916015625, -0.00811004638671875, -0.00717926025390625, 0.05853271484375, 0.0184173583984375, 0.0162506103515625, 0.060394287109375, 0.0298919677734375, 0.0809326171875, 0.0035419464111328125, 0.0557861328125, -0.018798828125, 0.024566650390625, 0.1055908203125, -0.01910400390625, 0.0902099609375, 0.0230255126953125, -0.042938232421875, 0.022064208984375, 0.0361328125, -0.01470184326171875, 0.0305938720703125, 0.03106689453125, -0.005298614501953125, -0.021575927734375, -0.00719451904296875, -0.05206298828125, 0.05706787109375, 0.0114593505859375, -0.01279449462890625, 0.01264190673828125, 0.031982421875, -0.016357421875, -0.0224761962890625, -0.03778076171875, 0.060333251953125, 0.02606201171875, -0.016845703125, 0.061279296875, -0.01384735107421875, 0.077392578125, -0.057525634765625, 0.0167694091796875, 0.0235595703125, 0.0041656494140625, -0.026763916015625, -0.040374755859375, -0.0024261474609375, -0.0024204254150390625, -0.0161895751953125, -0.0147247314453125, 0.054046630859375, -0.04876708984375, -0.035430908203125, 0.043182373046875, 0.018096923828125, 0.028228759765625, -0.024322509765625, -0.061676025390625, 0.0179290771484375, 0.0017852783203125, -0.0140380859375, 0.0164642333984375, 0.006336212158203125, 0.0174407958984375, 0.0401611328125, 0.07080078125, 0.0189056396484375, 0.005191802978515625, 0.03607177734375, 0.036865234375, -0.0428466796875, -0.060211181640625, -0.052490234375, 0.044708251953125, -0.003528594970703125, -0.0203857421875, 0.057647705078125, 0.047515869140625, 0.06463623046875, 0.00432586669921875, 0.0487060546875, 0.0255279541015625, 0.062744140625, -0.04522705078125, 0.06072998046875, -0.05487060546875, -0.00212860107421875, -0.0323486328125, -0.07073974609375, -0.0096588134765625, 0.048370361328125, -0.005619049072265625, 0.01392364501953125, 0.022064208984375, 0.04620361328125, -0.001373291015625, 0.004184722900390625, 0.050567626953125, 0.0311737060546875, 0.020233154296875, 0.01715087890625, 0.06292724609375, -0.0298919677734375, 0.046661376953125, -0.0131683349609375, -0.00952911376953125, -0.007244110107421875, -0.0501708984375, -0.06536865234375, -0.06536865234375, -0.02911376953125, -0.02459716796875, -0.00453948974609375, 0.08477783203125, 0.09088134765625, -0.06298828125, -0.0196533203125, 0.00725555419921875, -0.01338958740234375, -0.00566864013671875, -0.01422119140625, 0.03106689453125, -0.018096923828125, -0.0411376953125, 0.047119140625, -0.00084686279296875, 0.02728271484375, -0.01258087158203125, -0.016754150390625, -0.01425933837890625, -0.00862884521484375, 0.05743408203125, 0.0233612060546875, -0.07135009765625, -0.00725555419921875, -0.0214996337890625, 0.0018558502197265625, 0.009429931640625, 0.042633056640625, -0.056610107421875, 0.032989501953125, 0.0217437744140625, 0.031005859375, 0.06927490234375, -0.0007548332214355469, 0.0173797607421875, -0.06182861328125, 0.0186767578125, 0.0258636474609375, 0.0357666015625, 0.031494140625, -0.007965087890625, 0.006832122802734375, 0.00968170166015625, -0.051666259765625, -0.06939697265625, 0.004337310791015625, -0.09130859375, -0.0201873779296875, 0.0849609375, 0.004364013671875, -0.01024627685546875, -0.0171051025390625, -0.038665771484375, 0.046600341796875, -0.0278472900390625, 0.0225677490234375, 0.039459228515625, -0.005901336669921875, -0.01207733154296875, -0.037109375, 0.0455322265625, 0.0213470458984375, -0.0400390625, 0.007328033447265625, 0.02532958984375, 0.03826904296875, 0.0117950439453125, 0.05322265625, 0.003814697265625, 0.0032787322998046875, -0.008758544921875, 0.0289306640625, -0.0270233154296875, -0.0132904052734375, -0.04345703125, 0.006465911865234375, 0.0007386207580566406, -0.0380859375 ] ]
TheBloke/Llama-2-7B-32K-Instruct-GPTQ
2023-09-27T12:45:53.000Z
[ "transformers", "safetensors", "llama", "text-generation", "custom_code", "en", "dataset:togethercomputer/llama-instruct", "arxiv:2307.03172", "license:llama2", "text-generation-inference", "region:us" ]
text-generation
TheBloke
null
null
TheBloke/Llama-2-7B-32K-Instruct-GPTQ
21
225,318
transformers
2023-08-21T12:18:32
--- language: - en license: llama2 library_name: transformers datasets: - togethercomputer/llama-instruct model_name: Llama2 7B 32K Instruct base_model: togethercomputer/Llama-2-7B-32K-Instruct inference: false model_creator: Together model_type: llama prompt_template: '[INST] {prompt} [\INST] ' quantized_by: TheBloke --- <!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p> </div> <div style="display: flex; flex-direction: column; align-items: flex-end;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p> </div> </div> <div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div> <hr style="margin-top: 1.0em; margin-bottom: 1.0em;"> <!-- header end --> # Llama2 7B 32K Instruct - GPTQ - Model creator: [Together](https://huggingface.co/togethercomputer) - Original model: [Llama2 7B 32K Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct) <!-- description start --> ## Description This repo contains GPTQ model files for [Together's Llama2 7B 32K Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct). Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them. <!-- description end --> <!-- repositories-available start --> ## Repositories available * [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-AWQ) * [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ) * [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GGUF) * [Together's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct) <!-- repositories-available end --> <!-- prompt-template start --> ## Prompt template: Llama2-Instruct-Only ``` [INST] {prompt} [\INST] ``` <!-- prompt-template end --> <!-- README_GPTQ.md-provided-files start --> ## Provided files and GPTQ parameters Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements. Each separate quant is in a different branch. See below for instructions on fetching from different branches. All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa. <details> <summary>Explanation of GPTQ parameters</summary> - Bits: The bit size of the quantised model. - GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value. - Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now. - Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy. - GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s). - Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences. - ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit. </details> | Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc | | ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- | | [main](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/main) | 4 | 128 | No | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 3.90 GB | Yes | 4-bit, without Act Order and group size 128g. | | [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 4.28 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. | | [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 4.02 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. | | [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 3.90 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. | | [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 7.01 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. | | [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [c4](https://huggingface.co/datasets/allenai/c4) | 32768 | 7.16 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. | <!-- README_GPTQ.md-provided-files end --> <!-- README_GPTQ.md-download-from-branches start --> ## How to download from branches - In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Llama-2-7B-32K-Instruct-GPTQ:main` - With Git, you can clone a branch with: ``` git clone --single-branch --branch main https://huggingface.co/TheBloke/Llama-2-7B-32K-Instruct-GPTQ ``` - In Python Transformers code, the branch is the `revision` parameter; see below. <!-- README_GPTQ.md-download-from-branches end --> <!-- README_GPTQ.md-text-generation-webui start --> ## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui). Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui). It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install. 1. Click the **Model tab**. 2. Under **Download custom model or LoRA**, enter `TheBloke/Llama-2-7B-32K-Instruct-GPTQ`. - To download from a specific branch, enter for example `TheBloke/Llama-2-7B-32K-Instruct-GPTQ:main` - see Provided Files above for the list of branches for each option. 3. Click **Download**. 4. The model will start downloading. Once it's finished it will say "Done". 5. In the top left, click the refresh icon next to **Model**. 6. In the **Model** dropdown, choose the model you just downloaded: `Llama-2-7B-32K-Instruct-GPTQ` 7. The model will automatically load, and is now ready for use! 8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right. * Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`. 9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started! <!-- README_GPTQ.md-text-generation-webui end --> <!-- README_GPTQ.md-use-from-python start --> ## How to use this GPTQ model from Python code ### Install the necessary packages Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later. ```shell pip3 install transformers>=4.32.0 optimum>=1.12.0 pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7 ``` If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead: ```shell pip3 uninstall -y auto-gptq git clone https://github.com/PanQiWei/AutoGPTQ cd AutoGPTQ pip3 install . ``` ### For CodeLlama models only: you must use Transformers 4.33.0 or later. If 4.33.0 is not yet released when you read this, you will need to install Transformers from source: ```shell pip3 uninstall -y transformers pip3 install git+https://github.com/huggingface/transformers.git ``` ### You can then use the following code ```python from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline model_name_or_path = "TheBloke/Llama-2-7B-32K-Instruct-GPTQ" # To use a different branch, change revision # For example: revision="main" model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto", trust_remote_code=True, revision="main") tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True) prompt = "Tell me about AI" prompt_template=f'''[INST] {prompt} [\INST] ''' print("\n\n*** Generate:") input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda() output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512) print(tokenizer.decode(output[0])) # Inference can also be done using transformers' pipeline print("*** Pipeline:") pipe = pipeline( "text-generation", model=model, tokenizer=tokenizer, max_new_tokens=512, do_sample=True, temperature=0.7, top_p=0.95, top_k=40, repetition_penalty=1.1 ) print(pipe(prompt_template)[0]['generated_text']) ``` <!-- README_GPTQ.md-use-from-python end --> <!-- README_GPTQ.md-compatibility start --> ## Compatibility The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI). [ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility. [Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models. <!-- README_GPTQ.md-compatibility end --> <!-- footer start --> <!-- 200823 --> ## Discord For further support, and discussions on these models and AI in general, join us at: [TheBloke AI's Discord server](https://discord.gg/theblokeai) ## Thanks, and how to contribute Thanks to the [chirper.ai](https://chirper.ai) team! Thanks to Clay from [gpus.llm-utils.org](llm-utils)! I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training. If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects. Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits. * Patreon: https://patreon.com/TheBlokeAI * Ko-Fi: https://ko-fi.com/TheBlokeAI **Special thanks to**: Aemon Algiz. **Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov Thank you to all my generous patrons and donaters! And thank you again to a16z for their generous grant. <!-- footer end --> # Original model card: Together's Llama2 7B 32K Instruct # Llama-2-7B-32K-Instruct ## Model Description Llama-2-7B-32K-Instruct is an open-source, long-context chat model finetuned from [Llama-2-7B-32K](https://huggingface.co/togethercomputer/Llama-2-7B-32K), over high-quality instruction and chat data. We built Llama-2-7B-32K-Instruct with less than 200 lines of Python script using [Together API](https://together.ai/blog/api-announcement), and we also make the [recipe fully available](https://github.com/togethercomputer/Llama-2-7B-32K-Instruct). We hope that this can enable everyone to finetune their own version of [Llama-2-7B-32K](https://huggingface.co/togethercomputer/Llama-2-7B-32K) — play with [Together API](https://together.ai/blog/api-announcement) and give us feedback! ## Data Collection Details Llama-2-7B-32K-Instruct is fine-tuned over a combination of two parts: 1. **19K single- and multi-round conversations generated by human instructions and [Llama-2-70B-Chat](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) outputs**. We collected the dataset following the distillation paradigm that is used by Alpaca, Vicuna, WizardLM, Orca — producing instructions by querying a powerful LLM (in this case, [Llama-2-70B-Chat](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)). The complete dataset is also released [here](https://huggingface.co/datasets/togethercomputer/llama-instruct). We also share the complete recipe for the data collection process [here](https://github.com/togethercomputer/Llama-2-7B-32K-Instruct). 2. **Long-context Summarization and Long-context QA**. We follow the recipe of [Llama-2-7B-32K](https://together.ai/blog/Llama-2-7B-32K), and train our model with the [BookSum dataset](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections) and [Multi-document Question Answering](https://arxiv.org/abs/2307.03172). The final data mixture used for model finetuning is: 19K instruction (50%) + BookSum (25%) + MQA (25%). ## Model Usage We encourage you to try out this model using the [Together API](https://together.ai/blog/api-announcement). The updated inference stack allows for efficient inference. To run the model locally, we strongly recommend to install Flash Attention V2, which is necessary to obtain the best performance: ``` # Please update the path of `CUDA_HOME` export CUDA_HOME=/usr/local/cuda-11.8 pip install transformers==4.31.0 pip install sentencepiece pip install ninja pip install flash-attn --no-build-isolation pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary ``` You can load the model directly from the Hugging Face model hub using ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("togethercomputer/Llama-2-7B-32K-Instruct") model = AutoModelForCausalLM.from_pretrained("togethercomputer/Llama-2-7B-32K-Instruct", trust_remote_code=True, torch_dtype=torch.float16) input_ids = tokenizer.encode("[INST]\nWrite a poem about cats\n[/INST]\n\n", return_tensors="pt") output = model.generate(input_ids, max_length=128, temperature=0.7, repetition_penalty=1.1, top_p=0.7, top_k=50) output_text = tokenizer.decode(output[0], skip_special_tokens=True) ``` The model is also hosted on [Together Playground](https://api.together.xyz/playground). You can simply play with the model by using prompt formatted by: ``` [INST]\n<your instruction here>\n[\INST]\n\n ``` For example, if we query the model with ``` [INST]\nWrite a poem about cats\n[\INST]\n\n ``` the model will return ``` [INST] Write a poem about cats [/INST] Cats are mysterious creatures, with their aloof and independent nature. They're also incredibly beautiful, with their sleek fur and piercing eyes. Here's a poem that captures the essence of cats: Cats, oh cats, how can I describe you? Your beauty is beyond compare, it seems. You're graceful and elegant, like a ballerina's dance, But don't let your charm fool you, for you're not easily tamed. With your soft purring and playful meows, You draw us in with your enchanting powers. We love to watch you play, your tail twirling 'round, As if you're dancing on air, with no sound. But don't be fooled by your sweetness, my friend, For beneath that gentle exterior, lies a fierce defender. When danger lurks, you'll spring into action, Protecting those you hold dear, without question. Solet us admire you, from afar, For in your own way, you're truly unique, a star. And though we may never fully understand, The depths of your soul, we'll always stand, hand in paw, as one. This poem captures the essence of cats, highlighting their beauty, independence,and protective nature. It also celebrates the special bond between humans and cats, recognizing their unique qualities and the joy they bring to our lives. ``` ## Model Evaluation We evaluate the model from three aspects: 1) [Alpaca Eval](https://tatsu-lab.github.io/alpaca_eval/); 2) [Rouge score over BookSum](https://together.ai/blog/Llama-2-7B-32K); and 3) [Accuracy over Multi-document Question Answering (MQA)](https://together.ai/blog/Llama-2-7B-32K). We compare with models including [GPT-3.5-Turbo-16K](https://platform.openai.com/docs/models/gpt-3-5), [https://huggingface.co/meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf), [Longchat-7b-16k](https://huggingface.co/lmsys/longchat-7b-16k) and [Longchat-7b-v1.5-32k](https://huggingface.co/lmsys/longchat-7b-v1.5-32k). We summarize the results below: * Alpaca Eval | Model | win_rate | standard_error | n_total | avg_length | | -------- | ------- | ------- | ------- | ------- | | Llama-2-7B-Chat-hf | 71.37 | 1.59 | 805 | 1479 | | Llama-2-7B-32K-Instruct | 70.36 | 1.61 | 803 | 1885 | | oasst-rlhf-llama-33b | 66.52 | 1.66 | 805 | 1079 | | text_davinci_003 | 50.00 | 0.00 | 805 | 307| | falcon-40b-instruct | 45.71 | 1.75 | 805 | 662 | | alpaca-farm-ppo-human | 41.24 | 1.73 | 805 | 803 | | alpaca-7b | 26.46 | 1.54 | 805 | 396 | | text_davinci_001 | 15.17 | 1.24 | 804 | 296 | * Rouge Score over BookSum | Model | R1 | R2 | RL | | -------- | ------- | ------- | ------- | | Llama-2-7B-Chat-hf | 0.055 | 0.008 | 0.046 | | Longchat-7b-16k | 0.303 | 0.055 | 0.160 | | Longchat-7b-v1.5-32k | 0.308 | 0.057 | 0.163 | | GPT-3.5-Turbo-16K | 0.324 | 0.066 | 0.178 | | Llama-2-7B-32K-Instruct (ours) | 0.336 | 0.076 | 0.184 | * Accuracy over MQA | Model | 20 docs (Avg 2.9K tokens) | 30 docs (Avg 4.4K tokens) | 50 docs (Avg 7.4K tokens) | | -------- | ------- | ------- | ------- | | Llama-2-7B-Chat-hf | 0.448 | 0.421 | 0.354 | | Longchat-7b-16k | 0.510 | 0.473 | 0.428 | | Longchat-7b-v1.5-32k | 0.534 | 0.516 | 0.479 | | GPT-3.5-Turbo-16K | 0.622 | 0.609 | 0.577 | | Llama-2-7B-32K-Instruct (ours) | 0.622 | 0.604 | 0.589 | ## Limitations and Bias As with all language models, Llama-2-7B-32K-Instruct may generate incorrect or biased content. It's important to keep this in mind when using the model. ## Community Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4)
21,548
[ [ -0.034454345703125, -0.061309814453125, 0.015411376953125, 0.0177001953125, -0.032318115234375, -0.00537109375, 0.007190704345703125, -0.0411376953125, 0.023468017578125, 0.0268402099609375, -0.04949951171875, -0.04052734375, -0.0316162109375, 0.002468109130859375, -0.0251007080078125, 0.08013916015625, 0.006561279296875, -0.0234527587890625, -0.00263214111328125, -0.0136871337890625, -0.0186614990234375, -0.031494140625, -0.05255126953125, -0.021881103515625, 0.0293121337890625, 0.01410675048828125, 0.05963134765625, 0.043426513671875, 0.018585205078125, 0.0265655517578125, -0.0110321044921875, -0.0030155181884765625, -0.0438232421875, -0.00855255126953125, 0.0193328857421875, -0.02203369140625, -0.05792236328125, 0.007556915283203125, 0.035247802734375, 0.0087738037109375, -0.032806396484375, 0.0233306884765625, 0.0037326812744140625, 0.04931640625, -0.0304718017578125, 0.01708984375, -0.0302734375, 0.0016069412231445312, -0.01493072509765625, 0.0128021240234375, -0.001003265380859375, -0.037078857421875, 0.0029087066650390625, -0.06878662109375, 0.00879669189453125, -0.00498199462890625, 0.09405517578125, 0.0158538818359375, -0.04315185546875, 0.01016998291015625, -0.0258636474609375, 0.053741455078125, -0.0701904296875, 0.01549530029296875, 0.037139892578125, 0.0164031982421875, -0.01788330078125, -0.06561279296875, -0.044769287109375, -0.0025310516357421875, -0.00846099853515625, 0.0220947265625, -0.037109375, 0.00342559814453125, 0.0304718017578125, 0.0540771484375, -0.059722900390625, -0.00885772705078125, -0.022186279296875, -0.0076141357421875, 0.06878662109375, 0.01348876953125, 0.0275115966796875, -0.0173187255859375, -0.0279388427734375, -0.028900146484375, -0.05938720703125, 0.00391387939453125, 0.029022216796875, 0.0023593902587890625, -0.04962158203125, 0.034637451171875, -0.0251312255859375, 0.032867431640625, 0.0164031982421875, -0.0175933837890625, 0.0294952392578125, -0.042816162109375, -0.0367431640625, -0.027313232421875, 0.090576171875, 0.0259552001953125, -0.0133514404296875, 0.0203094482421875, -0.00506591796875, -0.005462646484375, -0.0008950233459472656, -0.07763671875, -0.028533935546875, 0.03631591796875, -0.038482666015625, -0.02020263671875, -0.012298583984375, -0.058837890625, -0.005859375, -0.007526397705078125, 0.0278472900390625, -0.036224365234375, -0.0303955078125, 0.004207611083984375, -0.019195556640625, 0.040679931640625, 0.031494140625, -0.052001953125, 0.032989501953125, 0.0263519287109375, 0.053131103515625, 0.0120849609375, -0.00994873046875, -0.0182342529296875, 0.00449371337890625, -0.004791259765625, 0.038848876953125, -0.01172637939453125, -0.037750244140625, -0.0240325927734375, 0.02056884765625, 0.002735137939453125, -0.02166748046875, 0.040008544921875, -0.022705078125, 0.0243988037109375, -0.033599853515625, -0.03485107421875, -0.031982421875, 0.01102447509765625, -0.04949951171875, 0.0908203125, 0.035675048828125, -0.06256103515625, 0.0107269287109375, -0.040435791015625, -0.01238250732421875, 0.005077362060546875, -0.0031280517578125, -0.050201416015625, -0.0170440673828125, 0.0186614990234375, 0.0244903564453125, -0.036651611328125, 0.0120849609375, -0.0216064453125, -0.023284912109375, 0.01221466064453125, -0.0302276611328125, 0.09527587890625, 0.01540374755859375, -0.0309906005859375, 0.0009813308715820312, -0.053680419921875, 0.007236480712890625, 0.044281005859375, -0.022857666015625, 0.00565338134765625, -0.01641845703125, 0.006622314453125, 0.001003265380859375, 0.02508544921875, -0.0275115966796875, 0.040557861328125, -0.01093292236328125, 0.05413818359375, 0.047210693359375, 0.0030918121337890625, 0.017486572265625, -0.035125732421875, 0.043548583984375, 0.00275421142578125, 0.040618896484375, 0.00980377197265625, -0.061126708984375, -0.051849365234375, -0.0236358642578125, 0.0291748046875, 0.0482177734375, -0.0491943359375, 0.035247802734375, -0.0095367431640625, -0.057281494140625, -0.02996826171875, -0.0032520294189453125, 0.0252685546875, 0.0264434814453125, 0.0341796875, -0.03680419921875, -0.031829833984375, -0.062469482421875, 0.01302337646484375, -0.034027099609375, -0.00649261474609375, 0.032958984375, 0.056854248046875, -0.0257110595703125, 0.05084228515625, -0.04827880859375, -0.0116424560546875, -0.006591796875, 0.0026454925537109375, 0.03009033203125, 0.040679931640625, 0.058349609375, -0.052001953125, -0.033843994140625, -0.00930023193359375, -0.05572509765625, -0.00493621826171875, 0.0008425712585449219, -0.033233642578125, 0.01351165771484375, -0.003063201904296875, -0.08013916015625, 0.04931640625, 0.041107177734375, -0.036407470703125, 0.05401611328125, -0.0110015869140625, 0.01189422607421875, -0.08355712890625, 0.0026073455810546875, 0.01390838623046875, -0.02056884765625, -0.034881591796875, 0.0188751220703125, -0.00867462158203125, 0.0162353515625, -0.028961181640625, 0.03955078125, -0.03802490234375, -0.0003070831298828125, 0.007724761962890625, 0.0023899078369140625, 0.0205078125, 0.0401611328125, -0.018524169921875, 0.06298828125, 0.034210205078125, -0.0299224853515625, 0.041015625, 0.040191650390625, -0.0054473876953125, 0.02288818359375, -0.06683349609375, 0.019378662109375, 0.010833740234375, 0.034820556640625, -0.06976318359375, -0.0234222412109375, 0.0452880859375, -0.03826904296875, 0.0283203125, -0.0275726318359375, -0.027801513671875, -0.02703857421875, -0.046142578125, 0.0309295654296875, 0.061767578125, -0.0276336669921875, 0.039703369140625, 0.034759521484375, 0.0011539459228515625, -0.047088623046875, -0.055877685546875, -0.0091400146484375, -0.0294952392578125, -0.046539306640625, 0.038055419921875, -0.01065826416015625, -0.00714874267578125, -0.00037169456481933594, 0.01117706298828125, -0.007190704345703125, -0.0012226104736328125, 0.02557373046875, 0.0276336669921875, -0.0187225341796875, -0.0180511474609375, 0.009246826171875, 0.004123687744140625, -0.00482940673828125, -0.0146331787109375, 0.034027099609375, -0.02178955078125, -0.00299835205078125, -0.035552978515625, 0.0164794921875, 0.028717041015625, 0.0011148452758789062, 0.0540771484375, 0.059234619140625, -0.0251312255859375, 0.01496124267578125, -0.03533935546875, -0.005340576171875, -0.03680419921875, 0.00846099853515625, -0.0171356201171875, -0.05560302734375, 0.041412353515625, 0.0216522216796875, 0.0112762451171875, 0.056640625, 0.038848876953125, -0.0014667510986328125, 0.0625, 0.0338134765625, -0.0210418701171875, 0.03692626953125, -0.048492431640625, -0.0173187255859375, -0.0679931640625, -0.01366424560546875, -0.027557373046875, -0.0176544189453125, -0.061981201171875, -0.037109375, 0.0243682861328125, 0.024688720703125, -0.0626220703125, 0.0426025390625, -0.051544189453125, 0.0142974853515625, 0.043670654296875, 0.0273284912109375, 0.02142333984375, 0.0078582763671875, -0.0009016990661621094, 0.00811767578125, -0.038116455078125, -0.0256500244140625, 0.08245849609375, 0.028167724609375, 0.047210693359375, 0.0213165283203125, 0.047210693359375, 0.010406494140625, 0.0215606689453125, -0.0396728515625, 0.04534912109375, 0.005939483642578125, -0.040740966796875, -0.0255126953125, -0.0452880859375, -0.0792236328125, 0.0194854736328125, -0.0112457275390625, -0.0640869140625, 0.023101806640625, 0.004711151123046875, -0.028289794921875, 0.0184326171875, -0.048858642578125, 0.06488037109375, -0.00909423828125, -0.0313720703125, -0.003009796142578125, -0.05255126953125, 0.035919189453125, 0.015869140625, 0.000006735324859619141, -0.0189361572265625, -0.012603759765625, 0.057525634765625, -0.061431884765625, 0.06298828125, -0.01526641845703125, -0.00881195068359375, 0.04815673828125, -0.006618499755859375, 0.045867919921875, 0.01441192626953125, -0.0007395744323730469, 0.034027099609375, 0.015777587890625, -0.03509521484375, -0.0230865478515625, 0.039459228515625, -0.07867431640625, -0.04754638671875, -0.035400390625, -0.0294647216796875, 0.007720947265625, -0.0002034902572631836, 0.034271240234375, 0.018890380859375, -0.0038604736328125, -0.0008563995361328125, 0.041839599609375, -0.0268402099609375, 0.03192138671875, 0.015899658203125, -0.0218353271484375, -0.0501708984375, 0.061309814453125, -0.0009517669677734375, 0.01297760009765625, 0.0171661376953125, 0.01016998291015625, -0.03289794921875, -0.035888671875, -0.058135986328125, 0.031707763671875, -0.037872314453125, -0.03533935546875, -0.03851318359375, -0.029998779296875, -0.024078369140625, 0.0203704833984375, -0.023101806640625, -0.04840087890625, -0.041229248046875, -0.0013151168823242188, 0.07366943359375, 0.044769287109375, -0.004302978515625, 0.03082275390625, -0.0626220703125, 0.0211639404296875, 0.030609130859375, 0.01004791259765625, 0.00116729736328125, -0.05621337890625, -0.002132415771484375, 0.01438140869140625, -0.050933837890625, -0.073974609375, 0.051300048828125, 0.0166473388671875, 0.02667236328125, 0.0244598388671875, 0.000972747802734375, 0.065185546875, -0.0160980224609375, 0.07733154296875, 0.01468658447265625, -0.06866455078125, 0.0396728515625, -0.043609619140625, 0.0144805908203125, 0.0272216796875, 0.03900146484375, -0.0205535888671875, -0.01183319091796875, -0.055419921875, -0.061920166015625, 0.03570556640625, 0.0391845703125, -0.0054473876953125, 0.02105712890625, 0.042449951171875, -0.005218505859375, 0.01427459716796875, -0.072021484375, -0.0413818359375, -0.0265045166015625, -0.01076507568359375, 0.01268768310546875, -0.00812530517578125, -0.02362060546875, -0.046417236328125, 0.06903076171875, -0.013336181640625, 0.05670166015625, 0.0194549560546875, 0.0099945068359375, -0.01166534423828125, 0.0075531005859375, 0.0275726318359375, 0.041107177734375, -0.012359619140625, -0.025390625, 0.0174713134765625, -0.06121826171875, 0.0140838623046875, 0.028106689453125, -0.016387939453125, -0.017425537109375, 0.0138702392578125, 0.05853271484375, -0.006259918212890625, -0.021881103515625, 0.032073974609375, -0.02508544921875, -0.0243072509765625, -0.02459716796875, 0.012054443359375, 0.0240631103515625, 0.0361328125, 0.0225067138671875, -0.0224151611328125, 0.02081298828125, -0.038055419921875, 0.0034236907958984375, 0.035064697265625, -0.0045928955078125, -0.02685546875, 0.06121826171875, -0.0011539459228515625, 0.004364013671875, 0.05853271484375, -0.0274658203125, -0.03167724609375, 0.06396484375, 0.036376953125, 0.058502197265625, -0.0145721435546875, 0.021697998046875, 0.047210693359375, 0.01166534423828125, -0.00450897216796875, 0.035369873046875, 0.0012292861938476562, -0.043060302734375, -0.027313232421875, -0.050567626953125, -0.028594970703125, 0.02423095703125, -0.0521240234375, 0.01513671875, -0.03265380859375, -0.027191162109375, -0.0171661376953125, 0.0243682861328125, -0.042327880859375, 0.0168304443359375, 0.004184722900390625, 0.06121826171875, -0.050201416015625, 0.06719970703125, 0.0411376953125, -0.03948974609375, -0.073486328125, -0.01468658447265625, 0.01093292236328125, -0.057220458984375, 0.01363372802734375, -0.0012235641479492188, 0.0194091796875, -0.0009331703186035156, -0.058624267578125, -0.0750732421875, 0.11865234375, 0.0301055908203125, -0.050537109375, -0.0040130615234375, 0.001369476318359375, 0.02960205078125, -0.0121917724609375, 0.050018310546875, 0.047210693359375, 0.027496337890625, 0.01788330078125, -0.06817626953125, 0.035125732421875, -0.032318115234375, 0.0027179718017578125, 0.016693115234375, -0.0858154296875, 0.0775146484375, -0.005092620849609375, -0.0111541748046875, 0.0210113525390625, 0.0498046875, 0.042633056640625, 0.0030612945556640625, 0.029571533203125, 0.06378173828125, 0.059844970703125, -0.0271148681640625, 0.08624267578125, -0.01544189453125, 0.044189453125, 0.05572509765625, 0.00010699033737182617, 0.057708740234375, 0.01355743408203125, -0.060455322265625, 0.04937744140625, 0.07696533203125, -0.00994873046875, 0.0244598388671875, 0.00957489013671875, -0.0280609130859375, -0.0068511962890625, 0.006927490234375, -0.05572509765625, 0.01190948486328125, 0.033416748046875, -0.01441192626953125, 0.007045745849609375, -0.0259246826171875, 0.0032215118408203125, -0.04461669921875, -0.0117340087890625, 0.0426025390625, 0.0243377685546875, -0.0164337158203125, 0.0699462890625, -0.006961822509765625, 0.050079345703125, -0.038818359375, -0.0171356201171875, -0.0309906005859375, -0.010223388671875, -0.0259246826171875, -0.055267333984375, 0.00872039794921875, -0.0079345703125, -0.0003223419189453125, 0.00527191162109375, 0.053924560546875, -0.0133514404296875, -0.033172607421875, 0.0252685546875, 0.03692626953125, 0.0279388427734375, -0.0060272216796875, -0.08416748046875, 0.0193023681640625, 0.004917144775390625, -0.050628662109375, 0.0369873046875, 0.0279998779296875, 0.0113067626953125, 0.053558349609375, 0.04248046875, -0.01070404052734375, -0.00124359130859375, -0.00933837890625, 0.07818603515625, -0.0616455078125, -0.01788330078125, -0.06036376953125, 0.045928955078125, -0.006786346435546875, -0.0322265625, 0.0517578125, 0.0382080078125, 0.0509033203125, 0.005336761474609375, 0.051300048828125, -0.025665283203125, 0.0101318359375, -0.030181884765625, 0.050689697265625, -0.056365966796875, 0.01531982421875, -0.0299072265625, -0.05633544921875, -0.0037403106689453125, 0.05340576171875, -0.0026226043701171875, 0.0202789306640625, 0.0295562744140625, 0.060455322265625, 0.004474639892578125, 0.0150604248046875, 0.00982666015625, 0.0323486328125, 0.01019287109375, 0.06890869140625, 0.059356689453125, -0.07525634765625, 0.0391845703125, -0.027374267578125, -0.0167388916015625, -0.0035991668701171875, -0.065185546875, -0.055999755859375, -0.0325927734375, -0.04412841796875, -0.04437255859375, -0.00189971923828125, 0.06585693359375, 0.053924560546875, -0.047210693359375, -0.0295562744140625, -0.00789642333984375, 0.0053253173828125, -0.01526641845703125, -0.02008056640625, 0.0258331298828125, 0.026947021484375, -0.05413818359375, 0.0146026611328125, -0.0015974044799804688, 0.0307769775390625, -0.00867462158203125, -0.0214385986328125, -0.0239410400390625, -0.00571441650390625, 0.046142578125, 0.03692626953125, -0.047515869140625, -0.005535125732421875, -0.01462554931640625, -0.0024566650390625, 0.027130126953125, 0.016693115234375, -0.057098388671875, -0.01093292236328125, 0.030181884765625, 0.01546478271484375, 0.065185546875, 0.004962921142578125, 0.025299072265625, -0.0333251953125, 0.01120758056640625, 0.0020503997802734375, 0.033538818359375, 0.01242828369140625, -0.044403076171875, 0.05987548828125, 0.029022216796875, -0.05792236328125, -0.049652099609375, -0.00795745849609375, -0.09051513671875, -0.0115814208984375, 0.08355712890625, -0.0175933837890625, -0.033477783203125, 0.00536346435546875, -0.0231170654296875, 0.0304718017578125, -0.047515869140625, 0.0303955078125, 0.0286102294921875, -0.0210418701171875, -0.0251312255859375, -0.05108642578125, 0.039825439453125, 0.0160675048828125, -0.071044921875, -0.006366729736328125, 0.036651611328125, 0.032379150390625, 0.00228118896484375, 0.0638427734375, -0.011016845703125, 0.0199432373046875, 0.00946807861328125, 0.0096588134765625, 0.006832122802734375, 0.0096588134765625, -0.0278472900390625, -0.004100799560546875, -0.0134429931640625, -0.003543853759765625 ] ]
xlnet-base-cased
2023-01-24T14:50:31.000Z
[ "transformers", "pytorch", "tf", "rust", "xlnet", "text-generation", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1906.08237", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-generation
null
null
null
xlnet-base-cased
48
224,522
transformers
2022-03-02T23:29:04
--- language: en license: mit datasets: - bookcorpus - wikipedia --- # XLNet (base-sized model) XLNet model pre-trained on English language. It was introduced in the paper [XLNet: Generalized Autoregressive Pretraining for Language Understanding](https://arxiv.org/abs/1906.08237) by Yang et al. and first released in [this repository](https://github.com/zihangdai/xlnet/). Disclaimer: The team releasing XLNet did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description XLNet is a new unsupervised language representation learning method based on a novel generalized permutation language modeling objective. Additionally, XLNet employs Transformer-XL as the backbone model, exhibiting excellent performance for language tasks involving long context. Overall, XLNet achieves state-of-the-art (SOTA) results on various downstream language tasks including question answering, natural language inference, sentiment analysis, and document ranking. ## Intended uses & limitations The model is mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlnet) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2. ## Usage Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import XLNetTokenizer, XLNetModel tokenizer = XLNetTokenizer.from_pretrained('xlnet-base-cased') model = XLNetModel.from_pretrained('xlnet-base-cased') inputs = tokenizer("Hello, my dog is cute", return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1906-08237, author = {Zhilin Yang and Zihang Dai and Yiming Yang and Jaime G. Carbonell and Ruslan Salakhutdinov and Quoc V. Le}, title = {XLNet: Generalized Autoregressive Pretraining for Language Understanding}, journal = {CoRR}, volume = {abs/1906.08237}, year = {2019}, url = {http://arxiv.org/abs/1906.08237}, eprinttype = {arXiv}, eprint = {1906.08237}, timestamp = {Mon, 24 Jun 2019 17:28:45 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1906-08237.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
2,696
[ [ -0.029205322265625, -0.05377197265625, 0.0181427001953125, 0.004222869873046875, -0.01241302490234375, -0.01016998291015625, -0.0179901123046875, -0.034271240234375, 0.02215576171875, 0.029144287109375, -0.0306243896484375, -0.029388427734375, -0.046630859375, 0.0059661865234375, -0.034820556640625, 0.0843505859375, -0.00745391845703125, -0.0142974853515625, 0.005985260009765625, -0.0186920166015625, -0.005039215087890625, -0.06561279296875, -0.06365966796875, -0.03546142578125, 0.044525146484375, 0.0024547576904296875, 0.037506103515625, 0.04986572265625, 0.0145721435546875, 0.032806396484375, -0.0226898193359375, -0.0018663406372070312, -0.0270233154296875, -0.0163421630859375, 0.0027294158935546875, -0.033966064453125, -0.049072265625, 0.01287078857421875, 0.050079345703125, 0.06463623046875, 0.0031604766845703125, 0.0145721435546875, 0.016021728515625, 0.036468505859375, -0.0369873046875, 0.00872802734375, -0.0289306640625, 0.0145721435546875, -0.01116943359375, 0.00820159912109375, -0.0210418701171875, -0.003223419189453125, 0.0171051025390625, -0.0287933349609375, 0.016265869140625, 0.01506805419921875, 0.0926513671875, -0.00872039794921875, -0.0294342041015625, -0.005344390869140625, -0.0311737060546875, 0.059417724609375, -0.05621337890625, 0.0330810546875, 0.0214691162109375, 0.0022716522216796875, 0.01465606689453125, -0.0838623046875, -0.04931640625, -0.02764892578125, -0.0262603759765625, 0.017333984375, -0.03192138671875, 0.006153106689453125, 0.0247344970703125, 0.035430908203125, -0.06451416015625, 0.00768280029296875, -0.0233154296875, -0.011444091796875, 0.040863037109375, -0.0018815994262695312, 0.022735595703125, -0.029815673828125, -0.0198211669921875, -0.020111083984375, -0.032562255859375, 0.021087646484375, 0.039581298828125, 0.0237274169921875, -0.0228118896484375, 0.033111572265625, -0.0109710693359375, 0.05010986328125, 0.01495361328125, 0.0229949951171875, 0.043853759765625, -0.026611328125, -0.0272674560546875, -0.0031890869140625, 0.09271240234375, -0.004604339599609375, 0.01456451416015625, -0.01055908203125, -0.0184326171875, -0.01306915283203125, 0.01358795166015625, -0.062469482421875, -0.01666259765625, 0.0188751220703125, -0.042510986328125, -0.02435302734375, 0.0069732666015625, -0.0256805419921875, 0.00743865966796875, -0.0293121337890625, 0.04052734375, -0.0303955078125, -0.04168701171875, -0.00995635986328125, 0.0096282958984375, 0.006404876708984375, -0.00563812255859375, -0.05743408203125, 0.020660400390625, 0.038482666015625, 0.072265625, -0.00830841064453125, -0.02996826171875, -0.021697998046875, -0.036895751953125, -0.02813720703125, 0.04473876953125, -0.020660400390625, 0.004604339599609375, -0.00493621826171875, 0.0191497802734375, -0.0079803466796875, -0.02825927734375, 0.0182647705078125, -0.03533935546875, 0.0223236083984375, 0.0013608932495117188, -0.032989501953125, -0.0193939208984375, 0.0127410888671875, -0.052459716796875, 0.079833984375, 0.0108184814453125, -0.07269287109375, 0.00829315185546875, -0.053558349609375, -0.01092529296875, -0.0090179443359375, -0.005084991455078125, -0.044219970703125, -0.005008697509765625, 0.00481414794921875, 0.040283203125, -0.0023021697998046875, 0.0197601318359375, -0.0137481689453125, -0.00856781005859375, 0.013580322265625, -0.0110321044921875, 0.08587646484375, 0.0300140380859375, -0.0341796875, 0.0124664306640625, -0.0562744140625, 0.0151824951171875, 0.0102081298828125, -0.016204833984375, -0.0224609375, -0.0189361572265625, 0.0188140869140625, 0.0167083740234375, 0.0207672119140625, -0.03485107421875, 0.0055999755859375, -0.04718017578125, 0.040435791015625, 0.03521728515625, -0.028350830078125, 0.03631591796875, -0.00710296630859375, 0.032989501953125, 0.01776123046875, 0.004749298095703125, -0.0187835693359375, -0.0233612060546875, -0.064208984375, 0.00665283203125, 0.0458984375, 0.046478271484375, -0.043365478515625, 0.04949951171875, -0.0160980224609375, -0.0321044921875, -0.02728271484375, 0.00811004638671875, 0.050323486328125, 0.0308837890625, 0.031829833984375, -0.0225067138671875, -0.057281494140625, -0.064453125, -0.00809478759765625, -0.008697509765625, 0.0057220458984375, 0.0191802978515625, 0.043548583984375, -0.0247344970703125, 0.06854248046875, -0.0249786376953125, -0.00986480712890625, -0.053558349609375, 0.036834716796875, 0.031494140625, 0.038360595703125, 0.04833984375, -0.05419921875, -0.05267333984375, 0.00820159912109375, -0.052978515625, -0.00910186767578125, 0.00768280029296875, -0.00980377197265625, 0.044036865234375, 0.041656494140625, -0.0322265625, 0.03582763671875, 0.057861328125, -0.0350341796875, 0.042510986328125, -0.007720947265625, -0.01065826416015625, -0.1046142578125, 0.0239105224609375, 0.0016508102416992188, -0.034149169921875, -0.049468994140625, -0.00006222724914550781, 0.00757598876953125, -0.0060577392578125, -0.018280029296875, 0.05059814453125, -0.053070068359375, 0.007671356201171875, -0.0248565673828125, 0.0123748779296875, -0.003612518310546875, 0.04522705078125, 0.0164642333984375, 0.0469970703125, 0.045745849609375, -0.036529541015625, 0.03143310546875, 0.0176849365234375, -0.015625, 0.0195770263671875, -0.06561279296875, 0.0195159912109375, -0.007442474365234375, 0.00856781005859375, -0.0609130859375, 0.0080413818359375, 0.01358795166015625, -0.044921875, 0.039825439453125, -0.01467132568359375, -0.024932861328125, -0.037139892578125, 0.0006198883056640625, 0.019500732421875, 0.039886474609375, -0.0308074951171875, 0.04833984375, 0.0165863037109375, -0.00986480712890625, -0.06732177734375, -0.0626220703125, 0.011566162109375, 0.007965087890625, -0.0447998046875, 0.0347900390625, -0.0088348388671875, -0.002231597900390625, 0.01335906982421875, -0.00231170654296875, -0.015228271484375, -0.0129547119140625, 0.006893157958984375, 0.00855255126953125, -0.0330810546875, 0.000010251998901367188, -0.01348114013671875, -0.0211944580078125, -0.0008816719055175781, -0.03192138671875, 0.047637939453125, -0.01885986328125, 0.0006775856018066406, -0.03070068359375, 0.037872314453125, 0.01464080810546875, -0.01287078857421875, 0.060272216796875, 0.054412841796875, -0.0236663818359375, -0.01312255859375, -0.0418701171875, -0.020263671875, -0.03533935546875, 0.05780029296875, -0.02264404296875, -0.070556640625, 0.040252685546875, 0.01531982421875, 0.0006833076477050781, 0.03973388671875, 0.0318603515625, 0.0018148422241210938, 0.071533203125, 0.0654296875, -0.0102081298828125, 0.044830322265625, -0.047637939453125, 0.0211944580078125, -0.0711669921875, 0.0031642913818359375, -0.0357666015625, -0.01403045654296875, -0.052154541015625, -0.0038204193115234375, 0.0028667449951171875, 0.0088348388671875, -0.0299072265625, 0.051361083984375, -0.038299560546875, 0.007648468017578125, 0.05316162109375, 0.00612640380859375, -0.0007052421569824219, 0.0011415481567382812, -0.0408935546875, 0.0117340087890625, -0.050689697265625, -0.02484130859375, 0.0831298828125, 0.0223236083984375, 0.038818359375, -0.004993438720703125, 0.04638671875, 0.002086639404296875, 0.00916290283203125, -0.04901123046875, 0.034210205078125, -0.019683837890625, -0.051483154296875, -0.02410888671875, -0.053070068359375, -0.09027099609375, 0.00534820556640625, -0.0163726806640625, -0.06756591796875, 0.00995635986328125, 0.0216827392578125, -0.0261077880859375, 0.030731201171875, -0.057159423828125, 0.06805419921875, -0.0254058837890625, -0.0267791748046875, 0.006832122802734375, -0.04833984375, 0.0037994384765625, 0.0019521713256835938, -0.005908966064453125, 0.0220184326171875, 0.017608642578125, 0.059661865234375, -0.047637939453125, 0.06866455078125, -0.00727081298828125, 0.01446533203125, 0.0171051025390625, -0.01202392578125, 0.037506103515625, -0.00930023193359375, 0.0026912689208984375, 0.0372314453125, -0.0009417533874511719, -0.01995849609375, -0.036834716796875, 0.038421630859375, -0.08984375, -0.033843994140625, -0.037445068359375, -0.0316162109375, -0.0021381378173828125, 0.029754638671875, 0.044921875, 0.04949951171875, -0.00620269775390625, 0.01812744140625, 0.04937744140625, -0.030242919921875, 0.04510498046875, 0.0288238525390625, -0.024444580078125, -0.034271240234375, 0.045745849609375, 0.0305938720703125, 0.0120697021484375, 0.049713134765625, 0.0151824951171875, -0.0283355712890625, -0.04168701171875, -0.01202392578125, 0.0219573974609375, -0.043243408203125, -0.015716552734375, -0.07586669921875, -0.0543212890625, -0.0474853515625, 0.00759124755859375, -0.03741455078125, -0.024810791015625, -0.0217742919921875, -0.01071929931640625, 0.0264129638671875, 0.056396484375, -0.0210723876953125, 0.03460693359375, -0.040557861328125, 0.01480865478515625, 0.0305938720703125, 0.024017333984375, 0.0019073486328125, -0.054046630859375, -0.029815673828125, 0.0028209686279296875, -0.0253448486328125, -0.050689697265625, 0.0413818359375, 0.00907135009765625, 0.05181884765625, 0.0300750732421875, -0.004116058349609375, 0.03814697265625, -0.03533935546875, 0.049346923828125, 0.0311126708984375, -0.0693359375, 0.03948974609375, -0.00830841064453125, 0.0235137939453125, 0.0155029296875, 0.0394287109375, -0.0516357421875, -0.01459503173828125, -0.052032470703125, -0.08355712890625, 0.06878662109375, 0.0197906494140625, 0.029266357421875, 0.0077667236328125, 0.03387451171875, -0.007747650146484375, 0.0135040283203125, -0.0845947265625, -0.05181884765625, -0.04412841796875, -0.0234832763671875, -0.00978851318359375, -0.037506103515625, 0.01397705078125, -0.025146484375, 0.052764892578125, -0.006389617919921875, 0.0535888671875, 0.0255279541015625, -0.022216796875, 0.011444091796875, 0.0092315673828125, 0.03668212890625, 0.04718017578125, -0.0185089111328125, 0.005916595458984375, 0.007808685302734375, -0.05230712890625, -0.003299713134765625, 0.030364990234375, -0.0105743408203125, 0.004695892333984375, 0.032867431640625, 0.0870361328125, -0.0025959014892578125, -0.005889892578125, 0.0443115234375, 0.00013053417205810547, -0.0330810546875, -0.042877197265625, 0.003814697265625, 0.008514404296875, 0.0157470703125, 0.032318115234375, 0.0005364418029785156, 0.0036716461181640625, -0.03131103515625, 0.0150909423828125, 0.0226898193359375, -0.044036865234375, -0.0360107421875, 0.0465087890625, 0.0020961761474609375, -0.0129547119140625, 0.048919677734375, -0.0197906494140625, -0.048828125, 0.045013427734375, 0.043853759765625, 0.07830810546875, -0.01174163818359375, 0.00604248046875, 0.050079345703125, 0.0263671875, 0.0007505416870117188, 0.0123443603515625, 0.0015611648559570312, -0.0721435546875, -0.04180908203125, -0.04937744140625, -0.00958251953125, 0.035858154296875, -0.038116455078125, 0.01116180419921875, -0.0228424072265625, -0.0308380126953125, 0.0009260177612304688, 0.02410888671875, -0.050689697265625, 0.0234832763671875, 0.019561767578125, 0.07525634765625, -0.051666259765625, 0.062347412109375, 0.07037353515625, -0.040557861328125, -0.08575439453125, -0.01471710205078125, -0.0125732421875, -0.0543212890625, 0.072265625, 0.0131072998046875, 0.005443572998046875, 0.012542724609375, -0.050079345703125, -0.0712890625, 0.07830810546875, 0.0204620361328125, -0.051055908203125, -0.00612640380859375, 0.019195556640625, 0.0291290283203125, -0.0249176025390625, 0.055511474609375, 0.01522064208984375, 0.03173828125, -0.0018215179443359375, -0.07403564453125, 0.0058135986328125, -0.0455322265625, 0.01355743408203125, -0.00345611572265625, -0.053619384765625, 0.08795166015625, 0.0008373260498046875, 0.00165557861328125, 0.005950927734375, 0.058868408203125, 0.0043792724609375, 0.0045013427734375, 0.041168212890625, 0.05523681640625, 0.048553466796875, -0.01898193359375, 0.0755615234375, -0.0192108154296875, 0.04278564453125, 0.06610107421875, 0.005718231201171875, 0.053924560546875, 0.01349639892578125, -0.01354217529296875, 0.050262451171875, 0.040283203125, 0.004123687744140625, 0.030731201171875, 0.01194000244140625, 0.01061248779296875, -0.017303466796875, 0.021087646484375, -0.0304412841796875, 0.03619384765625, 0.00811767578125, -0.049346923828125, -0.0107574462890625, -0.0004870891571044922, 0.03619384765625, -0.01396942138671875, -0.0230560302734375, 0.04913330078125, 0.0169830322265625, -0.0447998046875, 0.05377197265625, -0.0050201416015625, 0.062164306640625, -0.05963134765625, 0.00833892822265625, -0.032928466796875, 0.019378662109375, -0.0117034912109375, -0.0465087890625, 0.02325439453125, -0.0025043487548828125, -0.0247802734375, -0.017852783203125, 0.037994384765625, -0.024993896484375, -0.047760009765625, 0.03515625, 0.0294647216796875, 0.01166534423828125, -0.0089111328125, -0.07427978515625, 0.00047087669372558594, -0.0013170242309570312, -0.054412841796875, 0.02264404296875, 0.043426513671875, 0.0058746337890625, 0.04010009765625, 0.04052734375, 0.0153656005859375, -0.0029201507568359375, -0.00141143798828125, 0.058074951171875, -0.056182861328125, -0.046722412109375, -0.053466796875, 0.044677734375, -0.01568603515625, -0.035125732421875, 0.044158935546875, 0.034332275390625, 0.06353759765625, 0.0099334716796875, 0.07537841796875, -0.025604248046875, 0.043060302734375, -0.037750244140625, 0.08013916015625, -0.0567626953125, -0.00893402099609375, -0.020751953125, -0.06048583984375, -0.0235137939453125, 0.064697265625, -0.02569580078125, 0.03802490234375, 0.0711669921875, 0.0621337890625, -0.0172882080078125, -0.019866943359375, 0.03411865234375, 0.047454833984375, 0.004940032958984375, 0.0294342041015625, 0.045013427734375, -0.043670654296875, 0.05340576171875, -0.0295867919921875, -0.0229644775390625, -0.012451171875, -0.07611083984375, -0.0833740234375, -0.061279296875, -0.032318115234375, -0.03253173828125, 0.01081085205078125, 0.076416015625, 0.07861328125, -0.0767822265625, -0.0220184326171875, -0.01959228515625, -0.0085296630859375, -0.021820068359375, -0.0208282470703125, 0.048675537109375, -0.0401611328125, -0.054473876953125, -0.0066986083984375, 0.0025653839111328125, 0.002574920654296875, -0.015869140625, 0.0021495819091796875, -0.015228271484375, -0.0031414031982421875, 0.041778564453125, 0.02764892578125, -0.033660888671875, -0.01270294189453125, 0.0166778564453125, -0.0168304443359375, 0.00782012939453125, 0.04052734375, -0.0543212890625, 0.01308441162109375, 0.035858154296875, 0.033843994140625, 0.04229736328125, -0.0192108154296875, 0.040924072265625, -0.054473876953125, 0.019866943359375, -0.003063201904296875, 0.042083740234375, 0.02630615234375, -0.02728271484375, 0.029083251953125, 0.0183868408203125, -0.0482177734375, -0.045166015625, 0.0061492919921875, -0.07171630859375, -0.017822265625, 0.09307861328125, -0.0271759033203125, -0.0316162109375, -0.00913238525390625, -0.0070343017578125, 0.030731201171875, -0.0196380615234375, 0.044036865234375, 0.033966064453125, 0.00334930419921875, -0.040740966796875, -0.039947509765625, 0.0253753662109375, 0.0172576904296875, -0.044036865234375, -0.006748199462890625, 0.0062255859375, 0.026397705078125, 0.023681640625, 0.0419921875, 0.0015649795532226562, 0.0029888153076171875, -0.006591796875, 0.03131103515625, -0.01690673828125, -0.0102996826171875, -0.00452423095703125, -0.0031490325927734375, -0.0126800537109375, -0.00655364990234375 ] ]
siebert/sentiment-roberta-large-english
2023-04-02T16:25:45.000Z
[ "transformers", "pytorch", "tf", "jax", "roberta", "text-classification", "sentiment", "twitter", "reviews", "siebert", "en", "arxiv:1907.11692", "endpoints_compatible", "has_space", "region:us" ]
text-classification
siebert
null
null
siebert/sentiment-roberta-large-english
83
222,812
transformers
2022-03-02T23:29:05
--- language: "en" tags: - sentiment - twitter - reviews - siebert --- ## SiEBERT - English-Language Sentiment Classification # Overview This model ("SiEBERT", prefix for "Sentiment in English") is a fine-tuned checkpoint of [RoBERTa-large](https://huggingface.co/roberta-large) ([Liu et al. 2019](https://arxiv.org/pdf/1907.11692.pdf)). It enables reliable binary sentiment analysis for various types of English-language text. For each instance, it predicts either positive (1) or negative (0) sentiment. The model was fine-tuned and evaluated on 15 data sets from diverse text sources to enhance generalization across different types of texts (reviews, tweets, etc.). Consequently, it outperforms models trained on only one type of text (e.g., movie reviews from the popular SST-2 benchmark) when used on new data as shown below. # Predictions on a data set If you want to predict sentiment for your own data, we provide an example script via [Google Colab](https://colab.research.google.com/notebooks/intro.ipynb). You can load your data to a Google Drive and run the script for free on a Colab GPU. Set-up only takes a few minutes. We suggest that you manually label a subset of your data to evaluate performance for your use case. For performance benchmark values across various sentiment analysis contexts, please refer to our paper ([Hartmann et al. 2022](https://www.sciencedirect.com/science/article/pii/S0167811622000477?via%3Dihub)). [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/chrsiebert/sentiment-roberta-large-english/blob/main/sentiment_roberta_prediction_example.ipynb) # Use in a Hugging Face pipeline The easiest way to use the model for single predictions is Hugging Face's [sentiment analysis pipeline](https://huggingface.co/transformers/quicktour.html#getting-started-on-a-task-with-a-pipeline), which only needs a couple lines of code as shown in the following example: ``` from transformers import pipeline sentiment_analysis = pipeline("sentiment-analysis",model="siebert/sentiment-roberta-large-english") print(sentiment_analysis("I love this!")) ``` [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/chrsiebert/sentiment-roberta-large-english/blob/main/sentiment_roberta_pipeline.ipynb) # Use for further fine-tuning The model can also be used as a starting point for further fine-tuning of RoBERTa on your specific data. Please refer to Hugging Face's [documentation](https://huggingface.co/docs/transformers/training) for further details and example code. # Performance To evaluate the performance of our general-purpose sentiment analysis model, we set aside an evaluation set from each data set, which was not used for training. On average, our model outperforms a [DistilBERT-based model](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english) (which is solely fine-tuned on the popular SST-2 data set) by more than 15 percentage points (78.1 vs. 93.2 percent, see table below). As a robustness check, we evaluate the model in a leave-one-out manner (training on 14 data sets, evaluating on the one left out), which decreases model performance by only about 3 percentage points on average and underscores its generalizability. Model performance is given as evaluation set accuracy in percent. |Dataset|DistilBERT SST-2|This model| |---|---|---| |McAuley and Leskovec (2013) (Reviews)|84.7|98.0| |McAuley and Leskovec (2013) (Review Titles)|65.5|87.0| |Yelp Academic Dataset|84.8|96.5| |Maas et al. (2011)|80.6|96.0| |Kaggle|87.2|96.0| |Pang and Lee (2005)|89.7|91.0| |Nakov et al. (2013)|70.1|88.5| |Shamma (2009)|76.0|87.0| |Blitzer et al. (2007) (Books)|83.0|92.5| |Blitzer et al. (2007) (DVDs)|84.5|92.5| |Blitzer et al. (2007) (Electronics)|74.5|95.0| |Blitzer et al. (2007) (Kitchen devices)|80.0|98.5| |Pang et al. (2002)|73.5|95.5| |Speriosu et al. (2011)|71.5|85.5| |Hartmann et al. (2019)|65.5|98.0| |**Average**|**78.1**|**93.2**| # Fine-tuning hyperparameters - learning_rate = 2e-5 - num_train_epochs = 3.0 - warmump_steps = 500 - weight_decay = 0.01 Other values were left at their defaults as listed [here](https://huggingface.co/transformers/main_classes/trainer.html#transformers.TrainingArguments). # Citation and contact Please cite [this paper](https://www.sciencedirect.com/science/article/pii/S0167811622000477) (Published in the [IJRM](https://www.journals.elsevier.com/international-journal-of-research-in-marketing)) when you use our model. Feel free to reach out to [christian.siebert@uni-hamburg.de](mailto:christian.siebert@uni-hamburg.de) with any questions or feedback you may have. ``` @article{hartmann2023, title = {More than a Feeling: Accuracy and Application of Sentiment Analysis}, journal = {International Journal of Research in Marketing}, volume = {40}, number = {1}, pages = {75-87}, year = {2023}, doi = {https://doi.org/10.1016/j.ijresmar.2022.05.005}, url = {https://www.sciencedirect.com/science/article/pii/S0167811622000477}, author = {Jochen Hartmann and Mark Heitmann and Christian Siebert and Christina Schamp}, } ```
5,181
[ [ -0.03643798828125, -0.040069580078125, 0.008544921875, 0.0259552001953125, -0.0235443115234375, -0.0027675628662109375, -0.01910400390625, -0.035858154296875, 0.0221099853515625, 0.0009098052978515625, -0.044403076171875, -0.048675537109375, -0.06573486328125, -0.00531768798828125, -0.0144195556640625, 0.0963134765625, 0.01410675048828125, 0.0262298583984375, -0.0200653076171875, -0.017425537109375, -0.0105743408203125, -0.04241943359375, -0.040496826171875, -0.0195159912109375, 0.0258026123046875, 0.0147552490234375, 0.043365478515625, 0.01392364501953125, 0.049468994140625, 0.024078369140625, -0.027191162109375, -0.01194000244140625, -0.035125732421875, 0.0009064674377441406, 0.0006084442138671875, -0.040008544921875, -0.051788330078125, 0.0198211669921875, 0.03582763671875, 0.033843994140625, 0.00826263427734375, 0.032379150390625, 0.021942138671875, 0.059112548828125, -0.01543426513671875, 0.0261688232421875, -0.032562255859375, -0.0025691986083984375, 0.00048470497131347656, 0.01458740234375, -0.03143310546875, -0.03765869140625, 0.0341796875, -0.01029205322265625, 0.012176513671875, -0.001483917236328125, 0.09307861328125, 0.0256805419921875, -0.026123046875, -0.0190582275390625, -0.0299530029296875, 0.07952880859375, -0.0548095703125, 0.033050537109375, 0.00811004638671875, -0.00018322467803955078, 0.016815185546875, -0.04083251953125, -0.037384033203125, -0.0177001953125, -0.0081939697265625, 0.044219970703125, -0.0203094482421875, -0.0097198486328125, 0.033203125, 0.026336669921875, -0.03570556640625, -0.01154327392578125, -0.0283660888671875, -0.00991058349609375, 0.04632568359375, 0.0010347366333007812, 0.0078125, -0.046905517578125, -0.040802001953125, -0.0281829833984375, -0.016021728515625, 0.037139892578125, 0.0259857177734375, 0.0307769775390625, -0.031585693359375, 0.037109375, -0.004573822021484375, 0.02044677734375, 0.020294189453125, -0.01522064208984375, 0.057220458984375, -0.01221466064453125, -0.01311492919921875, -0.024505615234375, 0.0760498046875, 0.05072021484375, 0.016693115234375, 0.007175445556640625, -0.0110626220703125, 0.027740478515625, 0.01071929931640625, -0.07818603515625, -0.0159759521484375, 0.0217742919921875, -0.037872314453125, -0.04034423828125, 0.007160186767578125, -0.05804443359375, -0.0080108642578125, -0.02960205078125, 0.0277252197265625, -0.04937744140625, -0.03369140625, -0.004650115966796875, -0.0142669677734375, 0.0222930908203125, 0.00994873046875, -0.041778564453125, 0.00624847412109375, 0.034515380859375, 0.07708740234375, -0.0198211669921875, -0.02203369140625, -0.0107879638671875, -0.044586181640625, -0.009307861328125, 0.06353759765625, -0.01320648193359375, -0.0253143310546875, -0.0189056396484375, 0.0125274658203125, -0.01270294189453125, -0.0255126953125, 0.044219970703125, -0.0207061767578125, 0.03961181640625, -0.026885986328125, -0.04144287109375, -0.01490020751953125, 0.0278472900390625, -0.03594970703125, 0.094970703125, 0.027862548828125, -0.06884765625, 0.02789306640625, -0.04736328125, -0.0269775390625, -0.025054931640625, 0.0166778564453125, -0.055877685546875, 0.005329132080078125, 0.00655364990234375, 0.049072265625, -0.02081298828125, 0.0251617431640625, -0.04034423828125, -0.01538848876953125, 0.01230621337890625, -0.01024627685546875, 0.08184814453125, 0.0160675048828125, -0.035430908203125, -0.0019140243530273438, -0.046539306640625, 0.008941650390625, 0.0134735107421875, -0.01491546630859375, -0.007579803466796875, -0.0155792236328125, 0.00453948974609375, 0.037322998046875, 0.01593017578125, -0.045989990234375, 0.011474609375, -0.041015625, 0.0258941650390625, 0.05609130859375, -0.0140380859375, 0.036529541015625, -0.0154876708984375, 0.053741455078125, 0.019561767578125, 0.0241546630859375, -0.001956939697265625, -0.040191650390625, -0.058502197265625, -0.01873779296875, 0.03692626953125, 0.03997802734375, -0.033447265625, 0.03570556640625, -0.0244293212890625, -0.05975341796875, -0.043487548828125, -0.0015411376953125, 0.0303192138671875, 0.048858642578125, 0.03070068359375, -0.0303497314453125, -0.0374755859375, -0.059814453125, -0.024810791015625, -0.0177764892578125, 0.00778961181640625, 0.01788330078125, 0.04248046875, -0.0181427001953125, 0.067626953125, -0.038482666015625, -0.0262603759765625, -0.0217742919921875, 0.01015472412109375, 0.02801513671875, 0.02410888671875, 0.0501708984375, -0.05810546875, -0.0675048828125, -0.0208740234375, -0.06732177734375, 0.004192352294921875, 0.01108551025390625, -0.0095672607421875, 0.0287933349609375, 0.0164642333984375, -0.05389404296875, 0.019134521484375, 0.053680419921875, -0.0218658447265625, 0.03912353515625, 0.021484375, 0.00441741943359375, -0.09405517578125, 0.004894256591796875, 0.041900634765625, -0.005733489990234375, -0.0401611328125, -0.00495147705078125, 0.006961822509765625, 0.0024890899658203125, -0.04229736328125, 0.045928955078125, -0.0180206298828125, 0.015716552734375, -0.00022017955780029297, 0.00124359130859375, 0.00423431396484375, 0.051116943359375, 0.004673004150390625, 0.055206298828125, 0.048675537109375, -0.02203369140625, 0.017547607421875, 0.032928466796875, -0.023681640625, 0.05841064453125, -0.057342529296875, 0.0005879402160644531, -0.0178375244140625, 0.0237884521484375, -0.0765380859375, -0.0166473388671875, 0.0211334228515625, -0.0645751953125, 0.02911376953125, -0.024566650390625, -0.020050048828125, -0.037384033203125, -0.03936767578125, -0.0003628730773925781, 0.06390380859375, -0.0318603515625, 0.050323486328125, 0.0264892578125, -0.0163116455078125, -0.04888916015625, -0.049713134765625, -0.0167388916015625, -0.029632568359375, -0.051910400390625, 0.00917816162109375, -0.0190582275390625, -0.006214141845703125, -0.0109100341796875, -0.00800323486328125, 0.0115814208984375, -0.00753021240234375, 0.039947509765625, 0.032928466796875, -0.007213592529296875, 0.0142822265625, -0.004688262939453125, -0.024017333984375, 0.0120849609375, 0.00444793701171875, 0.03997802734375, -0.04742431640625, 0.0011720657348632812, -0.045928955078125, 0.0031795501708984375, 0.04998779296875, -0.0069732666015625, 0.06024169921875, 0.06488037109375, -0.0112152099609375, 0.002933502197265625, -0.03167724609375, -0.0189056396484375, -0.03607177734375, 0.030426025390625, -0.031524658203125, -0.042327880859375, 0.033966064453125, -0.00882720947265625, -0.00673675537109375, 0.051910400390625, 0.042236328125, -0.0252685546875, 0.10198974609375, 0.0435791015625, -0.01861572265625, 0.0406494140625, -0.05084228515625, 0.02337646484375, -0.059844970703125, -0.02789306640625, -0.034027099609375, -0.0236663818359375, -0.060150146484375, -0.01042938232421875, 0.036651611328125, 0.00849151611328125, -0.02374267578125, 0.040924072265625, -0.04852294921875, 0.0115203857421875, 0.048309326171875, 0.0172576904296875, -0.00010877847671508789, 0.013397216796875, -0.00798797607421875, -0.02362060546875, -0.03985595703125, -0.036865234375, 0.07940673828125, 0.05133056640625, 0.051910400390625, -0.0021038055419921875, 0.052764892578125, 0.0127105712890625, 0.026824951171875, -0.076171875, 0.041229248046875, -0.0215301513671875, -0.051513671875, -0.002140045166015625, -0.031585693359375, -0.042877197265625, 0.007778167724609375, -0.0179443359375, -0.047149658203125, 0.01068115234375, 0.015838623046875, -0.0223846435546875, 0.0157928466796875, -0.0697021484375, 0.061187744140625, -0.0234375, -0.03363037109375, 0.00154876708984375, -0.060028076171875, 0.01690673828125, 0.002086639404296875, 0.004810333251953125, -0.0257110595703125, 0.021331787109375, 0.062042236328125, -0.0191802978515625, 0.060699462890625, -0.035491943359375, 0.01047515869140625, 0.0233306884765625, -0.0050201416015625, 0.0450439453125, -0.0100860595703125, -0.031951904296875, 0.032379150390625, 0.011688232421875, -0.0298309326171875, -0.03466796875, 0.05181884765625, -0.06817626953125, -0.0156707763671875, -0.052825927734375, -0.0210418701171875, -0.028076171875, 0.01392364501953125, 0.041229248046875, 0.0233612060546875, -0.0018472671508789062, 0.0252685546875, 0.047332763671875, -0.0175323486328125, 0.01189422607421875, 0.01336669921875, -0.010162353515625, -0.032135009765625, 0.0736083984375, 0.01203155517578125, 0.0244293212890625, 0.0117950439453125, 0.0256805419921875, -0.043060302734375, -0.016448974609375, -0.029296875, 0.0306243896484375, -0.05084228515625, -0.0280303955078125, -0.06817626953125, -0.0132904052734375, -0.044464111328125, 0.0011072158813476562, -0.04010009765625, -0.02935791015625, -0.0211639404296875, -0.0182342529296875, 0.03668212890625, 0.041717529296875, -0.0022907257080078125, 0.036407470703125, -0.04052734375, 0.01273345947265625, 0.00588226318359375, 0.03582763671875, -0.00878143310546875, -0.055908203125, -0.0198822021484375, 0.0164642333984375, -0.036346435546875, -0.056732177734375, 0.0419921875, 0.00872802734375, 0.007427215576171875, 0.0233612060546875, 0.0028705596923828125, 0.045440673828125, -0.006954193115234375, 0.07098388671875, 0.0355224609375, -0.0665283203125, 0.040435791015625, -0.0238189697265625, 0.0195465087890625, 0.04150390625, 0.059234619140625, -0.03155517578125, -0.020965576171875, -0.06097412109375, -0.0721435546875, 0.05712890625, 0.00029087066650390625, 0.0197906494140625, 0.004146575927734375, 0.01389312744140625, 0.00004595518112182617, 0.02740478515625, -0.07177734375, -0.0246124267578125, -0.031524658203125, -0.02960205078125, -0.0066070556640625, -0.019622802734375, 0.00460052490234375, -0.046875, 0.07537841796875, -0.01303863525390625, 0.042877197265625, 0.00183868408203125, -0.00827789306640625, -0.004619598388671875, 0.003879547119140625, 0.01380157470703125, 0.018707275390625, -0.037689208984375, -0.0160980224609375, 0.0245208740234375, -0.033355712890625, -0.0023975372314453125, 0.00182342529296875, -0.0209503173828125, -0.0009889602661132812, 0.01213836669921875, 0.06439208984375, -0.00531768798828125, -0.032196044921875, 0.0557861328125, -0.004726409912109375, -0.021209716796875, -0.0298309326171875, -0.004085540771484375, 0.01239013671875, 0.0217132568359375, 0.004764556884765625, 0.0298614501953125, 0.009918212890625, -0.036895751953125, 0.005756378173828125, 0.03741455078125, -0.05126953125, -0.03704833984375, 0.0455322265625, 0.009735107421875, -0.00493621826171875, 0.048797607421875, -0.0193939208984375, -0.06097412109375, 0.040496826171875, 0.023101806640625, 0.061859130859375, -0.005779266357421875, 0.0269622802734375, 0.055877685546875, 0.011383056640625, -0.0115814208984375, 0.02069091796875, 0.018218994140625, -0.048492431640625, -0.01861572265625, -0.0684814453125, -0.0245208740234375, 0.004833221435546875, -0.0654296875, 0.03948974609375, -0.0335693359375, -0.051910400390625, 0.00872802734375, 0.019989013671875, -0.07440185546875, 0.041168212890625, 0.0021953582763671875, 0.06964111328125, -0.07122802734375, 0.045440673828125, 0.056610107421875, -0.049224853515625, -0.06903076171875, -0.005062103271484375, 0.01163482666015625, -0.053497314453125, 0.035064697265625, 0.0241546630859375, -0.0186614990234375, 0.0017976760864257812, -0.047698974609375, -0.06304931640625, 0.0765380859375, 0.00791168212890625, -0.036285400390625, -0.0019512176513671875, 0.0003974437713623047, 0.060699462890625, -0.01520538330078125, 0.041717529296875, 0.044219970703125, 0.0298309326171875, 0.00519561767578125, -0.053314208984375, -0.004474639892578125, -0.037445068359375, -0.007633209228515625, 0.00588226318359375, -0.085205078125, 0.060150146484375, -0.015167236328125, -0.002727508544921875, 0.00200653076171875, 0.04266357421875, 0.00899505615234375, 0.031768798828125, 0.04144287109375, 0.0548095703125, 0.044036865234375, -0.024658203125, 0.0799560546875, -0.03204345703125, 0.05462646484375, 0.06671142578125, -0.00531005859375, 0.06964111328125, 0.024505615234375, -0.039794921875, 0.04937744140625, 0.057830810546875, -0.0175628662109375, 0.036712646484375, -0.0116119384765625, -0.005748748779296875, -0.0103607177734375, -0.0029926300048828125, -0.033172607421875, 0.032012939453125, 0.0086822509765625, -0.03533935546875, 0.01473236083984375, 0.007137298583984375, 0.021270751953125, -0.016265869140625, 0.008087158203125, 0.056793212890625, 0.01540374755859375, -0.048004150390625, 0.04315185546875, -0.010406494140625, 0.06927490234375, -0.042236328125, 0.0247039794921875, -0.020233154296875, 0.019989013671875, -0.02691650390625, -0.05767822265625, 0.015289306640625, 0.006900787353515625, -0.020965576171875, -0.021942138671875, 0.045928955078125, -0.018157958984375, -0.049774169921875, 0.034759521484375, 0.035064697265625, 0.0205078125, -0.005916595458984375, -0.08221435546875, 0.01108551025390625, 0.01513671875, -0.050537109375, 0.0231781005859375, 0.01273345947265625, 0.016510009765625, 0.0501708984375, 0.04034423828125, 0.0009279251098632812, -0.00852203369140625, 0.005084991455078125, 0.0621337890625, -0.053009033203125, -0.0276947021484375, -0.050018310546875, 0.0645751953125, -0.015472412109375, -0.0300140380859375, 0.0701904296875, 0.039031982421875, 0.06951904296875, -0.00731658935546875, 0.07391357421875, -0.02008056640625, 0.059600830078125, -0.0088043212890625, 0.04510498046875, -0.054962158203125, 0.0027980804443359375, -0.038848876953125, -0.0760498046875, -0.02032470703125, 0.05181884765625, -0.03851318359375, 0.0216827392578125, 0.041717529296875, 0.06707763671875, -0.0131683349609375, 0.0035877227783203125, 0.0037212371826171875, 0.03497314453125, 0.009246826171875, 0.0165252685546875, 0.051788330078125, -0.054290771484375, 0.031402587890625, -0.044189453125, -0.02239990234375, -0.01413726806640625, -0.058013916015625, -0.067138671875, -0.042083740234375, -0.035552978515625, -0.060333251953125, -0.0120849609375, 0.08282470703125, 0.056884765625, -0.0645751953125, -0.015625, 0.0143585205078125, -0.015625, -0.01490020751953125, -0.02252197265625, 0.036041259765625, -0.0208892822265625, -0.07122802734375, -0.0023288726806640625, -0.003437042236328125, 0.00641632080078125, -0.01384735107421875, -0.01593017578125, 0.00601959228515625, 0.0021190643310546875, 0.066162109375, 0.0002601146697998047, -0.041229248046875, -0.0133514404296875, 0.024444580078125, -0.0076904296875, 0.0147705078125, 0.0206146240234375, -0.043792724609375, 0.029541015625, 0.036407470703125, 0.02069091796875, 0.052764892578125, 0.01116943359375, 0.00856781005859375, -0.041259765625, 0.0006418228149414062, 0.0140228271484375, 0.0264129638671875, 0.0372314453125, -0.02178955078125, 0.0419921875, 0.023773193359375, -0.04296875, -0.06689453125, 0.0030879974365234375, -0.092041015625, -0.0014200210571289062, 0.09100341796875, -0.01458740234375, -0.021240234375, 0.03021240234375, -0.01432037353515625, 0.021881103515625, -0.044586181640625, 0.060638427734375, 0.053009033203125, -0.01873779296875, -0.0078582763671875, -0.0335693359375, 0.03875732421875, 0.03900146484375, -0.044036865234375, -0.01546478271484375, 0.027191162109375, 0.04339599609375, 0.01366424560546875, 0.04052734375, -0.01336669921875, 0.003589630126953125, -0.00780487060546875, 0.01326751708984375, 0.01270294189453125, -0.0254364013671875, -0.0290985107421875, 0.005558013916015625, -0.002559661865234375, -0.0004742145538330078 ] ]
THUDM/chatglm2-6b-int4
2023-10-09T08:23:08.000Z
[ "transformers", "pytorch", "chatglm", "glm", "thudm", "custom_code", "zh", "en", "arxiv:2103.10360", "arxiv:2210.02414", "arxiv:1911.02150", "endpoints_compatible", "has_space", "region:us" ]
null
THUDM
null
null
THUDM/chatglm2-6b-int4
207
214,817
transformers
2023-06-25T12:46:22
--- language: - zh - en tags: - glm - chatglm - thudm --- # ChatGLM2-6B <p align="center"> 💻 <a href="https://github.com/THUDM/ChatGLM2-6B" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br> </p> <p align="center"> 👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1y7pqoloy-9b1g6T6JjA8J0KxvUjbwJw" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a> </p> ## 介绍 ChatGLM**2**-6B 是开源中英双语对话模型 [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B) 的第二代版本,在保留了初代模型对话流畅、部署门槛较低等众多优秀特性的基础之上,ChatGLM**2**-6B 引入了如下新特性: 1. **更强大的性能**:基于 ChatGLM 初代模型的开发经验,我们全面升级了 ChatGLM2-6B 的基座模型。ChatGLM2-6B 使用了 [GLM](https://github.com/THUDM/GLM) 的混合目标函数,经过了 1.4T 中英标识符的预训练与人类偏好对齐训练,[评测结果](#评测结果)显示,相比于初代模型,ChatGLM2-6B 在 MMLU(+23%)、CEval(+33%)、GSM8K(+571%) 、BBH(+60%)等数据集上的性能取得了大幅度的提升,在同尺寸开源模型中具有较强的竞争力。 2. **更长的上下文**:基于 [FlashAttention](https://github.com/HazyResearch/flash-attention) 技术,我们将基座模型的上下文长度(Context Length)由 ChatGLM-6B 的 2K 扩展到了 32K,并在对话阶段使用 8K 的上下文长度训练,允许更多轮次的对话。但当前版本的 ChatGLM2-6B 对单轮超长文档的理解能力有限,我们会在后续迭代升级中着重进行优化。 3. **更高效的推理**:基于 [Multi-Query Attention](http://arxiv.org/abs/1911.02150) 技术,ChatGLM2-6B 有更高效的推理速度和更低的显存占用:在官方的模型实现下,推理速度相比初代提升了 42%,INT4 量化下,6G 显存支持的对话长度由 1K 提升到了 8K。 ChatGLM**2**-6B is the second-generation version of the open-source bilingual (Chinese-English) chat model [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B). It retains the smooth conversation flow and low deployment threshold of the first-generation model, while introducing the following new features: 1. **Stronger Performance**: Based on the development experience of the first-generation ChatGLM model, we have fully upgraded the base model of ChatGLM2-6B. ChatGLM2-6B uses the hybrid objective function of [GLM](https://github.com/THUDM/GLM), and has undergone pre-training with 1.4T bilingual tokens and human preference alignment training. The [evaluation results](README.md#evaluation-results) show that, compared to the first-generation model, ChatGLM2-6B has achieved substantial improvements in performance on datasets like MMLU (+23%), CEval (+33%), GSM8K (+571%), BBH (+60%), showing strong competitiveness among models of the same size. 2. **Longer Context**: Based on [FlashAttention](https://github.com/HazyResearch/flash-attention) technique, we have extended the context length of the base model from 2K in ChatGLM-6B to 32K, and trained with a context length of 8K during the dialogue alignment, allowing for more rounds of dialogue. However, the current version of ChatGLM2-6B has limited understanding of single-round ultra-long documents, which we will focus on optimizing in future iterations. 3. **More Efficient Inference**: Based on [Multi-Query Attention](http://arxiv.org/abs/1911.02150) technique, ChatGLM2-6B has more efficient inference speed and lower GPU memory usage: under the official implementation, the inference speed has increased by 42% compared to the first generation; under INT4 quantization, the dialogue length supported by 6G GPU memory has increased from 1K to 8K. ## 软件依赖 ```shell pip install protobuf transformers==4.30.2 cpm_kernels torch>=2.0 gradio mdtex2html sentencepiece accelerate ``` ## 代码调用 可以通过如下代码调用 ChatGLM-6B 模型来生成对话: ```ipython >>> from transformers import AutoTokenizer, AutoModel >>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm2-6b-int4", trust_remote_code=True) >>> model = AutoModel.from_pretrained("THUDM/chatglm2-6b-int4", trust_remote_code=True).half().cuda() >>> model = model.eval() >>> response, history = model.chat(tokenizer, "你好", history=[]) >>> print(response) 你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。 >>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history) >>> print(response) 晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法: 1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。 2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。 3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。 4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。 5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。 6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。 如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。 ``` 关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM2-6B)。 For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/ChatGLM2-6B). ## Change Log * v1.0 ## 协议 本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM2-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。 ## 引用 如果你觉得我们的工作有帮助的话,请考虑引用下列论文,ChatGLM2-6B 的论文会在近期公布,尽情期待~ ``` @article{zeng2022glm, title={Glm-130b: An open bilingual pre-trained model}, author={Zeng, Aohan and Liu, Xiao and Du, Zhengxiao and Wang, Zihan and Lai, Hanyu and Ding, Ming and Yang, Zhuoyi and Xu, Yifan and Zheng, Wendi and Xia, Xiao and others}, journal={arXiv preprint arXiv:2210.02414}, year={2022} } ``` ``` @inproceedings{du2022glm, title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling}, author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie}, booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, pages={320--335}, year={2022} } ```
5,783
[ [ -0.033721923828125, -0.06182861328125, 0.00673675537109375, 0.0257415771484375, -0.0286712646484375, -0.0011281967163085938, -0.0217437744140625, -0.04150390625, 0.007099151611328125, 0.01300048828125, -0.041046142578125, -0.044189453125, -0.039398193359375, -0.01702880859375, -0.00960540771484375, 0.0672607421875, 0.0164794921875, 0.003360748291015625, 0.00562286376953125, -0.00417327880859375, -0.03973388671875, -0.036651611328125, -0.05853271484375, -0.01445770263671875, 0.0053253173828125, 0.01236724853515625, 0.05401611328125, 0.02276611328125, 0.031280517578125, 0.024169921875, -0.0164794921875, 0.01654052734375, -0.0498046875, -0.0222625732421875, 0.0186767578125, -0.03875732421875, -0.0513916015625, 0.0026302337646484375, 0.041412353515625, 0.0152435302734375, -0.0095672607421875, 0.0213775634765625, 0.0236053466796875, 0.049957275390625, -0.03143310546875, 0.035400390625, -0.044281005859375, -0.005748748779296875, -0.00945281982421875, -0.01186370849609375, -0.0244293212890625, -0.0250091552734375, 0.0031585693359375, -0.04144287109375, 0.002132415771484375, 0.01317596435546875, 0.093994140625, -0.0085296630859375, -0.0228729248046875, -0.0129852294921875, -0.04449462890625, 0.07147216796875, -0.0853271484375, 0.0154876708984375, 0.02587890625, 0.031646728515625, -0.02069091796875, -0.05755615234375, -0.03448486328125, -0.012176513671875, -0.03411865234375, 0.0262603759765625, -0.0117645263671875, -0.0036830902099609375, 0.01080322265625, 0.0273284912109375, -0.053558349609375, 0.0009107589721679688, -0.042999267578125, -0.0219573974609375, 0.049591064453125, 0.011077880859375, 0.044647216796875, -0.00887298583984375, -0.036102294921875, -0.0005941390991210938, -0.037689208984375, 0.0189361572265625, 0.01971435546875, 0.01776123046875, -0.0474853515625, 0.0216064453125, -0.0087127685546875, 0.033905029296875, 0.003894805908203125, -0.022125244140625, 0.03314208984375, -0.0440673828125, -0.0195465087890625, -0.0176544189453125, 0.09222412109375, 0.033477783203125, 0.001605987548828125, 0.01342010498046875, -0.0029430389404296875, -0.01082611083984375, -0.00881195068359375, -0.060638427734375, -0.00885009765625, 0.0301971435546875, -0.04473876953125, -0.01506805419921875, -0.0009436607360839844, -0.048797607421875, 0.0093841552734375, 0.003070831298828125, 0.0423583984375, -0.042999267578125, -0.033966064453125, 0.01158905029296875, -0.005893707275390625, 0.02484130859375, 0.0279998779296875, -0.07501220703125, 0.0278472900390625, 0.040130615234375, 0.06500244140625, -0.004993438720703125, -0.0182952880859375, -0.015716552734375, 0.006412506103515625, -0.01186370849609375, 0.028228759765625, -0.0022029876708984375, -0.03656005859375, -0.0075225830078125, -0.0015897750854492188, -0.0204010009765625, -0.0265045166015625, 0.028717041015625, -0.0291748046875, 0.056915283203125, -0.00638580322265625, -0.036376953125, -0.022064208984375, 0.0200653076171875, -0.02630615234375, 0.08160400390625, -0.0011034011840820312, -0.066650390625, -0.005451202392578125, -0.0474853515625, -0.0137176513671875, 0.00014078617095947266, -0.004241943359375, -0.026458740234375, -0.0202789306640625, 0.0283966064453125, 0.01343536376953125, -0.03204345703125, 0.0093231201171875, -0.018707275390625, -0.0308990478515625, 0.0185546875, -0.0262603759765625, 0.0887451171875, 0.0196685791015625, -0.03662109375, 0.0186767578125, -0.025482177734375, 0.025238037109375, 0.019775390625, -0.0143890380859375, -0.0012731552124023438, 0.0037746429443359375, -0.00019824504852294922, 0.0341796875, 0.04046630859375, -0.01690673828125, 0.005245208740234375, -0.05340576171875, 0.031341552734375, 0.048980712890625, -0.007152557373046875, 0.035736083984375, -0.03155517578125, 0.02178955078125, 0.013092041015625, 0.04510498046875, -0.01558685302734375, -0.05279541015625, -0.07635498046875, -0.0123748779296875, 0.0156402587890625, 0.0614013671875, -0.044647216796875, 0.059661865234375, -0.0154571533203125, -0.04327392578125, -0.039764404296875, 0.0186309814453125, 0.04583740234375, 0.0261077880859375, 0.03619384765625, -0.0201263427734375, -0.040252685546875, -0.0557861328125, -0.00817108154296875, -0.03460693359375, -0.004024505615234375, 0.038330078125, 0.03546142578125, -0.0249176025390625, 0.06781005859375, -0.036865234375, -0.02978515625, -0.0233917236328125, 0.00640106201171875, 0.0215301513671875, 0.04425048828125, 0.047271728515625, -0.05712890625, -0.0635986328125, 0.0005092620849609375, -0.063720703125, 0.007617950439453125, 0.006519317626953125, -0.0299530029296875, 0.037200927734375, 0.023712158203125, -0.042877197265625, 0.03155517578125, 0.04852294921875, -0.028350830078125, 0.041046142578125, -0.0162811279296875, 0.001178741455078125, -0.09173583984375, 0.001800537109375, -0.01177215576171875, 0.00316619873046875, -0.05419921875, -0.008087158203125, 0.00091552734375, 0.0158233642578125, -0.046783447265625, 0.07952880859375, -0.050018310546875, 0.0151519775390625, -0.01111602783203125, 0.021270751953125, -0.012939453125, 0.05731201171875, -0.0167083740234375, 0.0469970703125, 0.053955078125, -0.041534423828125, 0.0225982666015625, 0.0221099853515625, -0.0156707763671875, -0.003997802734375, -0.059478759765625, 0.0158843994140625, 0.0018510818481445312, 0.0240478515625, -0.0999755859375, -0.00507354736328125, 0.044921875, -0.065673828125, 0.0219573974609375, -0.0138397216796875, -0.032562255859375, -0.042510986328125, -0.04168701171875, 0.0143890380859375, 0.062347412109375, -0.02337646484375, 0.0406494140625, 0.032501220703125, -0.002899169921875, -0.0438232421875, -0.046905517578125, -0.003627777099609375, -0.01555633544921875, -0.0726318359375, 0.0200653076171875, -0.0128021240234375, -0.0016984939575195312, -0.01256561279296875, 0.00856781005859375, 0.003734588623046875, 0.00038743019104003906, 0.013763427734375, 0.038116455078125, -0.007328033447265625, -0.007434844970703125, -0.0107421875, -0.0107879638671875, -0.00032973289489746094, -0.00991058349609375, 0.04949951171875, -0.0233917236328125, -0.026519775390625, -0.03961181640625, 0.01491546630859375, 0.034332275390625, -0.01517486572265625, 0.058807373046875, 0.08050537109375, -0.0131378173828125, 0.01519775390625, -0.054412841796875, -0.0180206298828125, -0.04150390625, 0.026947021484375, -0.0005178451538085938, -0.07464599609375, 0.060882568359375, 0.0230560302734375, 0.0205535888671875, 0.04669189453125, 0.050048828125, 0.0065460205078125, 0.09124755859375, 0.026702880859375, -0.0228271484375, 0.041168212890625, -0.032684326171875, 0.017913818359375, -0.06097412109375, -0.0255584716796875, -0.029083251953125, -0.0208282470703125, -0.053314208984375, -0.039642333984375, 0.0292510986328125, 0.01110076904296875, -0.015716552734375, 0.0111541748046875, -0.033599853515625, -0.0015058517456054688, 0.042877197265625, 0.004119873046875, 0.006072998046875, -0.009246826171875, -0.007781982421875, 0.00165557861328125, -0.054931640625, -0.03631591796875, 0.0633544921875, 0.032440185546875, 0.05169677734375, 0.0252685546875, 0.038848876953125, -0.002788543701171875, 0.0227813720703125, -0.044647216796875, 0.04791259765625, 0.00598907470703125, -0.0601806640625, -0.0323486328125, -0.034881591796875, -0.072265625, 0.040313720703125, -0.0084075927734375, -0.07861328125, -0.005184173583984375, 0.011260986328125, -0.0169525146484375, 0.01561737060546875, -0.065673828125, 0.06524658203125, -0.03228759765625, -0.0251312255859375, -0.007259368896484375, -0.06298828125, 0.04254150390625, 0.01898193359375, 0.0311737060546875, -0.030242919921875, 0.0004467964172363281, 0.055694580078125, -0.036590576171875, 0.06158447265625, -0.0192413330078125, -0.00405120849609375, 0.03955078125, -0.007232666015625, 0.047454833984375, 0.01551055908203125, 0.01502227783203125, 0.023101806640625, 0.0012655258178710938, -0.02984619140625, -0.04144287109375, 0.05206298828125, -0.057220458984375, -0.05859375, -0.0312347412109375, -0.032684326171875, -0.01104736328125, 0.022064208984375, 0.0310516357421875, 0.020416259765625, -0.0038318634033203125, 0.01558685302734375, 0.0267486572265625, -0.032989501953125, 0.04522705078125, 0.043731689453125, -0.041748046875, -0.038116455078125, 0.05120849609375, 0.002071380615234375, 0.033477783203125, 0.0158843994140625, 0.007781982421875, -0.03399658203125, -0.03399658203125, -0.02978515625, 0.0263671875, -0.032379150390625, -0.003795623779296875, -0.0543212890625, -0.03509521484375, -0.052276611328125, 0.0102081298828125, -0.0240478515625, -0.00859832763671875, -0.031524658203125, 0.0060577392578125, 0.0313720703125, 0.0061798095703125, 0.0016803741455078125, 0.0124359130859375, -0.07208251953125, 0.027435302734375, 0.0202789306640625, 0.0182952880859375, 0.01551055908203125, -0.057861328125, -0.0361328125, 0.0394287109375, -0.01617431640625, -0.04046630859375, 0.045196533203125, 0.0113067626953125, 0.049560546875, 0.0300445556640625, -0.0036945343017578125, 0.054107666015625, -0.02288818359375, 0.07171630859375, 0.02630615234375, -0.06915283203125, 0.0406494140625, -0.03704833984375, 0.03302001953125, 0.0218505859375, 0.0194854736328125, -0.048828125, -0.03546142578125, -0.055145263671875, -0.06842041015625, 0.07867431640625, 0.041595458984375, 0.03369140625, 0.0028667449951171875, -0.0014324188232421875, -0.0231781005859375, 0.007232666015625, -0.049530029296875, -0.055816650390625, -0.00803375244140625, -0.00554656982421875, -0.0007219314575195312, -0.0377197265625, -0.006259918212890625, -0.037200927734375, 0.059967041015625, 0.0021228790283203125, 0.047393798828125, 0.0019283294677734375, 0.00226593017578125, 0.00804901123046875, 0.01499176025390625, 0.05084228515625, 0.0517578125, -0.02362060546875, -0.0097198486328125, 0.0238037109375, -0.042724609375, -0.0036220550537109375, 0.005100250244140625, -0.0181427001953125, 0.0096435546875, 0.017486572265625, 0.0848388671875, 0.0206451416015625, -0.035888671875, 0.041595458984375, -0.0199737548828125, -0.02215576171875, -0.022491455078125, 0.0209503173828125, 0.02362060546875, 0.01244354248046875, 0.045684814453125, -0.02056884765625, 0.0007729530334472656, -0.052581787109375, -0.001644134521484375, 0.0433349609375, -0.022186279296875, -0.0201416015625, 0.05401611328125, 0.01499176025390625, -0.01187896728515625, 0.0318603515625, -0.019805908203125, -0.046600341796875, 0.044281005859375, 0.04150390625, 0.06256103515625, -0.018585205078125, 0.00971221923828125, 0.05206298828125, 0.0162506103515625, -0.024810791015625, 0.024383544921875, 0.01309967041015625, -0.055145263671875, -0.0206146240234375, -0.035308837890625, -0.0149383544921875, 0.0134735107421875, -0.039764404296875, 0.0248565673828125, -0.026153564453125, -0.0244140625, -0.01213836669921875, 0.007442474365234375, -0.026885986328125, 0.01116943359375, 0.00991058349609375, 0.052276611328125, -0.031005859375, 0.06341552734375, 0.034698486328125, -0.023223876953125, -0.06591796875, -0.0160369873046875, 0.01479339599609375, -0.0577392578125, 0.0294036865234375, 0.0095672607421875, -0.0029087066650390625, 0.0026721954345703125, -0.040985107421875, -0.08502197265625, 0.0958251953125, 0.019439697265625, -0.028167724609375, -0.00827789306640625, 0.0003495216369628906, 0.0509033203125, -0.01294708251953125, 0.04541015625, 0.019561767578125, 0.031890869140625, 0.0186767578125, -0.09576416015625, 0.01233673095703125, -0.044036865234375, 0.01113128662109375, 0.0024890899658203125, -0.081298828125, 0.08050537109375, -0.01030731201171875, -0.03302001953125, -0.005184173583984375, 0.05328369140625, 0.0182952880859375, 0.01087188720703125, 0.0229949951171875, 0.024871826171875, 0.04058837890625, -0.0246734619140625, 0.06500244140625, -0.039642333984375, 0.0594482421875, 0.07305908203125, 0.0025482177734375, 0.049163818359375, 0.017242431640625, -0.0292510986328125, 0.031707763671875, 0.041534423828125, -0.0095062255859375, 0.0374755859375, -0.00605010986328125, -0.02032470703125, -0.0033588409423828125, 0.01088714599609375, -0.0501708984375, 0.0176849365234375, 0.031494140625, -0.01430511474609375, -0.01141357421875, -0.00827789306640625, 0.0195465087890625, -0.0303955078125, -0.0099334716796875, 0.0645751953125, 0.0163726806640625, -0.05059814453125, 0.07958984375, 0.0095367431640625, 0.0755615234375, -0.06304931640625, 0.01290130615234375, -0.0225982666015625, 0.00983428955078125, -0.01727294921875, -0.043975830078125, 0.00970458984375, -0.01004791259765625, 0.00302886962890625, -0.0023555755615234375, 0.06341552734375, -0.039337158203125, -0.031768798828125, 0.039520263671875, 0.03289794921875, 0.010986328125, 0.00677490234375, -0.072998046875, 0.006114959716796875, 0.021270751953125, -0.03375244140625, 0.032684326171875, 0.0218505859375, 0.003932952880859375, 0.05621337890625, 0.052154541015625, -0.005924224853515625, 0.0103607177734375, -0.0004343986511230469, 0.061492919921875, -0.046478271484375, -0.038238525390625, -0.074462890625, 0.050201416015625, -0.01145172119140625, -0.02130126953125, 0.07684326171875, 0.045745849609375, 0.061614990234375, 0.0000756382942199707, 0.059326171875, -0.02142333984375, 0.039031982421875, -0.035797119140625, 0.056671142578125, -0.0418701171875, 0.01611328125, -0.019775390625, -0.044525146484375, -0.01617431640625, 0.037750244140625, -0.0234527587890625, 0.025543212890625, 0.04974365234375, 0.07110595703125, 0.01291656494140625, -0.015899658203125, 0.0140838623046875, 0.0194244384765625, 0.0281982421875, 0.06610107421875, 0.042327880859375, -0.055267333984375, 0.054718017578125, -0.019866943359375, -0.002643585205078125, -0.037994384765625, -0.037261962890625, -0.08148193359375, -0.038330078125, -0.01543426513671875, -0.033477783203125, -0.00724029541015625, 0.063720703125, 0.045166015625, -0.05303955078125, -0.031768798828125, 0.01378631591796875, 0.011810302734375, -0.0195159912109375, -0.019805908203125, 0.034210205078125, -0.0285491943359375, -0.06500244140625, 0.0014972686767578125, 0.0200347900390625, 0.0185546875, -0.014862060546875, -0.0236053466796875, -0.0333251953125, 0.004032135009765625, 0.037994384765625, 0.02545166015625, -0.06402587890625, -0.00835418701171875, 0.0089569091796875, -0.037078857421875, 0.0149383544921875, 0.01378631591796875, -0.0325927734375, 0.02984619140625, 0.0440673828125, 0.00313568115234375, 0.05462646484375, -0.000008404254913330078, 0.027679443359375, -0.038055419921875, 0.031280517578125, 0.000988006591796875, 0.0218505859375, 0.00890350341796875, -0.022552490234375, 0.04510498046875, 0.0145416259765625, -0.02813720703125, -0.05902099609375, -0.01470184326171875, -0.080078125, -0.009033203125, 0.1080322265625, -0.01611328125, -0.0187835693359375, -0.0003762245178222656, -0.0391845703125, 0.030792236328125, -0.034393310546875, 0.06396484375, 0.05743408203125, -0.0009655952453613281, -0.012054443359375, -0.048431396484375, 0.0421142578125, 0.030914306640625, -0.0645751953125, -0.01134490966796875, 0.0277862548828125, 0.0216522216796875, 0.008636474609375, 0.07366943359375, -0.01369476318359375, 0.0261383056640625, -0.021087646484375, 0.01071929931640625, -0.01233673095703125, 0.0126495361328125, -0.010589599609375, -0.011444091796875, -0.01068878173828125, -0.0209503173828125 ] ]
facebook/opt-350m
2023-09-15T13:09:50.000Z
[ "transformers", "pytorch", "tf", "jax", "opt", "text-generation", "en", "arxiv:2205.01068", "arxiv:2005.14165", "license:other", "has_space", "text-generation-inference", "region:us" ]
text-generation
facebook
null
null
facebook/opt-350m
77
214,166
transformers
2022-05-11T08:25:39
--- language: en inference: false tags: - text-generation license: other commercial: false --- # OPT : Open Pre-trained Transformer Language Models OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI. **Disclaimer**: The team releasing OPT wrote an official model card, which is available in Appendix D of the [paper](https://arxiv.org/pdf/2205.01068.pdf). Content from **this** model card has been written by the Hugging Face team. ## Intro To quote the first two paragraphs of the [official paper](https://arxiv.org/abs/2205.01068) > Large language models trained on massive text collections have shown surprising emergent > capabilities to generate text and perform zero- and few-shot learning. While in some cases the public > can interact with these models through paid APIs, full model access is currently limited to only a > few highly resourced labs. This restricted access has limited researchers’ ability to study how and > why these large language models work, hindering progress on improving known challenges in areas > such as robustness, bias, and toxicity. > We present Open Pretrained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M > to 175B parameters, which we aim to fully and responsibly share with interested researchers. We train the OPT models to roughly match > the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data > collection and efficient training. Our aim in developing this suite of OPT models is to enable reproducible and responsible research at scale, and > to bring more voices to the table in studying the impact of these LLMs. Definitions of risk, harm, bias, and toxicity, etc., should be articulated by the > collective research community as a whole, which is only possible when models are available for study. ## Model description OPT was predominantly pretrained with English text, but a small amount of non-English data is still present within the training corpus via CommonCrawl. The model was pretrained using a causal language modeling (CLM) objective. OPT belongs to the same family of decoder-only models like [GPT-3](https://arxiv.org/abs/2005.14165). As such, it was pretrained using the self-supervised causal language modedling objective. For evaluation, OPT follows [GPT-3](https://arxiv.org/abs/2005.14165) by using their prompts and overall experimental setup. For more details, please read the [official paper](https://arxiv.org/abs/2205.01068). ## Intended uses & limitations The pretrained-only model can be used for prompting for evaluation of downstream tasks as well as text generation. In addition, the model can be fine-tuned on a downstream task using the [CLM example](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling). For all other OPT checkpoints, please have a look at the [model hub](https://huggingface.co/models?filter=opt). ### How to use You can use this model directly with a pipeline for text generation. ```python >>> from transformers import pipeline >>> generator = pipeline('text-generation', model="facebook/opt-350m") >>> generator("What are we having for dinner?") [{'generated_text': "What are we having for dinner?\nI'm having a steak and a salad.\nI'm""}] ``` By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`. ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True) >>> generator("What are we having for dinner?") [{'generated_text': "What are we having for dinner?\n\nWith spring fast approaching, it’s only appropriate"}] ``` ### Limitations and bias As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of unfiltered content from the internet, which is far from neutral the model is strongly biased : > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. Here's an example of how the model can have biased predictions: ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True, num_return_sequences=5) >>> generator("The woman worked as a") [{'generated_text': "The woman works as a substitute teacher for kids who have missed school. She's the teacher herself,"}, {'generated_text': 'The woman works as a security guard for another company and does an average of around $13/hour'}, {'generated_text': 'The woman works as a receptionist, she could at the least wait a week or two for her'}, {'generated_text': 'The woman works as a manager/intern/career development coach/advisor at a nursing home'}, {'generated_text': 'The woman works as a maid and has to clean the house but you can tell her to do it'}] ``` compared to: ```python >>> from transformers import pipeline, set_seed >>> set_seed(32) >>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True, num_return_sequences=5) >>> generator("The man worked as a") [{'generated_text': 'The man works as a security guard for the National Football League franchise. He has been a part of'}, {'generated_text': 'The man works as a security guard for another company and does an excellent job.\nI remember when'}, {'generated_text': 'The man works as a "secret agent" but at the same time he\'s working to protect the'}, {'generated_text': 'The man works as a manager/operator/servant for a grocery store and does a lot of'}, {'generated_text': 'The man works as a bouncer near the scene of the accident - how he could do that is'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The Meta AI team wanted to train this model on a corpus as large as possible. It is composed of the union of the following 5 filtered datasets of textual documents: - BookCorpus, which consists of more than 10K unpublished books, - CC-Stories, which contains a subset of CommonCrawl data filtered to match the story-like style of Winograd schemas, - The Pile, from which * Pile-CC, OpenWebText2, USPTO, Project Gutenberg, OpenSubtitles, Wikipedia, DM Mathematics and HackerNews* were included. - Pushshift.io Reddit dataset that was developed in Baumgartner et al. (2020) and processed in Roller et al. (2021) - CCNewsV2 containing an updated version of the English portion of the CommonCrawl News dataset that was used in RoBERTa (Liu et al., 2019b) The final training data contains 180B tokens corresponding to 800GB of data. The validation split was made of 200MB of the pretraining data, sampled proportionally to each dataset’s size in the pretraining corpus. The dataset might contains offensive content as parts of the dataset are a subset of public Common Crawl data, along with a subset of public Reddit data, which could contain sentences that, if viewed directly, can be insulting, threatening, or might otherwise cause anxiety. ### Collection process The dataset was collected form internet, and went through classic data processing algorithms and re-formatting practices, including removing repetitive/non-informative text like *Chapter One* or *This ebook by Project Gutenberg.* ## Training procedure ### Preprocessing The texts are tokenized using the **GPT2** byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50272. The inputs are sequences of 2048 consecutive tokens. The 175B model was trained on 992 *80GB A100 GPUs*. The training duration was roughly ~33 days of continuous training. ### BibTeX entry and citation info ```bibtex @misc{zhang2022opt, title={OPT: Open Pre-trained Transformer Language Models}, author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer}, year={2022}, eprint={2205.01068}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
8,772
[ [ -0.017364501953125, -0.0628662109375, 0.0188446044921875, 0.0098724365234375, -0.0199737548828125, -0.0195159912109375, -0.0292205810546875, -0.031707763671875, 0.001804351806640625, 0.0491943359375, -0.053955078125, -0.029266357421875, -0.0472412109375, 0.021942138671875, -0.035552978515625, 0.09259033203125, -0.0012865066528320312, -0.0031566619873046875, 0.003475189208984375, 0.012603759765625, -0.0174407958984375, -0.038665771484375, -0.051788330078125, -0.00966644287109375, 0.025482177734375, 0.01477813720703125, 0.054779052734375, 0.04644775390625, 0.0321044921875, 0.0222625732421875, -0.003299713134765625, 0.005939483642578125, -0.052093505859375, -0.0156707763671875, -0.0038127899169921875, -0.0298919677734375, -0.0242462158203125, 0.0158843994140625, 0.04241943359375, 0.0406494140625, 0.0064239501953125, 0.016510009765625, 0.0093994140625, 0.04241943359375, -0.03826904296875, 0.01357269287109375, -0.059722900390625, -0.007411956787109375, -0.0233154296875, 0.008819580078125, -0.047698974609375, -0.0184326171875, 0.00763702392578125, -0.034881591796875, 0.01161956787109375, -0.005069732666015625, 0.09271240234375, 0.0261993408203125, -0.0208282470703125, -0.01155853271484375, -0.04876708984375, 0.06298828125, -0.062255859375, 0.0246734619140625, 0.024627685546875, 0.001956939697265625, 0.00496673583984375, -0.0635986328125, -0.0457763671875, -0.00705718994140625, -0.0174407958984375, 0.0225067138671875, -0.0236968994140625, 0.0035228729248046875, 0.0187530517578125, 0.021942138671875, -0.04425048828125, 0.006008148193359375, -0.038909912109375, -0.02081298828125, 0.054931640625, 0.003360748291015625, 0.02410888671875, -0.027587890625, -0.0205535888671875, -0.011993408203125, -0.041900634765625, -0.004009246826171875, 0.043853759765625, 0.031951904296875, -0.01425933837890625, 0.0479736328125, -0.0179901123046875, 0.0555419921875, 0.0025787353515625, 0.0021877288818359375, 0.035003662109375, -0.04071044921875, -0.00997161865234375, -0.01024627685546875, 0.09033203125, 0.02569580078125, 0.03741455078125, 0.0018291473388671875, -0.005828857421875, 0.0089569091796875, 0.0197906494140625, -0.058502197265625, -0.007171630859375, 0.024444580078125, -0.0400390625, -0.030487060546875, 0.00014781951904296875, -0.06927490234375, -0.0008244514465332031, -0.0162353515625, 0.026092529296875, -0.031494140625, -0.026458740234375, 0.01345062255859375, -0.0033664703369140625, 0.0154266357421875, -0.0006093978881835938, -0.05889892578125, 0.004596710205078125, 0.036376953125, 0.05621337890625, -0.0045013427734375, -0.029815673828125, -0.0198974609375, -0.00846099853515625, -0.0128173828125, 0.040191650390625, -0.033935546875, 0.000766754150390625, 0.014495849609375, 0.005523681640625, -0.0157470703125, -0.019439697265625, 0.060333251953125, -0.035552978515625, 0.039794921875, -0.003658294677734375, -0.0256195068359375, -0.001373291015625, -0.0004334449768066406, -0.049652099609375, 0.0831298828125, 0.015106201171875, -0.08343505859375, 0.030242919921875, -0.052398681640625, -0.030853271484375, -0.0014047622680664062, 0.00818634033203125, -0.03204345703125, -0.01251220703125, 0.0296630859375, 0.036376953125, -0.0184478759765625, 0.03448486328125, -0.00850677490234375, -0.01580810546875, 0.00926971435546875, -0.0384521484375, 0.0782470703125, 0.028839111328125, -0.028656005859375, 0.0046844482421875, -0.047821044921875, -0.0008211135864257812, 0.0195159912109375, -0.0312042236328125, -0.0128326416015625, 0.0064849853515625, 0.0126800537109375, 0.0246124267578125, 0.0227813720703125, -0.0364990234375, 0.00701904296875, -0.04815673828125, 0.05340576171875, 0.060943603515625, -0.013275146484375, 0.034271240234375, -0.01593017578125, 0.032501220703125, 0.005092620849609375, 0.0193328857421875, -0.024566650390625, -0.0289764404296875, -0.0682373046875, -0.0140533447265625, 0.0260772705078125, 0.043121337890625, -0.053955078125, 0.049957275390625, -0.02764892578125, -0.045562744140625, -0.045989990234375, -0.0017566680908203125, 0.033782958984375, 0.0242919921875, 0.036163330078125, -0.0033283233642578125, -0.053863525390625, -0.0653076171875, -0.032928466796875, -0.0100860595703125, 0.0022220611572265625, 0.01812744140625, 0.044342041015625, -0.036834716796875, 0.08343505859375, -0.045379638671875, -0.0208587646484375, -0.041595458984375, -0.0004775524139404297, 0.0313720703125, 0.035888671875, 0.03302001953125, -0.061737060546875, -0.047149658203125, -0.0164337158203125, -0.047088623046875, -0.0157928466796875, -0.0125274658203125, -0.0251312255859375, 0.0321044921875, 0.04083251953125, -0.060089111328125, 0.010467529296875, 0.049530029296875, -0.0233306884765625, 0.0516357421875, 0.01308441162109375, -0.0164794921875, -0.10003662109375, 0.014892578125, -0.0013399124145507812, -0.01508331298828125, -0.04815673828125, -0.006916046142578125, -0.004558563232421875, -0.0194091796875, -0.042816162109375, 0.04669189453125, -0.031005859375, 0.023529052734375, -0.0007381439208984375, 0.007030487060546875, -0.01454925537109375, 0.0457763671875, 0.009552001953125, 0.0509033203125, 0.043121337890625, -0.0460205078125, 0.002437591552734375, 0.0200958251953125, -0.0235443115234375, 0.015899658203125, -0.051544189453125, 0.004261016845703125, -0.018646240234375, 0.0242462158203125, -0.0633544921875, -0.0267333984375, 0.02490234375, -0.0418701171875, 0.017059326171875, 0.00576019287109375, -0.0400390625, -0.0528564453125, -0.01255035400390625, 0.01528167724609375, 0.0509033203125, -0.03436279296875, 0.0439453125, 0.0307464599609375, -0.004791259765625, -0.059051513671875, -0.049957275390625, -0.002597808837890625, -0.010589599609375, -0.05340576171875, 0.0279693603515625, -0.0032501220703125, -0.00733184814453125, 0.006114959716796875, 0.01029205322265625, -0.00882720947265625, -0.005771636962890625, 0.00548553466796875, 0.018280029296875, -0.005985260009765625, 0.005214691162109375, -0.0016012191772460938, -0.0157470703125, 0.00830841064453125, -0.0248565673828125, 0.061431884765625, -0.00949859619140625, -0.004100799560546875, -0.032806396484375, 0.01299285888671875, 0.0302581787109375, -0.0269317626953125, 0.0660400390625, 0.05413818359375, -0.0254669189453125, -0.0125274658203125, -0.0396728515625, -0.02093505859375, -0.03857421875, 0.051788330078125, -0.0014009475708007812, -0.06390380859375, 0.02484130859375, 0.01226043701171875, 0.0178680419921875, 0.053497314453125, 0.04400634765625, 0.01146697998046875, 0.077392578125, 0.0421142578125, -0.019378662109375, 0.04327392578125, -0.0226287841796875, 0.02093505859375, -0.0418701171875, 0.0008554458618164062, -0.043121337890625, -0.00408935546875, -0.04327392578125, -0.0212860107421875, 0.0070953369140625, 0.0038852691650390625, -0.03192138671875, 0.03948974609375, -0.049560546875, 0.034027099609375, 0.046539306640625, 0.00826263427734375, 0.004589080810546875, -0.0002491474151611328, -0.00646209716796875, -0.003620147705078125, -0.059417724609375, -0.045989990234375, 0.09429931640625, 0.033782958984375, 0.050506591796875, -0.0242156982421875, 0.057464599609375, 0.00974273681640625, 0.02716064453125, -0.035430908203125, 0.044403076171875, -0.0183563232421875, -0.0635986328125, -0.01500701904296875, -0.043487548828125, -0.0738525390625, 0.0104522705078125, -0.0161590576171875, -0.05377197265625, -0.00955963134765625, 0.0100860595703125, -0.00989532470703125, 0.0205535888671875, -0.0660400390625, 0.0863037109375, -0.023712158203125, -0.0287933349609375, 0.002254486083984375, -0.05426025390625, 0.03277587890625, -0.01212310791015625, 0.0261993408203125, 0.00978851318359375, 0.01519775390625, 0.06915283203125, -0.033966064453125, 0.0802001953125, -0.006954193115234375, 0.0035419464111328125, 0.037384033203125, -0.0175018310546875, 0.039276123046875, -0.01012420654296875, -0.01288604736328125, 0.03131103515625, -0.019287109375, -0.0206756591796875, -0.0034389495849609375, 0.032745361328125, -0.07501220703125, -0.0285186767578125, -0.031951904296875, -0.034881591796875, 0.006610870361328125, 0.041107177734375, 0.051422119140625, 0.0243988037109375, -0.012542724609375, 0.020355224609375, 0.031036376953125, -0.039337158203125, 0.03948974609375, 0.0222015380859375, -0.01056671142578125, -0.032501220703125, 0.057373046875, 0.005947113037109375, 0.0247039794921875, 0.0347900390625, 0.01200103759765625, -0.031646728515625, -0.0220489501953125, -0.021484375, 0.03192138671875, -0.04571533203125, -0.021392822265625, -0.0762939453125, -0.03619384765625, -0.04095458984375, -0.0159454345703125, -0.037322998046875, -0.00872802734375, -0.03662109375, -0.01296234130859375, 0.0164337158203125, 0.039031982421875, 0.002666473388671875, 0.03948974609375, -0.054107666015625, 0.016510009765625, 0.008636474609375, 0.0210113525390625, -0.005016326904296875, -0.035736083984375, -0.0235443115234375, 0.023681640625, -0.0384521484375, -0.06402587890625, 0.03857421875, 0.00850677490234375, 0.04095458984375, 0.038238525390625, 0.01323699951171875, 0.029571533203125, -0.041961669921875, 0.070556640625, 0.009979248046875, -0.07293701171875, 0.03485107421875, -0.041168212890625, 0.0201263427734375, 0.042724609375, 0.0404052734375, -0.035736083984375, -0.041961669921875, -0.05413818359375, -0.0787353515625, 0.07354736328125, 0.033355712890625, 0.0301666259765625, -0.01413726806640625, 0.0209808349609375, 0.00141143798828125, 0.0197906494140625, -0.1075439453125, -0.024932861328125, -0.0226287841796875, -0.032318115234375, -0.01348876953125, -0.0269317626953125, 0.01413726806640625, -0.0197601318359375, 0.0601806640625, 0.0011396408081054688, 0.037841796875, 0.0106658935546875, -0.018157958984375, -0.0035381317138671875, 0.01531219482421875, 0.0283355712890625, 0.039031982421875, -0.01018524169921875, 0.00021791458129882812, 0.009246826171875, -0.04364013671875, -0.0069732666015625, 0.0163726806640625, -0.03167724609375, -0.00012314319610595703, 0.0304412841796875, 0.07427978515625, 0.002857208251953125, -0.051849365234375, 0.047576904296875, 0.0091400146484375, -0.0235595703125, -0.0310821533203125, 0.0040740966796875, 0.00830078125, -0.0005960464477539062, 0.0229339599609375, 0.00600433349609375, -0.00885772705078125, -0.03314208984375, 0.0176239013671875, 0.030029296875, -0.0250244140625, -0.0243682861328125, 0.066162109375, 0.0261993408203125, -0.0249481201171875, 0.05487060546875, -0.0197906494140625, -0.0638427734375, 0.04034423828125, 0.047882080078125, 0.07196044921875, -0.005550384521484375, 0.0286407470703125, 0.05340576171875, 0.049041748046875, -0.01233673095703125, 0.0029144287109375, 0.0197906494140625, -0.06439208984375, -0.041107177734375, -0.06219482421875, -0.00295257568359375, 0.026153564453125, -0.031982421875, 0.0406494140625, -0.00879669189453125, -0.0107421875, -0.01253509521484375, -0.007904052734375, -0.05859375, 0.016204833984375, 0.0013523101806640625, 0.063720703125, -0.08038330078125, 0.0430908203125, 0.044189453125, -0.041412353515625, -0.06591796875, 0.01044464111328125, -0.0207366943359375, -0.059417724609375, 0.046234130859375, 0.0439453125, 0.0272674560546875, 0.0162506103515625, -0.05767822265625, -0.07171630859375, 0.06585693359375, 0.023773193359375, -0.037933349609375, -0.0135650634765625, 0.0290374755859375, 0.05242919921875, -0.0209503173828125, 0.0372314453125, 0.033782958984375, 0.0386962890625, -0.0166015625, -0.0614013671875, 0.006862640380859375, -0.02001953125, -0.017120361328125, 0.0119171142578125, -0.05731201171875, 0.0838623046875, -0.0089874267578125, -0.0263824462890625, -0.01120758056640625, 0.035919189453125, 0.0008912086486816406, 0.0060577392578125, 0.038543701171875, 0.04766845703125, 0.04193115234375, -0.0169830322265625, 0.09124755859375, -0.03411865234375, 0.0423583984375, 0.071533203125, -0.00115203857421875, 0.0567626953125, 0.0191192626953125, -0.0199127197265625, 0.034881591796875, 0.0491943359375, -0.00943756103515625, 0.03619384765625, -0.0018758773803710938, 0.01132965087890625, -0.01617431640625, -0.0035991668701171875, -0.031646728515625, 0.03228759765625, 0.0069427490234375, -0.04461669921875, -0.0133514404296875, -0.0038814544677734375, 0.023834228515625, -0.00885009765625, -0.01406097412109375, 0.052093505859375, 0.004360198974609375, -0.0718994140625, 0.044769287109375, 0.006649017333984375, 0.06463623046875, -0.057037353515625, 0.020477294921875, -0.0032024383544921875, 0.0242462158203125, -0.011932373046875, -0.045562744140625, 0.019378662109375, 0.004604339599609375, -0.0111236572265625, -0.028839111328125, 0.0601806640625, -0.0394287109375, -0.0472412109375, 0.0252838134765625, 0.0274658203125, 0.0098419189453125, -0.01904296875, -0.059112548828125, 0.0032939910888671875, 0.0160369873046875, -0.0296630859375, 0.00905609130859375, 0.019073486328125, 0.01219940185546875, 0.03912353515625, 0.056549072265625, -0.0015735626220703125, 0.0106048583984375, -0.0083770751953125, 0.07440185546875, -0.038330078125, -0.038665771484375, -0.07366943359375, 0.059173583984375, -0.0114593505859375, -0.032257080078125, 0.058441162109375, 0.046234130859375, 0.0814208984375, -0.018310546875, 0.0689697265625, -0.021575927734375, 0.034271240234375, -0.037017822265625, 0.05859375, -0.046234130859375, -0.002788543701171875, -0.046234130859375, -0.08013916015625, -0.006046295166015625, 0.048919677734375, -0.0333251953125, 0.0177764892578125, 0.06011962890625, 0.05963134765625, -0.0011444091796875, -0.006427764892578125, -0.004108428955078125, 0.033935546875, 0.021331787109375, 0.048492431640625, 0.055389404296875, -0.041900634765625, 0.062408447265625, -0.0237884521484375, -0.0193023681640625, -0.0184173583984375, -0.06219482421875, -0.0806884765625, -0.04364013671875, -0.016143798828125, -0.034027099609375, -0.002651214599609375, 0.056243896484375, 0.04791259765625, -0.04736328125, -0.0117340087890625, -0.0307159423828125, 0.0007696151733398438, -0.0145721435546875, -0.024749755859375, 0.027587890625, -0.034027099609375, -0.06646728515625, 0.0006265640258789062, -0.00934600830078125, -0.00797271728515625, -0.01270294189453125, -0.0036525726318359375, -0.028045654296875, 0.00634002685546875, 0.04071044921875, 0.0001876354217529297, -0.038360595703125, -0.00934600830078125, 0.00997161865234375, -0.01018524169921875, -0.007007598876953125, 0.03875732421875, -0.041595458984375, 0.0248260498046875, 0.035430908203125, 0.044036865234375, 0.031585693359375, 0.00525665283203125, 0.04095458984375, -0.050018310546875, 0.0125274658203125, 0.01641845703125, 0.0263519287109375, 0.0196685791015625, -0.035919189453125, 0.03973388671875, 0.0206451416015625, -0.051177978515625, -0.066650390625, 0.01505279541015625, -0.0653076171875, -0.020477294921875, 0.10882568359375, 0.0027027130126953125, -0.01540374755859375, 0.002574920654296875, -0.027008056640625, 0.02947998046875, -0.034149169921875, 0.047271728515625, 0.056915283203125, 0.0198211669921875, -0.005146026611328125, -0.04034423828125, 0.038360595703125, 0.0296783447265625, -0.06451416015625, 0.0086669921875, 0.0335693359375, 0.02197265625, 0.015289306640625, 0.062469482421875, -0.005390167236328125, 0.0003008842468261719, 0.0012521743774414062, 0.0132293701171875, -0.0020923614501953125, -0.019866943359375, -0.007049560546875, 0.0040435791015625, -0.023712158203125, -0.0011682510375976562 ] ]
Rakib/roberta-base-on-cuad
2023-01-18T12:18:53.000Z
[ "transformers", "pytorch", "roberta", "question-answering", "legal-contract-review", "cuad", "en", "dataset:cuad", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
Rakib
null
null
Rakib/roberta-base-on-cuad
4
214,054
transformers
2022-03-02T23:29:04
--- language: - en license: mit datasets: - cuad pipeline_tag: question-answering tags: - legal-contract-review - roberta - cuad library_name: transformers --- # Model Card for roberta-base-on-cuad # Model Details ## Model Description - **Developed by:** Mohammed Rakib - **Shared by [Optional]:** More information needed - **Model type:** Question Answering - **Language(s) (NLP):** en - **License:** MIT - **Related Models:** - **Parent Model:** RoBERTa - **Resources for more information:** - GitHub Repo: [defactolaw](https://github.com/afra-tech/defactolaw) - Associated Paper: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/) # Uses ## Direct Use This model can be used for the task of Question Answering on Legal Documents. # Training Details Read: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/) for detailed information on training procedure, dataset preprocessing and evaluation. ## Training Data See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information. ## Training Procedure ### Preprocessing More information needed ### Speeds, Sizes, Times More information needed # Evaluation ## Testing Data, Factors & Metrics ### Testing Data See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information. ### Factors ### Metrics More information needed ## Results More information needed # Model Examination More information needed - **Hardware Type:** More information needed - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Technical Specifications [optional] ## Model Architecture and Objective More information needed ## Compute Infrastructure More information needed ### Hardware Used V100/P100 from Google Colab Pro ### Software Python, Transformers # Citation **BibTeX:** ``` @inproceedings{nawar-etal-2022-open, title = "An Open Source Contractual Language Understanding Application Using Machine Learning", author = "Nawar, Afra and Rakib, Mohammed and Hai, Salma Abdul and Haq, Sanaulla", booktitle = "Proceedings of the First Workshop on Language Technology and Resources for a Fair, Inclusive, and Safe Society within the 13th Language Resources and Evaluation Conference", month = jun, year = "2022", address = "Marseille, France", publisher = "European Language Resources Association", url = "https://aclanthology.org/2022.lateraisse-1.6", pages = "42--50", abstract = "Legal field is characterized by its exclusivity and non-transparency. Despite the frequency and relevance of legal dealings, legal documents like contracts remains elusive to non-legal professionals for the copious usage of legal jargon. There has been little advancement in making legal contracts more comprehensible. This paper presents how Machine Learning and NLP can be applied to solve this problem, further considering the challenges of applying ML to the high length of contract documents and training in a low resource environment. The largest open-source contract dataset so far, the Contract Understanding Atticus Dataset (CUAD) is utilized. Various pre-processing experiments and hyperparameter tuning have been carried out and we successfully managed to eclipse SOTA results presented for models in the CUAD dataset trained on RoBERTa-base. Our model, A-type-RoBERTa-base achieved an AUPR score of 46.6{\%} compared to 42.6{\%} on the original RoBERT-base. This model is utilized in our end to end contract understanding application which is able to take a contract and highlight the clauses a user is looking to find along with it{'}s descriptions to aid due diligence before signing. Alongside digital, i.e. searchable, contracts the system is capable of processing scanned, i.e. non-searchable, contracts using tesseract OCR. This application is aimed to not only make contract review a comprehensible process to non-legal professionals, but also to help lawyers and attorneys more efficiently review contracts.", } ``` # Glossary [optional] More information needed # More Information [optional] More information needed # Model Card Authors [optional] Mohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team # Model Card Contact More information needed # How to Get Started with the Model Use the code below to get started with the model. <details> <summary> Click to expand </summary> ```python from transformers import AutoTokenizer, AutoModelForQuestionAnswering tokenizer = AutoTokenizer.from_pretrained("Rakib/roberta-base-on-cuad") model = AutoModelForQuestionAnswering.from_pretrained("Rakib/roberta-base-on-cuad") ``` </details>
5,001
[ [ -0.019805908203125, -0.0626220703125, 0.026824951171875, 0.0031223297119140625, -0.02447509765625, -0.0129241943359375, -0.0158843994140625, -0.037261962890625, -0.00994873046875, 0.059417724609375, -0.0211944580078125, -0.055389404296875, -0.05218505859375, -0.00641632080078125, -0.0126800537109375, 0.07891845703125, 0.00005906820297241211, 0.00704193115234375, -0.0263824462890625, -0.031280517578125, -0.0214385986328125, -0.04327392578125, -0.050048828125, -0.034881591796875, 0.0013456344604492188, 0.0234527587890625, 0.05291748046875, 0.052276611328125, 0.025787353515625, 0.0264434814453125, -0.026336669921875, 0.00174713134765625, -0.006801605224609375, -0.0038547515869140625, -0.00640869140625, -0.04547119140625, -0.055908203125, 0.01593017578125, 0.02569580078125, 0.039306640625, -0.0218658447265625, 0.01763916015625, -0.0260009765625, 0.0462646484375, -0.03448486328125, 0.037353515625, -0.05499267578125, 0.00921630859375, 0.00855255126953125, -0.014373779296875, -0.039581298828125, -0.0061492919921875, 0.013763427734375, -0.05548095703125, 0.0170440673828125, 0.010650634765625, 0.10205078125, 0.0294189453125, -0.0274200439453125, -0.04254150390625, -0.0270538330078125, 0.049560546875, -0.0548095703125, 0.036865234375, 0.041656494140625, 0.0165557861328125, -0.00798797607421875, -0.0531005859375, -0.039093017578125, -0.057373046875, -0.006900787353515625, 0.0298919677734375, -0.01139068603515625, 0.00276947021484375, 0.03131103515625, 0.01531982421875, -0.054046630859375, 0.01366424560546875, -0.032562255859375, -0.006561279296875, 0.047210693359375, 0.00406646728515625, 0.01457977294921875, -0.0040740966796875, -0.0169677734375, 0.004550933837890625, -0.0265045166015625, 0.0269927978515625, 0.031219482421875, 0.00737762451171875, -0.040679931640625, 0.051849365234375, 0.00019621849060058594, 0.05963134765625, 0.00946807861328125, 0.005779266357421875, 0.02862548828125, -0.006153106689453125, -0.037689208984375, -0.007717132568359375, 0.058837890625, -0.00937652587890625, 0.01019287109375, -0.023345947265625, -0.0100250244140625, -0.011688232421875, 0.016265869140625, -0.033599853515625, -0.01088714599609375, 0.036102294921875, -0.04571533203125, -0.0133056640625, 0.021820068359375, -0.04132080078125, -0.0226898193359375, -0.048431396484375, -0.0087890625, -0.045196533203125, -0.014678955078125, 0.02545166015625, -0.00450897216796875, 0.042266845703125, 0.01285552978515625, -0.041595458984375, 0.020294189453125, 0.050445556640625, 0.04595947265625, 0.0223541259765625, -0.0156097412109375, -0.037353515625, -0.0057373046875, -0.00901031494140625, 0.032928466796875, -0.022430419921875, -0.018096923828125, -0.006610870361328125, -0.007965087890625, -0.00971221923828125, -0.027679443359375, 0.050506591796875, -0.054107666015625, 0.032623291015625, 0.0404052734375, -0.043212890625, -0.024688720703125, 0.04754638671875, -0.061004638671875, 0.0645751953125, 0.0264129638671875, -0.052001953125, 0.0172882080078125, -0.077880859375, -0.0170440673828125, 0.019805908203125, -0.015716552734375, -0.0751953125, -0.01377105712890625, 0.005077362060546875, 0.019805908203125, -0.033355712890625, 0.021881103515625, -0.003307342529296875, 0.007106781005859375, 0.029449462890625, -0.00033855438232421875, 0.09844970703125, 0.04510498046875, -0.0276947021484375, 0.0154876708984375, -0.0660400390625, 0.01061248779296875, 0.021484375, -0.0382080078125, -0.004856109619140625, -0.03546142578125, 0.00615692138671875, 0.0196533203125, 0.03271484375, -0.031768798828125, 0.002872467041015625, -0.044891357421875, 0.03961181640625, 0.036529541015625, -0.016815185546875, 0.035675048828125, -0.0285797119140625, 0.061279296875, 0.004024505615234375, 0.0230255126953125, 0.01788330078125, -0.04522705078125, -0.06195068359375, -0.0164337158203125, 0.04925537109375, 0.050537109375, -0.04473876953125, 0.049224853515625, -0.0176239013671875, -0.054840087890625, -0.061981201171875, 0.01122283935546875, 0.0601806640625, 0.0272216796875, 0.0279998779296875, -0.004451751708984375, -0.051910400390625, -0.04669189453125, -0.0214080810546875, -0.0179443359375, 0.025421142578125, 0.03546142578125, 0.042266845703125, -0.01277923583984375, 0.0675048828125, -0.034332275390625, -0.037078857421875, -0.03271484375, 0.0186767578125, 0.03961181640625, 0.033966064453125, 0.03277587890625, -0.08099365234375, -0.04443359375, 0.01409912109375, -0.054443359375, 0.009490966796875, -0.017181396484375, -0.00232696533203125, 0.04595947265625, 0.0413818359375, -0.042724609375, 0.0230712890625, 0.0276947021484375, -0.059356689453125, 0.04510498046875, -0.0574951171875, -0.01392364501953125, -0.07666015625, 0.0245361328125, 0.0014200210571289062, -0.0188446044921875, -0.037139892578125, 0.0380859375, 0.0009021759033203125, -0.0081939697265625, -0.0533447265625, 0.035430908203125, -0.05316162109375, -0.00640869140625, -0.0051422119140625, 0.003833770751953125, 0.005840301513671875, 0.048736572265625, -0.0009274482727050781, 0.05609130859375, 0.033355712890625, -0.050689697265625, 0.021484375, 0.05218505859375, -0.0215606689453125, 0.03436279296875, -0.0380859375, 0.024139404296875, -0.0165863037109375, -0.005096435546875, -0.023284912109375, 0.0212249755859375, 0.0335693359375, -0.044586181640625, 0.0217132568359375, -0.04296875, -0.032958984375, -0.0283203125, 0.00643157958984375, 0.0114898681640625, 0.032257080078125, -0.0236358642578125, 0.0775146484375, 0.039306640625, 0.00672149658203125, -0.054595947265625, -0.05364990234375, 0.017578125, -0.0149078369140625, -0.0426025390625, 0.0287322998046875, -0.0013990402221679688, -0.01526641845703125, 0.01526641845703125, 0.0293121337890625, -0.037994384765625, 0.0212860107421875, 0.0298919677734375, 0.0304718017578125, -0.01392364501953125, 0.002521514892578125, -0.021209716796875, -0.025848388671875, 0.00469970703125, -0.0273590087890625, 0.0482177734375, -0.0252685546875, 0.0005993843078613281, -0.02850341796875, 0.007106781005859375, 0.040283203125, -0.045074462890625, 0.0574951171875, 0.0309295654296875, -0.0208587646484375, -0.010894775390625, -0.041168212890625, 0.0293121337890625, -0.0299224853515625, 0.0167083740234375, -0.053253173828125, -0.051788330078125, 0.031829833984375, 0.00925445556640625, 0.0026874542236328125, 0.05938720703125, 0.061737060546875, 0.006134033203125, 0.03460693359375, 0.06048583984375, -0.00867462158203125, 0.0355224609375, -0.057373046875, 0.01352691650390625, -0.04571533203125, -0.03765869140625, -0.0732421875, 0.0201568603515625, -0.051910400390625, -0.0182037353515625, 0.0035190582275390625, -0.0010213851928710938, -0.0072174072265625, 0.062042236328125, -0.037017822265625, 0.025543212890625, 0.057342529296875, 0.0048980712890625, 0.0157470703125, -0.01538848876953125, -0.02667236328125, 0.0260162353515625, -0.052032470703125, -0.04510498046875, 0.08013916015625, 0.03564453125, 0.0288543701171875, 0.0028247833251953125, 0.0509033203125, 0.0177154541015625, 0.0018405914306640625, -0.04840087890625, 0.03961181640625, -0.01546478271484375, -0.036468505859375, -0.032958984375, -0.039886474609375, -0.099365234375, 0.0026416778564453125, -0.0282135009765625, -0.025054931640625, 0.00856781005859375, -0.0007853507995605469, -0.02276611328125, 0.006465911865234375, -0.037353515625, 0.074951171875, -0.0258026123046875, -0.00782012939453125, -0.01267242431640625, -0.041839599609375, 0.00677490234375, 0.002063751220703125, 0.030181884765625, 0.0006804466247558594, 0.0016126632690429688, 0.05474853515625, -0.048858642578125, 0.03289794921875, -0.024505615234375, -0.001605987548828125, 0.03662109375, -0.01012420654296875, 0.037567138671875, -0.009979248046875, -0.0285186767578125, 0.0120697021484375, 0.0130157470703125, -0.02191162109375, -0.033905029296875, 0.03875732421875, -0.07177734375, -0.007701873779296875, -0.035430908203125, -0.064208984375, -0.0006918907165527344, 0.0199432373046875, 0.010986328125, -0.0012950897216796875, 0.0021762847900390625, -0.006855010986328125, 0.035888671875, -0.0304107666015625, 0.013397216796875, 0.056243896484375, 0.014923095703125, -0.036102294921875, 0.058135986328125, 0.00464630126953125, -0.0117340087890625, 0.015411376953125, 0.014434814453125, -0.04913330078125, -0.054840087890625, -0.015899658203125, 0.034576416015625, -0.05419921875, -0.01323699951171875, -0.05987548828125, -0.025115966796875, -0.0501708984375, 0.0005450248718261719, -0.0005731582641601562, -0.034698486328125, -0.031005859375, -0.006923675537109375, 0.0269622802734375, 0.038421630859375, 0.020538330078125, 0.01239013671875, -0.0582275390625, 0.037750244140625, -0.01041412353515625, 0.04058837890625, -0.0260772705078125, -0.048126220703125, -0.0218963623046875, -0.0023670196533203125, -0.0243682861328125, -0.04254150390625, 0.03826904296875, 0.0137939453125, 0.055389404296875, 0.0035152435302734375, 0.00994873046875, 0.033447265625, -0.0181884765625, 0.0528564453125, 0.0122222900390625, -0.06787109375, 0.036865234375, -0.00921630859375, 0.01064300537109375, 0.047576904296875, 0.02252197265625, -0.0255279541015625, -0.04205322265625, -0.09039306640625, -0.063232421875, 0.06329345703125, 0.025665283203125, -0.00478363037109375, 0.018707275390625, 0.040771484375, 0.004848480224609375, 0.01134490966796875, -0.06622314453125, -0.025543212890625, 0.002788543701171875, -0.0269622802734375, 0.0029430389404296875, -0.0154266357421875, -0.0021686553955078125, -0.0036144256591796875, 0.08367919921875, 0.005359649658203125, 0.0239715576171875, 0.012847900390625, -0.035614013671875, 0.00960540771484375, 0.0335693359375, 0.04888916015625, 0.041839599609375, -0.0197906494140625, -0.012420654296875, 0.007171630859375, -0.031646728515625, 0.00731658935546875, 0.0276641845703125, -0.0279998779296875, 0.00560760498046875, 0.01611328125, 0.06103515625, -0.0181732177734375, -0.052337646484375, 0.042572021484375, 0.0022487640380859375, -0.059783935546875, -0.055755615234375, -0.00970458984375, 0.003170013427734375, 0.0223541259765625, -0.0047607421875, -0.00821685791015625, -0.0014286041259765625, -0.031951904296875, 0.019622802734375, 0.03057861328125, -0.031951904296875, 0.0220489501953125, 0.054229736328125, 0.0027256011962890625, -0.0322265625, 0.050872802734375, -0.02166748046875, -0.03741455078125, 0.07147216796875, 0.032958984375, 0.061279296875, 0.0165252685546875, 0.01158905029296875, 0.021087646484375, 0.00946807861328125, 0.01371002197265625, 0.041595458984375, -0.0014696121215820312, -0.033843994140625, -0.0235137939453125, -0.0285797119140625, -0.0291748046875, 0.0222930908203125, -0.042327880859375, 0.03173828125, -0.05718994140625, -0.00893402099609375, 0.00008761882781982422, 0.022216796875, -0.073486328125, 0.0199432373046875, -0.004230499267578125, 0.08148193359375, -0.038970947265625, 0.077880859375, 0.0611572265625, -0.0750732421875, -0.05389404296875, -0.0279541015625, -0.0261993408203125, -0.0496826171875, 0.06591796875, -0.002628326416015625, 0.01277923583984375, 0.0010986328125, -0.0275726318359375, -0.03436279296875, 0.08514404296875, 0.03619384765625, -0.031646728515625, -0.0083160400390625, 0.03314208984375, 0.046875, -0.0233917236328125, 0.03289794921875, 0.037353515625, 0.0416259765625, 0.003025054931640625, -0.05914306640625, 0.006450653076171875, -0.026458740234375, -0.00811004638671875, -0.004848480224609375, -0.0288238525390625, 0.05828857421875, -0.00725555419921875, -0.004100799560546875, 0.021484375, 0.046142578125, 0.0165252685546875, 0.030670166015625, 0.04840087890625, 0.05950927734375, 0.0579833984375, 0.0036716461181640625, 0.06732177734375, -0.043426513671875, 0.044097900390625, 0.0994873046875, 0.0035114288330078125, 0.0548095703125, 0.031646728515625, -0.0157623291015625, 0.037933349609375, 0.0262908935546875, -0.03546142578125, 0.0066375732421875, 0.042724609375, 0.01541900634765625, -0.01250457763671875, -0.01552581787109375, -0.0164947509765625, 0.033538818359375, 0.0059967041015625, -0.0406494140625, -0.017822265625, 0.002422332763671875, -0.005809783935546875, 0.034698486328125, -0.01520538330078125, 0.06756591796875, 0.0035419464111328125, -0.042266845703125, 0.040496826171875, 0.0347900390625, 0.038665771484375, -0.04144287109375, -0.01317596435546875, 0.00021409988403320312, 0.01515960693359375, -0.01837158203125, -0.049560546875, 0.022857666015625, 0.0014781951904296875, -0.0102081298828125, -0.02447509765625, 0.034088134765625, -0.037261962890625, -0.038665771484375, 0.0291595458984375, 0.0288543701171875, 0.0289306640625, 0.002971649169921875, -0.105712890625, -0.01511383056640625, -0.0158233642578125, 0.0006208419799804688, -0.0011301040649414062, 0.038360595703125, -0.005970001220703125, 0.0504150390625, 0.053009033203125, 0.02008056640625, 0.01039886474609375, 0.01134490966796875, 0.054840087890625, -0.04315185546875, -0.060882568359375, -0.0384521484375, 0.054412841796875, -0.010833740234375, -0.0246429443359375, 0.035430908203125, 0.0615234375, 0.06732177734375, -0.0122222900390625, 0.0518798828125, -0.004810333251953125, 0.0299072265625, -0.060699462890625, 0.06719970703125, -0.0199127197265625, 0.01092529296875, -0.0032596588134765625, -0.057464599609375, -0.040985107421875, 0.043426513671875, -0.0304412841796875, -0.00872039794921875, 0.04754638671875, 0.0538330078125, -0.0019464492797851562, -0.0207977294921875, 0.034759521484375, 0.0205535888671875, 0.033477783203125, 0.04107666015625, 0.0347900390625, -0.055389404296875, 0.0634765625, -0.01499176025390625, -0.0088043212890625, -0.0187530517578125, -0.0606689453125, -0.07000732421875, -0.05548095703125, -0.0452880859375, -0.051544189453125, 0.004428863525390625, 0.08026123046875, 0.0380859375, -0.06005859375, -0.027496337890625, -0.028167724609375, 0.0009112358093261719, -0.0175628662109375, -0.014556884765625, 0.040985107421875, -0.049346923828125, -0.043975830078125, 0.010833740234375, -0.006732940673828125, 0.0174102783203125, -0.0027008056640625, -0.03631591796875, -0.0408935546875, -0.00382232666015625, 0.02252197265625, 0.0296478271484375, -0.059814453125, 0.01617431640625, -0.0200042724609375, -0.028656005859375, 0.01399993896484375, 0.060394287109375, -0.036865234375, 0.025604248046875, 0.0264739990234375, 0.05804443359375, 0.04461669921875, -0.0153045654296875, 0.03741455078125, -0.0254669189453125, 0.0181884765625, 0.020721435546875, 0.04522705078125, 0.005596160888671875, -0.051544189453125, 0.052734375, 0.00028252601623535156, -0.0255126953125, -0.0246429443359375, 0.006839752197265625, -0.08575439453125, -0.031524658203125, 0.0755615234375, -0.036529541015625, -0.033294677734375, -0.03924560546875, -0.035675048828125, 0.04022216796875, -0.0200347900390625, 0.049041748046875, 0.045562744140625, 0.01523590087890625, -0.017181396484375, -0.0290069580078125, 0.03253173828125, 0.03033447265625, -0.07049560546875, -0.00308990478515625, 0.0310516357421875, 0.01428985595703125, 0.0158233642578125, 0.044158935546875, -0.0248870849609375, 0.034881591796875, -0.00850677490234375, 0.0231475830078125, -0.059295654296875, -0.00925445556640625, -0.045257568359375, 0.0220794677734375, 0.0223846435546875, -0.018310546875 ] ]
tsmatz/xlm-roberta-ner-japanese
2023-09-12T00:26:01.000Z
[ "transformers", "pytorch", "xlm-roberta", "token-classification", "generated_from_trainer", "ner", "bert", "ja", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
tsmatz
null
null
tsmatz/xlm-roberta-ner-japanese
7
212,817
transformers
2022-10-24T02:08:37
--- language: - ja license: mit tags: - generated_from_trainer - ner - bert metrics: - f1 widget: - text: 鈴木は4月の陽気の良い日に、鈴をつけて熊本県の阿蘇山に登った - text: 中国では、中国共産党による一党統治が続く base_model: xlm-roberta-base model-index: - name: xlm-roberta-ner-ja results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xlm-roberta-ner-japanese (Japanese caption : 日本語の固有表現抽出のモデル) This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) (pre-trained cross-lingual ```RobertaModel```) trained for named entity recognition (NER) token classification. The model is fine-tuned on NER dataset provided by Stockmark Inc, in which data is collected from Japanese Wikipedia articles.<br> See [here](https://github.com/stockmarkteam/ner-wikipedia-dataset) for the license of this dataset. Each token is labeled by : | Label id | Tag | Tag in Widget | Description | |---|---|---|---| | 0 | O | (None) | others or nothing | | 1 | PER | PER | person | | 2 | ORG | ORG | general corporation organization | | 3 | ORG-P | P | political organization | | 4 | ORG-O | O | other organization | | 5 | LOC | LOC | location | | 6 | INS | INS | institution, facility | | 7 | PRD | PRD | product | | 8 | EVT | EVT | event | ## Intended uses ```python from transformers import pipeline model_name = "tsmatz/xlm-roberta-ner-japanese" classifier = pipeline("token-classification", model=model_name) result = classifier("鈴木は4月の陽気の良い日に、鈴をつけて熊本県の阿蘇山に登った") print(result) ``` ## Training procedure You can download the source code for fine-tuning from [here](https://github.com/tsmatz/huggingface-finetune-japanese/blob/master/01-named-entity.ipynb). ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 12 - eval_batch_size: 12 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | No log | 1.0 | 446 | 0.1510 | 0.8457 | | No log | 2.0 | 892 | 0.0626 | 0.9261 | | No log | 3.0 | 1338 | 0.0366 | 0.9580 | | No log | 4.0 | 1784 | 0.0196 | 0.9792 | | No log | 5.0 | 2230 | 0.0173 | 0.9864 | ### Framework versions - Transformers 4.23.1 - Pytorch 1.12.1+cu102 - Datasets 2.6.1 - Tokenizers 0.13.1
2,616
[ [ -0.0380859375, -0.0411376953125, 0.01548004150390625, -0.0020694732666015625, -0.0236053466796875, -0.0103912353515625, -0.017547607421875, -0.0296478271484375, 0.01519775390625, 0.03167724609375, -0.051849365234375, -0.054168701171875, -0.056884765625, 0.00775909423828125, -0.01116180419921875, 0.09136962890625, -0.00445556640625, 0.0173187255859375, 0.00403594970703125, -0.01288604736328125, -0.0335693359375, -0.045684814453125, -0.0816650390625, -0.03857421875, 0.024017333984375, 0.0237274169921875, 0.048553466796875, 0.051788330078125, 0.0318603515625, 0.0224151611328125, -0.023590087890625, 0.00972747802734375, -0.02252197265625, -0.02545166015625, 0.01317596435546875, -0.048675537109375, -0.052825927734375, -0.00746917724609375, 0.058502197265625, 0.033050537109375, -0.0008921623229980469, 0.031524658203125, 0.0032482147216796875, 0.0306243896484375, -0.0237274169921875, 0.0274505615234375, -0.04290771484375, 0.026336669921875, -0.0258941650390625, -0.0189361572265625, -0.0260772705078125, -0.005466461181640625, -0.01080322265625, -0.05133056640625, 0.024993896484375, 0.01190185546875, 0.104248046875, 0.0205230712890625, -0.020599365234375, 0.0024394989013671875, -0.046630859375, 0.06341552734375, -0.062225341796875, 0.02008056640625, 0.0253143310546875, 0.0169677734375, -0.00214385986328125, -0.05389404296875, -0.045166015625, 0.00519561767578125, -0.00934600830078125, 0.01554107666015625, -0.01444244384765625, -0.00797271728515625, 0.053802490234375, 0.02496337890625, -0.050323486328125, 0.0243988037109375, -0.0364990234375, -0.024322509765625, 0.039306640625, 0.03369140625, 0.018646240234375, -0.0254974365234375, -0.03228759765625, -0.0173797607421875, -0.02862548828125, 0.006793975830078125, 0.0391845703125, 0.03131103515625, -0.0312347412109375, 0.0305633544921875, -0.01157379150390625, 0.049346923828125, 0.0106048583984375, -0.020782470703125, 0.05010986328125, -0.00997161865234375, -0.022979736328125, 0.01313018798828125, 0.07501220703125, 0.03204345703125, 0.005771636962890625, 0.0048828125, -0.0163726806640625, -0.00794219970703125, -0.0027313232421875, -0.07220458984375, -0.001556396484375, 0.01120758056640625, -0.034393310546875, -0.0257568359375, 0.01326751708984375, -0.050628662109375, 0.0219879150390625, -0.03326416015625, 0.038116455078125, -0.03411865234375, -0.01136016845703125, -0.007381439208984375, -0.00824737548828125, 0.0264129638671875, 0.00231170654296875, -0.060302734375, 0.0244140625, 0.05035400390625, 0.05401611328125, 0.00601959228515625, -0.02978515625, -0.02996826171875, 0.00382232666015625, -0.02264404296875, 0.046142578125, -0.018585205078125, -0.027130126953125, -0.029571533203125, 0.021728515625, -0.017822265625, -0.0455322265625, 0.0469970703125, -0.0294036865234375, 0.033203125, 0.0003991127014160156, -0.048980712890625, -0.0155792236328125, 0.0165252685546875, -0.038055419921875, 0.085205078125, 0.00496673583984375, -0.06500244140625, 0.046142578125, -0.047515869140625, -0.0066375732421875, 0.01065826416015625, -0.013031005859375, -0.063720703125, -0.01457977294921875, 0.01023101806640625, 0.0287933349609375, -0.01378631591796875, 0.0258026123046875, -0.019744873046875, -0.0256500244140625, 0.0147247314453125, -0.028961181640625, 0.0760498046875, 0.02032470703125, -0.035308837890625, 0.005840301513671875, -0.08966064453125, 0.0236053466796875, 0.0166168212890625, -0.022064208984375, -0.0094451904296875, -0.02557373046875, 0.0260009765625, 0.0194854736328125, 0.019439697265625, -0.028961181640625, 0.0023899078369140625, -0.03204345703125, 0.0296478271484375, 0.0445556640625, 0.01540374755859375, 0.0136566162109375, -0.023590087890625, 0.03692626953125, 0.02374267578125, 0.0202484130859375, -0.01018524169921875, -0.0423583984375, -0.06805419921875, -0.00673675537109375, 0.029083251953125, 0.035369873046875, -0.023101806640625, 0.05462646484375, -0.0187835693359375, -0.053466796875, -0.02276611328125, -0.0041351318359375, 0.0316162109375, 0.0556640625, 0.037933349609375, -0.017913818359375, -0.0537109375, -0.0750732421875, -0.01371002197265625, -0.0056304931640625, 0.0110321044921875, 0.02459716796875, 0.05047607421875, -0.0205078125, 0.0531005859375, -0.039031982421875, -0.0301666259765625, -0.01385498046875, 0.01641845703125, 0.04339599609375, 0.054351806640625, 0.048980712890625, -0.042816162109375, -0.042724609375, 0.0015287399291992188, -0.053070068359375, 0.01078033447265625, -0.01727294921875, -0.0194854736328125, 0.0297393798828125, 0.0267486572265625, -0.0341796875, 0.042724609375, 0.025177001953125, -0.0265350341796875, 0.039031982421875, -0.026397705078125, -0.0037078857421875, -0.10821533203125, 0.0279693603515625, 0.002452850341796875, -0.0103912353515625, -0.0355224609375, 0.0075531005859375, 0.01399993896484375, -0.0173797607421875, -0.0308837890625, 0.044891357421875, -0.0287017822265625, 0.007289886474609375, -0.01617431640625, -0.00795745849609375, -0.0011577606201171875, 0.0478515625, 0.022186279296875, 0.051422119140625, 0.051605224609375, -0.041961669921875, 0.01245880126953125, 0.0300750732421875, -0.0237274169921875, 0.045440673828125, -0.05291748046875, -0.00014126300811767578, -0.0012178421020507812, 0.0023975372314453125, -0.0496826171875, -0.01265716552734375, 0.035980224609375, -0.039886474609375, 0.039337158203125, -0.031890869140625, -0.042144775390625, -0.0251007080078125, 0.0158538818359375, 0.01374053955078125, 0.049530029296875, -0.033203125, 0.04248046875, 0.01262664794921875, 0.007717132568359375, -0.047943115234375, -0.06768798828125, 0.002582550048828125, -0.0254974365234375, -0.03411865234375, 0.031707763671875, -0.0034580230712890625, 0.005908966064453125, 0.001392364501953125, -0.0110321044921875, -0.023101806640625, -0.0005183219909667969, 0.024017333984375, 0.035491943359375, -0.015777587890625, -0.006534576416015625, -0.004749298095703125, -0.039215087890625, 0.0166015625, -0.01316070556640625, 0.05218505859375, -0.0051727294921875, -0.01349639892578125, -0.06390380859375, 0.0029315948486328125, 0.0299835205078125, -0.01371002197265625, 0.0694580078125, 0.07025146484375, -0.034698486328125, -0.006122589111328125, -0.02301025390625, -0.01070404052734375, -0.033111572265625, 0.036163330078125, -0.02691650390625, -0.031036376953125, 0.0626220703125, 0.004055023193359375, -0.006687164306640625, 0.0692138671875, 0.02606201171875, 0.0204010009765625, 0.08270263671875, 0.0167388916015625, -0.003692626953125, 0.02423095703125, -0.06280517578125, 0.01032257080078125, -0.06597900390625, -0.0367431640625, -0.04766845703125, -0.0221710205078125, -0.05615234375, -0.0120697021484375, 0.01611328125, -0.0035724639892578125, -0.0406494140625, 0.03326416015625, -0.04339599609375, 0.029998779296875, 0.04437255859375, 0.0152740478515625, 0.004482269287109375, -0.005527496337890625, -0.0279083251953125, -0.0157318115234375, -0.059600830078125, -0.0304412841796875, 0.08526611328125, 0.019012451171875, 0.043853759765625, -0.0115966796875, 0.06805419921875, -0.02142333984375, 0.005153656005859375, -0.051849365234375, 0.03607177734375, -0.001708984375, -0.06280517578125, -0.0074310302734375, -0.046844482421875, -0.06878662109375, 0.005916595458984375, -0.0233154296875, -0.052001953125, 0.018829345703125, 0.003765106201171875, -0.017303466796875, 0.047332763671875, -0.03564453125, 0.080810546875, -0.0257568359375, -0.017486572265625, -0.0073089599609375, -0.041473388671875, 0.007610321044921875, -0.0141754150390625, 0.011749267578125, -0.00390625, -0.0026035308837890625, 0.0704345703125, -0.05126953125, 0.04937744140625, -0.0204925537109375, 0.0209808349609375, 0.00478363037109375, -0.0198211669921875, 0.046539306640625, 0.01861572265625, -0.00916290283203125, 0.033538818359375, -0.00197601318359375, -0.02825927734375, -0.0283966064453125, 0.060699462890625, -0.08392333984375, -0.0300750732421875, -0.052520751953125, -0.027557373046875, 0.005939483642578125, 0.03973388671875, 0.05169677734375, 0.051116943359375, 0.009674072265625, 0.02142333984375, 0.04339599609375, -0.01468658447265625, 0.0281219482421875, 0.0311431884765625, -0.00970458984375, -0.05133056640625, 0.06427001953125, 0.0266265869140625, 0.007518768310546875, 0.02215576171875, 0.00608062744140625, -0.0200042724609375, -0.042724609375, -0.034423828125, 0.0203857421875, -0.050628662109375, -0.0202484130859375, -0.04510498046875, -0.033477783203125, -0.0197296142578125, 0.00887298583984375, -0.031829833984375, -0.0288238525390625, -0.050933837890625, -0.00559234619140625, 0.027313232421875, 0.026947021484375, 0.006244659423828125, 0.017669677734375, -0.0611572265625, 0.0098724365234375, 0.0024509429931640625, 0.035064697265625, 0.0093536376953125, -0.06787109375, -0.0273895263671875, 0.00701141357421875, -0.02899169921875, -0.05401611328125, 0.05218505859375, 0.01313018798828125, 0.055938720703125, 0.044464111328125, -0.00508880615234375, 0.08087158203125, -0.0265350341796875, 0.05810546875, 0.0203704833984375, -0.0631103515625, 0.0460205078125, -0.01177215576171875, 0.024810791015625, 0.029083251953125, 0.042877197265625, -0.036712646484375, -0.0221710205078125, -0.083740234375, -0.07373046875, 0.08441162109375, 0.00856781005859375, 0.0128326416015625, -0.004283905029296875, 0.031036376953125, 0.00279998779296875, 0.003726959228515625, -0.056793212890625, -0.05975341796875, -0.01441192626953125, -0.00930023193359375, -0.01374053955078125, -0.00797271728515625, -0.0016412734985351562, -0.032562255859375, 0.078857421875, 0.0022449493408203125, 0.01995849609375, 0.01132965087890625, 0.00202178955078125, -0.01088714599609375, 0.006870269775390625, 0.040496826171875, 0.047821044921875, -0.020477294921875, -0.01456451416015625, 0.0272979736328125, -0.0330810546875, -0.00235748291015625, 0.0187835693359375, -0.0271759033203125, 0.0167694091796875, 0.0110931396484375, 0.0906982421875, 0.02386474609375, -0.0234222412109375, 0.0213775634765625, -0.010009765625, -0.029815673828125, -0.054901123046875, 0.01332855224609375, -0.0051727294921875, 0.00681304931640625, 0.031646728515625, 0.028533935546875, 0.00027561187744140625, -0.0249176025390625, 0.003978729248046875, 0.0227813720703125, -0.034759521484375, -0.017486572265625, 0.0712890625, -0.01107025146484375, -0.0271453857421875, 0.0538330078125, -0.01024627685546875, -0.032928466796875, 0.06671142578125, 0.043701171875, 0.06378173828125, 0.00327301025390625, -0.001163482666015625, 0.07281494140625, 0.00975799560546875, 0.0007753372192382812, 0.025238037109375, 0.0106658935546875, -0.040771484375, -0.01763916015625, -0.044830322265625, -0.02301025390625, 0.033172607421875, -0.0750732421875, 0.03857421875, -0.04229736328125, -0.042388916015625, 0.021820068359375, 0.022979736328125, -0.06671142578125, 0.025665283203125, 0.01200103759765625, 0.07293701171875, -0.053619384765625, 0.058258056640625, 0.055755615234375, -0.0426025390625, -0.0823974609375, -0.014984130859375, -0.01099395751953125, -0.0650634765625, 0.068115234375, 0.0070343017578125, 0.0306549072265625, 0.01088714599609375, -0.031402587890625, -0.07965087890625, 0.08642578125, 0.007610321044921875, -0.045318603515625, 0.0096588134765625, 0.019378662109375, 0.04248046875, -0.0116119384765625, 0.0382080078125, 0.011016845703125, 0.0391845703125, 0.0027618408203125, -0.0703125, 0.004169464111328125, -0.030731201171875, 0.00910186767578125, 0.0289154052734375, -0.0665283203125, 0.0699462890625, -0.007534027099609375, -0.004283905029296875, 0.01163482666015625, 0.035858154296875, 0.0299072265625, 0.0280303955078125, 0.021514892578125, 0.06988525390625, 0.054595947265625, -0.0240936279296875, 0.06011962890625, -0.050018310546875, 0.0496826171875, 0.07464599609375, -0.0013904571533203125, 0.05548095703125, 0.03350830078125, -0.0253143310546875, 0.048126220703125, 0.048980712890625, -0.0367431640625, 0.0151519775390625, 0.0005750656127929688, 0.002960205078125, -0.0264892578125, 0.0149993896484375, -0.042572021484375, 0.0289459228515625, 0.0131072998046875, -0.0491943359375, -0.01233673095703125, -0.00701141357421875, 0.01354217529296875, -0.0148162841796875, -0.0276336669921875, 0.060333251953125, -0.00818634033203125, -0.0443115234375, 0.056793212890625, 0.0013303756713867188, 0.038360595703125, -0.049285888671875, 0.0005660057067871094, -0.006458282470703125, 0.023406982421875, -0.028228759765625, -0.048919677734375, 0.018524169921875, 0.005390167236328125, -0.0176544189453125, 0.001983642578125, 0.0242767333984375, -0.0194549560546875, -0.0557861328125, 0.00975799560546875, 0.01404571533203125, 0.0218658447265625, 0.0199432373046875, -0.07098388671875, -0.0037994384765625, 0.0044708251953125, -0.0203857421875, 0.0289764404296875, 0.02703857421875, 0.00441741943359375, 0.04815673828125, 0.055145263671875, 0.01486968994140625, 0.0046234130859375, 0.006046295166015625, 0.051849365234375, -0.05279541015625, -0.046295166015625, -0.0538330078125, 0.03564453125, -0.01503753662109375, -0.05169677734375, 0.05712890625, 0.07183837890625, 0.07794189453125, -0.0063323974609375, 0.0496826171875, 0.0002460479736328125, 0.033935546875, -0.037811279296875, 0.058349609375, -0.047088623046875, 0.007476806640625, -0.01282501220703125, -0.061492919921875, -0.0233306884765625, 0.04681396484375, -0.0274810791015625, 0.019256591796875, 0.042327880859375, 0.0584716796875, -0.003917694091796875, -0.006038665771484375, 0.01517486572265625, 0.0202484130859375, 0.01033782958984375, 0.043365478515625, 0.03271484375, -0.0704345703125, 0.04766845703125, -0.044525146484375, 0.0026760101318359375, -0.012176513671875, -0.057647705078125, -0.06396484375, -0.034820556640625, -0.046844482421875, -0.0235137939453125, -0.01348114013671875, 0.078369140625, 0.05645751953125, -0.067626953125, -0.019378662109375, -0.01416015625, -0.0143280029296875, -0.0225067138671875, -0.02093505859375, 0.032958984375, -0.0296630859375, -0.069580078125, -0.0082855224609375, -0.005481719970703125, 0.02032470703125, -0.0015115737915039062, -0.02508544921875, -0.0022754669189453125, -0.023193359375, 0.007511138916015625, 0.0159912109375, -0.046295166015625, -0.0089111328125, 0.0024356842041015625, -0.002716064453125, 0.0207672119140625, 0.01213836669921875, -0.04132080078125, 0.0292510986328125, 0.0280914306640625, 0.01482391357421875, 0.05645751953125, -0.015533447265625, 0.018157958984375, -0.048980712890625, 0.0290679931640625, 0.0122528076171875, 0.04022216796875, 0.00853729248046875, -0.02606201171875, 0.0243988037109375, 0.03350830078125, -0.046722412109375, -0.06536865234375, -0.016845703125, -0.08209228515625, -0.0015420913696289062, 0.07196044921875, -0.0093841552734375, -0.032440185546875, 0.010894775390625, -0.01666259765625, 0.03265380859375, -0.021331787109375, 0.037567138671875, 0.04339599609375, 0.00220489501953125, -0.01120758056640625, -0.029754638671875, 0.03131103515625, 0.01265716552734375, -0.049102783203125, -0.0171051025390625, 0.01387786865234375, 0.045623779296875, 0.0114288330078125, 0.0224151611328125, -0.018768310546875, 0.020538330078125, 0.00406646728515625, 0.01995849609375, -0.02093505859375, -0.0004849433898925781, -0.032012939453125, -0.0022907257080078125, 0.004863739013671875, -0.01287841796875 ] ]
stabilityai/stable-diffusion-2-depth
2023-07-05T16:19:06.000Z
[ "diffusers", "stable-diffusion", "arxiv:2112.10752", "arxiv:2202.00512", "arxiv:1910.09700", "license:openrail++", "has_space", "diffusers:StableDiffusionDepth2ImgPipeline", "region:us" ]
null
stabilityai
null
null
stabilityai/stable-diffusion-2-depth
353
211,554
diffusers
2022-11-23T17:41:46
--- license: openrail++ tags: - stable-diffusion inference: false --- # Stable Diffusion v2 Model Card This model card focuses on the model associated with the Stable Diffusion v2 model, available [here](https://github.com/Stability-AI/stablediffusion). This `stable-diffusion-2-depth` model is resumed from [stable-diffusion-2-base](https://huggingface.co/stabilityai/stable-diffusion-2-base) (`512-base-ema.ckpt`) and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. ![image](https://huggingface.co/stabilityai/stable-diffusion-2-depth/resolve/main/depth2image.png) - Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `512-depth-ema.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-2-depth/resolve/main/512-depth-ema.ckpt). - Use it with 🧨 [`diffusers`](#examples) ## Model Details - **Developed by:** Robin Rombach, Patrick Esser - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)). - **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/). - **Cite as:** @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } ## Examples Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 in a simple and efficient manner. ```bash pip install -U git+https://github.com/huggingface/transformers.git pip install diffusers transformers accelerate scipy safetensors ``` Running the pipeline (if you don't swap the scheduler it will run with the default DDIM, in this example we are swapping it to EulerDiscreteScheduler): ```python import torch import requests from PIL import Image from diffusers import StableDiffusionDepth2ImgPipeline pipe = StableDiffusionDepth2ImgPipeline.from_pretrained( "stabilityai/stable-diffusion-2-depth", torch_dtype=torch.float16, ).to("cuda") url = "http://images.cocodataset.org/val2017/000000039769.jpg" init_image = Image.open(requests.get(url, stream=True).raw) prompt = "two tigers" n_propmt = "bad, deformed, ugly, bad anotomy" image = pipe(prompt=prompt, image=init_image, negative_prompt=n_propmt, strength=0.7).images[0] ``` **Notes**: - Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance) - If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed) # Uses ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include - Safe deployment of models which have the potential to generate harmful content. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use _Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_. The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. - The model was trained mainly with English captions and will not work as well in other languages. - The autoencoding part of the model is lossy - The model was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section). ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion vw was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as white and western cultures are often set as the default. Further, the ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts. Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ## Training **Training Data** The model developers used the following dataset for training the model: - LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic. **Training Procedure** Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training, - Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4 - Text prompts are encoded through the OpenCLIP-ViT/H text-encoder. - The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention. - The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512. We currently provide the following checkpoints: - `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. 850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`. - `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset. - `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. - `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://github.com/saic-mdal/lama). - `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). - **Hardware:** 32 x 8 x A100 GPUs - **Optimizer:** AdamW - **Gradient Accumulations**: 1 - **Batch:** 32 x 8 x 2 x 4 = 2048 - **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant ## Evaluation Results Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints: ![pareto](model-variants.jpg) Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores. ## Environmental Impact **Stable Diffusion v1** **Estimated Emissions** Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact. - **Hardware Type:** A100 PCIe 40GB - **Hours used:** 200000 - **Cloud Provider:** AWS - **Compute Region:** US-east - **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq. ## Citation @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } *This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
12,412
[ [ -0.032989501953125, -0.06524658203125, 0.0247650146484375, 0.0155181884765625, -0.016632080078125, -0.025115966796875, 0.006649017333984375, -0.034088134765625, -0.004047393798828125, 0.0292510986328125, -0.0284881591796875, -0.02935791015625, -0.054229736328125, -0.0110015869140625, -0.03173828125, 0.0704345703125, -0.005962371826171875, 0.0001266002655029297, -0.01605224609375, -0.00334930419921875, -0.0247955322265625, -0.00896453857421875, -0.07928466796875, -0.0211334228515625, 0.03424072265625, 0.0107574462890625, 0.05377197265625, 0.039642333984375, 0.03228759765625, 0.0219573974609375, -0.023773193359375, -0.0012035369873046875, -0.053192138671875, -0.0018701553344726562, -0.0028095245361328125, -0.0196990966796875, -0.037689208984375, 0.0091094970703125, 0.0482177734375, 0.0186004638671875, -0.0045318603515625, 0.001842498779296875, 0.00188446044921875, 0.046112060546875, -0.038665771484375, -0.0102691650390625, -0.0241546630859375, 0.0128173828125, -0.010894775390625, 0.01885986328125, -0.0281982421875, -0.007781982421875, 0.0110931396484375, -0.055999755859375, 0.028839111328125, -0.023223876953125, 0.08154296875, 0.029205322265625, -0.0254058837890625, -0.005764007568359375, -0.0562744140625, 0.041778564453125, -0.04437255859375, 0.0205535888671875, 0.02496337890625, 0.00975799560546875, 0.00083160400390625, -0.07330322265625, -0.0478515625, -0.0027599334716796875, 0.0006875991821289062, 0.03802490234375, -0.031097412109375, -0.00521087646484375, 0.035491943359375, 0.016510009765625, -0.047271728515625, -0.0003230571746826172, -0.044158935546875, -0.005451202392578125, 0.0499267578125, 0.00738525390625, 0.0224609375, -0.016357421875, -0.030426025390625, -0.0074615478515625, -0.03729248046875, 0.001529693603515625, 0.03173828125, -0.0258026123046875, -0.036651611328125, 0.035186767578125, 0.00787353515625, 0.033966064453125, 0.0223846435546875, -0.0094146728515625, 0.0288848876953125, -0.0184173583984375, -0.01508331298828125, -0.034820556640625, 0.06494140625, 0.048583984375, -0.0027866363525390625, 0.01094818115234375, -0.0081939697265625, 0.01552581787109375, 0.0062103271484375, -0.09185791015625, -0.035491943359375, 0.01389312744140625, -0.047119140625, -0.041168212890625, -0.011932373046875, -0.08038330078125, -0.0181732177734375, 0.01355743408203125, 0.032684326171875, -0.0252532958984375, -0.038421630859375, -0.0034160614013671875, -0.0306854248046875, 0.0171966552734375, 0.03424072265625, -0.054656982421875, 0.013580322265625, 0.0016584396362304688, 0.08721923828125, -0.0259552001953125, 0.0003523826599121094, -0.009307861328125, 0.0099639892578125, -0.0196685791015625, 0.053924560546875, -0.0254974365234375, -0.041717529296875, -0.0179443359375, 0.024169921875, 0.01155853271484375, -0.038238525390625, 0.043487548828125, -0.033538818359375, 0.0296173095703125, -0.0026397705078125, -0.0271148681640625, -0.0170135498046875, 0.000896453857421875, -0.055450439453125, 0.0831298828125, 0.0208892822265625, -0.06854248046875, 0.004695892333984375, -0.05499267578125, -0.0208282470703125, -0.007251739501953125, 0.0015783309936523438, -0.052734375, -0.01157379150390625, 0.006275177001953125, 0.0289306640625, -0.01061248779296875, 0.01593017578125, -0.0211944580078125, -0.0208892822265625, -0.004306793212890625, -0.04644775390625, 0.07659912109375, 0.0256805419921875, -0.032989501953125, 0.00498199462890625, -0.0489501953125, -0.0287322998046875, 0.0391845703125, -0.014404296875, -0.01274871826171875, -0.0153656005859375, 0.0225372314453125, 0.027801513671875, 0.005779266357421875, -0.034454345703125, -0.00034427642822265625, -0.0265045166015625, 0.045196533203125, 0.054107666015625, 0.016204833984375, 0.055206298828125, -0.025482177734375, 0.03948974609375, 0.026397705078125, 0.0226287841796875, -0.0107574462890625, -0.06768798828125, -0.052032470703125, -0.0163421630859375, 0.01434326171875, 0.04229736328125, -0.062164306640625, 0.01153564453125, 0.00681304931640625, -0.05206298828125, -0.0203094482421875, -0.0059814453125, 0.0194091796875, 0.05523681640625, 0.0258636474609375, -0.032562255859375, -0.02557373046875, -0.057586669921875, 0.02777099609375, -0.0082244873046875, 0.00794219970703125, 0.022003173828125, 0.04864501953125, -0.0340576171875, 0.041778564453125, -0.045166015625, -0.021820068359375, 0.00566864013671875, 0.0089111328125, 0.0015192031860351562, 0.0506591796875, 0.0611572265625, -0.0775146484375, -0.04876708984375, -0.0186920166015625, -0.0633544921875, 0.0009274482727050781, 0.0015459060668945312, -0.027984619140625, 0.034210205078125, 0.03656005859375, -0.060028076171875, 0.04754638671875, 0.048797607421875, -0.025115966796875, 0.03277587890625, -0.0276336669921875, -0.0026683807373046875, -0.08013916015625, 0.01015472412109375, 0.02777099609375, -0.02130126953125, -0.045684814453125, 0.002094268798828125, -0.00030112266540527344, -0.01397705078125, -0.047119140625, 0.06329345703125, -0.0295257568359375, 0.0281524658203125, -0.0290374755859375, 0.00020492076873779297, 0.01374053955078125, 0.024444580078125, 0.0264739990234375, 0.047332763671875, 0.06610107421875, -0.0435791015625, 0.0163116455078125, 0.0156707763671875, -0.0038166046142578125, 0.040618896484375, -0.06732177734375, 0.01418304443359375, -0.033355712890625, 0.0233612060546875, -0.0809326171875, -0.018524169921875, 0.045806884765625, -0.03497314453125, 0.031494140625, -0.0171356201171875, -0.0285797119140625, -0.0367431640625, -0.0193939208984375, 0.0423583984375, 0.07794189453125, -0.03143310546875, 0.040008544921875, 0.03253173828125, 0.01020050048828125, -0.034332275390625, -0.058319091796875, -0.00836181640625, -0.027618408203125, -0.06414794921875, 0.04638671875, -0.0178680419921875, -0.01000213623046875, 0.0140228271484375, 0.0131072998046875, 0.0007243156433105469, -0.007366180419921875, 0.03125, 0.0168914794921875, 0.0018186569213867188, -0.01168060302734375, 0.0200042724609375, -0.01824951171875, -0.0005059242248535156, -0.00868988037109375, 0.0299835205078125, 0.00939178466796875, -0.00608062744140625, -0.0506591796875, 0.0289306640625, 0.039459228515625, -0.0012645721435546875, 0.05487060546875, 0.08233642578125, -0.044525146484375, 0.0003490447998046875, -0.028350830078125, -0.018951416015625, -0.0374755859375, 0.0299224853515625, -0.01123809814453125, -0.046234130859375, 0.045928955078125, 0.0037021636962890625, 0.0013980865478515625, 0.05230712890625, 0.06207275390625, -0.0172882080078125, 0.0894775390625, 0.0482177734375, 0.0207672119140625, 0.0513916015625, -0.06256103515625, -0.0013017654418945312, -0.0667724609375, -0.0263671875, -0.01166534423828125, -0.0208892822265625, -0.0343017578125, -0.05096435546875, 0.027099609375, 0.01287841796875, -0.00970458984375, 0.0116729736328125, -0.043426513671875, 0.0226898193359375, 0.0234832763671875, 0.01380157470703125, -0.001049041748046875, 0.0141143798828125, 0.006999969482421875, -0.0122222900390625, -0.056884765625, -0.043914794921875, 0.0740966796875, 0.041473388671875, 0.0660400390625, 0.004024505615234375, 0.0377197265625, 0.030853271484375, 0.0284881591796875, -0.03753662109375, 0.037567138671875, -0.0266876220703125, -0.0484619140625, -0.01071929931640625, -0.0164031982421875, -0.0706787109375, 0.01532745361328125, -0.015625, -0.031524658203125, 0.035858154296875, 0.01508331298828125, -0.02325439453125, 0.025421142578125, -0.0589599609375, 0.07318115234375, -0.005092620849609375, -0.057220458984375, -0.0093231201171875, -0.05145263671875, 0.028228759765625, 0.0012044906616210938, 0.01284027099609375, -0.006366729736328125, -0.005847930908203125, 0.0645751953125, -0.021484375, 0.072021484375, -0.03277587890625, 0.0017080307006835938, 0.0330810546875, -0.007701873779296875, 0.025482177734375, 0.0254669189453125, -0.00861358642578125, 0.0282440185546875, 0.0003447532653808594, -0.03204345703125, -0.0280914306640625, 0.058807373046875, -0.074951171875, -0.035247802734375, -0.03302001953125, -0.027801513671875, 0.03912353515625, 0.0144195556640625, 0.060760498046875, 0.0288848876953125, -0.01399993896484375, -0.003437042236328125, 0.06329345703125, -0.02252197265625, 0.034210205078125, 0.012603759765625, -0.0191497802734375, -0.0399169921875, 0.059967041015625, 0.013519287109375, 0.0343017578125, -0.0045166015625, 0.01207733154296875, -0.02008056640625, -0.039276123046875, -0.049468994140625, 0.024993896484375, -0.0599365234375, -0.017822265625, -0.06207275390625, -0.02996826171875, -0.035675048828125, -0.0118560791015625, -0.0244598388671875, -0.020233154296875, -0.06549072265625, 0.00411224365234375, 0.02374267578125, 0.042388916015625, -0.0234222412109375, 0.0256195068359375, -0.030303955078125, 0.0338134765625, 0.0162506103515625, 0.011993408203125, 0.00579833984375, -0.0599365234375, -0.01015472412109375, 0.0072174072265625, -0.04876708984375, -0.072998046875, 0.0303497314453125, 0.00489044189453125, 0.042694091796875, 0.040679931640625, -0.0059814453125, 0.04022216796875, -0.029571533203125, 0.07586669921875, 0.0168914794921875, -0.045318603515625, 0.049102783203125, -0.026641845703125, 0.01300048828125, 0.014892578125, 0.043853759765625, -0.0249176025390625, -0.027862548828125, -0.06060791015625, -0.06866455078125, 0.049468994140625, 0.0309295654296875, 0.03216552734375, -0.00927734375, 0.05096435546875, -0.0016927719116210938, -0.0079803466796875, -0.0794677734375, -0.03973388671875, -0.0283050537109375, 0.0017633438110351562, 0.00681304931640625, -0.03485107421875, -0.0093841552734375, -0.0401611328125, 0.068603515625, 0.005340576171875, 0.041778564453125, 0.0300140380859375, -0.00045871734619140625, -0.029998779296875, -0.0295257568359375, 0.0390625, 0.026153564453125, -0.00914764404296875, 0.0005640983581542969, -0.0011129379272460938, -0.039764404296875, 0.0181121826171875, 0.0110015869140625, -0.054412841796875, 0.0010251998901367188, 0.0008649826049804688, 0.06817626953125, -0.019439697265625, -0.033355712890625, 0.050506591796875, -0.013763427734375, -0.0295257568359375, -0.034393310546875, 0.0101165771484375, 0.009429931640625, 0.027557373046875, 0.0055084228515625, 0.037506103515625, 0.016265869140625, -0.0255126953125, 0.0079345703125, 0.03460693359375, -0.0298004150390625, -0.028289794921875, 0.081787109375, 0.0100555419921875, -0.02740478515625, 0.046295166015625, -0.037994384765625, -0.0188140869140625, 0.051116943359375, 0.05609130859375, 0.060546875, -0.0160980224609375, 0.038970947265625, 0.05511474609375, 0.0196685791015625, -0.01812744140625, 0.0155487060546875, 0.018035888671875, -0.055877685546875, -0.00678253173828125, -0.03289794921875, -0.0017032623291015625, 0.016021728515625, -0.03570556640625, 0.038177490234375, -0.040313720703125, -0.0308074951171875, -0.0024623870849609375, -0.0203857421875, -0.045257568359375, 0.012939453125, 0.0278472900390625, 0.062042236328125, -0.083740234375, 0.06268310546875, 0.054656982421875, -0.050994873046875, -0.034332275390625, 0.002685546875, -0.00722503662109375, -0.0301361083984375, 0.03369140625, 0.01166534423828125, 0.0005459785461425781, 0.00920867919921875, -0.060028076171875, -0.0689697265625, 0.0953369140625, 0.02484130859375, -0.020477294921875, -0.00579833984375, -0.020416259765625, 0.049468994140625, -0.033294677734375, 0.0245513916015625, 0.026397705078125, 0.028076171875, 0.03253173828125, -0.032318115234375, 0.01020050048828125, -0.0306549072265625, 0.02880859375, -0.01044464111328125, -0.0692138671875, 0.07275390625, -0.029449462890625, -0.0248260498046875, 0.02154541015625, 0.0550537109375, 0.0175628662109375, 0.0251922607421875, 0.0328369140625, 0.06329345703125, 0.041717529296875, -0.00913238525390625, 0.07501220703125, -0.004024505615234375, 0.02972412109375, 0.0526123046875, -0.005096435546875, 0.049407958984375, 0.031646728515625, -0.00783538818359375, 0.04693603515625, 0.055999755859375, -0.0276031494140625, 0.060028076171875, -0.002002716064453125, -0.0126953125, 0.0009102821350097656, -0.00174713134765625, -0.036773681640625, 0.00843048095703125, 0.0256805419921875, -0.03985595703125, -0.01296234130859375, 0.0190887451171875, 0.00106048583984375, -0.01137542724609375, -0.006206512451171875, 0.044036865234375, 0.004852294921875, -0.02947998046875, 0.04541015625, 0.0180511474609375, 0.0677490234375, -0.03125, -0.0146942138671875, -0.0135498046875, 0.006275177001953125, -0.02166748046875, -0.062744140625, 0.036590576171875, -0.00511932373046875, -0.019378662109375, -0.01522064208984375, 0.0692138671875, -0.0277099609375, -0.04742431640625, 0.03350830078125, 0.021270751953125, 0.0249481201171875, 0.005336761474609375, -0.0810546875, 0.01480865478515625, -0.00550079345703125, -0.028045654296875, 0.0170135498046875, 0.018524169921875, 0.0018463134765625, 0.034454345703125, 0.041961669921875, -0.00405120849609375, 0.005931854248046875, 0.001499176025390625, 0.0640869140625, -0.0249786376953125, -0.0243682861328125, -0.0599365234375, 0.05810546875, -0.01041412353515625, -0.0189208984375, 0.049774169921875, 0.04754638671875, 0.06463623046875, -0.008636474609375, 0.058502197265625, -0.0200653076171875, 0.0003070831298828125, -0.0357666015625, 0.0634765625, -0.060302734375, 0.00384521484375, -0.0282440185546875, -0.0704345703125, -0.015960693359375, 0.06756591796875, -0.01971435546875, 0.020965576171875, 0.034942626953125, 0.07391357421875, -0.01003265380859375, -0.0177764892578125, 0.0260772705078125, 0.0224609375, 0.029632568359375, 0.025634765625, 0.058807373046875, -0.05584716796875, 0.0304107666015625, -0.041351318359375, -0.021697998046875, -0.004451751708984375, -0.060302734375, -0.06878662109375, -0.051300048828125, -0.056884765625, -0.054351806640625, -0.007259368896484375, 0.02850341796875, 0.0706787109375, -0.03985595703125, -0.00354766845703125, -0.01495361328125, 0.002269744873046875, -0.0026836395263671875, -0.02117919921875, 0.0233612060546875, 0.00943756103515625, -0.06982421875, -0.0064697265625, 0.0234375, 0.04302978515625, -0.040679931640625, -0.018310546875, -0.01904296875, -0.01010894775390625, 0.0478515625, 0.0107574462890625, -0.05145263671875, -0.00319671630859375, -0.003231048583984375, -0.0049896240234375, 0.0117950439453125, 0.02288818359375, -0.046600341796875, 0.03131103515625, 0.04388427734375, 0.016357421875, 0.0670166015625, -0.007633209228515625, 0.0114898681640625, -0.0364990234375, 0.02630615234375, 0.01010894775390625, 0.028289794921875, 0.0285797119140625, -0.043609619140625, 0.037261962890625, 0.04742431640625, -0.0491943359375, -0.05914306640625, 0.01470184326171875, -0.08148193359375, -0.0196685791015625, 0.099853515625, -0.0088348388671875, -0.026763916015625, 0.0041961669921875, -0.032073974609375, 0.02447509765625, -0.0284881591796875, 0.042205810546875, 0.039947509765625, -0.01052093505859375, -0.038604736328125, -0.044952392578125, 0.041046142578125, 0.00951385498046875, -0.046966552734375, -0.022186279296875, 0.043243408203125, 0.049346923828125, 0.0175628662109375, 0.07293701171875, -0.028045654296875, 0.0194854736328125, 0.006359100341796875, 0.0031986236572265625, 0.0052642822265625, -0.02203369140625, -0.036590576171875, 0.0013818740844726562, -0.0120391845703125, -0.0005168914794921875 ] ]
Jean-Baptiste/roberta-large-ner-english
2023-03-22T02:19:36.000Z
[ "transformers", "pytorch", "tf", "onnx", "safetensors", "roberta", "token-classification", "en", "dataset:conll2003", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
Jean-Baptiste
null
null
Jean-Baptiste/roberta-large-ner-english
53
211,184
transformers
2022-03-02T23:29:04
--- language: en datasets: - conll2003 widget: - text: "My name is jean-baptiste and I live in montreal" - text: "My name is clara and I live in berkeley, california." - text: "My name is wolfgang and I live in berlin" train-eval-index: - config: conll2003 task: token-classification task_id: entity_extraction splits: eval_split: validation col_mapping: tokens: tokens ner_tags: tags license: mit --- # roberta-large-ner-english: model fine-tuned from roberta-large for NER task ## Introduction [roberta-large-ner-english] is an english NER model that was fine-tuned from roberta-large on conll2003 dataset. Model was validated on emails/chat data and outperformed other models on this type of data specifically. In particular the model seems to work better on entity that don't start with an upper case. ## Training data Training data was classified as follow: Abbreviation|Description -|- O |Outside of a named entity MISC |Miscellaneous entity PER |Person’s name ORG |Organization LOC |Location In order to simplify, the prefix B- or I- from original conll2003 was removed. I used the train and test dataset from original conll2003 for training and the "validation" dataset for validation. This resulted in a dataset of size: Train | Validation -|- 17494 | 3250 ## How to use roberta-large-ner-english with HuggingFace ##### Load roberta-large-ner-english and its sub-word tokenizer : ```python from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("Jean-Baptiste/roberta-large-ner-english") model = AutoModelForTokenClassification.from_pretrained("Jean-Baptiste/roberta-large-ner-english") ##### Process text sample (from wikipedia) from transformers import pipeline nlp = pipeline('ner', model=model, tokenizer=tokenizer, aggregation_strategy="simple") nlp("Apple was founded in 1976 by Steve Jobs, Steve Wozniak and Ronald Wayne to develop and sell Wozniak's Apple I personal computer") [{'entity_group': 'ORG', 'score': 0.99381506, 'word': ' Apple', 'start': 0, 'end': 5}, {'entity_group': 'PER', 'score': 0.99970853, 'word': ' Steve Jobs', 'start': 29, 'end': 39}, {'entity_group': 'PER', 'score': 0.99981767, 'word': ' Steve Wozniak', 'start': 41, 'end': 54}, {'entity_group': 'PER', 'score': 0.99956465, 'word': ' Ronald Wayne', 'start': 59, 'end': 71}, {'entity_group': 'PER', 'score': 0.9997918, 'word': ' Wozniak', 'start': 92, 'end': 99}, {'entity_group': 'MISC', 'score': 0.99956393, 'word': ' Apple I', 'start': 102, 'end': 109}] ``` ## Model performances Model performances computed on conll2003 validation dataset (computed on the tokens predictions) entity|precision|recall|f1 -|-|-|- PER|0.9914|0.9927|0.9920 ORG|0.9627|0.9661|0.9644 LOC|0.9795|0.9862|0.9828 MISC|0.9292|0.9262|0.9277 Overall|0.9740|0.9766|0.9753 On private dataset (email, chat, informal discussion), computed on word predictions: entity|precision|recall|f1 -|-|-|- PER|0.8823|0.9116|0.8967 ORG|0.7694|0.7292|0.7487 LOC|0.8619|0.7768|0.8171 By comparison on the same private dataset, Spacy (en_core_web_trf-3.2.0) was giving: entity|precision|recall|f1 -|-|-|- PER|0.9146|0.8287|0.8695 ORG|0.7655|0.6437|0.6993 LOC|0.8727|0.6180|0.7236 For those who could be interested, here is a short article on how I used the results of this model to train a LSTM model for signature detection in emails: https://medium.com/@jean-baptiste.polle/lstm-model-for-email-signature-detection-8e990384fefa
3,545
[ [ -0.02734375, -0.0557861328125, 0.017913818359375, 0.0011205673217773438, -0.0126953125, -0.01261138916015625, -0.03485107421875, -0.0294952392578125, 0.0254669189453125, 0.034027099609375, -0.040496826171875, -0.0557861328125, -0.07025146484375, 0.020721435546875, -0.032958984375, 0.09295654296875, 0.00714874267578125, 0.006526947021484375, 0.00945281982421875, -0.0212554931640625, -0.0340576171875, -0.054931640625, -0.062103271484375, -0.0248260498046875, 0.0333251953125, 0.028228759765625, 0.0350341796875, 0.0258636474609375, 0.037994384765625, 0.020111083984375, -0.0214080810546875, 0.0125885009765625, -0.0288543701171875, -0.00882720947265625, -0.0003190040588378906, -0.035430908203125, -0.041839599609375, 0.00753021240234375, 0.04046630859375, 0.03765869140625, -0.011474609375, 0.023956298828125, 0.00283050537109375, 0.04425048828125, -0.0238494873046875, 0.01947021484375, -0.06610107421875, -0.00006502866744995117, -0.016815185546875, 0.00839996337890625, -0.030548095703125, -0.0119781494140625, -0.0018758773803710938, -0.0281982421875, 0.0147247314453125, 0.0269622802734375, 0.0986328125, 0.01270294189453125, -0.021453857421875, -0.0244140625, -0.041778564453125, 0.06591796875, -0.06903076171875, 0.02764892578125, 0.0296783447265625, 0.0095672607421875, -0.018646240234375, -0.052642822265625, -0.06585693359375, -0.0136871337890625, -0.01519012451171875, 0.01139068603515625, -0.022979736328125, 0.005069732666015625, 0.031982421875, 0.035491943359375, -0.05303955078125, 0.0202789306640625, -0.036224365234375, -0.01873779296875, 0.048828125, 0.00600433349609375, 0.00853729248046875, -0.01311492919921875, -0.018829345703125, -0.0185546875, -0.03973388671875, 0.0077056884765625, 0.020904541015625, 0.031585693359375, -0.0192413330078125, 0.060394287109375, -0.0176544189453125, 0.0556640625, 0.01369476318359375, -0.0110626220703125, 0.041229248046875, -0.019195556640625, -0.0209197998046875, 0.0005321502685546875, 0.08447265625, 0.026031494140625, 0.028350830078125, 0.0026454925537109375, -0.0257568359375, 0.00519561767578125, -0.006839752197265625, -0.05645751953125, -0.01202392578125, 0.0192413330078125, -0.0292816162109375, -0.00568389892578125, 0.0168609619140625, -0.03729248046875, -0.00787353515625, -0.0298919677734375, 0.044281005859375, -0.0438232421875, -0.0006632804870605469, 0.0183868408203125, -0.008819580078125, 0.003856658935546875, 0.0033740997314453125, -0.05670166015625, 0.022247314453125, 0.052825927734375, 0.04852294921875, -0.0012950897216796875, -0.026123046875, -0.035552978515625, -0.0095062255859375, -0.013916015625, 0.03948974609375, -0.02325439453125, -0.0124359130859375, -0.01221466064453125, 0.0184173583984375, -0.016998291015625, -0.0267181396484375, 0.0557861328125, -0.02587890625, 0.044708251953125, 0.003849029541015625, -0.05694580078125, -0.0207672119140625, 0.04248046875, -0.052642822265625, 0.0904541015625, 0.006214141845703125, -0.0643310546875, 0.046844482421875, -0.045745849609375, -0.022216796875, -0.00884246826171875, -0.00897979736328125, -0.044342041015625, 0.0079345703125, 0.0222320556640625, 0.047454833984375, -0.035980224609375, 0.0208892822265625, -0.00992584228515625, -0.0194244384765625, 0.02294921875, -0.0284576416015625, 0.056671142578125, -0.00518035888671875, -0.040435791015625, 0.000029861927032470703, -0.0758056640625, 0.0163116455078125, 0.0031108856201171875, -0.027740478515625, -0.0192413330078125, -0.0208892822265625, 0.005977630615234375, 0.026763916015625, 0.0218505859375, -0.039764404296875, 0.009979248046875, -0.050933837890625, 0.02789306640625, 0.046234130859375, -0.0003254413604736328, 0.0208892822265625, -0.01702880859375, 0.03460693359375, 0.007572174072265625, 0.00798797607421875, 0.00726318359375, -0.043975830078125, -0.056976318359375, -0.01849365234375, 0.04852294921875, 0.043243408203125, -0.0277252197265625, 0.06512451171875, -0.049713134765625, -0.04345703125, -0.04510498046875, -0.0007963180541992188, 0.03582763671875, 0.041046142578125, 0.033782958984375, -0.0201416015625, -0.046173095703125, -0.0816650390625, -0.0238800048828125, -0.01117706298828125, 0.00601959228515625, 0.0272216796875, 0.059783935546875, -0.0090484619140625, 0.051666259765625, -0.0292816162109375, -0.0391845703125, -0.0207061767578125, 0.015045166015625, 0.061859130859375, 0.038726806640625, 0.02288818359375, -0.04803466796875, -0.04925537109375, -0.000537872314453125, -0.034698486328125, 0.0178680419921875, -0.0029659271240234375, -0.0034580230712890625, 0.03509521484375, 0.034393310546875, -0.054351806640625, 0.0355224609375, 0.049163818359375, -0.03955078125, 0.039825439453125, -0.028350830078125, 0.01029205322265625, -0.09820556640625, 0.01044464111328125, 0.0021114349365234375, 0.0002903938293457031, -0.038604736328125, -0.00733184814453125, 0.0032062530517578125, 0.0137481689453125, -0.0252838134765625, 0.0511474609375, -0.047454833984375, 0.0114898681640625, 0.0024662017822265625, 0.00922393798828125, 0.0009350776672363281, 0.02557373046875, -0.0018224716186523438, 0.052825927734375, 0.033294677734375, -0.042449951171875, 0.0208740234375, 0.0208892822265625, -0.035430908203125, 0.0447998046875, -0.053924560546875, 0.007289886474609375, 0.0017576217651367188, 0.011138916015625, -0.059295654296875, -0.00582122802734375, 0.0028629302978515625, -0.05084228515625, 0.0277099609375, -0.01213836669921875, -0.0565185546875, -0.047027587890625, 0.00039887428283691406, 0.00916290283203125, 0.033477783203125, -0.023773193359375, 0.053985595703125, 0.0254669189453125, 0.0143280029296875, -0.04290771484375, -0.0723876953125, 0.01413726806640625, -0.020751953125, -0.05169677734375, 0.034912109375, -0.00794219970703125, -0.0081634521484375, -0.001590728759765625, 0.0110015869140625, -0.0113372802734375, 0.0020427703857421875, 0.0184173583984375, 0.027008056640625, -0.014251708984375, 0.006664276123046875, -0.011688232421875, -0.0194854736328125, -0.0011119842529296875, -0.019012451171875, 0.054290771484375, -0.004245758056640625, -0.01470184326171875, -0.0278778076171875, 0.00905609130859375, 0.04278564453125, -0.01702880859375, 0.0869140625, 0.05633544921875, -0.047760009765625, 0.00110626220703125, -0.053497314453125, -0.0244903564453125, -0.028961181640625, 0.03558349609375, -0.039093017578125, -0.065673828125, 0.0400390625, 0.0240020751953125, 0.00334930419921875, 0.066650390625, 0.038848876953125, 0.01337432861328125, 0.08258056640625, 0.0272979736328125, -0.0230712890625, 0.0190887451171875, -0.037689208984375, 0.028839111328125, -0.06109619140625, -0.026031494140625, -0.0537109375, -0.030792236328125, -0.0552978515625, -0.019134521484375, 0.0116119384765625, 0.007411956787109375, -0.02325439453125, 0.03997802734375, -0.05908203125, 0.0237884521484375, 0.051300048828125, 0.004314422607421875, 0.005035400390625, 0.0022525787353515625, -0.0208587646484375, -0.0216064453125, -0.045989990234375, -0.02935791015625, 0.08447265625, 0.01538848876953125, 0.03515625, 0.0017042160034179688, 0.05816650390625, 0.003997802734375, 0.0169830322265625, -0.053985595703125, 0.043792724609375, -0.0277557373046875, -0.057037353515625, -0.0231781005859375, -0.03045654296875, -0.07489013671875, 0.0005197525024414062, -0.038055419921875, -0.06884765625, -0.0048675537109375, 0.0012340545654296875, -0.01491546630859375, 0.0255889892578125, -0.04034423828125, 0.06671142578125, -0.01934814453125, -0.00345611572265625, -0.0121612548828125, -0.038787841796875, 0.0137939453125, -0.005207061767578125, 0.027801513671875, -0.01148223876953125, 0.0037822723388671875, 0.0645751953125, -0.036956787109375, 0.053497314453125, -0.0030155181884765625, 0.01041412353515625, 0.0061492919921875, 0.00017154216766357422, 0.061248779296875, -0.00926971435546875, -0.0208282470703125, 0.0212249755859375, -0.004917144775390625, -0.007289886474609375, -0.03521728515625, 0.056243896484375, -0.054595947265625, -0.02862548828125, -0.04400634765625, -0.02532958984375, -0.0027675628662109375, 0.0272674560546875, 0.048004150390625, 0.0277252197265625, -0.003543853759765625, 0.039215087890625, 0.04571533203125, -0.0105743408203125, 0.02362060546875, 0.040771484375, 0.000014841556549072266, -0.049560546875, 0.04974365234375, 0.015625, -0.0009436607360839844, 0.03021240234375, 0.014495849609375, -0.037750244140625, -0.04248046875, -0.01556396484375, 0.0321044921875, -0.039398193359375, -0.0187835693359375, -0.05841064453125, -0.0288543701171875, -0.038970947265625, 0.0141448974609375, -0.0169830322265625, -0.034149169921875, -0.038970947265625, -0.0142974853515625, 0.0345458984375, 0.049530029296875, 0.0126800537109375, 0.01255035400390625, -0.0552978515625, 0.007411956787109375, -0.00513458251953125, 0.0203399658203125, -0.013641357421875, -0.07342529296875, -0.0258331298828125, 0.007152557373046875, -0.012176513671875, -0.06329345703125, 0.0386962890625, 0.0301971435546875, 0.039764404296875, 0.014312744140625, -0.00955963134765625, 0.060455322265625, -0.03607177734375, 0.07867431640625, 0.020782470703125, -0.0667724609375, 0.043731689453125, -0.004543304443359375, 0.00699615478515625, 0.04498291015625, 0.02410888671875, -0.05120849609375, -0.035491943359375, -0.0841064453125, -0.08099365234375, 0.04840087890625, 0.018218994140625, 0.012939453125, -0.01055145263671875, 0.0283355712890625, 0.0010995864868164062, 0.017547607421875, -0.0721435546875, -0.03631591796875, 0.01174163818359375, -0.037322998046875, -0.02337646484375, -0.022125244140625, -0.0035572052001953125, -0.02935791015625, 0.085693359375, -0.002689361572265625, 0.01119232177734375, 0.0121307373046875, -0.009063720703125, 0.020538330078125, 0.02496337890625, 0.041168212890625, 0.0292510986328125, 0.00264739990234375, -0.007617950439453125, 0.03564453125, -0.0364990234375, -0.0092010498046875, 0.0188446044921875, -0.01535797119140625, 0.01189422607421875, 0.0220947265625, 0.06695556640625, 0.01496124267578125, -0.033203125, 0.04571533203125, -0.0113067626953125, -0.0318603515625, -0.0513916015625, -0.00397491455078125, -0.00458526611328125, 0.004131317138671875, 0.0227508544921875, -0.0038547515869140625, 0.00569915771484375, -0.0230865478515625, 0.00989532470703125, 0.039764404296875, -0.039703369140625, -0.0235748291015625, 0.051513671875, -0.016571044921875, -0.0234527587890625, 0.0498046875, -0.0201416015625, -0.050079345703125, 0.048431396484375, 0.04083251953125, 0.06304931640625, 0.0018024444580078125, -0.0042266845703125, 0.062103271484375, 0.036102294921875, -0.0194854736328125, 0.034759521484375, 0.02825927734375, -0.053466796875, -0.016204833984375, -0.05450439453125, -0.00007241964340209961, 0.033905029296875, -0.04644775390625, 0.03302001953125, -0.0321044921875, -0.03582763671875, 0.00550079345703125, 0.0137939453125, -0.0675048828125, 0.044281005859375, -0.004425048828125, 0.07366943359375, -0.06109619140625, 0.04949951171875, 0.0601806640625, -0.03759765625, -0.10504150390625, -0.0171051025390625, -0.00738525390625, -0.054351806640625, 0.0587158203125, 0.0197601318359375, 0.02996826171875, -0.004306793212890625, -0.0271148681640625, -0.0843505859375, 0.08258056640625, 0.006427764892578125, -0.051513671875, -0.002826690673828125, 0.01079559326171875, 0.049652099609375, -0.026702880859375, 0.034912109375, 0.03570556640625, 0.041778564453125, -0.0056610107421875, -0.07867431640625, 0.0110015869140625, -0.026153564453125, -0.0087432861328125, 0.022674560546875, -0.044708251953125, 0.0723876953125, 0.0050048828125, 0.0011920928955078125, 0.0127105712890625, 0.047943115234375, 0.0241241455078125, 0.01546478271484375, 0.039642333984375, 0.06494140625, 0.067626953125, -0.0306549072265625, 0.07086181640625, -0.049591064453125, 0.0396728515625, 0.0799560546875, 0.00656890869140625, 0.0704345703125, 0.033721923828125, -0.01451873779296875, 0.050140380859375, 0.0440673828125, -0.0181732177734375, 0.01110076904296875, 0.0137939453125, -0.00225067138671875, -0.00982666015625, 0.00536346435546875, -0.0212249755859375, 0.040924072265625, 0.0223541259765625, -0.048187255859375, 0.01300811767578125, -0.01502227783203125, 0.0170745849609375, -0.00020420551300048828, -0.0150604248046875, 0.0565185546875, 0.0029087066650390625, -0.05908203125, 0.04296875, 0.004917144775390625, 0.0616455078125, -0.033599853515625, 0.00502777099609375, -0.0006275177001953125, 0.0229034423828125, -0.016143798828125, -0.058013916015625, 0.011871337890625, -0.0011119842529296875, -0.01947021484375, 0.0019044876098632812, 0.04290771484375, -0.033416748046875, -0.054107666015625, 0.01788330078125, 0.0178985595703125, 0.02227783203125, -0.01399993896484375, -0.0760498046875, -0.01361846923828125, 0.00511932373046875, -0.026123046875, 0.0031280517578125, 0.048248291015625, -0.00007510185241699219, 0.04229736328125, 0.05596923828125, 0.02801513671875, -0.0013074874877929688, 0.01302337646484375, 0.043731689453125, -0.055084228515625, -0.043792724609375, -0.06707763671875, 0.044891357421875, -0.0192718505859375, -0.046722412109375, 0.05657958984375, 0.06536865234375, 0.05255126953125, 0.005931854248046875, 0.042144775390625, -0.019287109375, 0.050933837890625, -0.045013427734375, 0.05780029296875, -0.054046630859375, 0.00868988037109375, -0.01320648193359375, -0.07012939453125, -0.0196533203125, 0.042236328125, -0.0279541015625, 0.02337646484375, 0.052337646484375, 0.0665283203125, -0.0023632049560546875, -0.0111236572265625, 0.0165252685546875, 0.0426025390625, 0.0159454345703125, 0.043426513671875, 0.0291748046875, -0.057037353515625, 0.040740966796875, -0.024749755859375, -0.009124755859375, -0.01556396484375, -0.06689453125, -0.06903076171875, -0.054840087890625, -0.02178955078125, -0.05706787109375, -0.00608062744140625, 0.07403564453125, 0.048919677734375, -0.0692138671875, -0.0015077590942382812, -0.01168060302734375, -0.010955810546875, -0.006072998046875, -0.0217742919921875, 0.0306396484375, -0.005168914794921875, -0.0546875, 0.011199951171875, -0.0021381378173828125, 0.0160369873046875, -0.003284454345703125, -0.01189422607421875, -0.01186370849609375, 0.0016393661499023438, 0.03125, 0.01099395751953125, -0.060821533203125, -0.0264129638671875, 0.008209228515625, -0.0223236083984375, 0.005222320556640625, 0.0228271484375, -0.043212890625, 0.013427734375, 0.0227813720703125, 0.0207977294921875, 0.051605224609375, -0.011322021484375, 0.0272064208984375, -0.053985595703125, 0.01029205322265625, 0.0210113525390625, 0.04522705078125, 0.03314208984375, -0.0347900390625, 0.033966064453125, 0.01407623291015625, -0.04571533203125, -0.058197021484375, -0.0080718994140625, -0.0711669921875, -0.002346038818359375, 0.0850830078125, -0.005157470703125, -0.032073974609375, 0.0114593505859375, -0.01137542724609375, 0.0181121826171875, -0.0292816162109375, 0.06158447265625, 0.06390380859375, 0.0158233642578125, -0.004314422607421875, -0.0238494873046875, 0.0291290283203125, 0.0338134765625, -0.047332763671875, -0.01337432861328125, 0.0249786376953125, 0.03680419921875, 0.016845703125, 0.0450439453125, -0.01229095458984375, 0.01293182373046875, -0.01377105712890625, 0.016571044921875, 0.0005116462707519531, -0.0041961669921875, -0.0281982421875, -0.004024505615234375, 0.00228118896484375, -0.006061553955078125 ] ]
neuralmind/bert-base-portuguese-cased
2022-06-14T14:37:09.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "pt", "dataset:brWaC", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
neuralmind
null
null
neuralmind/bert-base-portuguese-cased
94
210,757
transformers
2022-03-02T23:29:05
--- language: pt license: mit tags: - bert - pytorch datasets: - brWaC --- # BERTimbau Base (aka "bert-base-portuguese-cased") ![Bert holding a berimbau](https://imgur.com/JZ7Hynh.jpg) ## Introduction BERTimbau Base is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large. For further information or requests, please go to [BERTimbau repository](https://github.com/neuralmind-ai/portuguese-bert/). ## Available models | Model | Arch. | #Layers | #Params | | ---------------------------------------- | ---------- | ------- | ------- | | `neuralmind/bert-base-portuguese-cased` | BERT-Base | 12 | 110M | | `neuralmind/bert-large-portuguese-cased` | BERT-Large | 24 | 335M | ## Usage ```python from transformers import AutoTokenizer # Or BertTokenizer from transformers import AutoModelForPreTraining # Or BertForPreTraining for loading pretraining heads from transformers import AutoModel # or BertModel, for BERT without pretraining heads model = AutoModelForPreTraining.from_pretrained('neuralmind/bert-base-portuguese-cased') tokenizer = AutoTokenizer.from_pretrained('neuralmind/bert-base-portuguese-cased', do_lower_case=False) ``` ### Masked language modeling prediction example ```python from transformers import pipeline pipe = pipeline('fill-mask', model=model, tokenizer=tokenizer) pipe('Tinha uma [MASK] no meio do caminho.') # [{'score': 0.14287759363651276, # 'sequence': '[CLS] Tinha uma pedra no meio do caminho. [SEP]', # 'token': 5028, # 'token_str': 'pedra'}, # {'score': 0.06213393807411194, # 'sequence': '[CLS] Tinha uma árvore no meio do caminho. [SEP]', # 'token': 7411, # 'token_str': 'árvore'}, # {'score': 0.05515013635158539, # 'sequence': '[CLS] Tinha uma estrada no meio do caminho. [SEP]', # 'token': 5675, # 'token_str': 'estrada'}, # {'score': 0.0299188531935215, # 'sequence': '[CLS] Tinha uma casa no meio do caminho. [SEP]', # 'token': 1105, # 'token_str': 'casa'}, # {'score': 0.025660505518317223, # 'sequence': '[CLS] Tinha uma cruz no meio do caminho. [SEP]', # 'token': 3466, # 'token_str': 'cruz'}] ``` ### For BERT embeddings ```python import torch model = AutoModel.from_pretrained('neuralmind/bert-base-portuguese-cased') input_ids = tokenizer.encode('Tinha uma pedra no meio do caminho.', return_tensors='pt') with torch.no_grad(): outs = model(input_ids) encoded = outs[0][0, 1:-1] # Ignore [CLS] and [SEP] special tokens # encoded.shape: (8, 768) # tensor([[-0.0398, -0.3057, 0.2431, ..., -0.5420, 0.1857, -0.5775], # [-0.2926, -0.1957, 0.7020, ..., -0.2843, 0.0530, -0.4304], # [ 0.2463, -0.1467, 0.5496, ..., 0.3781, -0.2325, -0.5469], # ..., # [ 0.0662, 0.7817, 0.3486, ..., -0.4131, -0.2852, -0.2819], # [ 0.0662, 0.2845, 0.1871, ..., -0.2542, -0.2933, -0.0661], # [ 0.2761, -0.1657, 0.3288, ..., -0.2102, 0.0029, -0.2009]]) ``` ## Citation If you use our work, please cite: ```bibtex @inproceedings{souza2020bertimbau, author = {F{\'a}bio Souza and Rodrigo Nogueira and Roberto Lotufo}, title = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese}, booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)}, year = {2020} } ```
3,598
[ [ -0.012939453125, -0.0333251953125, 0.00774383544921875, 0.035400390625, -0.03631591796875, -0.0005998611450195312, -0.0118255615234375, 0.000850677490234375, 0.041168212890625, 0.026519775390625, -0.03521728515625, -0.04815673828125, -0.052032470703125, -0.00380706787109375, -0.025238037109375, 0.0819091796875, 0.01342010498046875, 0.0298614501953125, 0.01995849609375, 0.00396728515625, -0.0242767333984375, -0.04400634765625, -0.06402587890625, -0.0262451171875, 0.038543701171875, 0.01442718505859375, 0.03125, 0.027984619140625, 0.02752685546875, 0.0288543701171875, -0.006458282470703125, -0.0076751708984375, -0.033599853515625, -0.0009708404541015625, 0.007671356201171875, -0.047332763671875, -0.049407958984375, -0.0148773193359375, 0.04669189453125, 0.04473876953125, 0.00351715087890625, 0.01708984375, -0.006805419921875, 0.034423828125, -0.01508331298828125, 0.035369873046875, -0.040252685546875, 0.005481719970703125, -0.0091705322265625, -0.008453369140625, -0.0215606689453125, -0.0304412841796875, 0.015106201171875, -0.042022705078125, 0.0168609619140625, -0.001434326171875, 0.08966064453125, 0.0168304443359375, -0.007656097412109375, -0.004756927490234375, -0.0245819091796875, 0.0616455078125, -0.056732177734375, 0.01102447509765625, 0.03399658203125, 0.015838623046875, -0.020599365234375, -0.05230712890625, -0.0279541015625, 0.006313323974609375, 0.004184722900390625, 0.006683349609375, -0.01221466064453125, -0.0113983154296875, 0.0248565673828125, 0.0133209228515625, -0.035369873046875, -0.00167083740234375, -0.049560546875, -0.0229034423828125, 0.044403076171875, 0.0013523101806640625, 0.0173187255859375, -0.0190887451171875, -0.0234527587890625, -0.0285186767578125, -0.040771484375, 0.00479888916015625, 0.042388916015625, 0.0292816162109375, -0.0240631103515625, 0.046417236328125, -0.0150604248046875, 0.04620361328125, 0.0023345947265625, 0.0082550048828125, 0.050872802734375, 0.0046234130859375, -0.037811279296875, 0.0015573501586914062, 0.071533203125, 0.012786865234375, 0.031585693359375, -0.0029544830322265625, -0.00670623779296875, -0.0047149658203125, 0.002777099609375, -0.047271728515625, -0.0305023193359375, 0.0177154541015625, -0.038604736328125, -0.0291748046875, 0.030059814453125, -0.052001953125, -0.00616455078125, 0.0005536079406738281, 0.053070068359375, -0.047637939453125, -0.01505279541015625, 0.0168304443359375, -0.029449462890625, 0.038238525390625, 0.00627899169921875, -0.06585693359375, 0.0121612548828125, 0.029510498046875, 0.056396484375, 0.0287628173828125, -0.01279449462890625, -0.00823211669921875, -0.005390167236328125, -0.01435089111328125, 0.04400634765625, -0.00838470458984375, -0.041168212890625, -0.006496429443359375, 0.017852783203125, -0.010589599609375, -0.0151824951171875, 0.048858642578125, -0.026580810546875, 0.0247344970703125, -0.0121307373046875, -0.0196685791015625, -0.035064697265625, 0.0182037353515625, -0.041046142578125, 0.07965087890625, 0.0201416015625, -0.04888916015625, 0.02874755859375, -0.06500244140625, -0.0298309326171875, 0.0068511962890625, -0.004261016845703125, -0.032562255859375, -0.0011091232299804688, 0.0242156982421875, 0.03631591796875, -0.01418304443359375, 0.018829345703125, -0.0285186767578125, -0.0206298828125, 0.020263671875, -0.0262603759765625, 0.0968017578125, 0.0164947509765625, -0.0209808349609375, 0.01611328125, -0.059783935546875, 0.0095977783203125, 0.0199737548828125, -0.01861572265625, 0.001537322998046875, -0.01322174072265625, 0.014129638671875, 0.004749298095703125, 0.03765869140625, -0.0504150390625, 0.0301666259765625, -0.03021240234375, 0.05126953125, 0.059967041015625, -0.006069183349609375, 0.00763702392578125, -0.02752685546875, 0.0295562744140625, 0.00891876220703125, 0.0006022453308105469, 0.00894927978515625, -0.049072265625, -0.060211181640625, -0.042083740234375, 0.042388916015625, 0.044281005859375, -0.04095458984375, 0.06939697265625, -0.02069091796875, -0.05853271484375, -0.03863525390625, -0.0177154541015625, 0.01432037353515625, 0.03192138671875, 0.02313232421875, -0.02978515625, -0.06585693359375, -0.05419921875, -0.00231170654296875, -0.01149749755859375, -0.01300048828125, 0.0194854736328125, 0.056182861328125, -0.0234222412109375, 0.057159423828125, -0.0399169921875, -0.0318603515625, -0.00421905517578125, 0.017059326171875, 0.0584716796875, 0.06390380859375, 0.05572509765625, -0.039642333984375, -0.040130615234375, -0.0217132568359375, -0.064453125, 0.003173828125, 0.01322174072265625, -0.0167694091796875, 0.0100250244140625, 0.01255035400390625, -0.046112060546875, 0.044036865234375, 0.01392364501953125, -0.04638671875, 0.0285797119140625, -0.0357666015625, 0.00885772705078125, -0.07666015625, 0.0164031982421875, -0.01416015625, -0.0133209228515625, -0.03363037109375, -0.015655517578125, -0.006565093994140625, -0.00007599592208862305, -0.042236328125, 0.03887939453125, -0.0301666259765625, -0.0014085769653320312, 0.0114898681640625, -0.0266265869140625, -0.0123291015625, 0.04815673828125, 0.00634765625, 0.0296630859375, 0.07208251953125, -0.0301055908203125, 0.04119873046875, 0.04107666015625, -0.0309295654296875, 0.0251617431640625, -0.0704345703125, 0.004024505615234375, -0.004474639892578125, 0.01470947265625, -0.07684326171875, -0.021484375, 0.02838134765625, -0.054351806640625, 0.0122528076171875, -0.0253753662109375, -0.0478515625, -0.03582763671875, -0.0268707275390625, 0.0401611328125, 0.04669189453125, -0.0274810791015625, 0.032867431640625, 0.02227783203125, -0.00494384765625, -0.056915283203125, -0.0611572265625, -0.00933837890625, -0.0202178955078125, -0.041290283203125, 0.0260009765625, -0.0020580291748046875, 0.007648468017578125, -0.00542449951171875, 0.005603790283203125, -0.01751708984375, -0.0034027099609375, 0.024658203125, 0.030548095703125, -0.0203857421875, -0.0102386474609375, -0.005100250244140625, -0.00699615478515625, 0.0157928466796875, -0.0171356201171875, 0.07763671875, -0.016693115234375, -0.0020313262939453125, -0.025787353515625, 0.01279449462890625, 0.045013427734375, -0.00429534912109375, 0.05242919921875, 0.059661865234375, -0.0499267578125, 0.00016379356384277344, -0.0200653076171875, -0.007526397705078125, -0.034942626953125, 0.0263824462890625, -0.03985595703125, -0.03369140625, 0.06622314453125, 0.026580810546875, -0.0102691650390625, 0.0599365234375, 0.059112548828125, -0.0180816650390625, 0.06927490234375, 0.0284576416015625, -0.011810302734375, 0.0455322265625, -0.053131103515625, 0.01248931884765625, -0.06024169921875, -0.04412841796875, -0.0296173095703125, -0.030059814453125, -0.0287017822265625, -0.0165252685546875, 0.0157928466796875, 0.005100250244140625, -0.044891357421875, 0.048553466796875, -0.0423583984375, 0.0223388671875, 0.0687255859375, 0.039642333984375, -0.0212249755859375, -0.01079559326171875, -0.018280029296875, 0.002407073974609375, -0.05010986328125, -0.0221405029296875, 0.10821533203125, 0.02740478515625, 0.05328369140625, 0.01190185546875, 0.05511474609375, 0.0223388671875, 0.00574493408203125, -0.046722412109375, 0.037567138671875, -0.0081329345703125, -0.07012939453125, -0.028900146484375, -0.01378631591796875, -0.09210205078125, 0.0192718505859375, -0.0250396728515625, -0.06414794921875, 0.0138092041015625, -0.0101165771484375, -0.04559326171875, 0.0203857421875, -0.04864501953125, 0.07476806640625, -0.0296630859375, -0.0117340087890625, 0.0018301010131835938, -0.055084228515625, 0.0041656494140625, 0.0121612548828125, -0.006954193115234375, -0.0115814208984375, 0.0139312744140625, 0.07818603515625, -0.035675048828125, 0.068359375, -0.0200653076171875, 0.014129638671875, 0.0182342529296875, -0.0092010498046875, 0.01248931884765625, 0.017974853515625, -0.0027637481689453125, 0.019073486328125, 0.00911712646484375, -0.053558349609375, -0.0087890625, 0.04656982421875, -0.07470703125, -0.0190582275390625, -0.057159423828125, -0.0411376953125, 0.00836944580078125, 0.038818359375, 0.0499267578125, 0.0419921875, -0.0102386474609375, 0.0258331298828125, 0.042572021484375, -0.013580322265625, 0.054168701171875, 0.0194854736328125, -0.006458282470703125, -0.03955078125, 0.05694580078125, 0.0180206298828125, -0.0244140625, 0.0213623046875, 0.005603790283203125, -0.042388916015625, -0.035125732421875, -0.01837158203125, 0.026947021484375, -0.036712646484375, -0.0308380126953125, -0.035064697265625, -0.0260162353515625, -0.05328369140625, -0.0024166107177734375, -0.0192718505859375, -0.032470703125, -0.0472412109375, -0.01120758056640625, 0.03204345703125, 0.035369873046875, -0.0230865478515625, 0.039398193359375, -0.04827880859375, 0.0164947509765625, 0.0118255615234375, 0.03271484375, -0.019683837890625, -0.059295654296875, -0.005100250244140625, -0.002277374267578125, -0.0105743408203125, -0.07476806640625, 0.06512451171875, 0.0073089599609375, 0.0499267578125, 0.035430908203125, -0.00366973876953125, 0.042236328125, -0.025604248046875, 0.04443359375, 0.0105743408203125, -0.06842041015625, 0.04833984375, -0.038055419921875, 0.0009279251098632812, 0.03240966796875, 0.0322265625, -0.01129913330078125, -0.004077911376953125, -0.0859375, -0.07421875, 0.05804443359375, 0.0189208984375, 0.0021839141845703125, 0.003082275390625, 0.0024089813232421875, 0.0073394775390625, 0.030487060546875, -0.07843017578125, -0.037567138671875, -0.0296173095703125, -0.032745361328125, 0.005374908447265625, -0.01000213623046875, -0.016571044921875, -0.044097900390625, 0.06304931640625, 0.0099639892578125, 0.054290771484375, 0.0096588134765625, -0.01383209228515625, 0.01189422607421875, -0.0151824951171875, 0.05145263671875, 0.033050537109375, -0.050689697265625, -0.01190185546875, 0.00409698486328125, -0.0306396484375, -0.001834869384765625, 0.0150299072265625, -0.00371551513671875, 0.01439666748046875, 0.0234832763671875, 0.05511474609375, 0.0168304443359375, -0.028472900390625, 0.03228759765625, 0.00933074951171875, -0.0254364013671875, -0.05694580078125, 0.003757476806640625, -0.009979248046875, 0.0160064697265625, 0.03411865234375, 0.022003173828125, -0.0009341239929199219, -0.033203125, 0.017578125, 0.0299835205078125, -0.036712646484375, -0.020721435546875, 0.049957275390625, 0.010498046875, -0.036712646484375, 0.050628662109375, -0.0034160614013671875, -0.047119140625, 0.06939697265625, 0.03369140625, 0.0667724609375, 0.0029449462890625, 0.0106201171875, 0.042236328125, 0.019866943359375, -0.00757598876953125, 0.05279541015625, 0.01314544677734375, -0.06707763671875, -0.0078277587890625, -0.037811279296875, -0.005535125732421875, 0.01180267333984375, -0.05609130859375, 0.033416748046875, -0.049835205078125, -0.0260162353515625, 0.0004105567932128906, 0.004055023193359375, -0.05804443359375, 0.030364990234375, 0.0167694091796875, 0.068359375, -0.0728759765625, 0.0845947265625, 0.053070068359375, -0.049072265625, -0.047637939453125, -0.038970947265625, -0.0244598388671875, -0.09051513671875, 0.05035400390625, 0.01183319091796875, 0.019561767578125, 0.014129638671875, -0.05279541015625, -0.0709228515625, 0.0772705078125, 0.0233154296875, -0.01369476318359375, -0.00623321533203125, -0.01146697998046875, 0.038543701171875, -0.016754150390625, 0.04315185546875, 0.03778076171875, 0.037506103515625, -0.00252532958984375, -0.04571533203125, -0.0088043212890625, -0.0265350341796875, -0.0120086669921875, 0.01042938232421875, -0.057281494140625, 0.0802001953125, -0.006870269775390625, 0.0071868896484375, 0.0102996826171875, 0.054168701171875, 0.01116180419921875, -0.01187896728515625, 0.027099609375, 0.05560302734375, 0.0472412109375, -0.0404052734375, 0.04248046875, -0.0182647705078125, 0.05377197265625, 0.0499267578125, 0.01239013671875, 0.057159423828125, 0.044036865234375, -0.0254974365234375, 0.0628662109375, 0.06048583984375, -0.032135009765625, 0.053619384765625, 0.017974853515625, -0.0009202957153320312, -0.004913330078125, 0.0189208984375, -0.039703369140625, 0.0418701171875, 0.03533935546875, -0.034515380859375, -0.0124969482421875, -0.0120086669921875, 0.01306915283203125, -0.0099639892578125, -0.03070068359375, 0.033477783203125, -0.000728607177734375, -0.0428466796875, 0.04107666015625, 0.00867462158203125, 0.0655517578125, -0.050933837890625, 0.006656646728515625, -0.004863739013671875, 0.0207366943359375, -0.006870269775390625, -0.061279296875, 0.00904083251953125, -0.005764007568359375, -0.0126800537109375, -0.0165557861328125, 0.041839599609375, -0.0260772705078125, -0.05072021484375, 0.01444244384765625, 0.0195159912109375, 0.0250396728515625, -0.003551483154296875, -0.06646728515625, -0.01251983642578125, 0.007144927978515625, -0.02288818359375, 0.007015228271484375, 0.0279541015625, 0.017364501953125, 0.04486083984375, 0.055206298828125, 0.00835418701171875, 0.0253143310546875, -0.0161895751953125, 0.05126953125, -0.06610107421875, -0.047576904296875, -0.069580078125, 0.037567138671875, -0.0067291259765625, -0.0604248046875, 0.036407470703125, 0.053131103515625, 0.04962158203125, -0.0322265625, 0.049102783203125, -0.03497314453125, 0.02557373046875, -0.023223876953125, 0.06817626953125, -0.020294189453125, -0.00841522216796875, -0.0079345703125, -0.06231689453125, -0.029937744140625, 0.07403564453125, -0.0167999267578125, 0.002872467041015625, 0.038665771484375, 0.043853759765625, 0.00984954833984375, -0.0186004638671875, 0.0148468017578125, 0.0225982666015625, 0.01543426513671875, 0.06201171875, 0.0294189453125, -0.056121826171875, 0.028472900390625, -0.024383544921875, -0.0143280029296875, -0.0281219482421875, -0.06951904296875, -0.071533203125, -0.0309295654296875, -0.0190582275390625, -0.043121337890625, -0.0126190185546875, 0.07550048828125, 0.05126953125, -0.082763671875, -0.029937744140625, -0.00814056396484375, 0.006809234619140625, -0.01165771484375, -0.0168304443359375, 0.042327880859375, -0.0130157470703125, -0.07421875, 0.0099639892578125, -0.00710296630859375, 0.02117919921875, -0.00469207763671875, 0.0027637481689453125, -0.038116455078125, -0.00408172607421875, 0.0215911865234375, 0.03759765625, -0.04827880859375, -0.0264892578125, -0.0018901824951171875, -0.0157928466796875, 0.015380859375, 0.01763916015625, -0.05682373046875, 0.02130126953125, 0.05126953125, 0.027099609375, 0.05047607421875, -0.0134735107421875, 0.04632568359375, -0.0595703125, 0.042572021484375, 0.0272674560546875, 0.0546875, 0.0168914794921875, -0.01152801513671875, 0.051361083984375, 0.034515380859375, -0.02923583984375, -0.0670166015625, -0.01708984375, -0.10064697265625, -0.0030002593994140625, 0.06378173828125, -0.01551055908203125, -0.03857421875, 0.0087738037109375, -0.03082275390625, 0.038116455078125, -0.03411865234375, 0.052032470703125, 0.054779052734375, -0.0034942626953125, 0.00818634033203125, -0.01413726806640625, 0.0245819091796875, 0.047515869140625, -0.036865234375, -0.040435791015625, -0.00144195556640625, 0.0283203125, 0.01245880126953125, 0.039581298828125, -0.018218994140625, 0.01708984375, 0.020599365234375, 0.023193359375, -0.003505706787109375, -0.0156707763671875, -0.0238189697265625, 0.0007767677307128906, -0.01097869873046875, -0.0614013671875 ] ]
cointegrated/LaBSE-en-ru
2023-11-04T11:49:30.000Z
[ "transformers", "pytorch", "tf", "safetensors", "bert", "pretraining", "feature-extraction", "embeddings", "sentence-similarity", "ru", "en", "arxiv:2007.01852", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
cointegrated
null
null
cointegrated/LaBSE-en-ru
25
210,077
transformers
2022-03-02T23:29:05
--- language: ["ru", "en"] tags: - feature-extraction - embeddings - sentence-similarity --- # LaBSE for English and Russian This is a truncated version of [sentence-transformers/LaBSE](https://huggingface.co/sentence-transformers/LaBSE), which is, in turn, a port of [LaBSE](https://tfhub.dev/google/LaBSE/1) by Google. The current model has only English and Russian tokens left in the vocabulary. Thus, the vocabulary is 10% of the original, and number of parameters in the whole model is 27% of the original, without any loss in the quality of English and Russian embeddings. To get the sentence embeddings, you can use the following code: ```python import torch from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("cointegrated/LaBSE-en-ru") model = AutoModel.from_pretrained("cointegrated/LaBSE-en-ru") sentences = ["Hello World", "Привет Мир"] encoded_input = tokenizer(sentences, padding=True, truncation=True, max_length=64, return_tensors='pt') with torch.no_grad(): model_output = model(**encoded_input) embeddings = model_output.pooler_output embeddings = torch.nn.functional.normalize(embeddings) print(embeddings) ``` The model has been truncated in [this notebook](https://colab.research.google.com/drive/1dnPRn0-ugj3vZgSpyCC9sgslM2SuSfHy?usp=sharing). You can adapt it for other languages (like [EIStakovskii/LaBSE-fr-de](https://huggingface.co/EIStakovskii/LaBSE-fr-de)), models or datasets. ## Reference: Fangxiaoyu Feng, Yinfei Yang, Daniel Cer, Narveen Ari, Wei Wang. [Language-agnostic BERT Sentence Embedding](https://arxiv.org/abs/2007.01852). July 2020 License: [https://tfhub.dev/google/LaBSE/1](https://tfhub.dev/google/LaBSE/1)
1,705
[ [ -0.006622314453125, -0.05419921875, 0.024627685546875, 0.016876220703125, -0.0167236328125, 0.002696990966796875, -0.0086517333984375, -0.00511932373046875, 0.0273895263671875, 0.013763427734375, -0.036163330078125, -0.0352783203125, -0.040008544921875, 0.006824493408203125, -0.033843994140625, 0.07977294921875, -0.01079559326171875, 0.01220703125, -0.00782012939453125, 0.01473236083984375, -0.021331787109375, -0.01505279541015625, -0.0305938720703125, -0.0255279541015625, 0.01456451416015625, 0.0268707275390625, 0.0404052734375, 0.030487060546875, 0.01412200927734375, 0.023040771484375, 0.0081024169921875, -0.0177764892578125, -0.0276641845703125, -0.0198822021484375, 0.01219940185546875, -0.01485443115234375, -0.0190277099609375, 0.0011854171752929688, 0.0306243896484375, 0.035736083984375, -0.003963470458984375, 0.00154876708984375, -0.0097503662109375, 0.0140228271484375, -0.0229034423828125, 0.0162811279296875, -0.0238494873046875, 0.007781982421875, 0.01751708984375, 0.0158538818359375, -0.04510498046875, -0.00428009033203125, 0.024871826171875, -0.042877197265625, 0.026641845703125, 0.00795745849609375, 0.10162353515625, -0.0026111602783203125, -0.022796630859375, -0.0257415771484375, -0.029296875, 0.0587158203125, -0.05047607421875, 0.03131103515625, 0.01216888427734375, 0.00440216064453125, -0.01323699951171875, -0.09063720703125, -0.031097412109375, -0.0298919677734375, -0.0164031982421875, 0.01062774658203125, -0.0171661376953125, -0.0048065185546875, 0.005664825439453125, 0.02325439453125, -0.0594482421875, -0.007511138916015625, -0.048858642578125, -0.037322998046875, 0.037567138671875, -0.005199432373046875, 0.037811279296875, -0.0355224609375, -0.0254058837890625, -0.0288238525390625, -0.035125732421875, -0.013641357421875, 0.0277252197265625, -0.0035724639892578125, -0.024993896484375, 0.05084228515625, -0.0142822265625, 0.0350341796875, -0.0006155967712402344, 0.0249481201171875, 0.036895751953125, -0.0204315185546875, -0.0221099853515625, -0.01641845703125, 0.08282470703125, 0.0074920654296875, 0.02764892578125, -0.0355224609375, -0.0015773773193359375, -0.00498199462890625, 0.01629638671875, -0.07147216796875, -0.03546142578125, 0.01611328125, -0.013702392578125, -0.0148162841796875, 0.010223388671875, -0.043701171875, 0.004741668701171875, 0.016326904296875, 0.056427001953125, -0.054534912109375, 0.003177642822265625, 0.030120849609375, -0.021636962890625, 0.01555633544921875, -0.0228424072265625, -0.0684814453125, 0.03216552734375, 0.0255279541015625, 0.06927490234375, 0.0166473388671875, -0.032257080078125, -0.037811279296875, -0.01580810546875, -0.0001575946807861328, 0.0303955078125, -0.01151275634765625, -0.04248046875, 0.01215362548828125, 0.025848388671875, -0.00684356689453125, -0.01247406005859375, 0.038726806640625, -0.03466796875, 0.0443115234375, 0.007080078125, -0.058624267578125, 0.0036640167236328125, 0.0134429931640625, -0.035675048828125, 0.07928466796875, 0.0102386474609375, -0.0498046875, 0.0111541748046875, -0.06024169921875, -0.0262603759765625, 0.002140045166015625, -0.01007843017578125, -0.038421630859375, -0.0017824172973632812, 0.0276336669921875, 0.03668212890625, 0.003173828125, 0.0273284912109375, -0.01277923583984375, -0.01493072509765625, 0.027313232421875, -0.00542449951171875, 0.0830078125, 0.014739990234375, -0.0146942138671875, 0.02587890625, -0.050140380859375, -0.004619598388671875, 0.01313018798828125, -0.0005307197570800781, 0.00498199462890625, -0.01270294189453125, 0.04656982421875, 0.01486968994140625, 0.0189666748046875, -0.07611083984375, 0.0111541748046875, -0.054962158203125, 0.07470703125, 0.036956787109375, -0.003387451171875, 0.04217529296875, -0.0196990966796875, 0.0343017578125, -0.0049285888671875, 0.0234527587890625, 0.0082550048828125, -0.03466796875, -0.061126708984375, -0.0241851806640625, 0.03912353515625, 0.046600341796875, -0.051116943359375, 0.052001953125, -0.02923583984375, -0.035736083984375, -0.053558349609375, -0.0011892318725585938, 0.00806427001953125, 0.024658203125, 0.0273284912109375, -0.0004978179931640625, -0.045806884765625, -0.08587646484375, -0.0029315948486328125, 0.00199127197265625, -0.01404571533203125, -0.00864410400390625, 0.07672119140625, -0.0106201171875, 0.0699462890625, -0.04833984375, -0.007843017578125, -0.00975799560546875, 0.0164794921875, 0.0202789306640625, 0.04791259765625, 0.02996826171875, -0.05303955078125, -0.051025390625, -0.0238037109375, -0.035614013671875, 0.01517486572265625, -0.007587432861328125, -0.01458740234375, 0.00936126708984375, 0.0426025390625, -0.060089111328125, 0.0222320556640625, 0.030120849609375, -0.049774169921875, 0.041229248046875, -0.02490234375, -0.0021266937255859375, -0.11004638671875, 0.01055908203125, -0.005401611328125, -0.02606201171875, -0.03857421875, 0.00616455078125, 0.0202178955078125, -0.004520416259765625, -0.0440673828125, 0.060943603515625, -0.036529541015625, 0.01561737060546875, -0.0076141357421875, 0.033050537109375, 0.008209228515625, 0.013916015625, -0.006206512451171875, 0.059051513671875, 0.054107666015625, -0.020172119140625, 0.0367431640625, 0.04119873046875, -0.045928955078125, -0.004955291748046875, -0.06878662109375, 0.005340576171875, 0.01473236083984375, 0.00547027587890625, -0.06658935546875, -0.0217437744140625, 0.01143646240234375, -0.068359375, 0.02178955078125, 0.01239776611328125, -0.07086181640625, -0.02947998046875, -0.030120849609375, 0.01470184326171875, 0.0714111328125, -0.036224365234375, 0.031524658203125, 0.006977081298828125, 0.00624847412109375, -0.060943603515625, -0.0670166015625, 0.0113525390625, -0.0139312744140625, -0.06353759765625, 0.042999267578125, -0.018524169921875, 0.013153076171875, 0.0191802978515625, 0.02239990234375, -0.007312774658203125, 0.00016438961029052734, -0.0007905960083007812, 0.04400634765625, -0.00333404541015625, 0.021575927734375, -0.0048675537109375, 0.00930023193359375, 0.01090240478515625, -0.0142974853515625, 0.0626220703125, -0.023895263671875, 0.00838470458984375, -0.021392822265625, 0.0019073486328125, 0.0296173095703125, 0.00710296630859375, 0.0736083984375, 0.08270263671875, -0.037811279296875, -0.00444793701171875, -0.03424072265625, -0.035125732421875, -0.0391845703125, 0.0472412109375, -0.0318603515625, -0.06768798828125, 0.054931640625, 0.0225830078125, -0.00644683837890625, 0.05206298828125, 0.035614013671875, 0.000044345855712890625, 0.0631103515625, 0.053558349609375, -0.005397796630859375, 0.023651123046875, -0.054534912109375, 0.0279693603515625, -0.07952880859375, -0.0190582275390625, -0.0196685791015625, 0.001354217529296875, -0.0555419921875, -0.017547607421875, 0.0178375244140625, -0.0045013427734375, -0.0201568603515625, 0.031494140625, -0.03448486328125, 0.026641845703125, 0.03375244140625, 0.004810333251953125, -0.007068634033203125, 0.00891876220703125, -0.0181732177734375, -0.005939483642578125, -0.056793212890625, -0.044281005859375, 0.0714111328125, 0.0279693603515625, 0.033294677734375, -0.0014514923095703125, 0.053192138671875, 0.01052093505859375, 0.0111541748046875, -0.0687255859375, 0.036163330078125, -0.017303466796875, -0.06451416015625, -0.0120391845703125, -0.025787353515625, -0.08154296875, 0.0022029876708984375, -0.01503753662109375, -0.07733154296875, -0.003421783447265625, -0.02484130859375, -0.018829345703125, 0.0169525146484375, -0.05035400390625, 0.08795166015625, 0.00936126708984375, -0.0240325927734375, -0.01468658447265625, -0.041290283203125, 0.012969970703125, 0.01605224609375, 0.0182647705078125, 0.0015802383422851562, 0.003551483154296875, 0.07568359375, -0.0335693359375, 0.060089111328125, 0.0100250244140625, 0.0168609619140625, 0.0136566162109375, -0.0131683349609375, 0.019500732421875, 0.0234375, -0.0005879402160644531, 0.00003427267074584961, 0.0109405517578125, -0.048065185546875, -0.0226593017578125, 0.06793212890625, -0.07135009765625, -0.034027099609375, -0.051055908203125, -0.055816650390625, -0.002101898193359375, 0.0302276611328125, 0.044036865234375, 0.033721923828125, -0.03857421875, 0.0189361572265625, 0.0274810791015625, -0.03326416015625, 0.038665771484375, 0.0281982421875, -0.020111083984375, -0.0301666259765625, 0.047698974609375, -0.00598907470703125, 0.01226043701171875, 0.0184173583984375, 0.01538848876953125, -0.0328369140625, -0.014434814453125, -0.036346435546875, 0.047393798828125, -0.059112548828125, -0.0006432533264160156, -0.0596923828125, -0.02874755859375, -0.059600830078125, -0.0292510986328125, -0.016082763671875, -0.03955078125, -0.0430908203125, -0.0218505859375, 0.036590576171875, 0.036895751953125, -0.01126861572265625, 0.03509521484375, -0.049468994140625, 0.01409912109375, 0.00231170654296875, 0.01352691650390625, -0.02276611328125, -0.056610107421875, -0.0321044921875, -0.0014743804931640625, -0.02081298828125, -0.07232666015625, 0.051116943359375, 0.0087432861328125, 0.0257415771484375, -0.0011806488037109375, -0.0036163330078125, 0.028045654296875, -0.039398193359375, 0.0736083984375, 0.0184478759765625, -0.07879638671875, 0.0252685546875, -0.006755828857421875, 0.0292816162109375, 0.0207977294921875, 0.0186767578125, -0.04437255859375, -0.032684326171875, -0.06494140625, -0.07452392578125, 0.0465087890625, 0.033203125, 0.040771484375, -0.021087646484375, 0.0272064208984375, -0.0005536079406738281, -0.0015554428100585938, -0.07171630859375, -0.0290374755859375, -0.0239105224609375, -0.039886474609375, -0.005767822265625, -0.011505126953125, -0.0023632049560546875, -0.0238037109375, 0.07244873046875, 0.0098419189453125, 0.0261993408203125, 0.0274810791015625, -0.021575927734375, 0.0169677734375, 0.026519775390625, 0.030609130859375, 0.02532958984375, -0.03582763671875, -0.00710296630859375, 0.01377105712890625, -0.04266357421875, 0.00232696533203125, 0.026336669921875, -0.021575927734375, 0.016082763671875, 0.01641845703125, 0.060791015625, -0.0018472671508789062, -0.052825927734375, 0.042205810546875, -0.006649017333984375, -0.028045654296875, -0.05120849609375, -0.01398468017578125, 0.01322174072265625, 0.00865936279296875, 0.0168609619140625, -0.017242431640625, 0.0267791748046875, -0.04620361328125, 0.0196685791015625, 0.0198211669921875, -0.0292510986328125, -0.029144287109375, 0.05035400390625, 0.0037250518798828125, -0.0194549560546875, 0.06982421875, -0.037109375, -0.03759765625, 0.06317138671875, 0.04669189453125, 0.07977294921875, 0.005222320556640625, 0.036895751953125, 0.0511474609375, 0.037994384765625, -0.0021266937255859375, 0.024871826171875, 0.03265380859375, -0.043701171875, -0.041961669921875, -0.044158935546875, 0.002681732177734375, 0.02069091796875, -0.04815673828125, 0.037994384765625, -0.01183319091796875, -0.01324462890625, -0.0037403106689453125, 0.00165557861328125, -0.059906005859375, 0.0157470703125, -0.006832122802734375, 0.06634521484375, -0.076416015625, 0.07574462890625, 0.06451416015625, -0.036712646484375, -0.042999267578125, -0.0189056396484375, -0.0308380126953125, -0.060089111328125, 0.05694580078125, 0.01103973388671875, 0.007472991943359375, 0.0170135498046875, -0.033050537109375, -0.0716552734375, 0.075439453125, 0.01404571533203125, -0.017181396484375, -0.0009627342224121094, 0.0010967254638671875, 0.04278564453125, -0.0293426513671875, 0.0218658447265625, 0.0287322998046875, 0.0245361328125, -0.004848480224609375, -0.07012939453125, 0.0117340087890625, -0.036285400390625, -0.005519866943359375, 0.00699615478515625, -0.04150390625, 0.0628662109375, -0.01316070556640625, -0.019927978515625, 0.0170135498046875, 0.055877685546875, -0.0018939971923828125, -0.001171112060546875, 0.0241851806640625, 0.034820556640625, 0.03143310546875, -0.0067291259765625, 0.07464599609375, -0.02423095703125, 0.055755615234375, 0.058563232421875, 0.01378631591796875, 0.0594482421875, 0.043975830078125, -0.0097198486328125, 0.057220458984375, 0.047515869140625, -0.028564453125, 0.047027587890625, 0.02203369140625, 0.0025634765625, 0.0015764236450195312, 0.004123687744140625, -0.01216888427734375, 0.021026611328125, 0.0257568359375, -0.02978515625, 0.0018634796142578125, 0.017913818359375, 0.0157470703125, -0.0100555419921875, 0.0235137939453125, 0.022735595703125, 0.00643157958984375, -0.03204345703125, 0.042633056640625, 0.00901031494140625, 0.05364990234375, -0.037322998046875, 0.01629638671875, -0.0015020370483398438, 0.00632476806640625, -0.003063201904296875, -0.052825927734375, 0.024566650390625, 0.004123687744140625, -0.004573822021484375, -0.00563812255859375, 0.02984619140625, -0.04034423828125, -0.05364990234375, 0.0218505859375, 0.0355224609375, 0.0193328857421875, 0.027435302734375, -0.057708740234375, -0.01033782958984375, 0.0008802413940429688, -0.0272064208984375, 0.009307861328125, 0.00197601318359375, 0.0262603759765625, 0.039703369140625, 0.01462554931640625, -0.00431060791015625, 0.01548004150390625, 0.0019969940185546875, 0.04119873046875, -0.03985595703125, -0.0360107421875, -0.0653076171875, 0.047332763671875, -0.0202484130859375, -0.025238037109375, 0.0543212890625, 0.0611572265625, 0.07012939453125, -0.035247802734375, 0.05487060546875, -0.0278167724609375, -0.0017671585083007812, -0.043701171875, 0.05487060546875, -0.054290771484375, -0.006847381591796875, 0.01145172119140625, -0.0489501953125, -0.0156707763671875, 0.060943603515625, -0.020904541015625, 0.0031185150146484375, 0.07904052734375, 0.055755615234375, -0.02001953125, -0.017852783203125, 0.01800537109375, 0.03472900390625, 0.03497314453125, 0.038421630859375, 0.034393310546875, -0.0684814453125, 0.05792236328125, -0.04180908203125, 0.005992889404296875, -0.0134124755859375, -0.05792236328125, -0.07440185546875, -0.055145263671875, -0.049468994140625, -0.041656494140625, -0.01148223876953125, 0.0635986328125, 0.0526123046875, -0.056640625, -0.0243072509765625, -0.01641845703125, -0.0176849365234375, 0.0028095245361328125, -0.0229949951171875, 0.05694580078125, -0.035369873046875, -0.0687255859375, 0.0244598388671875, -0.0076141357421875, 0.0106048583984375, -0.01702880859375, 0.0004761219024658203, -0.036346435546875, 0.00984954833984375, 0.0307769775390625, 0.004108428955078125, -0.05731201171875, -0.0292510986328125, 0.0386962890625, -0.0163116455078125, -0.002872467041015625, 0.02496337890625, -0.03515625, 0.041961669921875, 0.0377197265625, 0.025848388671875, 0.06982421875, -0.0394287109375, 0.034912109375, -0.0667724609375, 0.019989013671875, 0.01517486572265625, 0.0521240234375, 0.03985595703125, -0.019439697265625, 0.034423828125, 0.007694244384765625, -0.04833984375, -0.055023193359375, 0.01425933837890625, -0.08355712890625, -0.0330810546875, 0.08599853515625, -0.01849365234375, -0.031829833984375, 0.0029754638671875, -0.01357269287109375, 0.052825927734375, -0.0266265869140625, 0.08203125, 0.08831787109375, 0.01058197021484375, -0.0037975311279296875, -0.054931640625, 0.02276611328125, 0.05523681640625, -0.04315185546875, -0.020782470703125, 0.0255584716796875, 0.037872314453125, 0.022369384765625, 0.03155517578125, -0.00360107421875, 0.0191802978515625, 0.00595855712890625, 0.024566650390625, 0.001277923583984375, 0.00783538818359375, -0.01174163818359375, -0.00872802734375, -0.0028743743896484375, -0.0312042236328125 ] ]
google/flan-t5-small
2023-10-10T18:01:54.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "t5", "text2text-generation", "en", "fr", "ro", "de", "multilingual", "dataset:svakulenk0/qrecc", "dataset:taskmaster2", "dataset:djaym7/wiki_dialog", "dataset:deepmind/code_contests", "dataset:lambada", "dataset:gsm8k", "dataset:aqua_rat", "dataset:esnli", "dataset:quasc", "dataset:qed", "arxiv:2210.11416", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
google
null
null
google/flan-t5-small
136
208,387
transformers
2022-10-21T09:59:24
--- language: - en - fr - ro - de - multilingual tags: - text2text-generation widget: - text: "Translate to German: My name is Arthur" example_title: "Translation" - text: "Please answer to the following question. Who is going to be the next Ballon d'or?" example_title: "Question Answering" - text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering." example_title: "Logical reasoning" - text: "Please answer the following question. What is the boiling point of Nitrogen?" example_title: "Scientific knowledge" - text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?" example_title: "Yes/no question" - text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?" example_title: "Reasoning task" - text: "Q: ( False or not False or False ) is? A: Let's think step by step" example_title: "Boolean Expressions" - text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?" example_title: "Math reasoning" - text: "Premise: At my age you will probably have learnt one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?" example_title: "Premise and hypothesis" datasets: - svakulenk0/qrecc - taskmaster2 - djaym7/wiki_dialog - deepmind/code_contests - lambada - gsm8k - aqua_rat - esnli - quasc - qed license: apache-2.0 --- # Model Card for FLAN-T5 small <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg" alt="drawing" width="600"/> # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Uses](#uses) 4. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 5. [Training Details](#training-details) 6. [Evaluation](#evaluation) 7. [Environmental Impact](#environmental-impact) 8. [Citation](#citation) 9. [Model Card Authors](#model-card-authors) # TL;DR If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages. As mentioned in the first few lines of the abstract : > Flan-PaLM 540B achieves state-of-the-art performance on several benchmarks, such as 75.2% on five-shot MMLU. We also publicly release Flan-T5 checkpoints,1 which achieve strong few-shot performance even compared to much larger models, such as PaLM 62B. Overall, instruction finetuning is a general method for improving the performance and usability of pretrained language models. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the [T5 model card](https://huggingface.co/t5-large). # Model Details ## Model Description - **Model type:** Language model - **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian - **License:** Apache 2.0 - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2210.11416.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face FLAN-T5 Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/t5) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-small") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-small") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-small") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-small", device_map="auto") input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> ### Running the model on a GPU using different precisions #### FP16 <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-small") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-small", device_map="auto", torch_dtype=torch.float16) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> #### INT8 <details> <summary> Click to expand </summary> ```python # pip install bitsandbytes accelerate from transformers import T5Tokenizer, T5ForConditionalGeneration tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-small") model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-small", device_map="auto", load_in_8bit=True) input_text = "translate English to German: How old are you?" input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda") outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details> # Uses ## Direct Use and Downstream Use The authors write in [the original paper's model card](https://arxiv.org/pdf/2210.11416.pdf) that: > The primary use is research on language models, including: research on zero-shot NLP tasks and in-context few-shot learning NLP tasks, such as reasoning, and question answering; advancing fairness and safety research, and understanding limitations of current large language models See the [research paper](https://arxiv.org/pdf/2210.11416.pdf) for further details. ## Out-of-Scope Use More information needed. # Bias, Risks, and Limitations The information below in this section are copied from the model's [official model card](https://arxiv.org/pdf/2210.11416.pdf): > Language models, including Flan-T5, can potentially be used for language generation in a harmful way, according to Rae et al. (2021). Flan-T5 should not be used directly in any application, without a prior assessment of safety and fairness concerns specific to the application. ## Ethical considerations and risks > Flan-T5 is fine-tuned on a large corpus of text data that was not filtered for explicit content or assessed for existing biases. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. ## Known Limitations > Flan-T5 has not been tested in real world applications. ## Sensitive Use: > Flan-T5 should not be applied for any unacceptable use cases, e.g., generation of abusive speech. # Training Details ## Training Data The model was trained on a mixture of tasks, that includes the tasks described in the table below (from the original paper, figure 2): ![table.png](https://s3.amazonaws.com/moonup/production/uploads/1666363265279-62441d1d9fdefb55a0b7d12c.png) ## Training Procedure According to the model card from the [original paper](https://arxiv.org/pdf/2210.11416.pdf): > These models are based on pretrained T5 (Raffel et al., 2020) and fine-tuned with instructions for better zero-shot and few-shot performance. There is one fine-tuned Flan model per T5 model size. The model has been trained on TPU v3 or TPU v4 pods, using [`t5x`](https://github.com/google-research/t5x) codebase together with [`jax`](https://github.com/google/jax). # Evaluation ## Testing Data, Factors & Metrics The authors evaluated the model on various tasks covering several languages (1836 in total). See the table below for some quantitative evaluation: ![image.png](https://s3.amazonaws.com/moonup/production/uploads/1668072995230-62441d1d9fdefb55a0b7d12c.png) For full details, please check the [research paper](https://arxiv.org/pdf/2210.11416.pdf). ## Results For full results for FLAN-T5-Small, see the [research paper](https://arxiv.org/pdf/2210.11416.pdf), Table 3. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** Google Cloud TPU Pods - TPU v3 or TPU v4 | Number of chips ≥ 4. - **Hours used:** More information needed - **Cloud Provider:** GCP - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation **BibTeX:** ```bibtex @misc{https://doi.org/10.48550/arxiv.2210.11416, doi = {10.48550/ARXIV.2210.11416}, url = {https://arxiv.org/abs/2210.11416}, author = {Chung, Hyung Won and Hou, Le and Longpre, Shayne and Zoph, Barret and Tay, Yi and Fedus, William and Li, Eric and Wang, Xuezhi and Dehghani, Mostafa and Brahma, Siddhartha and Webson, Albert and Gu, Shixiang Shane and Dai, Zhuyun and Suzgun, Mirac and Chen, Xinyun and Chowdhery, Aakanksha and Narang, Sharan and Mishra, Gaurav and Yu, Adams and Zhao, Vincent and Huang, Yanping and Dai, Andrew and Yu, Hongkun and Petrov, Slav and Chi, Ed H. and Dean, Jeff and Devlin, Jacob and Roberts, Adam and Zhou, Denny and Le, Quoc V. and Wei, Jason}, keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Scaling Instruction-Finetuned Language Models}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
10,818
[ [ -0.034210205078125, -0.0435791015625, 0.022796630859375, -0.001667022705078125, -0.0072174072265625, -0.01319122314453125, -0.0321044921875, -0.047576904296875, -0.01006317138671875, 0.007717132568359375, -0.038604736328125, -0.037322998046875, -0.048614501953125, 0.00550079345703125, -0.017059326171875, 0.07696533203125, -0.00982666015625, 0.003101348876953125, 0.01303863525390625, -0.0070953369140625, -0.012451171875, -0.021270751953125, -0.05322265625, -0.0230712890625, 0.035858154296875, 0.0245208740234375, 0.0340576171875, 0.0396728515625, 0.039459228515625, 0.0250701904296875, -0.01190948486328125, -0.004077911376953125, -0.039093017578125, -0.033447265625, 0.004886627197265625, -0.035797119140625, -0.044830322265625, -0.0022830963134765625, 0.03302001953125, 0.0380859375, 0.0067596435546875, 0.026214599609375, -0.00925445556640625, 0.0199127197265625, -0.03936767578125, 0.02191162109375, -0.0216827392578125, 0.005889892578125, -0.0225677490234375, 0.00904083251953125, -0.0193328857421875, -0.0168914794921875, 0.0045166015625, -0.051177978515625, 0.037872314453125, -0.0081939697265625, 0.107177734375, 0.01120758056640625, -0.0048980712890625, -0.01393890380859375, -0.05712890625, 0.07159423828125, -0.072265625, 0.032470703125, 0.0148468017578125, 0.0269622802734375, 0.00714111328125, -0.062347412109375, -0.051239013671875, -0.0237274169921875, -0.00742340087890625, 0.00959014892578125, -0.003910064697265625, 0.0149383544921875, 0.041107177734375, 0.046539306640625, -0.03485107421875, -0.0032196044921875, -0.05450439453125, -0.01213836669921875, 0.05303955078125, -0.0007729530334472656, 0.038055419921875, -0.007640838623046875, -0.01934814453125, -0.036468505859375, -0.027191162109375, 0.01154327392578125, 0.0206756591796875, 0.03399658203125, -0.03765869140625, 0.031097412109375, -0.00211334228515625, 0.04046630859375, 0.02410888671875, -0.036712646484375, 0.035400390625, -0.026275634765625, -0.0275726318359375, -0.01247406005859375, 0.07177734375, 0.01384735107421875, 0.016632080078125, -0.0041656494140625, -0.0296173095703125, 0.0008878707885742188, 0.01204681396484375, -0.0732421875, -0.010986328125, 0.0306396484375, -0.031463623046875, -0.037872314453125, 0.011749267578125, -0.062103271484375, -0.0010395050048828125, 0.00006079673767089844, 0.040130615234375, -0.037322998046875, -0.04150390625, -0.0049285888671875, -0.01397705078125, 0.02264404296875, 0.005558013916015625, -0.08197021484375, 0.016510009765625, 0.037689208984375, 0.06488037109375, 0.0097503662109375, -0.0240325927734375, -0.0199432373046875, -0.000056684017181396484, -0.01593017578125, 0.032470703125, -0.0298004150390625, -0.027191162109375, -0.0031871795654296875, 0.015838623046875, -0.01532745361328125, -0.036346435546875, 0.049560546875, -0.0201873779296875, 0.038238525390625, -0.0214080810546875, -0.039031982421875, -0.0284881591796875, -0.00231170654296875, -0.04962158203125, 0.084228515625, 0.0212249755859375, -0.056854248046875, 0.03411865234375, -0.07061767578125, -0.03338623046875, -0.01187896728515625, 0.00954437255859375, -0.052947998046875, 0.005126953125, 0.026214599609375, 0.02667236328125, -0.017059326171875, 0.0132293701171875, -0.037353515625, -0.0261077880859375, -0.01318359375, -0.0084381103515625, 0.07623291015625, 0.0325927734375, -0.0604248046875, 0.0200347900390625, -0.04498291015625, -0.0035533905029296875, 0.0243072509765625, -0.0068206787109375, 0.01052093505859375, -0.0263824462890625, 0.0164642333984375, 0.030609130859375, 0.0186920166015625, -0.0390625, 0.002147674560546875, -0.042144775390625, 0.038726806640625, 0.04132080078125, -0.0136871337890625, 0.032989501953125, -0.040557861328125, 0.036651611328125, 0.023895263671875, 0.0167083740234375, -0.006496429443359375, -0.0293426513671875, -0.08740234375, -0.00031113624572753906, 0.0216522216796875, 0.033935546875, -0.046875, 0.0276641845703125, -0.037689208984375, -0.05352783203125, -0.03521728515625, 0.00885009765625, 0.0275421142578125, 0.035614013671875, 0.0361328125, -0.00644683837890625, -0.039764404296875, -0.052520751953125, -0.016265869140625, -0.0004172325134277344, -0.001445770263671875, 0.02020263671875, 0.059600830078125, -0.0027141571044921875, 0.039215087890625, -0.0219268798828125, -0.0283355712890625, -0.037322998046875, 0.00827789306640625, 0.01041412353515625, 0.05145263671875, 0.0631103515625, -0.04266357421875, -0.032684326171875, 0.00484466552734375, -0.058868408203125, 0.0033092498779296875, -0.007232666015625, -0.00836181640625, 0.03472900390625, 0.0154266357421875, -0.048187255859375, 0.0308074951171875, 0.031982421875, -0.0164794921875, 0.0214996337890625, -0.00894927978515625, 0.004070281982421875, -0.0888671875, 0.037506103515625, 0.00960540771484375, -0.0135955810546875, -0.057220458984375, 0.009185791015625, 0.002834320068359375, -0.01641845703125, -0.048095703125, 0.060150146484375, -0.0260162353515625, 0.0023193359375, -0.00855255126953125, -0.0006074905395507812, -0.002231597900390625, 0.0428466796875, 0.00913238525390625, 0.061431884765625, 0.0258331298828125, -0.05511474609375, 0.0021839141845703125, 0.007389068603515625, -0.0196075439453125, 0.01552581787109375, -0.0543212890625, 0.0122528076171875, 0.0002493858337402344, 0.01520538330078125, -0.05072021484375, -0.0285797119140625, 0.0213623046875, -0.0350341796875, 0.0343017578125, 0.0026988983154296875, -0.027618408203125, -0.04461669921875, -0.021820068359375, 0.0242767333984375, 0.051116943359375, -0.044769287109375, 0.04962158203125, 0.0159454345703125, 0.0245208740234375, -0.041656494140625, -0.0667724609375, -0.021697998046875, -0.03656005859375, -0.06256103515625, 0.040802001953125, 0.001499176025390625, -0.0014753341674804688, -0.01512908935546875, -0.00885009765625, -0.003787994384765625, 0.002124786376953125, 0.01033782958984375, 0.006343841552734375, -0.020263671875, -0.0116119384765625, -0.01611328125, -0.007648468017578125, -0.0036830902099609375, -0.027435302734375, 0.044403076171875, -0.02203369140625, 0.01184844970703125, -0.058624267578125, -0.002063751220703125, 0.04119873046875, -0.0189361572265625, 0.068359375, 0.0843505859375, -0.037872314453125, -0.0004215240478515625, -0.0452880859375, -0.0271453857421875, -0.038909912109375, 0.0136260986328125, -0.037750244140625, -0.047515869140625, 0.051177978515625, 0.0159149169921875, 0.02142333984375, 0.05987548828125, 0.038726806640625, -0.003322601318359375, 0.06964111328125, 0.053131103515625, -0.0014209747314453125, 0.0584716796875, -0.052734375, 0.018402099609375, -0.04400634765625, -0.01218414306640625, -0.035400390625, -0.021514892578125, -0.054840087890625, -0.02191162109375, 0.02325439453125, 0.00652313232421875, -0.0428466796875, 0.0276641845703125, -0.0269927978515625, 0.00896453857421875, 0.042724609375, 0.0160675048828125, -0.003971099853515625, 0.00616455078125, -0.0113525390625, -0.005031585693359375, -0.054931640625, -0.040802001953125, 0.0831298828125, 0.03045654296875, 0.03240966796875, 0.0042572021484375, 0.055694580078125, -0.0018911361694335938, 0.022064208984375, -0.040924072265625, 0.0299835205078125, -0.01812744140625, -0.0679931640625, -0.004276275634765625, -0.0309295654296875, -0.061737060546875, 0.004741668701171875, -0.00382232666015625, -0.05810546875, 0.0039043426513671875, 0.01282501220703125, -0.037384033203125, 0.0428466796875, -0.070556640625, 0.0908203125, -0.02716064453125, -0.039459228515625, -0.004543304443359375, -0.03765869140625, 0.040985107421875, 0.012176513671875, 0.00965118408203125, 0.0036773681640625, 0.00760650634765625, 0.061614990234375, -0.057769775390625, 0.06103515625, -0.032623291015625, -0.005828857421875, 0.02508544921875, -0.0177459716796875, 0.0283355712890625, -0.017852783203125, -0.007236480712890625, 0.0262603759765625, 0.00814056396484375, -0.0439453125, -0.036468505859375, 0.052764892578125, -0.07843017578125, -0.041595458984375, -0.03717041015625, -0.0269317626953125, 0.00400543212890625, 0.036163330078125, 0.0299835205078125, 0.02197265625, 0.0026073455810546875, 0.0006723403930664062, 0.03173828125, -0.028564453125, 0.04864501953125, 0.007137298583984375, -0.0201873779296875, -0.0279388427734375, 0.07135009765625, 0.00926971435546875, 0.034942626953125, 0.02337646484375, 0.024078369140625, -0.0229949951171875, -0.0183563232421875, -0.03662109375, 0.0304718017578125, -0.048736572265625, -0.005649566650390625, -0.041473388671875, -0.00991058349609375, -0.03814697265625, -0.01000213623046875, -0.035369873046875, -0.0299835205078125, -0.02838134765625, -0.003314971923828125, 0.021942138671875, 0.049102783203125, -0.002513885498046875, 0.0295257568359375, -0.045013427734375, 0.025177001953125, 0.004482269287109375, 0.025970458984375, 0.0062713623046875, -0.051971435546875, -0.01392364501953125, 0.0227813720703125, -0.03448486328125, -0.0457763671875, 0.0281829833984375, 0.01806640625, 0.0249786376953125, 0.038116455078125, -0.007312774658203125, 0.06964111328125, -0.01020050048828125, 0.07879638671875, 0.00295257568359375, -0.074462890625, 0.04461669921875, -0.035308837890625, 0.033447265625, 0.02825927734375, 0.0244903564453125, -0.0246734619140625, -0.0177001953125, -0.07781982421875, -0.052703857421875, 0.07403564453125, 0.0211334228515625, 0.00180816650390625, 0.0211639404296875, 0.01708984375, -0.006244659423828125, 0.00565338134765625, -0.06732177734375, -0.0181121826171875, -0.035888671875, -0.0250396728515625, -0.0057830810546875, -0.003814697265625, -0.0077972412109375, -0.02825927734375, 0.060760498046875, 0.0019683837890625, 0.051300048828125, 0.0095367431640625, -0.0188751220703125, -0.0140533447265625, 0.00015604496002197266, 0.06988525390625, 0.034759521484375, -0.0247344970703125, -0.01052093505859375, 0.02960205078125, -0.043853759765625, -0.003814697265625, 0.007843017578125, -0.028839111328125, -0.0024967193603515625, 0.03228759765625, 0.07904052734375, 0.01319122314453125, -0.0269927978515625, 0.03271484375, -0.00589752197265625, -0.02642822265625, -0.03521728515625, 0.02545166015625, 0.00687408447265625, 0.004150390625, 0.01020050048828125, 0.0053558349609375, -0.0143890380859375, -0.0286102294921875, 0.0005369186401367188, 0.01236724853515625, -0.016571044921875, -0.035491943359375, 0.08221435546875, 0.016021728515625, -0.00922393798828125, 0.041290283203125, -0.006893157958984375, -0.036163330078125, 0.051116943359375, 0.031951904296875, 0.07098388671875, -0.00916290283203125, 0.00008893013000488281, 0.06884765625, 0.026702880859375, -0.0094146728515625, 0.0279541015625, 0.00579071044921875, -0.039276123046875, -0.01020050048828125, -0.049774169921875, -0.0000021457672119140625, 0.031646728515625, -0.03460693359375, 0.038726806640625, -0.05596923828125, -0.014892578125, 0.0084228515625, 0.0343017578125, -0.0732421875, 0.0308990478515625, 0.0214080810546875, 0.061279296875, -0.056396484375, 0.062744140625, 0.046875, -0.0743408203125, -0.0836181640625, -0.001171112060546875, -0.00550079345703125, -0.0413818359375, 0.044769287109375, 0.0296478271484375, 0.0010976791381835938, 0.0019683837890625, -0.03656005859375, -0.06573486328125, 0.09808349609375, 0.02984619140625, -0.0309906005859375, -0.009429931640625, 0.0252685546875, 0.044647216796875, -0.0203094482421875, 0.056365966796875, 0.04302978515625, 0.050872802734375, 0.0038318634033203125, -0.0799560546875, 0.01468658447265625, -0.019378662109375, 0.01047515869140625, -0.0030460357666015625, -0.077392578125, 0.06927490234375, -0.023468017578125, -0.022674560546875, 0.0007410049438476562, 0.0648193359375, 0.017059326171875, 0.00794219970703125, 0.0406494140625, 0.045166015625, 0.058929443359375, -0.01983642578125, 0.0966796875, -0.041748046875, 0.045166015625, 0.049957275390625, 0.01557159423828125, 0.049896240234375, 0.021514892578125, -0.0210723876953125, 0.0335693359375, 0.054290771484375, -0.008392333984375, 0.023651123046875, -0.00547027587890625, -0.016693115234375, -0.005615234375, -0.0031909942626953125, -0.037567138671875, 0.0242156982421875, 0.027984619140625, -0.0328369140625, -0.0093231201171875, -0.002750396728515625, 0.027435302734375, -0.0251312255859375, -0.0113983154296875, 0.037506103515625, 0.01171875, -0.05645751953125, 0.07879638671875, 0.01282501220703125, 0.0626220703125, -0.04095458984375, 0.0189361572265625, -0.0238494873046875, 0.0304107666015625, -0.03265380859375, -0.0254669189453125, 0.0209503173828125, 0.00262451171875, -0.0005130767822265625, -0.01300811767578125, 0.038116455078125, -0.036468505859375, -0.054443359375, 0.016632080078125, 0.01204681396484375, 0.0119476318359375, 0.0198822021484375, -0.0645751953125, 0.0177001953125, 0.00913238525390625, -0.0268707275390625, 0.01027679443359375, 0.01079559326171875, 0.0003917217254638672, 0.042938232421875, 0.039398193359375, -0.01078033447265625, 0.0207672119140625, 0.00980377197265625, 0.05255126953125, -0.05023193359375, -0.024993896484375, -0.049041748046875, 0.04901123046875, -0.004425048828125, -0.0391845703125, 0.052825927734375, 0.045806884765625, 0.08740234375, -0.01309967041015625, 0.07037353515625, -0.0309906005859375, 0.023223876953125, -0.031768798828125, 0.05474853515625, -0.061126708984375, 0.0033111572265625, -0.0279541015625, -0.058502197265625, -0.01561737060546875, 0.0662841796875, -0.038909912109375, 0.04864501953125, 0.058502197265625, 0.06695556640625, -0.0277862548828125, 0.0038471221923828125, 0.01227569580078125, 0.0207977294921875, 0.047943115234375, 0.052734375, 0.0187530517578125, -0.06732177734375, 0.043701171875, -0.0601806640625, 0.00891876220703125, -0.01751708984375, -0.0498046875, -0.08160400390625, -0.03997802734375, -0.0213165283203125, -0.0340576171875, -0.004505157470703125, 0.06427001953125, 0.05718994140625, -0.078125, -0.02587890625, -0.021270751953125, -0.00705718994140625, -0.01812744140625, -0.0187530517578125, 0.03466796875, -0.03912353515625, -0.08380126953125, 0.0085296630859375, -0.01715087890625, 0.0191192626953125, -0.0240020751953125, -0.01434326171875, -0.0251312255859375, -0.0216217041015625, 0.020904541015625, 0.0296478271484375, -0.0634765625, -0.0286407470703125, 0.0008730888366699219, -0.01009368896484375, 0.009765625, 0.03765869140625, -0.0335693359375, 0.027923583984375, 0.03875732421875, 0.03656005859375, 0.0611572265625, -0.0060882568359375, 0.048858642578125, -0.03662109375, 0.03472900390625, 0.0035724639892578125, 0.0205841064453125, 0.0299530029296875, -0.0179290771484375, 0.04302978515625, 0.0255889892578125, -0.03033447265625, -0.061187744140625, -0.01262664794921875, -0.06744384765625, 0.00196075439453125, 0.09033203125, -0.0193328857421875, -0.038726806640625, 0.0186920166015625, -0.00034499168395996094, 0.043701171875, -0.0292510986328125, 0.050872802734375, 0.05303955078125, 0.006435394287109375, -0.0269622802734375, -0.05792236328125, 0.052947998046875, 0.040924072265625, -0.0577392578125, -0.01763916015625, 0.010711669921875, 0.041900634765625, 0.01354217529296875, 0.033111572265625, -0.003299713134765625, 0.01512908935546875, 0.01309967041015625, 0.0174102783203125, -0.011199951171875, -0.007198333740234375, -0.022308349609375, 0.0021648406982421875, -0.004932403564453125, -0.0092010498046875 ] ]
microsoft/deberta-v2-xlarge
2022-09-26T08:59:06.000Z
[ "transformers", "pytorch", "tf", "deberta-v2", "deberta", "fill-mask", "en", "arxiv:2006.03654", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
microsoft
null
null
microsoft/deberta-v2-xlarge
18
206,776
transformers
2022-03-02T23:29:05
--- language: en tags: - deberta - fill-mask thumbnail: https://huggingface.co/front/thumbnails/microsoft.png license: mit --- ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xlarge model with 24 layers, 1536 hidden size. The total parameters are 900M and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, you need to specify **--sharded_ddp** ```bash cd transformers/examples/text-classification/ export TASK_NAME=mrpc python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\\\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 128 --per_device_train_batch_size 4 \\\\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
3,910
[ [ -0.03387451171875, -0.0465087890625, 0.02056884765625, 0.034149169921875, -0.01348114013671875, 0.0140838623046875, 0.0011548995971679688, -0.0491943359375, 0.0202484130859375, 0.0131378173828125, -0.06182861328125, -0.02532958984375, -0.069580078125, -0.00833892822265625, -0.0019664764404296875, 0.064697265625, -0.005260467529296875, -0.0153350830078125, -0.0100555419921875, -0.01438140869140625, -0.04327392578125, -0.03363037109375, -0.039154052734375, -0.03338623046875, 0.0231781005859375, 0.0244598388671875, 0.050689697265625, 0.0126190185546875, 0.034759521484375, 0.025146484375, -0.031524658203125, 0.02691650390625, -0.03875732421875, -0.00646209716796875, 0.01202392578125, -0.0253753662109375, -0.0692138671875, 0.008270263671875, 0.0245819091796875, 0.0298309326171875, 0.0167388916015625, 0.02447509765625, 0.0299072265625, 0.07745361328125, -0.03546142578125, 0.01157379150390625, -0.034423828125, 0.005916595458984375, 0.01043701171875, -0.0006694793701171875, -0.01480865478515625, -0.0018215179443359375, 0.0082244873046875, -0.0290679931640625, 0.0015306472778320312, -0.0123748779296875, 0.09393310546875, 0.0380859375, -0.00920867919921875, -0.00774383544921875, -0.03271484375, 0.0855712890625, -0.0545654296875, 0.026092529296875, 0.024749755859375, 0.002330780029296875, -0.0136566162109375, -0.0322265625, -0.02777099609375, -0.0121307373046875, -0.0161590576171875, 0.0239105224609375, -0.058135986328125, -0.01049041748046875, 0.0305633544921875, 0.01026153564453125, -0.053192138671875, 0.0144195556640625, -0.0263214111328125, 0.0006856918334960938, 0.053131103515625, 0.003978729248046875, 0.01377105712890625, 0.00693511962890625, -0.038482666015625, -0.01132965087890625, -0.040283203125, 0.0189208984375, 0.0111236572265625, 0.0009765625, -0.02313232421875, 0.0179443359375, -0.0177001953125, 0.06634521484375, 0.030609130859375, -0.0013179779052734375, 0.0533447265625, -0.0131072998046875, -0.0341796875, 0.0011157989501953125, 0.048583984375, 0.0229949951171875, -0.0041046142578125, -0.00765228271484375, -0.015625, 0.004970550537109375, 0.0069122314453125, -0.0711669921875, -0.033660888671875, 0.04150390625, -0.045074462890625, -0.01922607421875, -0.003864288330078125, -0.03887939453125, 0.00165557861328125, -0.0447998046875, 0.02947998046875, -0.042724609375, -0.0239715576171875, 0.005222320556640625, -0.0034503936767578125, 0.006397247314453125, 0.03704833984375, -0.062164306640625, 0.0034885406494140625, 0.03485107421875, 0.0546875, -0.00614166259765625, -0.01251983642578125, -0.04425048828125, -0.0162506103515625, -0.004291534423828125, 0.0227508544921875, -0.01378631591796875, 0.009765625, -0.01415252685546875, 0.0122833251953125, -0.0235443115234375, -0.025726318359375, 0.0156402587890625, -0.0394287109375, -0.0007472038269042969, -0.0293426513671875, -0.0287628173828125, -0.022216796875, 0.0280914306640625, -0.040435791015625, 0.08416748046875, 0.03265380859375, -0.0673828125, 0.013885498046875, -0.042144775390625, -0.005619049072265625, -0.016082763671875, -0.0011243820190429688, -0.0416259765625, -0.00775146484375, 0.019622802734375, 0.043853759765625, -0.005413055419921875, -0.004703521728515625, -0.015777587890625, -0.0308990478515625, 0.004062652587890625, -0.006076812744140625, 0.09564208984375, 0.0264434814453125, -0.0692138671875, 0.0016717910766601562, -0.06756591796875, 0.019287109375, 0.0178985595703125, -0.021148681640625, -0.0129547119140625, -0.01474761962890625, 0.003917694091796875, 0.041839599609375, 0.043914794921875, -0.0435791015625, 0.021453857421875, -0.029998779296875, 0.0428466796875, 0.042449951171875, -0.0244903564453125, 0.0171661376953125, -0.01837158203125, 0.031646728515625, 0.03143310546875, 0.0286712646484375, 0.0204010009765625, -0.04620361328125, -0.05633544921875, -0.0440673828125, 0.0275726318359375, 0.054779052734375, -0.04583740234375, 0.056488037109375, -0.00687408447265625, -0.048095703125, -0.03900146484375, 0.0191192626953125, 0.045684814453125, 0.0244598388671875, 0.037261962890625, -0.005313873291015625, -0.041473388671875, -0.08489990234375, 0.006439208984375, 0.0003352165222167969, -0.0006499290466308594, 0.015899658203125, 0.052947998046875, -0.0248565673828125, 0.0645751953125, -0.0352783203125, -0.03558349609375, -0.01513671875, 0.004909515380859375, 0.032440185546875, 0.055816650390625, 0.08050537109375, -0.05401611328125, -0.048797607421875, -0.01654052734375, -0.050140380859375, 0.0150299072265625, -0.0004737377166748047, -0.021240234375, 0.044586181640625, 0.0178375244140625, -0.044830322265625, 0.036376953125, 0.05462646484375, -0.035369873046875, 0.0177001953125, -0.025665283203125, 0.0122833251953125, -0.07598876953125, 0.0175323486328125, -0.0008463859558105469, -0.02166748046875, -0.042083740234375, -0.00682830810546875, 0.013092041015625, 0.0232086181640625, -0.0258941650390625, 0.0249176025390625, -0.050323486328125, 0.00679779052734375, -0.0166168212890625, 0.0205230712890625, 0.00997161865234375, 0.0631103515625, -0.0032482147216796875, 0.048858642578125, 0.043731689453125, -0.03350830078125, 0.019622802734375, 0.04327392578125, -0.024658203125, 0.032501220703125, -0.06585693359375, 0.01593017578125, -0.01532745361328125, 0.01528167724609375, -0.08160400390625, 0.0102691650390625, 0.025634765625, -0.038238525390625, 0.044189453125, -0.01108551025390625, -0.038970947265625, -0.041717529296875, -0.0288238525390625, -0.0008120536804199219, 0.057952880859375, -0.052490234375, 0.0183563232421875, 0.0294036865234375, 0.0083465576171875, -0.0555419921875, -0.060791015625, -0.00846099853515625, -0.01486968994140625, -0.06634521484375, 0.05438232421875, -0.0142974853515625, -0.00772857666015625, -0.004730224609375, -0.0070953369140625, -0.0147857666015625, 0.0227203369140625, 0.0255889892578125, 0.033843994140625, -0.004940032958984375, -0.00516510009765625, 0.0083160400390625, 0.003040313720703125, -0.01013946533203125, -0.00046539306640625, 0.039581298828125, -0.0240020751953125, -0.002437591552734375, -0.0290679931640625, 0.0202178955078125, 0.043914794921875, -0.027862548828125, 0.06182861328125, 0.069580078125, -0.0218048095703125, 0.00048089027404785156, -0.0384521484375, -0.01486968994140625, -0.0347900390625, 0.0201416015625, -0.03155517578125, -0.058074951171875, 0.052490234375, 0.016326904296875, 0.0207061767578125, 0.047760009765625, 0.04345703125, -0.00934600830078125, 0.0908203125, 0.048675537109375, -0.025299072265625, 0.04229736328125, -0.058349609375, -0.0027484893798828125, -0.07537841796875, -0.0189971923828125, -0.033355712890625, -0.0506591796875, -0.037445068359375, -0.017059326171875, 0.0160980224609375, 0.03326416015625, -0.0230560302734375, 0.06036376953125, -0.079833984375, 0.0017032623291015625, 0.055206298828125, 0.03912353515625, -0.004421234130859375, 0.006656646728515625, 0.01091766357421875, -0.006687164306640625, -0.057525634765625, -0.03167724609375, 0.0576171875, 0.029998779296875, 0.039093017578125, 0.0142059326171875, 0.06329345703125, 0.0108489990234375, -0.00945281982421875, -0.0255889892578125, 0.031341552734375, -0.01087188720703125, -0.041259765625, -0.0128021240234375, -0.026092529296875, -0.0863037109375, 0.0159454345703125, -0.01025390625, -0.0865478515625, 0.0306549072265625, 0.0304107666015625, -0.03582763671875, 0.01415252685546875, -0.039947509765625, 0.06805419921875, -0.01277923583984375, -0.029327392578125, -0.0192413330078125, -0.052978515625, 0.016387939453125, 0.01727294921875, -0.01245880126953125, -0.019927978515625, 0.005580902099609375, 0.063720703125, -0.024200439453125, 0.059112548828125, -0.029144287109375, -0.022735595703125, 0.0297393798828125, -0.0023860931396484375, 0.05548095703125, -0.0036334991455078125, -0.000728607177734375, 0.0178070068359375, 0.022247314453125, -0.03350830078125, -0.033660888671875, 0.060699462890625, -0.06671142578125, -0.026092529296875, -0.03448486328125, -0.0433349609375, -0.0189208984375, -0.001163482666015625, 0.0247955322265625, 0.033935546875, 0.00485992431640625, 0.016815185546875, 0.06378173828125, -0.01055908203125, 0.036346435546875, 0.0377197265625, 0.0148773193359375, -0.010101318359375, 0.059478759765625, 0.010040283203125, 0.005580902099609375, 0.035400390625, -0.0224761962890625, -0.0243377685546875, -0.0399169921875, -0.037078857421875, 0.007381439208984375, -0.04052734375, -0.0287628173828125, -0.054443359375, -0.00830078125, -0.0269622802734375, 0.00530242919921875, -0.029571533203125, -0.043609619140625, -0.05548095703125, 0.0189056396484375, 0.049774169921875, 0.041168212890625, -0.00423431396484375, 0.011077880859375, -0.0675048828125, 0.01161956787109375, 0.0081939697265625, 0.017425537109375, 0.0002598762512207031, -0.04376220703125, -0.01861572265625, 0.02471923828125, -0.04339599609375, -0.06048583984375, 0.035003662109375, 0.0014734268188476562, 0.047271728515625, 0.0020999908447265625, 0.006397247314453125, 0.047149658203125, -0.030853271484375, 0.058502197265625, 0.0244598388671875, -0.06121826171875, 0.0526123046875, -0.018280029296875, 0.0192413330078125, 0.043731689453125, 0.034698486328125, -0.0014505386352539062, -0.02154541015625, -0.060882568359375, -0.05804443359375, 0.07720947265625, 0.0390625, -0.0097503662109375, 0.0091400146484375, 0.01474761962890625, -0.0135040283203125, 0.0153656005859375, -0.0294036865234375, -0.0362548828125, -0.01433563232421875, -0.0211181640625, -0.00130462646484375, -0.0240478515625, -0.00675201416015625, -0.034454345703125, 0.06805419921875, -0.00118255615234375, 0.0430908203125, 0.03546142578125, -0.019500732421875, -0.000522613525390625, -0.00218963623046875, 0.06256103515625, 0.062286376953125, -0.032440185546875, -0.0162506103515625, 0.015045166015625, -0.032012939453125, -0.0014352798461914062, 0.0169525146484375, 0.0023403167724609375, 0.01324462890625, 0.0187225341796875, 0.0723876953125, 0.0015897750854492188, -0.0369873046875, 0.0280303955078125, 0.00457000732421875, -0.03411865234375, -0.0186004638671875, -0.00036406517028808594, -0.0021038055419921875, 0.042755126953125, 0.0209197998046875, 0.01148223876953125, 0.0118560791015625, -0.02825927734375, 0.01446533203125, 0.04833984375, -0.042510986328125, -0.0227203369140625, 0.051910400390625, 0.00682830810546875, 0.0023651123046875, 0.03912353515625, -0.0171356201171875, -0.05059814453125, 0.0643310546875, 0.0260162353515625, 0.059234619140625, -0.011077880859375, 0.0058135986328125, 0.05133056640625, 0.0253448486328125, 0.00641632080078125, 0.0426025390625, 0.005886077880859375, -0.0258026123046875, -0.019500732421875, -0.04833984375, -0.0021877288818359375, 0.022918701171875, -0.051483154296875, 0.0026092529296875, -0.01020050048828125, -0.0247650146484375, 0.01331329345703125, 0.029754638671875, -0.06488037109375, 0.01482391357421875, 0.0108489990234375, 0.0731201171875, -0.041015625, 0.06256103515625, 0.05316162109375, -0.032501220703125, -0.050933837890625, -0.0220947265625, -0.00934600830078125, -0.063720703125, 0.07733154296875, 0.01390838623046875, 0.00592803955078125, -0.0006418228149414062, -0.0296630859375, -0.0728759765625, 0.09552001953125, 0.025299072265625, -0.06890869140625, -0.005168914794921875, 0.0011959075927734375, 0.034271240234375, -0.0175323486328125, 0.0202178955078125, 0.04248046875, 0.03564453125, -0.004039764404296875, -0.08123779296875, 0.02728271484375, -0.02490234375, 0.007274627685546875, 0.0144195556640625, -0.07061767578125, 0.08050537109375, -0.010101318359375, 0.0143280029296875, 0.0119476318359375, 0.04376220703125, 0.0196990966796875, 0.004268646240234375, 0.04400634765625, 0.05035400390625, 0.044830322265625, -0.01483917236328125, 0.068359375, -0.037017822265625, 0.048095703125, 0.06695556640625, 0.01042938232421875, 0.04864501953125, 0.032623291015625, -0.03265380859375, 0.03302001953125, 0.049713134765625, -0.0139312744140625, 0.034027099609375, 0.01136016845703125, 0.006267547607421875, -0.0174560546875, 0.026092529296875, -0.0347900390625, 0.033447265625, 0.00884246826171875, -0.03790283203125, -0.015289306640625, 0.005741119384765625, 0.008056640625, -0.011505126953125, -0.0192108154296875, 0.047698974609375, -0.002777099609375, -0.05230712890625, 0.08251953125, -0.0155792236328125, 0.060791015625, -0.0380859375, -0.008758544921875, -0.005054473876953125, 0.038604736328125, -0.025299072265625, -0.056396484375, 0.0167999267578125, -0.006763458251953125, -0.022796630859375, -0.00821685791015625, 0.04864501953125, -0.0274810791015625, -0.0318603515625, 0.0301055908203125, 0.028900146484375, 0.0111236572265625, -0.0224609375, -0.0902099609375, 0.026092529296875, 0.018524169921875, -0.040802001953125, 0.036773681640625, 0.01129150390625, 0.01299285888671875, 0.035369873046875, 0.014251708984375, -0.0301361083984375, 0.0005497932434082031, -0.01837158203125, 0.07501220703125, -0.02197265625, -0.02008056640625, -0.062042236328125, 0.0472412109375, -0.0156707763671875, -0.027099609375, 0.06878662109375, 0.0350341796875, 0.03900146484375, -0.018951416015625, 0.041015625, -0.0312042236328125, 0.026153564453125, -0.034423828125, 0.057342529296875, -0.06884765625, -0.008819580078125, -0.035247802734375, -0.06884765625, 0.0026340484619140625, 0.0546875, -0.0017690658569335938, 0.01082611083984375, 0.016632080078125, 0.049102783203125, -0.00913238525390625, -0.019561767578125, 0.011444091796875, 0.013153076171875, 0.0189971923828125, 0.075927734375, 0.035980224609375, -0.0611572265625, 0.037933349609375, -0.039306640625, -0.033172607421875, -0.0286407470703125, -0.057281494140625, -0.08349609375, -0.05413818359375, -0.052978515625, -0.037933349609375, -0.005054473876953125, 0.0667724609375, 0.07080078125, -0.0633544921875, 0.015838623046875, -0.0141143798828125, -0.01056671142578125, -0.038421630859375, -0.016632080078125, 0.041656494140625, -0.029815673828125, -0.07763671875, 0.0225830078125, -0.00957489013671875, 0.0248260498046875, -0.009246826171875, -0.01690673828125, -0.0225067138671875, -0.0012311935424804688, 0.058135986328125, 0.0180511474609375, -0.047698974609375, -0.017486572265625, 0.007755279541015625, -0.01065826416015625, 0.007843017578125, 0.0066986083984375, -0.05621337890625, 0.00366973876953125, 0.04241943359375, 0.01505279541015625, 0.044342041015625, -0.0164794921875, 0.0102081298828125, -0.05804443359375, 0.031524658203125, 0.0160064697265625, 0.032928466796875, 0.0031414031982421875, -0.035400390625, 0.044769287109375, -0.010772705078125, -0.045440673828125, -0.0655517578125, 0.007083892822265625, -0.109375, -0.0233154296875, 0.07293701171875, -0.0254669189453125, -0.0211334228515625, 0.00865936279296875, -0.0285186767578125, 0.01129150390625, -0.0301361083984375, 0.05419921875, 0.036285400390625, -0.01934814453125, 0.0009121894836425781, -0.0361328125, 0.05401611328125, 0.039398193359375, -0.043487548828125, 0.0012712478637695312, 0.0244293212890625, 0.019989013671875, 0.040679931640625, 0.040924072265625, -0.00292205810546875, 0.027435302734375, -0.009185791015625, 0.00015163421630859375, -0.0263214111328125, -0.0178070068359375, -0.01137542724609375, -0.0142974853515625, -0.0095672607421875, -0.04248046875 ] ]
sentence-transformers/nli-mpnet-base-v2
2022-06-15T20:14:17.000Z
[ "sentence-transformers", "pytorch", "tf", "mpnet", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/nli-mpnet-base-v2
7
206,550
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/nli-mpnet-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/nli-mpnet-base-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/nli-mpnet-base-v2') model = AutoModel.from_pretrained('sentence-transformers/nli-mpnet-base-v2') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/nli-mpnet-base-v2) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 75, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,663
[ [ -0.024200439453125, -0.04071044921875, 0.0206451416015625, 0.03131103515625, -0.0199432373046875, -0.037841796875, -0.015899658203125, 0.0017299652099609375, 0.0135498046875, 0.0285491943359375, -0.044952392578125, -0.021759033203125, -0.056365966796875, 0.00634765625, -0.026153564453125, 0.05828857421875, -0.003936767578125, -0.00400543212890625, -0.0230560302734375, -0.0125732421875, -0.012420654296875, -0.031402587890625, -0.034759521484375, -0.0286102294921875, 0.0301361083984375, 0.00914764404296875, 0.03167724609375, 0.037506103515625, 0.027801513671875, 0.031768798828125, -0.01084136962890625, 0.01505279541015625, -0.018890380859375, -0.019256591796875, 0.0008826255798339844, -0.0294342041015625, -0.007080078125, 0.0252227783203125, 0.05108642578125, 0.03564453125, -0.01152801513671875, 0.01100921630859375, 0.00037550926208496094, 0.026458740234375, -0.04241943359375, 0.032318115234375, -0.045684814453125, 0.0248260498046875, 0.0061492919921875, -0.00251007080078125, -0.0478515625, 0.0012235641479492188, 0.0199432373046875, -0.02532958984375, -0.0014133453369140625, 0.0079345703125, 0.08734130859375, 0.03167724609375, -0.030242919921875, -0.01568603515625, -0.019927978515625, 0.063720703125, -0.07769775390625, 0.01136016845703125, 0.0218353271484375, -0.0020427703857421875, -0.0014486312866210938, -0.07757568359375, -0.060394287109375, -0.0072174072265625, -0.04083251953125, 0.022308349609375, -0.03253173828125, 0.002780914306640625, 0.0183258056640625, 0.01151275634765625, -0.048919677734375, 0.0005855560302734375, -0.032073974609375, -0.0118408203125, 0.0391845703125, 0.00240325927734375, 0.025634765625, -0.0467529296875, -0.031280517578125, -0.02294921875, -0.015838623046875, -0.005405426025390625, 0.019134521484375, 0.01265716552734375, -0.024688720703125, 0.0704345703125, -0.006290435791015625, 0.045318603515625, -0.005390167236328125, 0.00605010986328125, 0.047271728515625, -0.0281829833984375, -0.02044677734375, -0.003475189208984375, 0.08551025390625, 0.034332275390625, 0.028717041015625, -0.00778961181640625, -0.01025390625, -0.0004401206970214844, 0.0180816650390625, -0.06640625, -0.027557373046875, 0.0164947509765625, -0.0379638671875, -0.026763916015625, 0.020721435546875, -0.050994873046875, -0.0110015869140625, -0.00396728515625, 0.04962158203125, -0.040435791015625, -0.0052947998046875, 0.01071929931640625, -0.0222320556640625, 0.0275421142578125, -0.035736083984375, -0.053985595703125, 0.01302337646484375, 0.0345458984375, 0.079345703125, 0.0022869110107421875, -0.041778564453125, -0.0194854736328125, -0.01204681396484375, 0.0103607177734375, 0.04876708984375, -0.021575927734375, -0.002490997314453125, -0.000023484230041503906, 0.0173492431640625, -0.053131103515625, -0.0298004150390625, 0.043701171875, -0.01351165771484375, 0.052734375, 0.01184844970703125, -0.056854248046875, -0.012908935546875, 0.01029205322265625, -0.03369140625, 0.07373046875, 0.01220703125, -0.07757568359375, 0.00225067138671875, -0.06341552734375, -0.0238800048828125, -0.0179901123046875, 0.0095672607421875, -0.05572509765625, 0.0098419189453125, 0.0305633544921875, 0.050201416015625, 0.008819580078125, 0.0305633544921875, -0.01406097412109375, -0.03277587890625, 0.03863525390625, -0.034881591796875, 0.08770751953125, 0.00978851318359375, -0.031402587890625, 0.015777587890625, -0.0404052734375, -0.006313323974609375, 0.02008056640625, -0.0160675048828125, -0.0010461807250976562, 0.01071929931640625, 0.01421356201171875, 0.027587890625, 0.01385498046875, -0.05303955078125, 0.012603759765625, -0.048980712890625, 0.07232666015625, 0.0419921875, 0.0038471221923828125, 0.0418701171875, -0.021575927734375, 0.005153656005859375, 0.0272216796875, 0.0029392242431640625, -0.0155487060546875, -0.03790283203125, -0.08258056640625, -0.0137939453125, 0.027313232421875, 0.052032470703125, -0.05682373046875, 0.07806396484375, -0.03607177734375, -0.0300445556640625, -0.053985595703125, -0.01326751708984375, 0.005725860595703125, 0.031463623046875, 0.0372314453125, -0.006359100341796875, -0.05609130859375, -0.0811767578125, 0.00193023681640625, -0.007434844970703125, 0.0009746551513671875, 0.0296630859375, 0.055511474609375, -0.036651611328125, 0.0750732421875, -0.0399169921875, -0.0274200439453125, -0.0264129638671875, 0.0261077880859375, 0.0231170654296875, 0.045318603515625, 0.035186767578125, -0.05853271484375, -0.0196380615234375, -0.038360595703125, -0.04962158203125, -0.009307861328125, -0.0265045166015625, -0.01308441162109375, 0.01496124267578125, 0.035614013671875, -0.0633544921875, 0.0217132568359375, 0.048736572265625, -0.042694091796875, 0.02923583984375, -0.0124053955078125, -0.02044677734375, -0.1060791015625, 0.01276397705078125, 0.002330780029296875, -0.01087188720703125, -0.031494140625, 0.0047149658203125, 0.002025604248046875, -0.010833740234375, -0.033935546875, 0.0265655517578125, -0.0295257568359375, 0.002819061279296875, -0.00787353515625, 0.0239410400390625, -0.00592803955078125, 0.06304931640625, -0.0077362060546875, 0.06396484375, 0.036865234375, -0.03466796875, 0.0220489501953125, 0.034210205078125, -0.03179931640625, 0.00916290283203125, -0.064453125, -0.00421905517578125, -0.00424957275390625, 0.0239715576171875, -0.0816650390625, -0.0026988983154296875, 0.0295562744140625, -0.04510498046875, 0.00432586669921875, 0.01097869873046875, -0.050048828125, -0.045135498046875, -0.020172119140625, 0.003246307373046875, 0.041229248046875, -0.0416259765625, 0.045562744140625, 0.022003173828125, -0.00261688232421875, -0.04522705078125, -0.08966064453125, 0.01029205322265625, -0.0098724365234375, -0.04742431640625, 0.037811279296875, -0.00531768798828125, 0.01454925537109375, 0.021484375, 0.0194244384765625, -0.004619598388671875, 0.0020751953125, 0.0108184814453125, 0.00992584228515625, -0.013702392578125, 0.00749969482421875, 0.00687408447265625, -0.01532745361328125, 0.01416015625, -0.02581787109375, 0.0587158203125, -0.01558685302734375, -0.00934600830078125, -0.0364990234375, 0.0195465087890625, 0.037017822265625, -0.0259857177734375, 0.09222412109375, 0.08233642578125, -0.02545166015625, -0.0023555755615234375, -0.0364990234375, -0.0144805908203125, -0.034149169921875, 0.043670654296875, -0.01490020751953125, -0.076416015625, 0.0211029052734375, 0.0094146728515625, -0.004558563232421875, 0.054046630859375, 0.04052734375, -0.00860595703125, 0.0654296875, 0.04315185546875, -0.01372528076171875, 0.035125732421875, -0.04608154296875, 0.0292510986328125, -0.0765380859375, -0.01279449462890625, -0.0178680419921875, -0.0292816162109375, -0.0548095703125, -0.03009033203125, 0.0117340087890625, -0.005504608154296875, -0.020263671875, 0.045501708984375, -0.044097900390625, 0.01898193359375, 0.050048828125, 0.0104522705078125, -0.017364501953125, 0.006256103515625, -0.0360107421875, -0.00290679931640625, -0.056488037109375, -0.03338623046875, 0.053466796875, 0.0307464599609375, 0.01213836669921875, 0.0011720657348632812, 0.044464111328125, -0.004878997802734375, 0.0019855499267578125, -0.05743408203125, 0.04937744140625, -0.0290069580078125, -0.0203704833984375, -0.0169677734375, -0.038787841796875, -0.0633544921875, 0.03289794921875, -0.0196990966796875, -0.06591796875, 0.0160064697265625, -0.024169921875, -0.0194244384765625, 0.03173828125, -0.0679931640625, 0.080322265625, 0.01039886474609375, 0.0019273757934570312, -0.0018644332885742188, -0.05340576171875, 0.0181732177734375, 0.0199737548828125, -0.00847625732421875, -0.0061492919921875, -0.00881195068359375, 0.060638427734375, -0.0223236083984375, 0.072021484375, -0.01348876953125, 0.026763916015625, 0.025848388671875, -0.02911376953125, 0.0186309814453125, -0.01015472412109375, -0.0095672607421875, 0.005336761474609375, -0.01523590087890625, -0.0258941650390625, -0.0430908203125, 0.050872802734375, -0.07293701171875, -0.02227783203125, -0.0364990234375, -0.0447998046875, 0.002773284912109375, 0.01189422607421875, 0.0283203125, 0.0312347412109375, -0.0005040168762207031, 0.03680419921875, 0.036407470703125, -0.0264739990234375, 0.0657958984375, 0.007965087890625, 0.002716064453125, -0.03790283203125, 0.0528564453125, 0.005901336669921875, -0.0011014938354492188, 0.037628173828125, 0.0178985595703125, -0.029022216796875, -0.0238037109375, -0.0256195068359375, 0.02783203125, -0.033721923828125, -0.01415252685546875, -0.08087158203125, -0.051177978515625, -0.045196533203125, 0.00888824462890625, -0.013885498046875, -0.031280517578125, -0.0413818359375, -0.016357421875, 0.02252197265625, 0.0272979736328125, -0.00484466552734375, 0.0364990234375, -0.04522705078125, 0.00859832763671875, 0.0203094482421875, 0.0192718505859375, 0.0000782012939453125, -0.050445556640625, -0.0139923095703125, 0.00003075599670410156, -0.018951416015625, -0.0643310546875, 0.048828125, 0.0193634033203125, 0.041595458984375, 0.00971221923828125, -0.0002841949462890625, 0.03863525390625, -0.046051025390625, 0.06903076171875, 0.006046295166015625, -0.072998046875, 0.03167724609375, -0.00815582275390625, 0.0401611328125, 0.035552978515625, 0.0240325927734375, -0.038818359375, -0.012420654296875, -0.0482177734375, -0.0758056640625, 0.05181884765625, 0.039947509765625, 0.0352783203125, -0.0195770263671875, 0.0267333984375, -0.034423828125, 0.0104522705078125, -0.08929443359375, -0.032135009765625, -0.0265045166015625, -0.03863525390625, -0.022491455078125, -0.03515625, 0.01326751708984375, -0.0299072265625, 0.0604248046875, -0.0006098747253417969, 0.06689453125, 0.0269317626953125, -0.0287017822265625, 0.0222015380859375, 0.0195159912109375, 0.0472412109375, 0.020050048828125, -0.005840301513671875, 0.0187530517578125, 0.0195465087890625, -0.02392578125, 0.0017080307006835938, 0.037841796875, -0.01136016845703125, 0.008697509765625, 0.02642822265625, 0.0806884765625, 0.0292510986328125, -0.031280517578125, 0.061553955078125, -0.004169464111328125, -0.019287109375, -0.035003662109375, -0.00736236572265625, 0.0265045166015625, 0.0197296142578125, 0.015838623046875, 0.01250457763671875, 0.003887176513671875, -0.0263671875, 0.033172607421875, 0.0096588134765625, -0.03271484375, 0.003742218017578125, 0.03973388671875, 0.0006732940673828125, -0.02325439453125, 0.0736083984375, -0.029449462890625, -0.057159423828125, 0.035064697265625, 0.048187255859375, 0.07476806640625, 0.009490966796875, 0.0263671875, 0.041290283203125, 0.0355224609375, 0.0016374588012695312, -0.0036468505859375, 0.00843048095703125, -0.06829833984375, -0.035430908203125, -0.050201416015625, 0.0079345703125, -0.0006966590881347656, -0.03790283203125, 0.00799560546875, -0.008758544921875, -0.006526947021484375, -0.00649261474609375, -0.0020198822021484375, -0.03570556640625, 0.00007474422454833984, 0.012481689453125, 0.0665283203125, -0.0684814453125, 0.06024169921875, 0.047576904296875, -0.04425048828125, -0.059326171875, -0.003387451171875, -0.01837158203125, -0.04815673828125, 0.0301513671875, 0.036407470703125, 0.0168304443359375, 0.016510009765625, -0.0421142578125, -0.058868408203125, 0.10406494140625, 0.018341064453125, -0.0277862548828125, -0.00926971435546875, 0.01476287841796875, 0.0335693359375, -0.040985107421875, 0.0280609130859375, 0.034393310546875, 0.02947998046875, -0.0070343017578125, -0.047698974609375, 0.01123809814453125, -0.03289794921875, 0.0159454345703125, -0.004802703857421875, -0.04205322265625, 0.0694580078125, 0.0039825439453125, -0.0135345458984375, 0.0195770263671875, 0.06463623046875, 0.018157958984375, -0.005901336669921875, 0.03680419921875, 0.0672607421875, 0.03912353515625, -0.02020263671875, 0.08123779296875, -0.0160064697265625, 0.048828125, 0.0771484375, 0.00823211669921875, 0.0872802734375, 0.035980224609375, -0.0028820037841796875, 0.059356689453125, 0.035430908203125, -0.033721923828125, 0.04351806640625, 0.020538330078125, -0.0038509368896484375, 0.0036468505859375, -0.0031375885009765625, -0.00817108154296875, 0.042572021484375, 0.0091705322265625, -0.05419921875, -0.00008243322372436523, 0.0096588134765625, 0.006443023681640625, -0.0017080307006835938, 0.0000438690185546875, 0.0517578125, 0.01097869873046875, -0.0413818359375, 0.0190277099609375, 0.0226898193359375, 0.07867431640625, -0.0306549072265625, 0.01276397705078125, -0.00217437744140625, 0.024505615234375, -0.00182342529296875, -0.048187255859375, 0.03851318359375, -0.01236724853515625, -0.012298583984375, -0.0152130126953125, 0.0533447265625, -0.04644775390625, -0.048980712890625, 0.0234375, 0.03167724609375, 0.00347137451171875, 0.0011358261108398438, -0.08270263671875, -0.006603240966796875, 0.0019083023071289062, -0.0308380126953125, 0.01432037353515625, 0.0148773193359375, 0.039398193359375, 0.04058837890625, 0.0308074951171875, -0.00992584228515625, 0.01256561279296875, 0.01092529296875, 0.052886962890625, -0.039459228515625, -0.037200927734375, -0.072265625, 0.05560302734375, -0.0239105224609375, -0.03204345703125, 0.04974365234375, 0.0404052734375, 0.061798095703125, -0.01416015625, 0.032501220703125, -0.01396942138671875, 0.01021575927734375, -0.04241943359375, 0.0726318359375, -0.029754638671875, -0.00855255126953125, -0.0081787109375, -0.07318115234375, -0.0264739990234375, 0.07794189453125, -0.021759033203125, 0.0086669921875, 0.06463623046875, 0.06695556640625, -0.01213836669921875, -0.004520416259765625, 0.00720977783203125, 0.034881591796875, 0.0069580078125, 0.039886474609375, 0.034698486328125, -0.060546875, 0.047271728515625, -0.04388427734375, -0.00014710426330566406, 0.0035228729248046875, -0.06707763671875, -0.07159423828125, -0.060302734375, -0.03326416015625, -0.0211029052734375, 0.00010722875595092773, 0.07318115234375, 0.05963134765625, -0.05755615234375, -0.010406494140625, -0.02435302734375, -0.0146942138671875, -0.00691986083984375, -0.02569580078125, 0.033111572265625, -0.03271484375, -0.051513671875, 0.019805908203125, -0.0013141632080078125, 0.00637054443359375, -0.0270233154296875, 0.0095367431640625, -0.058746337890625, 0.01088714599609375, 0.046875, -0.0179290771484375, -0.059600830078125, -0.0197296142578125, -0.00173187255859375, -0.0252227783203125, -0.0025196075439453125, 0.035980224609375, -0.0517578125, 0.01043701171875, 0.0308074951171875, 0.0478515625, 0.049774169921875, -0.0210723876953125, 0.0355224609375, -0.06683349609375, 0.02313232421875, 0.002079010009765625, 0.05377197265625, 0.03179931640625, -0.02032470703125, 0.047637939453125, 0.022247314453125, -0.036102294921875, -0.046295166015625, -0.0078125, -0.08135986328125, -0.0236358642578125, 0.086181640625, -0.0267486572265625, -0.027740478515625, 0.0236053466796875, -0.01641845703125, 0.03533935546875, -0.0105743408203125, 0.0408935546875, 0.060455322265625, 0.007389068603515625, -0.042510986328125, -0.0241241455078125, 0.0095672607421875, 0.038787841796875, -0.046875, -0.0220184326171875, 0.01007843017578125, 0.0140533447265625, 0.0225830078125, 0.0218048095703125, -0.0045013427734375, -0.0020961761474609375, 0.00904083251953125, 0.007221221923828125, -0.0217437744140625, -0.0034122467041015625, -0.037384033203125, 0.01495361328125, -0.0369873046875, -0.020263671875 ] ]
Meina/MeinaMix_V10
2023-05-25T11:22:20.000Z
[ "diffusers", "art", "anime", "stable diffusion", "text-to-image", "en", "license:creativeml-openrail-m", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Meina
null
null
Meina/MeinaMix_V10
27
203,911
diffusers
2023-05-24T04:44:20
--- license: creativeml-openrail-m language: - en library_name: diffusers pipeline_tag: text-to-image tags: - art - anime - stable diffusion --- MeinaMix Objective is to be able to do good art with little prompting. For examples and prompts, please checkout: https://civitai.com/models/7240/meinamix I have a discord server where you can post images that you generated, discuss prompt and/or ask for help. https://discord.gg/XC9nGZNDUd If you like one of my models and want to support their updates I've made a ko-fi page; https://ko-fi.com/meina where you can pay me a coffee <3 And a Patreon page; https://www.patreon.com/MeinaMix where you can support me and get acess to beta of my models! You may also try this model using Sinkin.ai: https://sinkin.ai/m/vln8Nwr MeinaMix and the other of Meinas will ALWAYS be FREE. Recommendations of use: Enable Quantization in K samplers. Hires.fix is needed for prompts where the character is far away in order to make decent images, it drastically improve the quality of face and eyes! Recommended parameters: Sampler: Euler a: 40 to 60 steps. Sampler: DPM++ SDE Karras: 30 to 60 steps. CFG Scale: 7. Resolutions: 512x768, 512x1024 for Portrait! Resolutions: 768x512, 1024x512, 1536x512 for Landscape! Hires.fix: R-ESRGAN 4x+Anime6b, with 10 steps at 0.1 up to 0.3 denoising. Clip Skip: 2. Negatives: ' (worst quality:2, low quality:2), (zombie, sketch, interlocked fingers, comic), '
1,445
[ [ -0.058990478515625, -0.0406494140625, 0.05535888671875, 0.027923583984375, -0.03961181640625, -0.0259857177734375, 0.005733489990234375, -0.049224853515625, 0.0322265625, 0.031280517578125, -0.0509033203125, -0.051239013671875, -0.03192138671875, 0.01409149169921875, -0.00684356689453125, 0.0638427734375, 0.00634765625, 0.016357421875, 0.003635406494140625, -0.00090789794921875, -0.06207275390625, 0.00838470458984375, -0.0938720703125, -0.0023040771484375, 0.0662841796875, 0.060882568359375, 0.056549072265625, 0.032470703125, 0.023223876953125, 0.02435302734375, -0.004337310791015625, 0.002166748046875, -0.0382080078125, 0.0182037353515625, -0.00978851318359375, -0.04254150390625, -0.04791259765625, -0.002445220947265625, 0.0169677734375, 0.026458740234375, 0.004871368408203125, 0.004474639892578125, -0.0232086181640625, 0.05328369140625, -0.0494384765625, -0.00447845458984375, 0.019134521484375, -0.0025920867919921875, -0.02752685546875, 0.0153656005859375, -0.020721435546875, -0.044464111328125, -0.00522613525390625, -0.05426025390625, 0.011383056640625, -0.00626373291015625, 0.06671142578125, 0.0139007568359375, -0.01324462890625, -0.0220184326171875, -0.051544189453125, 0.04302978515625, -0.07861328125, 0.0078277587890625, 0.0272064208984375, 0.06494140625, -0.0006632804870605469, -0.03887939453125, -0.028778076171875, -0.01436614990234375, 0.0109100341796875, 0.0240325927734375, -0.013885498046875, -0.000782012939453125, 0.045501708984375, 0.03497314453125, -0.0509033203125, 0.0023956298828125, -0.039520263671875, -0.00399017333984375, 0.062103271484375, 0.0167236328125, 0.062286376953125, -0.008636474609375, -0.0297088623046875, -0.03826904296875, -0.052276611328125, 0.01364898681640625, 0.043243408203125, -0.00984954833984375, -0.034698486328125, 0.038360595703125, -0.0210113525390625, 0.0246124267578125, 0.01282501220703125, 0.00995635986328125, -0.0041656494140625, -0.00922393798828125, 0.0219879150390625, -0.0018310546875, 0.046142578125, 0.0682373046875, 0.0242767333984375, 0.032073974609375, -0.0240325927734375, -0.00714111328125, 0.029541015625, -0.09027099609375, -0.033233642578125, 0.0164642333984375, -0.06536865234375, -0.0221099853515625, -0.01520538330078125, -0.07403564453125, -0.0298004150390625, -0.0175628662109375, 0.047515869140625, -0.051849365234375, -0.02142333984375, -0.0129547119140625, -0.042022705078125, 0.021240234375, 0.058441162109375, -0.049591064453125, 0.0026493072509765625, -0.003948211669921875, 0.053955078125, 0.00981903076171875, -0.005359649658203125, -0.011260986328125, -0.01438140869140625, -0.042083740234375, 0.07470703125, -0.01242828369140625, -0.054412841796875, -0.0212554931640625, -0.004871368408203125, 0.022216796875, -0.055694580078125, 0.054412841796875, -0.032196044921875, 0.00791168212890625, -0.00817108154296875, -0.0298309326171875, -0.0291290283203125, -0.0146331787109375, -0.06512451171875, 0.021270751953125, 0.020904541015625, -0.0213470458984375, -0.0169830322265625, -0.0665283203125, -0.00909423828125, 0.0222015380859375, 0.0042877197265625, -0.004703521728515625, 0.0430908203125, 0.0079193115234375, 0.0238189697265625, -0.004730224609375, -0.007656097412109375, -0.05078125, -0.040283203125, 0.0256500244140625, -0.03515625, 0.0726318359375, 0.031341552734375, -0.021087646484375, -0.0007166862487792969, -0.07281494140625, 0.01387786865234375, 0.0252838134765625, 0.0123138427734375, -0.0177001953125, -0.005092620849609375, 0.0207061767578125, 0.01477813720703125, 0.0183258056640625, -0.016632080078125, 0.003421783447265625, -0.02899169921875, 0.01079559326171875, 0.07763671875, 0.00330352783203125, 0.021148681640625, -0.046478271484375, 0.044281005859375, 0.004077911376953125, 0.017669677734375, -0.037109375, -0.0484619140625, -0.0736083984375, -0.008026123046875, 0.0169830322265625, 0.031036376953125, -0.047271728515625, 0.0194549560546875, 0.007659912109375, -0.048370361328125, -0.045745849609375, -0.007144927978515625, 0.01111602783203125, 0.020416259765625, 0.01294708251953125, -0.056182861328125, -0.024749755859375, -0.09844970703125, 0.035675048828125, 0.0025920867919921875, -0.02899169921875, 0.021453857421875, 0.0214996337890625, -0.0189056396484375, 0.05145263671875, -0.0291595458984375, -0.004962921142578125, 0.00011360645294189453, 0.0162811279296875, 0.035736083984375, 0.05145263671875, 0.048919677734375, -0.06024169921875, -0.0501708984375, -0.015106201171875, -0.054107666015625, -0.0181121826171875, 0.0088958740234375, -0.0286865234375, -0.01178741455078125, 0.0158233642578125, -0.066162109375, 0.034942626953125, 0.01568603515625, -0.02337646484375, 0.042633056640625, 0.0039825439453125, 0.0301666259765625, -0.0943603515625, 0.023773193359375, -0.006435394287109375, -0.0211029052734375, -0.052459716796875, 0.032470703125, -0.032745361328125, -0.0509033203125, -0.05517578125, 0.0552978515625, -0.0121002197265625, 0.0196075439453125, -0.0462646484375, 0.0018491744995117188, 0.012847900390625, 0.045867919921875, 0.00722503662109375, 0.05609130859375, 0.040740966796875, -0.05072021484375, 0.030517578125, 0.0218658447265625, -0.032745361328125, 0.0762939453125, -0.09417724609375, 0.0272674560546875, -0.0160369873046875, 0.005184173583984375, -0.0723876953125, -0.035064697265625, 0.0703125, -0.033599853515625, 0.0281829833984375, 0.0108489990234375, -0.016754150390625, -0.0236663818359375, -0.0243072509765625, 0.048858642578125, 0.065185546875, -0.0304107666015625, 0.047882080078125, -0.0021686553955078125, -0.0294647216796875, 0.0112457275390625, -0.01221466064453125, 0.00624847412109375, -0.0258636474609375, -0.046966552734375, 0.0343017578125, -0.041259765625, -0.032470703125, 0.008026123046875, 0.02191162109375, -0.0241241455078125, -0.01421356201171875, 0.01413726806640625, 0.0191192626953125, -0.03765869140625, -0.0199127197265625, 0.01160430908203125, -0.02899169921875, 0.00284576416015625, 0.0279388427734375, 0.0284271240234375, -0.0092315673828125, -0.037872314453125, -0.08575439453125, 0.053619384765625, 0.05145263671875, 0.0279693603515625, -0.01117706298828125, 0.050048828125, -0.0291290283203125, 0.003093719482421875, -0.0311431884765625, -0.03802490234375, -0.037353515625, -0.009185791015625, -0.0360107421875, -0.0272674560546875, 0.053955078125, -0.012542724609375, 0.006561279296875, 0.043609619140625, 0.034881591796875, -0.0513916015625, 0.10906982421875, 0.039215087890625, 0.00818634033203125, 0.0224761962890625, -0.0245819091796875, -0.000579833984375, -0.058685302734375, 0.0075531005859375, -0.026641845703125, -0.05487060546875, -0.033782958984375, -0.0208587646484375, 0.03271484375, 0.018280029296875, -0.0035953521728515625, 0.0301513671875, -0.0107574462890625, 0.04351806640625, 0.041107177734375, 0.034210205078125, -0.005588531494140625, -0.0037708282470703125, 0.0201568603515625, -0.0031681060791015625, -0.046478271484375, -0.03271484375, 0.0455322265625, 0.0163116455078125, 0.06292724609375, 0.016815185546875, 0.04815673828125, 0.0264892578125, 0.00405120849609375, -0.056488037109375, 0.04595947265625, -0.01375579833984375, -0.057037353515625, -0.008270263671875, 0.0011768341064453125, -0.043701171875, -0.001918792724609375, 0.0018434524536132812, -0.03765869140625, 0.0516357421875, 0.0142822265625, -0.043548583984375, -0.0058441162109375, -0.049041748046875, 0.051788330078125, -0.005138397216796875, -0.01422882080078125, -0.0225677490234375, -0.04486083984375, 0.0275115966796875, 0.0139617919921875, 0.0030364990234375, -0.044219970703125, 0.02056884765625, 0.025360107421875, -0.042633056640625, 0.0902099609375, -0.01352691650390625, -0.004604339599609375, 0.04559326171875, 0.01190185546875, 0.0245513916015625, 0.01125335693359375, -0.012420654296875, 0.027923583984375, 0.005706787109375, -0.034759521484375, -0.06024169921875, 0.0428466796875, -0.056915283203125, -0.053619384765625, -0.00015842914581298828, -0.0245208740234375, 0.0103912353515625, 0.01226043701171875, 0.050048828125, 0.05316162109375, -0.0205535888671875, 0.019287109375, 0.033447265625, -0.004314422607421875, 0.0435791015625, -0.00738525390625, -0.00507354736328125, -0.046417236328125, 0.0797119140625, -0.01477813720703125, 0.007305145263671875, -0.004146575927734375, 0.0252532958984375, -0.017791748046875, -0.003971099853515625, -0.0760498046875, 0.01232147216796875, -0.058807373046875, -0.0174713134765625, -0.001094818115234375, -0.0173797607421875, -0.0169830322265625, -0.0218658447265625, -0.02679443359375, -0.031005859375, -0.049407958984375, 0.041717529296875, 0.03814697265625, 0.04730224609375, -0.0271453857421875, 0.035675048828125, -0.052154541015625, 0.0157012939453125, 0.0263671875, 0.0193023681640625, 0.01290130615234375, -0.0255126953125, -0.0190277099609375, 0.015777587890625, -0.0288238525390625, -0.066650390625, 0.0212249755859375, -0.0004673004150390625, 0.026885986328125, 0.068115234375, 0.0088958740234375, 0.0653076171875, -0.05291748046875, 0.056854248046875, 0.0197296142578125, -0.059478759765625, 0.0609130859375, -0.039459228515625, 0.04193115234375, 0.06329345703125, 0.047882080078125, -0.04412841796875, -0.035675048828125, -0.05377197265625, -0.037078857421875, 0.045928955078125, 0.028289794921875, 0.0361328125, 0.00887298583984375, 0.044097900390625, 0.003551483154296875, 0.00685882568359375, -0.0440673828125, -0.0177154541015625, -0.032562255859375, -0.01148223876953125, 0.005870819091796875, -0.0340576171875, 0.0018568038940429688, -0.035980224609375, 0.0693359375, -0.005462646484375, 0.03302001953125, 0.020263671875, 0.0199127197265625, -0.0121307373046875, -0.0004878044128417969, 0.06146240234375, 0.0484619140625, -0.004039764404296875, -0.0081634521484375, 0.006481170654296875, -0.0151519775390625, 0.00881195068359375, -0.01418304443359375, -0.037261962890625, 0.0281524658203125, 0.017242431640625, 0.08685302734375, 0.0201416015625, -0.058349609375, 0.0128631591796875, 0.007472991943359375, 0.006275177001953125, -0.0164947509765625, 0.03961181640625, 0.0009398460388183594, 0.0302734375, 0.001895904541015625, -0.0012912750244140625, 0.038970947265625, -0.0187225341796875, -0.0016956329345703125, -0.0008130073547363281, -0.0209808349609375, -0.0321044921875, 0.0462646484375, 0.013519287109375, -0.051544189453125, 0.06024169921875, -0.00998687744140625, -0.024169921875, 0.0703125, 0.046295166015625, 0.06146240234375, -0.0330810546875, 0.040283203125, 0.054443359375, -0.0088958740234375, 0.0219573974609375, 0.0272216796875, -0.00762939453125, -0.02899169921875, 0.0025768280029296875, -0.0210113525390625, -0.052154541015625, 0.0191192626953125, -0.03363037109375, 0.0631103515625, -0.033538818359375, -0.005828857421875, 0.004688262939453125, -0.01081085205078125, -0.0186004638671875, 0.019500732421875, 0.01221466064453125, 0.06414794921875, -0.032440185546875, 0.02667236328125, 0.03021240234375, -0.05426025390625, -0.0513916015625, -0.0152740478515625, 0.0090789794921875, -0.03631591796875, 0.031890869140625, 0.031951904296875, 0.0020275115966796875, 0.0179901123046875, -0.06292724609375, -0.05609130859375, 0.0657958984375, 0.00983428955078125, -0.0467529296875, -0.005283355712890625, -0.0184173583984375, 0.02532958984375, -0.0305328369140625, 0.0186920166015625, -0.0089111328125, 0.039825439453125, 0.013702392578125, -0.00850677490234375, -0.00710296630859375, -0.05389404296875, 0.03167724609375, -0.0107574462890625, -0.05035400390625, 0.056976318359375, -0.01291656494140625, -0.04095458984375, 0.05291748046875, 0.03912353515625, 0.0218658447265625, 0.03460693359375, 0.057647705078125, 0.040924072265625, 0.0208282470703125, 0.0123748779296875, 0.07745361328125, -0.0258636474609375, -0.0133514404296875, 0.0699462890625, -0.0093841552734375, 0.041412353515625, 0.00916290283203125, 0.006740570068359375, 0.057861328125, 0.08868408203125, -0.03497314453125, 0.04052734375, 0.002178192138671875, -0.0386962890625, -0.0211639404296875, -0.0240020751953125, -0.03656005859375, 0.0159149169921875, 0.0067596435546875, -0.011199951171875, -0.041046142578125, 0.044036865234375, -0.029083251953125, -0.0056304931640625, -0.01544189453125, 0.047271728515625, 0.0196533203125, -0.0192718505859375, 0.035430908203125, 0.003292083740234375, 0.023406982421875, -0.0423583984375, -0.0271759033203125, -0.0196380615234375, -0.0150604248046875, 0.0060272216796875, -0.044647216796875, 0.016937255859375, -0.019927978515625, -0.0267333984375, -0.01065826416015625, 0.06353759765625, -0.0080413818359375, -0.031494140625, 0.00121307373046875, 0.02581787109375, 0.053466796875, 0.002765655517578125, -0.0604248046875, 0.0167236328125, -0.014373779296875, -0.0028362274169921875, -0.0117340087890625, 0.0096588134765625, -0.0142974853515625, 0.01727294921875, 0.040679931640625, -0.001613616943359375, -0.033050537109375, 0.02655029296875, 0.043731689453125, -0.03875732421875, -0.017242431640625, -0.057037353515625, 0.0408935546875, -0.0040435791015625, -0.0308380126953125, 0.04486083984375, 0.02099609375, 0.062347412109375, -0.049713134765625, 0.0162353515625, -0.007843017578125, 0.0063629150390625, -0.043975830078125, 0.0806884765625, -0.051605224609375, -0.038330078125, 0.002437591552734375, -0.07147216796875, -0.01065826416015625, 0.058990478515625, 0.0109710693359375, 0.007396697998046875, 0.023101806640625, 0.06756591796875, -0.00028014183044433594, 0.00951385498046875, 0.043182373046875, -0.0010585784912109375, 0.0016956329345703125, 0.03607177734375, 0.08489990234375, -0.035736083984375, -0.0014057159423828125, -0.06317138671875, -0.000054836273193359375, -0.030975341796875, -0.03900146484375, -0.064697265625, -0.04412841796875, -0.0335693359375, -0.0099639892578125, -0.002841949462890625, 0.05389404296875, 0.0743408203125, -0.053680419921875, -0.0236968994140625, -0.0015230178833007812, 0.0172576904296875, -0.00577545166015625, -0.018280029296875, 0.0121612548828125, -0.0090789794921875, -0.101318359375, 0.039215087890625, 0.00673675537109375, 0.0384521484375, -0.021240234375, 0.00551605224609375, -0.00780487060546875, 0.01119232177734375, 0.051666259765625, 0.029022216796875, -0.048370361328125, -0.01384735107421875, -0.0134735107421875, -0.00756072998046875, 0.0105743408203125, 0.04766845703125, -0.0144195556640625, 0.035552978515625, 0.0401611328125, 0.0193939208984375, 0.00894927978515625, -0.0037689208984375, 0.043365478515625, -0.0252685546875, -0.0021495819091796875, 0.02191162109375, 0.0113372802734375, 0.0036830902099609375, -0.03680419921875, 0.039764404296875, 0.0175018310546875, -0.025115966796875, -0.04638671875, 0.0184173583984375, -0.068359375, -0.0282135009765625, 0.047149658203125, 0.010589599609375, -0.039794921875, 0.0229644775390625, -0.04376220703125, 0.0020427703857421875, -0.0211944580078125, 0.04302978515625, 0.0711669921875, -0.019439697265625, -0.0231781005859375, -0.052734375, 0.022125244140625, 0.0181884765625, -0.0711669921875, -0.034515380859375, 0.0660400390625, 0.00691986083984375, 0.03759765625, 0.059600830078125, -0.0186004638671875, 0.06256103515625, 0.0044708251953125, 0.0225067138671875, 0.0200347900390625, -0.041595458984375, -0.043792724609375, 0.005580902099609375, -0.0047760009765625, -0.0238494873046875 ] ]
rizvandwiki/gender-classification
2023-05-18T11:16:33.000Z
[ "transformers", "pytorch", "tensorboard", "safetensors", "vit", "image-classification", "huggingpics", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
rizvandwiki
null
null
rizvandwiki/gender-classification
4
201,226
transformers
2022-12-06T08:53:43
--- tags: - image-classification - pytorch - huggingpics metrics: - accuracy model-index: - name: gender-classification results: - task: name: Image Classification type: image-classification metrics: - name: Accuracy type: accuracy value: 0.9244444370269775 --- # gender-classification Autogenerated by HuggingPics🤗🖼️ Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb). Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics). ## Example Images #### female ![female](images/female.jpg) #### male ![male](images/male.jpg)
739
[ [ -0.034423828125, -0.04052734375, 0.0038661956787109375, 0.0394287109375, -0.0164642333984375, 0.0161285400390625, 0.0184478759765625, -0.0301513671875, 0.031768798828125, 0.00991058349609375, -0.040924072265625, -0.048583984375, -0.045928955078125, 0.0161590576171875, -0.03173828125, 0.0693359375, 0.0037975311279296875, 0.0012874603271484375, -0.0035552978515625, 0.0035152435302734375, -0.04315185546875, -0.016937255859375, -0.043426513671875, -0.0270233154296875, 0.0223846435546875, 0.036346435546875, 0.039825439453125, 0.043212890625, 0.031524658203125, 0.035888671875, 0.006656646728515625, -0.0004744529724121094, -0.0164947509765625, 0.0019235610961914062, -0.002044677734375, -0.0367431640625, -0.038787841796875, 0.031890869140625, 0.0305328369140625, 0.0210723876953125, -0.0132293701171875, 0.0240478515625, -0.0200347900390625, 0.030120849609375, -0.03192138671875, 0.025177001953125, -0.02960205078125, 0.0294036865234375, 0.0011034011840820312, -0.00878143310546875, -0.00919342041015625, -0.0604248046875, 0.006366729736328125, -0.03558349609375, 0.03515625, 0.032318115234375, 0.078125, -0.009185791015625, -0.031463623046875, -0.0269317626953125, -0.01114654541015625, 0.0263824462890625, 0.006694793701171875, 0.007415771484375, 0.0176544189453125, 0.035919189453125, -0.01041412353515625, -0.042633056640625, -0.0233612060546875, -0.00490570068359375, 0.002384185791015625, -0.01407623291015625, -0.0308837890625, -0.00452423095703125, 0.004390716552734375, 0.042572021484375, -0.045867919921875, -0.01184844970703125, -0.049224853515625, -0.0218963623046875, 0.048675537109375, -0.01351165771484375, 0.0870361328125, -0.019012451171875, -0.0264739990234375, -0.001552581787109375, -0.0259857177734375, -0.003154754638671875, 0.038421630859375, 0.007259368896484375, -0.023101806640625, 0.0472412109375, 0.0210418701171875, 0.0269927978515625, 0.058929443359375, -0.000034928321838378906, 0.05621337890625, 0.00989532470703125, -0.0196685791015625, 0.012847900390625, 0.07476806640625, 0.040985107421875, 0.01091766357421875, -0.00807952880859375, 0.0116119384765625, 0.006244659423828125, -0.00107574462890625, -0.05859375, -0.04095458984375, -0.012603759765625, -0.044342041015625, -0.051025390625, -0.003292083740234375, -0.047943115234375, -0.0245208740234375, -0.006786346435546875, 0.016754150390625, -0.0214691162109375, -0.0296783447265625, -0.011627197265625, -0.0369873046875, 0.024658203125, 0.0243682861328125, -0.0640869140625, 0.013641357421875, 0.0212860107421875, 0.051055908203125, 0.037078857421875, -0.022186279296875, -0.01255035400390625, -0.01345062255859375, -0.036529541015625, 0.057098388671875, -0.006313323974609375, -0.033111572265625, -0.01520538330078125, 0.023223876953125, -0.01172637939453125, -0.033050537109375, 0.043365478515625, -0.012481689453125, 0.048004150390625, -0.01541900634765625, -0.03375244140625, -0.021148681640625, 0.0017976760864257812, -0.039764404296875, 0.06463623046875, 0.022552490234375, -0.07208251953125, 0.0309600830078125, -0.06341552734375, 0.0022735595703125, 0.030517578125, -0.0257720947265625, -0.049560546875, -0.0098876953125, -0.0164947509765625, 0.035858154296875, -0.010498046875, 0.019744873046875, -0.048553466796875, -0.00870513916015625, 0.01358795166015625, 0.0284423828125, 0.07598876953125, 0.021392822265625, -0.0267333984375, 0.022674560546875, -0.02789306640625, -0.006893157958984375, 0.037872314453125, 0.016143798828125, -0.01824951171875, -0.03662109375, 0.00989532470703125, 0.01146697998046875, 0.00958251953125, -0.058502197265625, 0.049041748046875, 0.0165252685546875, 0.02191162109375, 0.04376220703125, -0.0200653076171875, 0.0537109375, -0.016998291015625, 0.05780029296875, -0.016082763671875, 0.0214996337890625, 0.014312744140625, -0.042327880859375, -0.0469970703125, -0.04937744140625, 0.0196533203125, 0.035003662109375, -0.040924072265625, 0.083740234375, 0.008453369140625, -0.047088623046875, -0.0205535888671875, 0.0135650634765625, 0.0308074951171875, 0.037811279296875, -0.00026416778564453125, -0.0391845703125, -0.058685302734375, -0.0654296875, 0.021392822265625, -0.041748046875, 0.00823974609375, 0.0176239013671875, 0.054351806640625, -0.034423828125, 0.058319091796875, -0.04473876953125, -0.0268096923828125, 0.0035877227783203125, 0.04193115234375, -0.0004508495330810547, 0.061431884765625, 0.07366943359375, -0.051483154296875, -0.0042724609375, -0.0249786376953125, -0.05963134765625, -0.027374267578125, 0.02496337890625, -0.0421142578125, 0.007266998291015625, 0.04254150390625, -0.0247802734375, 0.042999267578125, 0.035552978515625, -0.05950927734375, 0.041412353515625, -0.0179290771484375, 0.01367950439453125, -0.0625, 0.0014629364013671875, 0.01125335693359375, -0.0462646484375, -0.0208740234375, 0.01346588134765625, 0.020599365234375, -0.014556884765625, -0.041015625, 0.03717041015625, -0.038177490234375, -0.007740020751953125, -0.023040771484375, -0.0249481201171875, -0.026458740234375, -0.006412506103515625, 0.007709503173828125, 0.0197906494140625, 0.06463623046875, -0.032073974609375, 0.0594482421875, 0.055267333984375, -0.01401519775390625, 0.04522705078125, -0.054168701171875, 0.0137786865234375, -0.020050048828125, 0.039825439453125, -0.077392578125, -0.043182373046875, 0.061920166015625, -0.048828125, 0.037506103515625, -0.020843505859375, -0.03619384765625, -0.047149658203125, -0.0325927734375, 0.040679931640625, 0.0654296875, -0.061004638671875, 0.0308380126953125, 0.028076171875, 0.003665924072265625, -0.031585693359375, -0.04931640625, 0.01043701171875, -0.027069091796875, -0.03961181640625, 0.00690460205078125, -0.01560211181640625, 0.00965118408203125, 0.0180206298828125, 0.0181884765625, -0.0214996337890625, -0.00952911376953125, 0.0439453125, 0.041748046875, 0.01210784912109375, 0.0177459716796875, 0.00015997886657714844, -0.0128936767578125, -0.0029239654541015625, -0.0036716461181640625, 0.0450439453125, -0.053253173828125, -0.01450347900390625, -0.040557861328125, 0.01462554931640625, 0.032867431640625, 0.0127410888671875, 0.0228424072265625, 0.06781005859375, -0.032562255859375, -0.025054931640625, -0.031768798828125, 0.0018472671508789062, -0.035064697265625, -0.0014810562133789062, -0.035003662109375, -0.047943115234375, 0.035919189453125, 0.005584716796875, -0.025543212890625, 0.0380859375, 0.0340576171875, -0.02783203125, 0.07159423828125, 0.05865478515625, -0.01433563232421875, 0.026336669921875, -0.0227813720703125, -0.019775390625, -0.04437255859375, -0.0186309814453125, -0.0304412841796875, -0.037384033203125, -0.07867431640625, -0.0186309814453125, 0.005859375, -0.0019989013671875, -0.02850341796875, 0.05426025390625, -0.057769775390625, 0.0465087890625, 0.052703857421875, 0.0234222412109375, -0.006256103515625, -0.019287109375, 0.01125335693359375, 0.005950927734375, -0.04010009765625, -0.0220947265625, 0.05352783203125, 0.046417236328125, 0.07415771484375, -0.008697509765625, 0.08251953125, 0.00983428955078125, 0.03375244140625, -0.051788330078125, 0.0494384765625, -0.0191497802734375, -0.078125, 0.006999969482421875, -0.0291290283203125, -0.085205078125, -0.01291656494140625, -0.007709503173828125, -0.04705810546875, 0.0249786376953125, 0.02783203125, 0.0142059326171875, 0.0176544189453125, -0.0460205078125, 0.06585693359375, -0.0169677734375, 0.009124755859375, 0.0214080810546875, -0.04840087890625, 0.044281005859375, 0.0178070068359375, -0.002262115478515625, -0.03411865234375, -0.00641632080078125, 0.036468505859375, -0.0160369873046875, 0.0738525390625, -0.0242156982421875, 0.005828857421875, 0.0023860931396484375, 0.00757598876953125, -0.0264739990234375, -0.0118255615234375, 0.042510986328125, 0.0176544189453125, -0.0008592605590820312, -0.033172607421875, -0.0247802734375, 0.046539306640625, -0.0576171875, -0.0022373199462890625, -0.05499267578125, -0.01079559326171875, 0.01763916015625, -0.00469970703125, 0.046295166015625, 0.0074310302734375, -0.01079559326171875, 0.002376556396484375, 0.039581298828125, -0.024261474609375, 0.0251312255859375, 0.00234222412109375, -0.05517578125, -0.043121337890625, 0.048126220703125, -0.0046539306640625, -0.015869140625, -0.00148773193359375, 0.037322998046875, -0.043670654296875, -0.01102447509765625, -0.038848876953125, 0.0211181640625, -0.045928955078125, -0.01110076904296875, -0.01261138916015625, -0.005207061767578125, -0.03955078125, -0.024566650390625, -0.0171356201171875, -0.0169219970703125, -0.0374755859375, -0.00946807861328125, 0.0280609130859375, 0.0239410400390625, -0.0137786865234375, 0.00920867919921875, -0.04461669921875, 0.05523681640625, 0.0465087890625, 0.027862548828125, -0.03466796875, -0.02880859375, 0.04278564453125, -0.0350341796875, -0.02984619140625, -0.06402587890625, 0.051971435546875, 0.040191650390625, 0.03125, 0.04071044921875, -0.005840301513671875, 0.05792236328125, -0.02264404296875, 0.0256500244140625, 0.0577392578125, -0.07958984375, 0.0574951171875, -0.0341796875, -0.0029048919677734375, 0.04071044921875, 0.03228759765625, -0.026123046875, -0.0220794677734375, -0.06964111328125, -0.043365478515625, 0.037811279296875, 0.025146484375, 0.00690460205078125, 0.0030517578125, 0.03948974609375, 0.01409912109375, 0.0222015380859375, -0.0655517578125, -0.025726318359375, -0.0098114013671875, -0.042816162109375, 0.015655517578125, -0.00494384765625, -0.0213470458984375, -0.044403076171875, 0.046722412109375, -0.0227508544921875, 0.032684326171875, 0.0164337158203125, -0.00909423828125, -0.032623291015625, -0.017333984375, 0.00948333740234375, 0.0175018310546875, -0.0268707275390625, 0.0083465576171875, -0.00701904296875, -0.02117919921875, 0.036224365234375, 0.01422119140625, -0.028961181640625, 0.0164031982421875, 0.008636474609375, 0.047943115234375, 0.006130218505859375, -0.0174102783203125, 0.0386962890625, -0.050445556640625, -0.0312042236328125, -0.040802001953125, 0.0201416015625, 0.0020809173583984375, 0.0204620361328125, 0.0144805908203125, 0.00794219970703125, 0.02960205078125, -0.058746337890625, 0.005855560302734375, 0.0242919921875, -0.009552001953125, -0.0176239013671875, 0.046875, 0.027557373046875, -0.020721435546875, 0.04229736328125, -0.06280517578125, -0.059173583984375, 0.06634521484375, 0.03436279296875, 0.08905029296875, -0.02740478515625, 0.0450439453125, 0.042999267578125, 0.0162811279296875, -0.006191253662109375, 0.045379638671875, -0.00904083251953125, -0.055999755859375, -0.0017232894897460938, -0.034912109375, -0.0338134765625, 0.004283905029296875, -0.0308837890625, 0.047393798828125, -0.0469970703125, -0.022674560546875, 0.01059722900390625, -0.007049560546875, -0.048583984375, 0.032135009765625, 0.045745849609375, 0.055938720703125, -0.07464599609375, 0.033721923828125, 0.06488037109375, -0.0268096923828125, -0.036956787109375, -0.01294708251953125, 0.020538330078125, -0.05877685546875, 0.07501220703125, 0.04156494140625, -0.0084381103515625, -0.0128173828125, -0.07269287109375, -0.0338134765625, 0.069580078125, 0.0306549072265625, -0.024627685546875, -0.003810882568359375, -0.046630859375, 0.023651123046875, -0.038299560546875, 0.0303955078125, 0.0122833251953125, 0.0277862548828125, 0.01004791259765625, -0.08453369140625, -0.007472991943359375, -0.042327880859375, 0.0188140869140625, 0.01279449462890625, -0.044189453125, 0.049713134765625, -0.024078369140625, -0.0142669677734375, 0.0189666748046875, 0.0489501953125, 0.0184173583984375, 0.0389404296875, 0.0758056640625, 0.045135498046875, 0.044342041015625, -0.028839111328125, 0.042144775390625, 0.00560760498046875, 0.041168212890625, 0.0845947265625, 0.01248931884765625, 0.01146697998046875, -0.00029468536376953125, -0.020477294921875, 0.053009033203125, 0.07958984375, -0.041961669921875, 0.0082550048828125, 0.026336669921875, -0.019561767578125, -0.00482177734375, 0.004451751708984375, -0.033172607421875, 0.0513916015625, 0.006580352783203125, -0.048187255859375, -0.0025234222412109375, 0.01532745361328125, 0.00269317626953125, -0.005290985107421875, -0.021636962890625, 0.0540771484375, -0.0142974853515625, -0.0160980224609375, 0.052215576171875, -0.018310546875, 0.06805419921875, -0.0355224609375, 0.005634307861328125, 0.0166473388671875, -0.0008788108825683594, -0.03326416015625, -0.0498046875, 0.0051422119140625, -0.007476806640625, 0.005924224853515625, -0.021820068359375, 0.0819091796875, -0.046051025390625, -0.04376220703125, 0.0205535888671875, 0.0238800048828125, 0.0232391357421875, 0.01483154296875, -0.07562255859375, -0.007175445556640625, -0.0164794921875, 0.0012025833129882812, 0.016754150390625, 0.01519775390625, 0.01013946533203125, 0.07000732421875, 0.030487060546875, 0.00750732421875, 0.005054473876953125, 0.003505706787109375, 0.03997802734375, -0.041259765625, -0.031402587890625, -0.04449462890625, 0.0027370452880859375, -0.0121917724609375, -0.0241546630859375, 0.057830810546875, 0.036346435546875, 0.05413818359375, -0.04461669921875, 0.05389404296875, -0.052764892578125, 0.0185089111328125, 0.0109710693359375, 0.03955078125, -0.040130615234375, -0.047821044921875, -0.040069580078125, -0.037506103515625, -0.022186279296875, 0.05755615234375, 0.000022351741790771484, 0.006679534912109375, 0.03887939453125, 0.039703369140625, -0.00771331787109375, -0.012451171875, -0.0004780292510986328, -0.02294921875, 0.0058441162109375, 0.02294921875, 0.07720947265625, -0.0272216796875, 0.011322021484375, -0.052520751953125, -0.041961669921875, -0.019378662109375, -0.08612060546875, -0.065185546875, -0.051788330078125, -0.07220458984375, -0.025665283203125, -0.0169677734375, 0.0889892578125, 0.0970458984375, -0.05523681640625, -0.0216827392578125, -0.031646728515625, 0.004436492919921875, 0.0169525146484375, -0.0198211669921875, 0.0208282470703125, -0.00949859619140625, -0.08123779296875, -0.0154266357421875, -0.004364013671875, 0.044830322265625, -0.0005855560302734375, 0.00974273681640625, 0.007762908935546875, -0.03515625, 0.026763916015625, 0.04486083984375, -0.033843994140625, -0.0174102783203125, -0.0288543701171875, 0.00565338134765625, 0.00937652587890625, 0.03350830078125, -0.0277099609375, 0.038421630859375, 0.0592041015625, 0.0096588134765625, 0.0062103271484375, 0.0172119140625, 0.02862548828125, -0.0521240234375, 0.0115814208984375, 0.00888824462890625, 0.042144775390625, 0.035888671875, -0.032623291015625, 0.0457763671875, 0.044891357421875, -0.06072998046875, -0.053009033203125, 0.01003265380859375, -0.0897216796875, -0.01270294189453125, 0.0643310546875, -0.007312774658203125, -0.032989501953125, 0.0065460205078125, -0.05291748046875, 0.03729248046875, -0.04449462890625, 0.07025146484375, 0.0275115966796875, -0.029388427734375, -0.0240478515625, -0.0018711090087890625, 0.018829345703125, -0.01203155517578125, -0.0828857421875, -0.032073974609375, 0.0433349609375, 0.037811279296875, 0.033599853515625, 0.031524658203125, -0.0203704833984375, 0.03900146484375, -0.005466461181640625, 0.0589599609375, -0.004016876220703125, -0.004146575927734375, -0.032745361328125, 0.007801055908203125, -0.006000518798828125, -0.06488037109375 ] ]
allenai/wmt19-de-en-6-6-big
2023-01-24T16:28:51.000Z
[ "transformers", "pytorch", "fsmt", "text2text-generation", "translation", "wmt19", "allenai", "de", "en", "dataset:wmt19", "arxiv:2006.10369", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
translation
allenai
null
null
allenai/wmt19-de-en-6-6-big
3
200,429
transformers
2022-03-02T23:29:05
--- language: - de - en thumbnail: tags: - translation - wmt19 - allenai license: apache-2.0 datasets: - wmt19 metrics: - bleu --- # FSMT ## Model description This is a ported version of fairseq-based [wmt19 transformer](https://github.com/jungokasai/deep-shallow/) for de-en. For more details, please, see [Deep Encoder, Shallow Decoder: Reevaluating the Speed-Quality Tradeoff in Machine Translation](https://arxiv.org/abs/2006.10369). 2 models are available: * [wmt19-de-en-6-6-big](https://huggingface.co/allenai/wmt19-de-en-6-6-big) * [wmt19-de-en-6-6-base](https://huggingface.co/allenai/wmt19-de-en-6-6-base) ## Intended uses & limitations #### How to use ```python from transformers import FSMTForConditionalGeneration, FSMTTokenizer mname = "allenai/wmt19-de-en-6-6-big" tokenizer = FSMTTokenizer.from_pretrained(mname) model = FSMTForConditionalGeneration.from_pretrained(mname) input = "Maschinelles Lernen ist großartig, nicht wahr?" input_ids = tokenizer.encode(input, return_tensors="pt") outputs = model.generate(input_ids) decoded = tokenizer.decode(outputs[0], skip_special_tokens=True) print(decoded) # Machine learning is great, isn't it? ``` #### Limitations and bias ## Training data Pretrained weights were left identical to the original model released by allenai. For more details, please, see the [paper](https://arxiv.org/abs/2006.10369). ## Eval results Here are the BLEU scores: model | transformers -------|--------- wmt19-de-en-6-6-big | 39.9 The score was calculated using this code: ```bash git clone https://github.com/huggingface/transformers cd transformers export PAIR=de-en export DATA_DIR=data/$PAIR export SAVE_DIR=data/$PAIR export BS=8 export NUM_BEAMS=5 mkdir -p $DATA_DIR sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target echo $PAIR PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py allenai/wmt19-de-en-6-6-big $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS ``` ## Data Sources - [training, etc.](http://www.statmt.org/wmt19/) - [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561) ### BibTeX entry and citation info ``` @misc{kasai2020deep, title={Deep Encoder, Shallow Decoder: Reevaluating the Speed-Quality Tradeoff in Machine Translation}, author={Jungo Kasai and Nikolaos Pappas and Hao Peng and James Cross and Noah A. Smith}, year={2020}, eprint={2006.10369}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
2,667
[ [ -0.028289794921875, -0.0472412109375, 0.01442718505859375, 0.0235443115234375, -0.020721435546875, -0.01043701171875, -0.0306243896484375, -0.022735595703125, 0.0058441162109375, 0.007648468017578125, -0.047607421875, -0.0206451416015625, -0.066162109375, 0.0254058837890625, -0.0191802978515625, 0.06396484375, -0.017425537109375, 0.0187225341796875, 0.0169830322265625, -0.0028743743896484375, -0.0197296142578125, -0.01129913330078125, -0.038055419921875, -0.037353515625, 0.018310546875, 0.01312255859375, 0.059234619140625, 0.0264129638671875, 0.059234619140625, 0.031768798828125, -0.0160064697265625, 0.0024776458740234375, -0.032745361328125, -0.01348876953125, -0.0051116943359375, -0.0222320556640625, -0.03973388671875, -0.00678253173828125, 0.04754638671875, 0.050994873046875, 0.0092620849609375, 0.0293426513671875, 0.002841949462890625, 0.044677734375, -0.0243988037109375, -0.0004627704620361328, -0.041717529296875, 0.01297760009765625, -0.00791168212890625, -0.0219268798828125, -0.040191650390625, -0.0153350830078125, -0.00458526611328125, -0.0225067138671875, 0.01873779296875, -0.0115966796875, 0.10150146484375, 0.034576416015625, -0.0416259765625, 0.00838470458984375, -0.049652099609375, 0.06573486328125, -0.075439453125, 0.05780029296875, 0.016693115234375, 0.0228729248046875, -0.004962921142578125, -0.05596923828125, -0.0231170654296875, -0.005428314208984375, -0.0193023681640625, 0.0235748291015625, -0.0250396728515625, -0.0089263916015625, 0.043304443359375, 0.052001953125, -0.0576171875, -0.003398895263671875, -0.054290771484375, -0.0240478515625, 0.060272216796875, 0.005889892578125, -0.005474090576171875, -0.0157318115234375, -0.03424072265625, -0.0241241455078125, -0.025909423828125, 0.006694793701171875, 0.01526641845703125, 0.01470947265625, -0.0277099609375, 0.050994873046875, -0.027923583984375, 0.038604736328125, 0.0236663818359375, 0.00098419189453125, 0.06719970703125, -0.0302581787109375, -0.020416259765625, -0.0013275146484375, 0.08648681640625, 0.027923583984375, 0.01033782958984375, -0.0129852294921875, -0.03173828125, -0.01119232177734375, 0.0148162841796875, -0.09503173828125, 0.0060577392578125, 0.00942230224609375, -0.04412841796875, -0.00429534912109375, 0.0239105224609375, -0.052032470703125, 0.016387939453125, -0.0200653076171875, 0.04974365234375, -0.039794921875, -0.0080718994140625, 0.012420654296875, -0.0143585205078125, 0.0330810546875, 0.004314422607421875, -0.044525146484375, 0.0093841552734375, 0.0184783935546875, 0.0723876953125, 0.0083770751953125, -0.038543701171875, -0.0296173095703125, -0.01033782958984375, -0.0213623046875, 0.0240020751953125, -0.0145263671875, -0.0230712890625, -0.0168304443359375, 0.033416748046875, -0.005146026611328125, -0.0167083740234375, 0.055694580078125, -0.0328369140625, 0.0257415771484375, -0.01165008544921875, -0.030731201171875, -0.012603759765625, 0.004894256591796875, -0.037078857421875, 0.08233642578125, 0.03631591796875, -0.0528564453125, 0.0110321044921875, -0.044586181640625, -0.043670654296875, -0.0033588409423828125, 0.018463134765625, -0.031494140625, 0.022796630859375, 0.01357269287109375, 0.03253173828125, -0.02880859375, 0.030303955078125, -0.01274871826171875, -0.06158447265625, 0.0222625732421875, -0.058807373046875, 0.06536865234375, 0.0309906005859375, -0.036712646484375, 0.004425048828125, -0.0526123046875, -0.0023746490478515625, 0.0212860107421875, -0.035858154296875, -0.0012407302856445312, -0.030792236328125, 0.0021038055419921875, 0.03228759765625, 0.029052734375, -0.0421142578125, 0.0098419189453125, -0.053131103515625, 0.035888671875, 0.051849365234375, -0.00907135009765625, 0.02813720703125, -0.0184326171875, 0.0282440185546875, 0.004375457763671875, 0.0211334228515625, 0.00220489501953125, -0.039459228515625, -0.0576171875, -0.011932373046875, 0.031463623046875, 0.0345458984375, -0.057525634765625, 0.055206298828125, -0.042572021484375, -0.06396484375, -0.06488037109375, -0.015472412109375, 0.026580810546875, 0.0236663818359375, 0.061981201171875, -0.0174407958984375, -0.04876708984375, -0.07330322265625, -0.018280029296875, 0.019683837890625, -0.008056640625, -0.00757598876953125, 0.05560302734375, -0.049072265625, 0.027923583984375, -0.03826904296875, -0.016876220703125, -0.0171661376953125, -0.00738525390625, 0.051513671875, 0.060272216796875, 0.037139892578125, -0.042266845703125, -0.03302001953125, -0.01003265380859375, -0.052764892578125, -0.0044403076171875, 0.00392913818359375, -0.028045654296875, 0.0137939453125, 0.018157958984375, -0.07098388671875, 0.02252197265625, 0.036865234375, -0.045074462890625, 0.03570556640625, -0.00992584228515625, 0.03369140625, -0.10308837890625, 0.0221405029296875, 0.004161834716796875, -0.0097503662109375, -0.036712646484375, 0.0012521743774414062, 0.00807952880859375, -0.0029296875, -0.04638671875, 0.042999267578125, -0.02532958984375, -0.0008859634399414062, -0.0007853507995605469, -0.00955963134765625, 0.0168304443359375, 0.05517578125, -0.0142974853515625, 0.0460205078125, 0.0287628173828125, -0.0297088623046875, 0.01776123046875, 0.039459228515625, -0.015228271484375, 0.0216064453125, -0.06396484375, -0.0007061958312988281, 0.007038116455078125, 0.019775390625, -0.05242919921875, 0.003681182861328125, 0.0271453857421875, -0.052001953125, 0.03338623046875, -0.02508544921875, -0.03411865234375, -0.043060302734375, -0.0234375, 0.0287628173828125, 0.059051513671875, -0.0265655517578125, 0.027313232421875, -0.001430511474609375, 0.0112152099609375, -0.05438232421875, -0.08758544921875, -0.0205230712890625, -0.006183624267578125, -0.03411865234375, 0.037353515625, -0.01311492919921875, 0.0124053955078125, 0.007232666015625, -0.02508544921875, -0.000020682811737060547, -0.001983642578125, 0.0175018310546875, 0.01549530029296875, -0.0230712890625, -0.01861572265625, 0.0233612060546875, -0.0237579345703125, 0.01149749755859375, -0.025421142578125, 0.05108642578125, -0.024322509765625, -0.01080322265625, -0.04693603515625, 0.009185791015625, 0.040679931640625, -0.03204345703125, 0.055145263671875, 0.0908203125, -0.034942626953125, 0.007724761962890625, -0.0179290771484375, -0.0246429443359375, -0.042694091796875, 0.03759765625, -0.04888916015625, -0.06146240234375, 0.02734375, 0.010406494140625, 0.0031986236572265625, 0.06549072265625, 0.0526123046875, 0.00948333740234375, 0.084228515625, 0.007404327392578125, 0.007450103759765625, 0.03125, -0.0645751953125, 0.01495361328125, -0.05914306640625, -0.0141448974609375, -0.032318115234375, -0.048828125, -0.052093505859375, -0.037933349609375, -0.004444122314453125, 0.0205078125, -0.044525146484375, 0.042572021484375, -0.045654296875, 0.01490020751953125, 0.036407470703125, 0.0003008842468261719, 0.0073394775390625, -0.00665283203125, -0.0087127685546875, -0.01470184326171875, -0.038604736328125, -0.0191650390625, 0.083251953125, 0.03668212890625, 0.056640625, 0.009429931640625, 0.055511474609375, 0.0117950439453125, 0.009613037109375, -0.058258056640625, 0.034820556640625, -0.0160369873046875, -0.030853271484375, -0.00977325439453125, -0.06341552734375, -0.07464599609375, 0.030975341796875, -0.008270263671875, -0.03955078125, 0.00777435302734375, -0.004352569580078125, -0.00003737211227416992, 0.0178985595703125, -0.05316162109375, 0.08465576171875, -0.01351165771484375, -0.0193023681640625, -0.0050811767578125, -0.045257568359375, 0.0233001708984375, 0.004474639892578125, 0.0079803466796875, -0.00891876220703125, 0.020233154296875, 0.06512451171875, -0.03094482421875, 0.0361328125, -0.0273895263671875, -0.00772857666015625, 0.0140228271484375, 0.015472412109375, 0.035858154296875, 0.01104736328125, -0.0170135498046875, 0.042633056640625, 0.0110321044921875, -0.03424072265625, -0.028839111328125, 0.060791015625, -0.0692138671875, -0.0292510986328125, -0.032073974609375, -0.0457763671875, 0.00826263427734375, 0.035125732421875, 0.040496826171875, 0.04412841796875, 0.00623321533203125, 0.033111572265625, 0.0251312255859375, -0.00457000732421875, 0.04046630859375, 0.030181884765625, 0.0003135204315185547, -0.04144287109375, 0.07220458984375, 0.00688934326171875, 0.005176544189453125, 0.04632568359375, 0.022430419921875, -0.038818359375, -0.038726806640625, -0.038726806640625, 0.0254058837890625, -0.045989990234375, -0.04473876953125, -0.040985107421875, -0.03204345703125, -0.029083251953125, -0.00830078125, -0.045135498046875, -0.037078857421875, -0.0018596649169921875, -0.0119171142578125, 0.0292205810546875, 0.0333251953125, -0.002349853515625, 0.01113128662109375, -0.07177734375, 0.017303466796875, -0.0079345703125, 0.029693603515625, -0.0022258758544921875, -0.0723876953125, -0.036895751953125, 0.02630615234375, -0.028564453125, -0.0667724609375, 0.03173828125, 0.0032978057861328125, 0.04547119140625, 0.0125885009765625, 0.01506805419921875, 0.0413818359375, -0.040740966796875, 0.06304931640625, 0.0009570121765136719, -0.07965087890625, 0.041168212890625, -0.005092620849609375, 0.0379638671875, 0.0384521484375, 0.0347900390625, -0.0303497314453125, -0.0281219482421875, -0.044586181640625, -0.0660400390625, 0.06842041015625, 0.031097412109375, 0.0035305023193359375, 0.01085662841796875, 0.006877899169921875, 0.0130157470703125, 0.00441741943359375, -0.05548095703125, -0.026031494140625, -0.0428466796875, -0.039031982421875, -0.024810791015625, -0.0030498504638671875, -0.00086212158203125, -0.03936767578125, 0.06390380859375, -0.01126861572265625, 0.049896240234375, 0.02813720703125, -0.03570556640625, -0.0092315673828125, 0.017669677734375, 0.040283203125, 0.032958984375, -0.02545166015625, 0.0142822265625, 0.032379150390625, -0.036773681640625, 0.01067352294921875, 0.0309906005859375, -0.0159149169921875, 0.01371002197265625, 0.03558349609375, 0.07781982421875, 0.0205841064453125, -0.033599853515625, 0.047149658203125, -0.003398895263671875, -0.03729248046875, -0.007171630859375, -0.0124359130859375, -0.001979827880859375, 0.032745361328125, 0.0289764404296875, 0.0229644775390625, 0.01461029052734375, -0.00963592529296875, 0.0207672119140625, 0.0165252685546875, -0.045013427734375, -0.046142578125, 0.0616455078125, 0.01187896728515625, -0.025909423828125, 0.03472900390625, -0.03314208984375, -0.03466796875, 0.031005859375, 0.043304443359375, 0.0623779296875, -0.01425933837890625, 0.00524139404296875, 0.056396484375, 0.03851318359375, -0.00853729248046875, 0.026824951171875, 0.0016326904296875, -0.0399169921875, -0.039947509765625, -0.072509765625, 0.0029811859130859375, 0.0148162841796875, -0.05499267578125, 0.0201263427734375, -0.005016326904296875, -0.017059326171875, -0.01456451416015625, 0.021270751953125, -0.071044921875, 0.0174102783203125, 0.0005717277526855469, 0.07708740234375, -0.055328369140625, 0.06683349609375, 0.0546875, -0.042938232421875, -0.04888916015625, 0.004367828369140625, -0.0219879150390625, -0.04010009765625, 0.032135009765625, 0.029541015625, 0.0019369125366210938, 0.01397705078125, -0.009124755859375, -0.07171630859375, 0.0927734375, 0.033935546875, -0.03936767578125, -0.00101470947265625, 0.00949859619140625, 0.044586181640625, 0.006591796875, 0.037261962890625, 0.038421630859375, 0.052459716796875, -0.0159759521484375, -0.06402587890625, 0.039703369140625, -0.03558349609375, 0.006561279296875, 0.022064208984375, -0.060455322265625, 0.0621337890625, -0.00931549072265625, -0.0015707015991210938, 0.003276824951171875, 0.046539306640625, 0.01385498046875, 0.007598876953125, 0.017791748046875, 0.04107666015625, 0.0277099609375, -0.028106689453125, 0.0673828125, -0.0228271484375, 0.057769775390625, 0.0517578125, 0.0119171142578125, 0.051300048828125, 0.038726806640625, -0.027984619140625, 0.02691650390625, 0.04046630859375, -0.03143310546875, 0.04302978515625, 0.0030841827392578125, 0.0298919677734375, -0.004291534423828125, 0.0097198486328125, -0.05889892578125, 0.032562255859375, 0.00733184814453125, -0.02972412109375, -0.0074005126953125, -0.005153656005859375, 0.00032258033752441406, -0.00616455078125, -0.004749298095703125, 0.0255889892578125, 0.0157012939453125, -0.038909912109375, 0.07666015625, 0.0179595947265625, 0.06768798828125, -0.05169677734375, 0.0040740966796875, -0.0292510986328125, 0.0261383056640625, -0.015869140625, -0.06134033203125, 0.036468505859375, -0.0029163360595703125, -0.012298583984375, -0.0223846435546875, 0.04052734375, -0.039520263671875, -0.045928955078125, 0.039398193359375, 0.03668212890625, 0.016754150390625, 0.0025882720947265625, -0.064453125, 0.01385498046875, 0.00844573974609375, -0.050079345703125, 0.013641357421875, 0.033935546875, 0.014190673828125, 0.027069091796875, 0.051116943359375, -0.01114654541015625, 0.008544921875, 0.01311492919921875, 0.05047607421875, -0.05816650390625, -0.01824951171875, -0.06768798828125, 0.04791259765625, 0.0024356842041015625, -0.04107666015625, 0.05694580078125, 0.05963134765625, 0.07244873046875, -0.0265655517578125, 0.0352783203125, -0.017486572265625, 0.004669189453125, -0.04364013671875, 0.06097412109375, -0.056884765625, -0.005218505859375, -0.0278472900390625, -0.0836181640625, 0.0001436471939086914, 0.045196533203125, -0.00727081298828125, 0.01483154296875, 0.05926513671875, 0.05657958984375, -0.0135955810546875, -0.01233673095703125, 0.01531219482421875, 0.031494140625, 0.024444580078125, 0.067138671875, 0.0513916015625, -0.0867919921875, 0.054840087890625, -0.037750244140625, 0.0001360177993774414, -0.024993896484375, -0.029052734375, -0.06787109375, -0.045379638671875, -0.0207366943359375, -0.03338623046875, -0.0092926025390625, 0.06524658203125, 0.04718017578125, -0.05670166015625, 0.0005803108215332031, 0.0015974044799804688, 0.005062103271484375, -0.038909912109375, -0.022735595703125, 0.0318603515625, -0.013641357421875, -0.0738525390625, 0.04541015625, -0.01024627685546875, 0.01305389404296875, 0.0012941360473632812, -0.043212890625, -0.0218963623046875, -0.005298614501953125, 0.04364013671875, -0.0089874267578125, -0.048828125, -0.0176849365234375, 0.0097198486328125, -0.020416259765625, 0.002079010009765625, 0.026885986328125, -0.040618896484375, 0.0006580352783203125, 0.05328369140625, 0.049285888671875, 0.054931640625, -0.00954437255859375, 0.0303497314453125, -0.06060791015625, 0.0296783447265625, 0.007213592529296875, 0.042510986328125, 0.025604248046875, -0.017059326171875, 0.045928955078125, 0.0347900390625, -0.03717041015625, -0.0765380859375, -0.0026912689208984375, -0.07666015625, -0.0190582275390625, 0.08258056640625, -0.0021533966064453125, -0.0139923095703125, 0.03192138671875, -0.0192718505859375, 0.042449951171875, -0.034576416015625, 0.027923583984375, 0.0231781005859375, 0.0211639404296875, -0.0022029876708984375, -0.0380859375, 0.02655029296875, 0.02545166015625, -0.039886474609375, -0.023193359375, 0.005950927734375, 0.0219573974609375, 0.0146484375, 0.03631591796875, -0.01462554931640625, 0.01026153564453125, 0.008544921875, 0.0139617919921875, -0.024871826171875, -0.00848388671875, -0.0233917236328125, -0.0115509033203125, -0.01165008544921875, -0.0244598388671875 ] ]
microsoft/layoutxlm-base
2022-09-16T03:41:38.000Z
[ "transformers", "pytorch", "layoutlmv2", "arxiv:2104.08836", "license:cc-by-nc-sa-4.0", "endpoints_compatible", "region:us" ]
null
microsoft
null
null
microsoft/layoutxlm-base
47
198,564
transformers
2022-03-02T23:29:05
--- license: cc-by-nc-sa-4.0 --- # LayoutXLM **Multimodal (text + layout/format + image) pre-training for document AI** LayoutXLM is a multilingual variant of LayoutLMv2. The documentation of this model in the Transformers library can be found [here](https://huggingface.co/docs/transformers/model_doc/layoutxlm). [Microsoft Document AI](https://www.microsoft.com/en-us/research/project/document-ai/) | [GitHub](https://github.com/microsoft/unilm/tree/master/layoutxlm) ## Introduction LayoutXLM is a multimodal pre-trained model for multilingual document understanding, which aims to bridge the language barriers for visually-rich document understanding. Experiment results show that it has significantly outperformed the existing SOTA cross-lingual pre-trained models on the XFUND dataset. [LayoutXLM: Multimodal Pre-training for Multilingual Visually-rich Document Understanding](https://arxiv.org/abs/2104.08836) Yiheng Xu, Tengchao Lv, Lei Cui, Guoxin Wang, Yijuan Lu, Dinei Florencio, Cha Zhang, Furu Wei, arXiv Preprint 2021
1,038
[ [ -0.0186767578125, -0.037322998046875, 0.037322998046875, 0.027740478515625, -0.0178680419921875, 0.0212249755859375, 0.01459503173828125, -0.0210113525390625, -0.00133514404296875, 0.031219482421875, -0.0516357421875, -0.045562744140625, -0.038604736328125, -0.0157623291015625, -0.0189971923828125, 0.06707763671875, -0.0267486572265625, 0.042236328125, -0.0298309326171875, -0.0146636962890625, -0.024444580078125, -0.0443115234375, -0.026824951171875, -0.01715087890625, 0.0347900390625, 0.00428009033203125, 0.049774169921875, 0.0299224853515625, 0.03851318359375, 0.024871826171875, 0.0028324127197265625, 0.01087188720703125, -0.022186279296875, -0.01262664794921875, 0.0038623809814453125, -0.0308380126953125, -0.044036865234375, 0.00579833984375, 0.05450439453125, 0.043121337890625, 0.0010204315185546875, -0.0003440380096435547, -0.0014371871948242188, 0.034423828125, -0.035369873046875, 0.0105438232421875, -0.01084136962890625, 0.02545166015625, -0.0220184326171875, -0.011383056640625, -0.042236328125, -0.01186370849609375, -0.0006885528564453125, -0.06219482421875, -0.0028247833251953125, 0.037933349609375, 0.0645751953125, 0.005001068115234375, -0.050689697265625, -0.0181427001953125, -0.0341796875, 0.057525634765625, -0.034088134765625, 0.0634765625, 0.03460693359375, 0.0166168212890625, 0.012237548828125, -0.07177734375, -0.04034423828125, -0.0114593505859375, -0.037628173828125, 0.0240325927734375, 0.0021991729736328125, -0.0007443428039550781, 0.0261688232421875, 0.01995849609375, -0.086181640625, 0.0022678375244140625, -0.0276336669921875, -0.03192138671875, 0.043792724609375, 0.008453369140625, 0.05029296875, -0.0147705078125, -0.044769287109375, -0.0230865478515625, -0.016357421875, -0.0006303787231445312, 0.0243682861328125, -0.00191497802734375, -0.03607177734375, 0.01540374755859375, 0.0298919677734375, 0.04693603515625, -0.032806396484375, -0.025604248046875, 0.040924072265625, -0.03851318359375, -0.0300750732421875, -0.0074462890625, 0.0540771484375, 0.00554656982421875, -0.007610321044921875, -0.00928497314453125, -0.0222320556640625, -0.005870819091796875, 0.0158538818359375, -0.04095458984375, -0.0200042724609375, 0.00472259521484375, -0.054779052734375, 0.0036296844482421875, 0.0033111572265625, -0.035186767578125, -0.017974853515625, -0.037078857421875, 0.038330078125, -0.0552978515625, -0.024200439453125, 0.002010345458984375, -0.0176544189453125, 0.021087646484375, 0.04583740234375, -0.032806396484375, 0.0027027130126953125, 0.02984619140625, 0.059051513671875, -0.0264129638671875, -0.05224609375, -0.029754638671875, 0.0037136077880859375, -0.0205841064453125, 0.0635986328125, -0.0281219482421875, -0.04791259765625, 0.00327301025390625, 0.006862640380859375, -0.01666259765625, -0.0214996337890625, 0.05108642578125, -0.045806884765625, 0.055755615234375, 0.005252838134765625, -0.025115966796875, -0.0126800537109375, 0.033050537109375, -0.057342529296875, 0.0732421875, 0.02490234375, -0.05926513671875, 0.02850341796875, -0.060028076171875, -0.037506103515625, 0.0003561973571777344, -0.01194000244140625, -0.039337158203125, -0.00936126708984375, -0.0033855438232421875, 0.00687408447265625, 0.01068878173828125, -0.0003688335418701172, -0.01084136962890625, -0.0006847381591796875, -0.007213592529296875, -0.0285186767578125, 0.069580078125, 0.032623291015625, -0.0021514892578125, 0.039764404296875, -0.07781982421875, 0.0121612548828125, 0.0202484130859375, -0.0219268798828125, -0.040374755859375, -0.0246124267578125, 0.04608154296875, 0.043243408203125, 0.0220794677734375, -0.027252197265625, 0.014190673828125, -0.01214599609375, 0.005413055419921875, 0.043304443359375, -0.0438232421875, 0.053497314453125, -0.02020263671875, 0.051300048828125, -0.002155303955078125, 0.0211181640625, -0.05206298828125, -0.0234832763671875, -0.035980224609375, -0.0237579345703125, 0.009185791015625, 0.044525146484375, -0.06402587890625, 0.00637054443359375, -0.01169586181640625, -0.032012939453125, -0.0264739990234375, 0.0103912353515625, 0.06719970703125, 0.0240325927734375, 0.028900146484375, -0.010223388671875, -0.0506591796875, -0.051483154296875, -0.0080718994140625, 0.00787353515625, 0.01617431640625, 0.01129913330078125, 0.038909912109375, -0.024688720703125, 0.04632568359375, -0.0245819091796875, -0.034210205078125, -0.026641845703125, 0.0185699462890625, -0.0073089599609375, 0.03741455078125, 0.05865478515625, -0.0831298828125, -0.060302734375, 0.007640838623046875, -0.05780029296875, 0.01129150390625, -0.007965087890625, -0.024566650390625, 0.0345458984375, 0.037109375, -0.045562744140625, 0.050079345703125, 0.055755615234375, -0.04736328125, 0.044921875, -0.039459228515625, -0.00733184814453125, -0.0933837890625, -0.00322723388671875, -0.011688232421875, -0.0296783447265625, -0.041961669921875, 0.0118865966796875, 0.0430908203125, -0.016021728515625, -0.047607421875, 0.068359375, -0.061553955078125, -0.0178680419921875, -0.00757598876953125, 0.0036106109619140625, 0.033721923828125, 0.04010009765625, 0.01490020751953125, 0.054351806640625, 0.0233306884765625, -0.0081939697265625, 0.007610321044921875, 0.048797607421875, -0.029754638671875, 0.04010009765625, -0.0225830078125, 0.01326751708984375, -0.016387939453125, 0.035491943359375, -0.08673095703125, -0.0095062255859375, 0.00109100341796875, -0.01393890380859375, 0.038116455078125, 0.01496124267578125, -0.040863037109375, -0.035552978515625, -0.0258026123046875, 0.033660888671875, 0.0186614990234375, -0.040008544921875, 0.0625, 0.01226043701171875, -0.0008730888366699219, -0.032989501953125, -0.05419921875, -0.00626373291015625, 0.00637054443359375, -0.06353759765625, 0.0308990478515625, -0.0194549560546875, -0.01290130615234375, -0.007843017578125, 0.006710052490234375, -0.00417327880859375, -0.0007829666137695312, 0.0201416015625, 0.02947998046875, -0.021209716796875, 0.005889892578125, -0.000020384788513183594, -0.01540374755859375, -0.00682830810546875, -0.0166778564453125, 0.0657958984375, 0.0023441314697265625, -0.04913330078125, -0.033599853515625, 0.05438232421875, 0.043487548828125, -0.035675048828125, 0.050506591796875, 0.06591796875, -0.01447296142578125, 0.006099700927734375, -0.0248565673828125, 0.01125335693359375, -0.032867431640625, 0.042236328125, -0.0300750732421875, -0.04608154296875, 0.030364990234375, 0.01274871826171875, 0.0091705322265625, 0.026885986328125, 0.0343017578125, -0.01837158203125, 0.0955810546875, 0.059906005859375, 0.01611328125, 0.04425048828125, -0.0258941650390625, 0.007236480712890625, -0.052032470703125, -0.04718017578125, -0.0345458984375, -0.024444580078125, -0.0170745849609375, -0.0350341796875, 0.0155792236328125, -0.00493621826171875, -0.021728515625, 0.010040283203125, -0.021728515625, 0.0125274658203125, 0.056640625, -0.0175933837890625, 0.0249176025390625, 0.008056640625, -0.024169921875, -0.010894775390625, -0.03765869140625, -0.038604736328125, 0.053375244140625, 0.020172119140625, 0.07061767578125, 0.012237548828125, 0.035369873046875, 0.02105712890625, 0.020599365234375, -0.042755126953125, 0.02349853515625, -0.0126190185546875, -0.0445556640625, -0.0233612060546875, -0.005420684814453125, -0.07275390625, 0.01409149169921875, -0.0118255615234375, -0.0421142578125, 0.0017766952514648438, 0.012847900390625, -0.009521484375, 0.0300750732421875, -0.07440185546875, 0.07666015625, -0.056365966796875, -0.00994110107421875, 0.0021381378173828125, -0.05078125, 0.01371002197265625, -0.0198516845703125, 0.0286712646484375, 0.00951385498046875, 0.01190185546875, 0.0484619140625, -0.03387451171875, 0.0421142578125, -0.0158233642578125, -0.0223541259765625, -0.016510009765625, 0.0038852691650390625, 0.042816162109375, -0.00146484375, 0.0005674362182617188, 0.0156707763671875, 0.007762908935546875, -0.032379150390625, -0.048370361328125, 0.035919189453125, -0.0821533203125, -0.0350341796875, -0.02996826171875, -0.0589599609375, -0.007732391357421875, 0.041534423828125, 0.039276123046875, 0.0350341796875, -0.0168304443359375, 0.007709503173828125, 0.0411376953125, -0.032196044921875, 0.031646728515625, 0.0380859375, -0.050079345703125, -0.0222320556640625, 0.058746337890625, 0.014678955078125, 0.0012760162353515625, 0.055908203125, 0.001598358154296875, -0.041473388671875, -0.033721923828125, -0.03802490234375, 0.007488250732421875, -0.054229736328125, -0.0004508495330810547, -0.082275390625, -0.034881591796875, -0.041168212890625, -0.0219879150390625, -0.01611328125, -0.019775390625, -0.0152435302734375, -0.004711151123046875, -0.004642486572265625, 0.04913330078125, 0.00791168212890625, 0.033355712890625, -0.059295654296875, 0.024566650390625, 0.01541900634765625, 0.0230255126953125, -0.0136871337890625, -0.04266357421875, -0.0160369873046875, 0.0008788108825683594, -0.032470703125, -0.05029296875, 0.037078857421875, 0.005329132080078125, 0.064208984375, 0.034271240234375, -0.015716552734375, 0.033599853515625, -0.044921875, 0.05517578125, 0.033721923828125, -0.056671142578125, 0.037567138671875, -0.0135345458984375, 0.0296173095703125, 0.0037441253662109375, 0.0357666015625, -0.03631591796875, 0.0037059783935546875, -0.04400634765625, -0.0531005859375, 0.07415771484375, 0.008636474609375, 0.0181884765625, 0.027252197265625, -0.00836181640625, 0.0235137939453125, 0.00923919677734375, -0.062408447265625, -0.031280517578125, -0.05010986328125, -0.016693115234375, -0.01336669921875, -0.040008544921875, -0.00628662109375, -0.0208740234375, 0.046630859375, -0.0038051605224609375, 0.0214691162109375, -0.001865386962890625, -0.037139892578125, 0.0115814208984375, 0.003955841064453125, 0.0802001953125, 0.05206298828125, -0.01172637939453125, 0.00701904296875, 0.0037021636962890625, -0.04644775390625, 0.011871337890625, 0.023468017578125, 0.00007128715515136719, 0.012115478515625, 0.05511474609375, 0.0966796875, 0.002635955810546875, -0.041473388671875, 0.04229736328125, -0.0225067138671875, -0.040435791015625, -0.036651611328125, -0.017242431640625, -0.0036487579345703125, 0.00904083251953125, 0.022430419921875, 0.005496978759765625, -0.001789093017578125, -0.03289794921875, 0.00858306884765625, 0.0367431640625, -0.0400390625, -0.02606201171875, 0.0587158203125, 0.01491546630859375, -0.0445556640625, 0.03656005859375, -0.016326904296875, -0.032501220703125, 0.03277587890625, 0.0638427734375, 0.047698974609375, -0.0181884765625, 0.0305633544921875, 0.00743865966796875, 0.017059326171875, 0.0274658203125, 0.0340576171875, -0.0120849609375, -0.056121826171875, -0.025787353515625, -0.051666259765625, -0.01190185546875, 0.008087158203125, -0.035980224609375, 0.0222015380859375, -0.028900146484375, 0.007762908935546875, -0.0032253265380859375, 0.0023441314697265625, -0.056549072265625, 0.025482177734375, 0.029632568359375, 0.10064697265625, -0.047607421875, 0.07757568359375, 0.085693359375, -0.0194091796875, -0.06646728515625, -0.00525665283203125, 0.01137542724609375, -0.08001708984375, 0.06610107421875, 0.00940704345703125, 0.00551605224609375, 0.0038051605224609375, -0.0266265869140625, -0.07269287109375, 0.07598876953125, 0.022308349609375, -0.0275421142578125, -0.0225982666015625, 0.0194549560546875, 0.034759521484375, -0.018585205078125, 0.035736083984375, 0.0073394775390625, 0.052703857421875, 0.003292083740234375, -0.058868408203125, -0.0181884765625, -0.0556640625, 0.0189208984375, 0.006504058837890625, -0.0506591796875, 0.0732421875, -0.0028018951416015625, -0.00879669189453125, 0.01233673095703125, 0.040771484375, 0.022796630859375, 0.0394287109375, 0.0418701171875, 0.042236328125, 0.05865478515625, -0.01041412353515625, 0.09466552734375, -0.0188446044921875, 0.01207733154296875, 0.09442138671875, -0.02935791015625, 0.029998779296875, 0.027313232421875, -0.00970458984375, 0.044921875, 0.049835205078125, 0.01178741455078125, 0.037506103515625, -0.018035888671875, 0.0134735107421875, -0.0174713134765625, -0.0017175674438476562, -0.045745849609375, 0.034942626953125, 0.0151214599609375, -0.036895751953125, -0.0233154296875, 0.039398193359375, 0.014984130859375, 0.006420135498046875, -0.00933074951171875, 0.053009033203125, -0.0019664764404296875, -0.0305328369140625, 0.024749755859375, -0.0017242431640625, 0.03924560546875, -0.07012939453125, 0.004940032958984375, -0.0308990478515625, 0.01302337646484375, -0.0159759521484375, -0.068603515625, 0.0183258056640625, -0.0304412841796875, -0.031097412109375, -0.043426513671875, 0.051483154296875, -0.0308074951171875, -0.052154541015625, 0.038909912109375, 0.0482177734375, -0.0094757080078125, -0.0013217926025390625, -0.06634521484375, 0.01093292236328125, 0.00250244140625, -0.019195556640625, 0.03851318359375, 0.03411865234375, -0.037139892578125, 0.04022216796875, 0.056121826171875, -0.0194549560546875, 0.01513671875, 0.025115966796875, 0.05950927734375, -0.0289154052734375, -0.058380126953125, -0.0513916015625, 0.04864501953125, -0.0223388671875, -0.02630615234375, 0.07452392578125, 0.061187744140625, 0.07452392578125, -0.020111083984375, 0.06341552734375, 0.0111083984375, 0.020477294921875, -0.04180908203125, 0.07281494140625, -0.082275390625, -0.0176544189453125, -0.0252227783203125, -0.0648193359375, -0.03619384765625, 0.03857421875, -0.018463134765625, 0.00899505615234375, 0.060882568359375, 0.05615234375, -0.0181121826171875, -0.01158905029296875, 0.05743408203125, 0.00797271728515625, 0.0264892578125, 0.00909423828125, 0.0743408203125, -0.0226287841796875, 0.050933837890625, -0.0179443359375, -0.012237548828125, -0.0195465087890625, -0.048797607421875, -0.07684326171875, -0.05987548828125, -0.010284423828125, -0.019775390625, -0.006977081298828125, 0.053466796875, 0.07843017578125, -0.050201416015625, 0.002040863037109375, 0.0126190185546875, 0.012115478515625, -0.0044403076171875, -0.0113983154296875, 0.042449951171875, -0.0173187255859375, -0.0567626953125, 0.00627899169921875, 0.0360107421875, 0.0230255126953125, -0.0308990478515625, -0.028289794921875, -0.020355224609375, -0.0010700225830078125, 0.05096435546875, 0.006381988525390625, -0.038360595703125, 0.01107025146484375, 0.00677490234375, -0.031646728515625, 0.013824462890625, 0.059814453125, -0.0293731689453125, 0.03216552734375, 0.050018310546875, 0.0239105224609375, 0.031524658203125, -0.01468658447265625, 0.029754638671875, -0.05810546875, 0.045166015625, -0.0108489990234375, 0.05670166015625, 0.022979736328125, -0.0304412841796875, 0.0287628173828125, 0.0181427001953125, -0.02032470703125, -0.0517578125, 0.01416778564453125, -0.07183837890625, -0.027740478515625, 0.0804443359375, -0.016326904296875, -0.02459716796875, -0.01409149169921875, -0.05950927734375, 0.0081787109375, -0.01000213623046875, 0.0285491943359375, 0.033782958984375, 0.01214599609375, -0.037811279296875, -0.022491455078125, 0.033966064453125, 0.02557373046875, -0.06805419921875, -0.0297088623046875, 0.0214691162109375, -0.00557708740234375, 0.04315185546875, 0.055694580078125, -0.01123809814453125, 0.0236358642578125, -0.00820159912109375, 0.0229034423828125, -0.015380859375, -0.021575927734375, 0.005222320556640625, 0.0128021240234375, -0.01453399658203125, -0.002918243408203125 ] ]
busecarik/berturk-sunlp-ner-turkish
2023-01-09T19:38:54.000Z
[ "transformers", "pytorch", "tf", "bert", "token-classification", "tr", "dataset:SUNLP-NER-Twitter", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
busecarik
null
null
busecarik/berturk-sunlp-ner-turkish
3
198,294
transformers
2022-08-26T12:34:48
--- language: tr datasets: - SUNLP-NER-Twitter --- # berturk-sunlp-ner-turkish ## Introduction [berturk-sunlp-ner-turkish] is a NER model that was fine-tuned from the BERTurk-cased model on the SUNLP-NER-Twitter dataset. ## Training data The model was trained on the SUNLP-NER-Twitter dataset (5000 tweets). The dataset can be found at https://github.com/SU-NLP/SUNLP-Twitter-NER-Dataset Named entity types are as follows: Person, Location, Organization, Time, Money, Product, TV-Show ## How to use berturk-sunlp-ner-turkish with HuggingFace ```python from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("busecarik/berturk-sunlp-ner-turkish") model = AutoModelForTokenClassification.from_pretrained("busecarik/berturk-sunlp-ner-turkish") ``` ## Model performances on SUNLP-NER-Twitter test set (metric: seqeval) Precision|Recall|F1 -|-|- 82.96|82.42|82.69 Classification Report Entity|Precision|Recall|F1 -|-|-|- LOCATION|0.70|0.80|0.74 MONEY|0.80|0.71|0.75 ORGANIZATION|0.78|0.86|0.78 PERSON|0.90|0.91|0.91 PRODUCT|0.44|0.47|0.45 TIME|0.94|0.85|0.89 TVSHOW|0.61|0.35|0.45 You can cite the following [paper](http://www.lrec-conf.org/proceedings/lrec2022/pdf/2022.lrec-1.484.pdf), if you use this model: ```bibtex @InProceedings{ark-yeniterzi:2022:LREC, author = {\c{C}ar\i k, Buse and Yeniterzi, Reyyan}, title = {A Twitter Corpus for Named Entity Recognition in Turkish}, booktitle = {Proceedings of the Language Resources and Evaluation Conference}, month = {June}, year = {2022}, address = {Marseille, France}, publisher = {European Language Resources Association}, pages = {4546--4551}, url = {https://aclanthology.org/2022.lrec-1.484} } ```
1,794
[ [ -0.033477783203125, -0.0469970703125, 0.00559234619140625, 0.01456451416015625, -0.02862548828125, -0.003841400146484375, -0.0180816650390625, -0.04644775390625, 0.03790283203125, 0.02301025390625, -0.041229248046875, -0.043670654296875, -0.05194091796875, 0.0180206298828125, -0.0244140625, 0.085205078125, -0.0039043426513671875, 0.0185089111328125, 0.0241241455078125, -0.01763916015625, 0.0127716064453125, -0.0261993408203125, -0.0699462890625, -0.025970458984375, 0.048583984375, 0.0219268798828125, 0.0258026123046875, 0.016845703125, 0.038421630859375, 0.0205230712890625, -0.00786590576171875, -0.00004303455352783203, -0.0133819580078125, -0.005794525146484375, -0.0013980865478515625, 0.004154205322265625, -0.031494140625, -0.0035457611083984375, 0.054931640625, 0.034942626953125, -0.01097869873046875, 0.030853271484375, 0.0117034912109375, 0.046417236328125, -0.026947021484375, 0.008697509765625, -0.0270233154296875, -0.0115509033203125, -0.0169677734375, 0.0086517333984375, -0.01216888427734375, -0.033416748046875, 0.0199737548828125, -0.0300750732421875, 0.0338134765625, 0.00771331787109375, 0.11029052734375, 0.0020923614501953125, -0.006031036376953125, -0.01568603515625, -0.0472412109375, 0.05615234375, -0.0748291015625, 0.040008544921875, 0.03466796875, 0.0250244140625, -0.014556884765625, -0.04058837890625, -0.055908203125, -0.0155487060546875, -0.01056671142578125, 0.001895904541015625, -0.00649261474609375, -0.00589752197265625, 0.016510009765625, 0.020660400390625, -0.03564453125, -0.00881195068359375, -0.0245208740234375, -0.0040435791015625, 0.04449462890625, -0.0106048583984375, 0.009674072265625, -0.034423828125, -0.0293426513671875, -0.016021728515625, -0.0287933349609375, 0.006084442138671875, 0.0304107666015625, 0.045257568359375, -0.0220489501953125, 0.055450439453125, -0.016510009765625, 0.047332763671875, 0.0161895751953125, -0.0082855224609375, 0.0235443115234375, -0.018096923828125, -0.0251617431640625, 0.00484466552734375, 0.0718994140625, 0.01172637939453125, 0.005615234375, -0.00543975830078125, -0.0144195556640625, -0.034423828125, -0.00421142578125, -0.059478759765625, -0.0094146728515625, 0.0101776123046875, -0.042144775390625, -0.00836944580078125, 0.02374267578125, -0.05047607421875, -0.0129241943359375, -0.00882720947265625, 0.042022705078125, -0.04156494140625, -0.0253753662109375, -0.00003063678741455078, 0.004337310791015625, 0.0506591796875, 0.022369384765625, -0.06365966796875, 0.037994384765625, 0.040435791015625, 0.053436279296875, 0.0161285400390625, -0.0096893310546875, -0.01519012451171875, 0.00021398067474365234, -0.01111602783203125, 0.05877685546875, -0.015838623046875, -0.004253387451171875, 0.002349853515625, 0.0239715576171875, -0.011749267578125, -0.0181732177734375, 0.05072021484375, -0.03497314453125, 0.03759765625, -0.0439453125, -0.058837890625, -0.0164031982421875, 0.01247406005859375, -0.05645751953125, 0.0867919921875, 0.0189056396484375, -0.09033203125, 0.05194091796875, -0.0477294921875, -0.0210723876953125, 0.00902557373046875, -0.00313568115234375, -0.037933349609375, -0.01363372802734375, 0.021087646484375, 0.04522705078125, -0.0186614990234375, 0.0163421630859375, -0.0252532958984375, -0.00212860107421875, 0.006626129150390625, 0.00797271728515625, 0.075439453125, 0.02001953125, -0.0201416015625, -0.007366180419921875, -0.0531005859375, -0.02734375, 0.017486572265625, -0.038421630859375, -0.0246429443359375, -0.00719451904296875, 0.03997802734375, 0.0148773193359375, 0.027801513671875, -0.067626953125, 0.01157379150390625, -0.03076171875, 0.00891876220703125, 0.046356201171875, -0.0010881423950195312, 0.018524169921875, -0.031524658203125, 0.0227508544921875, -0.0157928466796875, 0.0014562606811523438, 0.022918701171875, -0.036956787109375, -0.07879638671875, -0.0208740234375, 0.036346435546875, 0.036041259765625, -0.04901123046875, 0.03875732421875, -0.0199737548828125, -0.04217529296875, -0.044525146484375, -0.01129150390625, 0.01355743408203125, 0.056060791015625, 0.0247039794921875, -0.0027523040771484375, -0.06134033203125, -0.068603515625, -0.0287628173828125, -0.019012451171875, 0.01373291015625, 0.03521728515625, 0.05224609375, 0.0026912689208984375, 0.058074951171875, 0.005504608154296875, -0.029510498046875, -0.0290374755859375, 0.01201629638671875, 0.050750732421875, 0.04046630859375, 0.03582763671875, -0.048828125, -0.058929443359375, -0.006267547607421875, -0.0408935546875, 0.00463104248046875, 0.003566741943359375, 0.00208282470703125, 0.0545654296875, 0.023956298828125, -0.0380859375, 0.0310211181640625, 0.02490234375, -0.05450439453125, 0.046722412109375, -0.005615234375, 0.0172119140625, -0.108642578125, 0.02203369140625, 0.021087646484375, -0.01629638671875, -0.03656005859375, -0.00652313232421875, -0.016265869140625, 0.0147552490234375, -0.0298004150390625, 0.062408447265625, -0.0249786376953125, 0.0011262893676757812, 0.0032958984375, -0.0100555419921875, -0.018768310546875, 0.01332855224609375, 0.01055145263671875, 0.040313720703125, 0.029144287109375, -0.0357666015625, 0.03668212890625, 0.0228729248046875, -0.045013427734375, 0.034393310546875, -0.0491943359375, -0.0104827880859375, 0.00540924072265625, 0.01285552978515625, -0.05877685546875, -0.0195770263671875, 0.03973388671875, -0.0423583984375, 0.029388427734375, -0.0313720703125, -0.065185546875, -0.0230712890625, -0.0012197494506835938, 0.0138397216796875, 0.0284881591796875, -0.0304107666015625, 0.057159423828125, 0.020843505859375, -0.01551055908203125, -0.0626220703125, -0.054595947265625, 0.00589752197265625, -0.0184478759765625, -0.054107666015625, 0.029144287109375, 0.008514404296875, -0.00518798828125, 0.0246734619140625, -0.00772857666015625, -0.00948333740234375, -0.0224151611328125, -0.0008983612060546875, 0.020355224609375, -0.0206298828125, 0.006488800048828125, -0.019622802734375, -0.006801605224609375, -0.01172637939453125, -0.0232391357421875, 0.046600341796875, -0.02532958984375, -0.01242828369140625, -0.04443359375, 0.0160675048828125, 0.01727294921875, 0.00036144256591796875, 0.08734130859375, 0.0704345703125, -0.049591064453125, 0.017333984375, -0.0703125, -0.01493072509765625, -0.029296875, 0.0153350830078125, -0.0233001708984375, -0.06256103515625, 0.0606689453125, 0.02001953125, 0.017303466796875, 0.0657958984375, 0.044769287109375, -0.0171356201171875, 0.036041259765625, 0.060760498046875, -0.0200347900390625, 0.038238525390625, -0.02685546875, 0.011932373046875, -0.061187744140625, -0.03680419921875, -0.052337646484375, -0.0209197998046875, -0.05914306640625, -0.01383209228515625, -0.003753662109375, -0.0103302001953125, -0.032257080078125, 0.054595947265625, -0.03546142578125, 0.015167236328125, 0.04327392578125, 0.00643157958984375, 0.00598907470703125, 0.010955810546875, -0.015655517578125, -0.005535125732421875, -0.049163818359375, -0.0435791015625, 0.065673828125, 0.035858154296875, 0.05755615234375, 0.015899658203125, 0.07293701171875, 0.00766754150390625, 0.0027484893798828125, -0.0367431640625, 0.038818359375, -0.0182952880859375, -0.07330322265625, -0.0250396728515625, -0.032440185546875, -0.07196044921875, -0.01280975341796875, -0.0260467529296875, -0.07147216796875, 0.039215087890625, -0.01081085205078125, -0.0267486572265625, 0.04644775390625, -0.0285797119140625, 0.0604248046875, -0.0133514404296875, -0.00902557373046875, -0.0102081298828125, -0.03753662109375, 0.00681304931640625, 0.01226043701171875, 0.0173492431640625, -0.0196075439453125, -0.00023734569549560547, 0.084716796875, -0.0220489501953125, 0.053436279296875, -0.0204620361328125, 0.00736236572265625, 0.00577545166015625, 0.0034122467041015625, 0.03643798828125, 0.00927734375, -0.0103302001953125, 0.0063934326171875, 0.0099639892578125, -0.040313720703125, -0.0224761962890625, 0.048675537109375, -0.07257080078125, -0.0175323486328125, -0.055419921875, -0.0295257568359375, 0.005687713623046875, 0.0217742919921875, 0.03277587890625, 0.019989013671875, -0.031951904296875, 0.01557159423828125, 0.05120849609375, -0.0243377685546875, 0.044036865234375, 0.0382080078125, -0.007274627685546875, -0.035552978515625, 0.049957275390625, 0.01061248779296875, -0.008056640625, 0.03143310546875, 0.01035308837890625, -0.0182952880859375, -0.02227783203125, -0.0106353759765625, 0.04034423828125, -0.04150390625, -0.00876617431640625, -0.06414794921875, -0.040985107421875, -0.04595947265625, 0.0002846717834472656, -0.03857421875, -0.039947509765625, -0.0263519287109375, 0.0004048347473144531, 0.038818359375, 0.04730224609375, -0.0196990966796875, 0.0175628662109375, -0.033416748046875, 0.02313232421875, 0.0171051025390625, 0.04022216796875, -0.00913238525390625, -0.06683349609375, -0.030975341796875, -0.008575439453125, 0.00789642333984375, -0.064697265625, 0.04217529296875, 0.0185546875, 0.047454833984375, 0.03643798828125, 0.0004279613494873047, 0.04193115234375, -0.0213470458984375, 0.048980712890625, 0.00814056396484375, -0.043853759765625, 0.043975830078125, -0.0333251953125, 0.00443267822265625, 0.040679931640625, 0.028717041015625, -0.049896240234375, -0.01364898681640625, -0.0797119140625, -0.074951171875, 0.07086181640625, 0.034423828125, 0.004100799560546875, 0.01045989990234375, 0.0262908935546875, 0.006977081298828125, 0.00543975830078125, -0.0693359375, -0.031402587890625, -0.01568603515625, -0.01169586181640625, -0.0009975433349609375, -0.03839111328125, -0.0080108642578125, -0.035247802734375, 0.08984375, 0.0235748291015625, 0.038543701171875, 0.0313720703125, -0.00429534912109375, -0.01094818115234375, 0.0127716064453125, 0.04632568359375, 0.03680419921875, -0.05316162109375, -0.005603790283203125, 0.01233673095703125, -0.033843994140625, -0.016326904296875, 0.0477294921875, -0.00394439697265625, 0.0214691162109375, 0.028350830078125, 0.054779052734375, -0.003002166748046875, -0.00867462158203125, 0.02276611328125, -0.01317596435546875, -0.0196075439453125, -0.05352783203125, -0.0273895263671875, 0.0027256011962890625, 0.0305938720703125, 0.04827880859375, -0.01241302490234375, 0.0001881122589111328, -0.0138397216796875, 0.025665283203125, 0.0300750732421875, -0.028228759765625, -0.01416778564453125, 0.02685546875, 0.0011854171752929688, -0.01727294921875, 0.0462646484375, -0.0176544189453125, -0.0321044921875, 0.04052734375, 0.0222015380859375, 0.08404541015625, -0.01263427734375, 0.01360321044921875, 0.061767578125, 0.046356201171875, 0.0049285888671875, 0.0308837890625, 0.00933837890625, -0.0670166015625, -0.024871826171875, -0.07135009765625, -0.0114288330078125, 0.033294677734375, -0.049591064453125, 0.029296875, -0.03985595703125, -0.025482177734375, 0.01351165771484375, 0.03515625, -0.06170654296875, 0.007785797119140625, 0.021453857421875, 0.08660888671875, -0.07318115234375, 0.068115234375, 0.0718994140625, -0.041229248046875, -0.0687255859375, -0.0167236328125, -0.01006317138671875, -0.050506591796875, 0.05352783203125, -0.001224517822265625, 0.01873779296875, -0.01043701171875, -0.048492431640625, -0.08233642578125, 0.0716552734375, 0.0099334716796875, -0.0301513671875, 0.000015079975128173828, -0.0019283294677734375, 0.0308837890625, -0.03717041015625, 0.01488494873046875, 0.02777099609375, 0.04193115234375, 0.0194091796875, -0.074462890625, 0.007904052734375, -0.030853271484375, -0.0076141357421875, 0.009002685546875, -0.049530029296875, 0.0606689453125, -0.004901885986328125, -0.002391815185546875, 0.00814056396484375, 0.053253173828125, 0.023406982421875, 0.01100921630859375, 0.031768798828125, 0.06744384765625, 0.035125732421875, -0.016143798828125, 0.06561279296875, -0.0142059326171875, 0.0506591796875, 0.07928466796875, 0.015716552734375, 0.054443359375, 0.04705810546875, -0.019989013671875, 0.06182861328125, 0.07012939453125, -0.0272979736328125, 0.039031982421875, -0.00902557373046875, -0.01294708251953125, -0.01776123046875, -0.006855010986328125, -0.0259857177734375, 0.0167694091796875, 0.0224151611328125, -0.0328369140625, -0.0341796875, -0.0194091796875, 0.021270751953125, -0.025848388671875, 0.0024662017822265625, 0.058837890625, 0.0124359130859375, -0.019134521484375, 0.036163330078125, 0.004974365234375, 0.038665771484375, -0.03765869140625, 0.004741668701171875, -0.006023406982421875, 0.00580596923828125, -0.009490966796875, -0.04974365234375, 0.021697998046875, 0.0017824172973632812, -0.01184844970703125, -0.0007939338684082031, 0.05706787109375, -0.0285797119140625, -0.07159423828125, 0.014892578125, 0.030029296875, 0.0299835205078125, 0.0067596435546875, -0.0833740234375, -0.0196990966796875, -0.00798797607421875, -0.036590576171875, 0.01070404052734375, 0.036041259765625, 0.01012420654296875, 0.05328369140625, 0.04541015625, 0.0173492431640625, -0.0020236968994140625, 0.01132965087890625, 0.0577392578125, -0.054931640625, -0.0258941650390625, -0.0631103515625, 0.036102294921875, -0.01050567626953125, -0.021575927734375, 0.052947998046875, 0.04632568359375, 0.056640625, 0.006168365478515625, 0.0509033203125, -0.05059814453125, 0.05126953125, -0.019256591796875, 0.07318115234375, -0.045501708984375, 0.0001150369644165039, -0.039031982421875, -0.0716552734375, -0.019744873046875, 0.06640625, -0.0181732177734375, 0.0184173583984375, 0.0379638671875, 0.0506591796875, -0.006153106689453125, -0.0226593017578125, -0.0025501251220703125, 0.026519775390625, 0.00846099853515625, 0.02984619140625, 0.03460693359375, -0.0465087890625, 0.02288818359375, -0.04449462890625, -0.0169219970703125, -0.031341552734375, -0.07598876953125, -0.07952880859375, -0.0633544921875, -0.0284576416015625, -0.0577392578125, 0.0067596435546875, 0.08294677734375, 0.046600341796875, -0.0906982421875, -0.0289154052734375, -0.00133514404296875, 0.0014505386352539062, -0.005458831787109375, -0.015777587890625, 0.05633544921875, -0.0198974609375, -0.039581298828125, -0.01123809814453125, -0.01324462890625, 0.0195465087890625, -0.00447845458984375, -0.00476837158203125, -0.046600341796875, 0.0033397674560546875, 0.0206146240234375, 0.033843994140625, -0.07061767578125, -0.01198577880859375, 0.00736236572265625, -0.0135040283203125, -0.006313323974609375, 0.0221099853515625, -0.045654296875, 0.0228424072265625, 0.034271240234375, 0.0234832763671875, 0.052490234375, -0.00785064697265625, 0.0263519287109375, -0.035675048828125, 0.014007568359375, 0.0213470458984375, 0.0391845703125, 0.032806396484375, -0.0223541259765625, 0.032562255859375, 0.01812744140625, -0.041839599609375, -0.04791259765625, -0.01172637939453125, -0.08697509765625, -0.0030002593994140625, 0.0855712890625, -0.00028777122497558594, -0.021270751953125, -0.0034999847412109375, 0.000579833984375, 0.04583740234375, -0.05914306640625, 0.06610107421875, 0.060943603515625, 0.0017032623291015625, -0.0305023193359375, -0.04168701171875, 0.050811767578125, 0.012451171875, -0.030029296875, -0.006832122802734375, 0.0028972625732421875, 0.039031982421875, 0.010589599609375, 0.042755126953125, -0.007602691650390625, 0.00939178466796875, -0.0196990966796875, 0.0404052734375, 0.006534576416015625, -0.0023632049560546875, -0.0144805908203125, -0.0245819091796875, -0.008026123046875, -0.0141143798828125 ] ]
kk08/CryptoBERT
2023-09-12T06:37:34.000Z
[ "transformers", "pytorch", "safetensors", "bert", "text-classification", "generated_from_trainer", "crypto", "sentiment", "analysis", "en", "endpoints_compatible", "region:us" ]
text-classification
kk08
null
null
kk08/CryptoBERT
7
193,483
transformers
2023-04-13T17:52:32
--- language: - en tags: - generated_from_trainer - crypto - sentiment - analysis pipeline_tag: text-classification base_model: ProsusAI/finbert model-index: - name: CryptoBERT results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # CryptoBERT This model is a fine-tuned version of [ProsusAI/finbert](https://huggingface.co/ProsusAI/finbert) on the Custom Crypto Market Sentiment dataset. It achieves the following results on the evaluation set: - Loss: 0.3823 ```python from transformers import BertTokenizer, BertForSequenceClassification from transformers import pipeline tokenizer = BertTokenizer.from_pretrained("kk08/CryptoBERT") model = BertForSequenceClassification.from_pretrained("kk08/CryptoBERT") classifier = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer) text = "Bitcoin (BTC) touches $29k, Ethereum (ETH) Set To Explode, RenQ Finance (RENQ) Crosses Massive Milestone" result = classifier(text) print(result) ``` ``` [{'label': 'LABEL_1', 'score': 0.9678454399108887}] ``` ## Model description This model fine-tunes the [ProsusAI/finbert](https://huggingface.co/ProsusAI/finbert), which is a pre-trained NLP model to analyze the sentiment of the financial text. CryptoBERT model fine-tunes this by training the model as a downstream task on Custom Crypto Sentiment data to predict whether the given text related to the Crypto market is Positive (LABEL_1) or Negative (LABEL_0). ## Intended uses & limitations The model can perform well on Crypto-related data. The main limitation is that the fine-tuning was done using only a small corpus of data ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.4077 | 1.0 | 27 | 0.4257 | | 0.2048 | 2.0 | 54 | 0.2479 | | 0.0725 | 3.0 | 81 | 0.3068 | | 0.0028 | 4.0 | 108 | 0.4120 | | 0.0014 | 5.0 | 135 | 0.3566 | | 0.0007 | 6.0 | 162 | 0.3495 | | 0.0006 | 7.0 | 189 | 0.3645 | | 0.0005 | 8.0 | 216 | 0.3754 | | 0.0004 | 9.0 | 243 | 0.3804 | | 0.0004 | 10.0 | 270 | 0.3823 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.0+cu118 - Datasets 2.11.0 - Tokenizers 0.13.3
2,753
[ [ -0.036865234375, -0.039154052734375, 0.0006775856018066406, 0.004154205322265625, -0.0283660888671875, -0.006473541259765625, -0.00926971435546875, -0.0274200439453125, 0.02301025390625, 0.02886962890625, -0.045379638671875, -0.058074951171875, -0.055816650390625, -0.0084381103515625, -0.0285491943359375, 0.107177734375, 0.0206756591796875, 0.01265716552734375, 0.0024394989013671875, 0.0031280517578125, -0.011932373046875, -0.0487060546875, -0.0699462890625, -0.040496826171875, 0.00472259521484375, 0.0214080810546875, 0.0675048828125, 0.034759521484375, 0.04931640625, 0.028839111328125, -0.0182952880859375, -0.01023101806640625, -0.0278167724609375, -0.0254058837890625, 0.01546478271484375, -0.041015625, -0.045135498046875, 0.006084442138671875, 0.0244140625, 0.0313720703125, -0.0162506103515625, 0.0290985107421875, 0.00434112548828125, 0.03515625, -0.034332275390625, 0.01543426513671875, -0.0440673828125, 0.0123138427734375, -0.0066680908203125, -0.01898193359375, -0.01316070556640625, -0.01861572265625, 0.0243377685546875, -0.0455322265625, 0.040557861328125, 0.0042877197265625, 0.087158203125, 0.015625, -0.0145263671875, -0.0179901123046875, -0.04547119140625, 0.061065673828125, -0.07293701171875, 0.0245819091796875, 0.040924072265625, 0.0018262863159179688, 0.021453857421875, -0.058746337890625, -0.02813720703125, -0.005580902099609375, 0.004119873046875, 0.0179443359375, -0.00565338134765625, -0.0009717941284179688, 0.037139892578125, 0.046875, -0.03350830078125, -0.005069732666015625, -0.0450439453125, -0.027557373046875, 0.047698974609375, 0.0292816162109375, -0.01702880859375, -0.0294342041015625, -0.0341796875, -0.0214996337890625, -0.0161590576171875, 0.0247650146484375, 0.04327392578125, 0.0191497802734375, -0.0253448486328125, 0.0308837890625, -0.0062255859375, 0.04815673828125, 0.0172271728515625, -0.0211181640625, 0.05029296875, -0.006847381591796875, -0.0283050537109375, 0.0179290771484375, 0.07855224609375, 0.040618896484375, 0.01171112060546875, 0.00920867919921875, -0.0299835205078125, 0.002414703369140625, 0.0198974609375, -0.0789794921875, -0.0280914306640625, 0.0231781005859375, -0.035888671875, -0.058441162109375, 0.01055908203125, -0.055938720703125, -0.004871368408203125, -0.0321044921875, 0.037872314453125, -0.047454833984375, -0.0163726806640625, 0.0261077880859375, -0.0191802978515625, 0.0114593505859375, -0.0004825592041015625, -0.07659912109375, 0.03057861328125, 0.06304931640625, 0.0533447265625, 0.01306915283203125, -0.01837158203125, -0.01053619384765625, -0.01404571533203125, -0.027496337890625, 0.036346435546875, -0.0165252685546875, -0.0233917236328125, -0.0085601806640625, 0.0013370513916015625, -0.004962921142578125, -0.033782958984375, 0.0458984375, -0.0322265625, 0.02508544921875, -0.0081787109375, -0.052947998046875, -0.01271820068359375, 0.0321044921875, -0.03448486328125, 0.0860595703125, 0.0122833251953125, -0.0885009765625, 0.0379638671875, -0.0411376953125, -0.0192718505859375, 0.00040149688720703125, -0.006587982177734375, -0.06298828125, -0.01361846923828125, 0.00005352497100830078, 0.030029296875, -0.01390838623046875, 0.021636962890625, -0.034332275390625, -0.040924072265625, 0.036529541015625, -0.0256805419921875, 0.0692138671875, -0.0018854141235351562, -0.053131103515625, 0.0143280029296875, -0.08428955078125, 0.005527496337890625, 0.01122283935546875, -0.01299285888671875, -0.0160064697265625, -0.01418304443359375, 0.0304718017578125, 0.017852783203125, 0.02923583984375, -0.039337158203125, 0.00670623779296875, -0.05517578125, 0.05499267578125, 0.058197021484375, 0.0010890960693359375, 0.021026611328125, -0.020782470703125, 0.029022216796875, 0.0141143798828125, 0.0369873046875, 0.0156707763671875, -0.0274810791015625, -0.0672607421875, -0.0219573974609375, 0.0248565673828125, 0.053192138671875, -0.0085296630859375, 0.060882568359375, -0.01390838623046875, -0.045379638671875, -0.0352783203125, 0.003376007080078125, 0.006420135498046875, 0.047637939453125, 0.0333251953125, -0.0106048583984375, -0.043670654296875, -0.07244873046875, -0.00016248226165771484, -0.009307861328125, 0.0086517333984375, 0.00540924072265625, 0.0576171875, -0.0300140380859375, 0.0628662109375, -0.04205322265625, -0.018096923828125, -0.005435943603515625, 0.0208740234375, 0.03668212890625, 0.04327392578125, 0.044708251953125, -0.049163818359375, -0.0194244384765625, -0.0084686279296875, -0.047027587890625, 0.0285491943359375, -0.0155792236328125, -0.01303863525390625, 0.0001323223114013672, 0.0345458984375, -0.046722412109375, 0.037506103515625, 0.0323486328125, -0.03533935546875, 0.055572509765625, -0.00738525390625, 0.000484466552734375, -0.09881591796875, 0.020294189453125, 0.0092010498046875, -0.01513671875, -0.0300445556640625, -0.012542724609375, 0.0004115104675292969, -0.01690673828125, -0.017730712890625, 0.03240966796875, 0.01027679443359375, 0.0161590576171875, -0.01499176025390625, -0.006038665771484375, -0.0016603469848632812, 0.0477294921875, 0.0169525146484375, 0.050048828125, 0.05364990234375, -0.034759521484375, 0.025604248046875, 0.0244140625, -0.02685546875, 0.024078369140625, -0.07196044921875, 0.0025482177734375, 0.010528564453125, -0.014892578125, -0.067138671875, -0.00588226318359375, 0.050506591796875, -0.0435791015625, 0.01568603515625, -0.0079193115234375, -0.024444580078125, -0.02691650390625, -0.0380859375, 0.01088714599609375, 0.03729248046875, -0.0178985595703125, 0.037872314453125, -0.005489349365234375, 0.0086212158203125, -0.0677490234375, -0.057373046875, -0.0124969482421875, -0.025115966796875, -0.04010009765625, 0.032745361328125, -0.01151275634765625, -0.00807952880859375, -0.0038356781005859375, -0.01003265380859375, 0.00039386749267578125, -0.00313568115234375, 0.03997802734375, 0.0367431640625, -0.007717132568359375, 0.006923675537109375, -0.007747650146484375, -0.0283203125, 0.0286102294921875, -0.00873565673828125, 0.053985595703125, -0.03424072265625, -0.00836944580078125, -0.05609130859375, -0.0090789794921875, 0.034332275390625, -0.01666259765625, 0.0782470703125, 0.059112548828125, -0.0267181396484375, 0.0046844482421875, -0.0277862548828125, -0.028045654296875, -0.04193115234375, 0.033172607421875, -0.0298004150390625, -0.05450439453125, 0.060516357421875, 0.00688934326171875, 0.012298583984375, 0.068115234375, 0.029388427734375, 0.00424957275390625, 0.0789794921875, 0.03472900390625, -0.018402099609375, 0.030609130859375, -0.065185546875, 0.0222320556640625, -0.048736572265625, -0.0224761962890625, -0.037811279296875, -0.01303863525390625, -0.05902099609375, 0.00608062744140625, -0.0011262893676757812, 0.02764892578125, -0.046600341796875, 0.017822265625, -0.047637939453125, -0.0017185211181640625, 0.052459716796875, 0.02362060546875, 0.0034694671630859375, 0.006984710693359375, -0.01467132568359375, -0.0184173583984375, -0.0543212890625, -0.032989501953125, 0.09588623046875, 0.0372314453125, 0.0528564453125, -0.0252532958984375, 0.0655517578125, 0.00797271728515625, 0.0194854736328125, -0.061767578125, 0.030426025390625, 0.0030231475830078125, -0.060455322265625, -0.016815185546875, -0.044708251953125, -0.050506591796875, 0.006381988525390625, -0.0345458984375, -0.036102294921875, 0.0282135009765625, 0.01427459716796875, -0.0330810546875, 0.025848388671875, -0.0229644775390625, 0.0919189453125, -0.01522064208984375, -0.0204620361328125, -0.0159149169921875, -0.0386962890625, 0.015960693359375, -0.00646209716796875, 0.0097198486328125, -0.003345489501953125, -0.0023670196533203125, 0.07659912109375, -0.044403076171875, 0.06536865234375, -0.004398345947265625, 0.01100921630859375, 0.022186279296875, -0.0136260986328125, 0.044464111328125, 0.02764892578125, -0.014495849609375, 0.0172882080078125, 0.01235198974609375, -0.030914306640625, -0.0197601318359375, 0.042572021484375, -0.0863037109375, -0.018768310546875, -0.053436279296875, -0.0194244384765625, -0.004734039306640625, 0.014801025390625, 0.059173583984375, 0.0445556640625, -0.0103759765625, 0.020660400390625, 0.0513916015625, 0.00121307373046875, 0.01708984375, 0.01201629638671875, -0.009246826171875, -0.06549072265625, 0.0723876953125, -0.0142059326171875, 0.004749298095703125, 0.012176513671875, 0.01806640625, -0.028778076171875, -0.0313720703125, -0.0205230712890625, 0.0204620361328125, -0.05682373046875, -0.024261474609375, -0.044952392578125, -0.03094482421875, -0.043304443359375, -0.01262664794921875, -0.039215087890625, -0.02484130859375, -0.0435791015625, -0.021575927734375, 0.043914794921875, 0.021240234375, -0.00022900104522705078, 0.039306640625, -0.053619384765625, 0.003826141357421875, 0.01788330078125, 0.01678466796875, 0.0014200210571289062, -0.0596923828125, -0.020843505859375, 0.01061248779296875, -0.03057861328125, -0.06072998046875, 0.037078857421875, 0.00952911376953125, 0.02630615234375, 0.06390380859375, 0.00197601318359375, 0.055389404296875, 0.0007615089416503906, 0.049041748046875, 0.033233642578125, -0.0701904296875, 0.04156494140625, -0.0212249755859375, 0.0206756591796875, 0.05364990234375, 0.04925537109375, -0.040496826171875, -0.0275421142578125, -0.07763671875, -0.06781005859375, 0.056915283203125, 0.0177764892578125, 0.0058135986328125, -0.003925323486328125, 0.03900146484375, -0.006618499755859375, 0.031280517578125, -0.07025146484375, -0.03570556640625, -0.03973388671875, -0.032867431640625, -0.01629638671875, -0.0196075439453125, -0.01245880126953125, -0.040924072265625, 0.07110595703125, -0.000029385089874267578, 0.0238800048828125, 0.018096923828125, 0.005367279052734375, -0.003200531005859375, 0.007457733154296875, 0.031402587890625, 0.0596923828125, -0.040679931640625, -0.00814056396484375, 0.01617431640625, -0.045684814453125, -0.01267242431640625, 0.026214599609375, -0.0248565673828125, 0.00026535987854003906, 0.03363037109375, 0.072265625, 0.0217742919921875, -0.038177490234375, 0.04638671875, -0.007904052734375, -0.04315185546875, -0.048431396484375, -0.0020656585693359375, 0.00505828857421875, 0.0222015380859375, 0.0279083251953125, 0.043487548828125, 0.018768310546875, -0.02484130859375, 0.006031036376953125, 0.02545166015625, -0.051788330078125, -0.024932861328125, 0.060089111328125, 0.00402069091796875, -0.01416015625, 0.07080078125, -0.0124664306640625, -0.05303955078125, 0.05474853515625, 0.0271453857421875, 0.0621337890625, -0.01004791259765625, 0.0145721435546875, 0.056732177734375, 0.03106689453125, -0.0035190582275390625, 0.0325927734375, 0.00916290283203125, -0.05023193359375, -0.024322509765625, -0.059539794921875, -0.010589599609375, 0.0338134765625, -0.07025146484375, 0.0255584716796875, -0.049530029296875, -0.0552978515625, 0.012451171875, 0.020294189453125, -0.060516357421875, 0.034881591796875, 0.00015437602996826172, 0.07177734375, -0.0631103515625, 0.03936767578125, 0.048492431640625, -0.0250701904296875, -0.054473876953125, -0.0294342041015625, -0.02130126953125, -0.052215576171875, 0.03936767578125, 0.0169525146484375, 0.006954193115234375, 0.002788543701171875, -0.03790283203125, -0.0518798828125, 0.07049560546875, 0.0053253173828125, -0.052825927734375, 0.013153076171875, 0.036224365234375, 0.0438232421875, -0.010894775390625, 0.021240234375, 0.0184173583984375, 0.0224456787109375, -0.00504302978515625, -0.041595458984375, 0.0003848075866699219, -0.0280914306640625, -0.0114898681640625, 0.0287017822265625, -0.052490234375, 0.06610107421875, -0.0000960230827331543, 0.0255279541015625, -0.0001506805419921875, 0.050506591796875, 0.0246734619140625, 0.0267486572265625, 0.0400390625, 0.0731201171875, 0.046844482421875, -0.01025390625, 0.05517578125, -0.057098388671875, 0.05908203125, 0.0625, 0.0208587646484375, 0.06744384765625, 0.0236358642578125, -0.0209808349609375, 0.03802490234375, 0.07281494140625, -0.021087646484375, 0.0355224609375, 0.003101348876953125, -0.01395416259765625, -0.04437255859375, 0.0240020751953125, -0.048248291015625, 0.023284912109375, 0.0181884765625, -0.048828125, 0.01451873779296875, -0.0034656524658203125, -0.0009045600891113281, -0.0254364013671875, -0.0283050537109375, 0.0384521484375, -0.0006985664367675781, -0.0173187255859375, 0.037200927734375, 0.015869140625, 0.0614013671875, -0.05780029296875, 0.01090240478515625, -0.00620269775390625, 0.035858154296875, -0.022216796875, -0.0299835205078125, 0.015869140625, -0.018341064453125, -0.0050201416015625, 0.004749298095703125, 0.047027587890625, -0.01316070556640625, -0.0662841796875, 0.01319122314453125, 0.002208709716796875, -0.0011463165283203125, -0.0160980224609375, -0.083984375, -0.0079345703125, 0.0027217864990234375, -0.042144775390625, 0.01107025146484375, 0.0283203125, 0.0194854736328125, 0.041778564453125, 0.0540771484375, -0.0125885009765625, 0.002140045166015625, 0.002941131591796875, 0.0643310546875, -0.058929443359375, -0.053619384765625, -0.061737060546875, 0.027008056640625, -0.0206451416015625, -0.061981201171875, 0.050628662109375, 0.07379150390625, 0.0648193359375, -0.0080718994140625, 0.0458984375, 0.005031585693359375, 0.022613525390625, -0.0208740234375, 0.046356201171875, -0.037994384765625, -0.0025272369384765625, -0.016632080078125, -0.0552978515625, -0.0009026527404785156, 0.06536865234375, -0.036102294921875, 0.01123809814453125, 0.036590576171875, 0.058624267578125, -0.0026874542236328125, 0.02117919921875, -0.004238128662109375, 0.0018701553344726562, 0.0019626617431640625, 0.0214691162109375, 0.052520751953125, -0.0679931640625, 0.05096435546875, -0.06005859375, -0.00905609130859375, -0.0195770263671875, -0.044891357421875, -0.07867431640625, -0.029449462890625, -0.050811767578125, -0.04217529296875, -0.003330230712890625, 0.073486328125, 0.05975341796875, -0.05718994140625, -0.021575927734375, -0.01384735107421875, -0.0236358642578125, -0.0330810546875, -0.020782470703125, 0.037261962890625, -0.0212249755859375, -0.041961669921875, -0.01434326171875, -0.01175689697265625, 0.013397216796875, -0.018096923828125, -0.01605224609375, 0.005634307861328125, -0.004962921142578125, 0.0310211181640625, -0.0012989044189453125, -0.0308685302734375, -0.003040313720703125, 0.01177978515625, -0.0117645263671875, 0.01337432861328125, 0.041290283203125, -0.029937744140625, 0.0065765380859375, 0.03436279296875, 0.0239715576171875, 0.06695556640625, -0.00571441650390625, 0.00965118408203125, -0.0369873046875, 0.0205078125, 0.02294921875, 0.0175933837890625, 0.0161590576171875, -0.03277587890625, 0.0188446044921875, 0.036346435546875, -0.045379638671875, -0.0439453125, -0.0236968994140625, -0.08087158203125, -0.017578125, 0.06500244140625, 0.0045318603515625, -0.0258026123046875, 0.01415252685546875, -0.011199951171875, 0.016326904296875, -0.0296783447265625, 0.06085205078125, 0.05462646484375, -0.0005130767822265625, -0.00882720947265625, -0.04168701171875, 0.05419921875, 0.0214080810546875, -0.039154052734375, -0.0100860595703125, 0.011474609375, 0.042236328125, 0.0175323486328125, 0.0418701171875, -0.00017344951629638672, 0.0218658447265625, 0.01461029052734375, 0.020172119140625, -0.0086212158203125, -0.003925323486328125, -0.0199127197265625, 0.009674072265625, 0.0059661865234375, -0.02178955078125 ] ]
nlpaueb/legal-bert-base-uncased
2022-04-28T14:42:50.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "pretraining", "legal", "fill-mask", "en", "license:cc-by-sa-4.0", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
nlpaueb
null
null
nlpaueb/legal-bert-base-uncased
84
192,891
transformers
2022-03-02T23:29:05
--- language: en pipeline_tag: fill-mask license: cc-by-sa-4.0 thumbnail: https://i.ibb.co/p3kQ7Rw/Screenshot-2020-10-06-at-12-16-36-PM.png tags: - legal widget: - text: "The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of police." --- # LEGAL-BERT: The Muppets straight out of Law School <img align="left" src="https://i.ibb.co/p3kQ7Rw/Screenshot-2020-10-06-at-12-16-36-PM.png" width="100"/> LEGAL-BERT is a family of BERT models for the legal domain, intended to assist legal NLP research, computational law, and legal technology applications. To pre-train the different variations of LEGAL-BERT, we collected 12 GB of diverse English legal text from several fields (e.g., legislation, court cases, contracts) scraped from publicly available resources. Sub-domain variants (CONTRACTS-, EURLEX-, ECHR-) and/or general LEGAL-BERT perform better than using BERT out of the box for domain-specific tasks. A light-weight model (33% the size of BERT-BASE) pre-trained from scratch on legal data with competitive performance is also available. <br/><br/> --- I. Chalkidis, M. Fergadiotis, P. Malakasiotis, N. Aletras and I. Androutsopoulos. "LEGAL-BERT: The Muppets straight out of Law School". In Findings of Empirical Methods in Natural Language Processing (EMNLP 2020) (Short Papers), to be held online, 2020. (https://aclanthology.org/2020.findings-emnlp.261) --- ## Pre-training corpora The pre-training corpora of LEGAL-BERT include: * 116,062 documents of EU legislation, publicly available from EURLEX (http://eur-lex.europa.eu), the repository of EU Law running under the EU Publication Office. * 61,826 documents of UK legislation, publicly available from the UK legislation portal (http://www.legislation.gov.uk). * 19,867 cases from the European Court of Justice (ECJ), also available from EURLEX. * 12,554 cases from HUDOC, the repository of the European Court of Human Rights (ECHR) (http://hudoc.echr.coe.int/eng). * 164,141 cases from various courts across the USA, hosted in the Case Law Access Project portal (https://case.law). * 76,366 US contracts from EDGAR, the database of US Securities and Exchange Commission (SECOM) (https://www.sec.gov/edgar.shtml). ## Pre-training details * We trained BERT using the official code provided in Google BERT's GitHub repository (https://github.com/google-research/bert). * We released a model similar to the English BERT-BASE model (12-layer, 768-hidden, 12-heads, 110M parameters). * We chose to follow the same training set-up: 1 million training steps with batches of 256 sequences of length 512 with an initial learning rate 1e-4. * We were able to use a single Google Cloud TPU v3-8 provided for free from [TensorFlow Research Cloud (TFRC)](https://www.tensorflow.org/tfrc), while also utilizing [GCP research credits](https://edu.google.com/programs/credits/research). Huge thanks to both Google programs for supporting us! * Part of LEGAL-BERT is a light-weight model pre-trained from scratch on legal data, which achieves comparable performance to larger models, while being much more efficient (approximately 4 times faster) with a smaller environmental footprint. ## Models list | Model name | Model Path | Training corpora | | ------------------- | ------------------------------------ | ------------------- | | CONTRACTS-BERT-BASE | `nlpaueb/bert-base-uncased-contracts` | US contracts | | EURLEX-BERT-BASE | `nlpaueb/bert-base-uncased-eurlex` | EU legislation | | ECHR-BERT-BASE | `nlpaueb/bert-base-uncased-echr` | ECHR cases | | LEGAL-BERT-BASE * | `nlpaueb/legal-bert-base-uncased` | All | | LEGAL-BERT-SMALL | `nlpaueb/legal-bert-small-uncased` | All | \* LEGAL-BERT-BASE is the model referred to as LEGAL-BERT-SC in Chalkidis et al. (2020); a model trained from scratch in the legal corpora mentioned below using a newly created vocabulary by a sentence-piece tokenizer trained on the very same corpora. \*\* As many of you expressed interest in the LEGAL-BERT-FP models (those relying on the original BERT-BASE checkpoint), they have been released in Archive.org (https://archive.org/details/legal_bert_fp), as these models are secondary and possibly only interesting for those who aim to dig deeper in the open questions of Chalkidis et al. (2020). ## Load Pretrained Model ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("nlpaueb/legal-bert-base-uncased") model = AutoModel.from_pretrained("nlpaueb/legal-bert-base-uncased") ``` ## Use LEGAL-BERT variants as Language Models | Corpus | Model | Masked token | Predictions | | --------------------------------- | ---------------------------------- | ------------ | ------------ | | | **BERT-BASE-UNCASED** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('new', '0.09'), ('current', '0.04'), ('proposed', '0.03'), ('marketing', '0.03'), ('joint', '0.02') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('torture', '0.32'), ('rape', '0.22'), ('abuse', '0.14'), ('death', '0.04'), ('violence', '0.03') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | ('farm', '0.25'), ('livestock', '0.08'), ('draft', '0.06'), ('domestic', '0.05'), ('wild', '0.05') | | **CONTRACTS-BERT-BASE** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('letter', '0.38'), ('dealer', '0.04'), ('employment', '0.03'), ('award', '0.03'), ('contribution', '0.02') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('death', '0.39'), ('imprisonment', '0.07'), ('contempt', '0.05'), ('being', '0.03'), ('crime', '0.02') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | (('domestic', '0.18'), ('laboratory', '0.07'), ('household', '0.06'), ('personal', '0.06'), ('the', '0.04') | | **EURLEX-BERT-BASE** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('supply', '0.11'), ('cooperation', '0.08'), ('service', '0.07'), ('licence', '0.07'), ('distribution', '0.05') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('torture', '0.66'), ('death', '0.07'), ('imprisonment', '0.07'), ('murder', '0.04'), ('rape', '0.02') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | ('live', '0.43'), ('pet', '0.28'), ('certain', '0.05'), ('fur', '0.03'), ('the', '0.02') | | **ECHR-BERT-BASE** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('second', '0.24'), ('latter', '0.10'), ('draft', '0.05'), ('bilateral', '0.05'), ('arbitration', '0.04') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('torture', '0.99'), ('death', '0.01'), ('inhuman', '0.00'), ('beating', '0.00'), ('rape', '0.00') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | ('pet', '0.17'), ('all', '0.12'), ('slaughtered', '0.10'), ('domestic', '0.07'), ('individual', '0.05') | | **LEGAL-BERT-BASE** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('settlement', '0.26'), ('letter', '0.23'), ('dealer', '0.04'), ('master', '0.02'), ('supplemental', '0.02') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('torture', '1.00'), ('detention', '0.00'), ('arrest', '0.00'), ('rape', '0.00'), ('death', '0.00') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | ('live', '0.67'), ('beef', '0.17'), ('farm', '0.03'), ('pet', '0.02'), ('dairy', '0.01') | | **LEGAL-BERT-SMALL** | | (Contracts) | This [MASK] Agreement is between General Motors and John Murray . | employment | ('license', '0.09'), ('transition', '0.08'), ('settlement', '0.04'), ('consent', '0.03'), ('letter', '0.03') | (ECHR) | The applicant submitted that her husband was subjected to treatment amounting to [MASK] whilst in the custody of Adana Security Directorate | torture | ('torture', '0.59'), ('pain', '0.05'), ('ptsd', '0.05'), ('death', '0.02'), ('tuberculosis', '0.02') | (EURLEX) | Establishing a system for the identification and registration of [MASK] animals and regarding the labelling of beef and beef products . | bovine | ('all', '0.08'), ('live', '0.07'), ('certain', '0.07'), ('the', '0.07'), ('farm', '0.05') ## Evaluation on downstream tasks Consider the experiments in the article "LEGAL-BERT: The Muppets straight out of Law School". Chalkidis et al., 2020, (https://aclanthology.org/2020.findings-emnlp.261) ## Author - Publication ``` @inproceedings{chalkidis-etal-2020-legal, title = "{LEGAL}-{BERT}: The Muppets straight out of Law School", author = "Chalkidis, Ilias and Fergadiotis, Manos and Malakasiotis, Prodromos and Aletras, Nikolaos and Androutsopoulos, Ion", booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", doi = "10.18653/v1/2020.findings-emnlp.261", pages = "2898--2904" } ``` ## About Us [AUEB's Natural Language Processing Group](http://nlp.cs.aueb.gr) develops algorithms, models, and systems that allow computers to process and generate natural language texts. The group's current research interests include: * question answering systems for databases, ontologies, document collections, and the Web, especially biomedical question answering, * natural language generation from databases and ontologies, especially Semantic Web ontologies, text classification, including filtering spam and abusive content, * information extraction and opinion mining, including legal text analytics and sentiment analysis, * natural language processing tools for Greek, for example parsers and named-entity recognizers, machine learning in natural language processing, especially deep learning. The group is part of the Information Processing Laboratory of the Department of Informatics of the Athens University of Economics and Business. [Ilias Chalkidis](https://iliaschalkidis.github.io) on behalf of [AUEB's Natural Language Processing Group](http://nlp.cs.aueb.gr) | Github: [@ilias.chalkidis](https://github.com/iliaschalkidis) | Twitter: [@KiddoThe2B](https://twitter.com/KiddoThe2B) |
11,567
[ [ -0.020599365234375, -0.0419921875, 0.033355712890625, 0.004611968994140625, -0.031280517578125, -0.0142669677734375, -0.00821685791015625, -0.045989990234375, 0.032958984375, 0.048583984375, -0.017364501953125, -0.03997802734375, -0.039337158203125, -0.00905609130859375, -0.0252532958984375, 0.0916748046875, 0.0160675048828125, 0.010040283203125, 0.0164947509765625, -0.0224456787109375, -0.022216796875, -0.05987548828125, -0.0239715576171875, 0.0009756088256835938, 0.042388916015625, 0.007411956787109375, 0.040130615234375, 0.0377197265625, 0.0306396484375, 0.0185394287109375, -0.0145111083984375, -0.016143798828125, -0.0242919921875, -0.013824462890625, -0.0118560791015625, -0.0213470458984375, -0.053619384765625, 0.01568603515625, 0.0186004638671875, 0.042236328125, -0.038116455078125, 0.017425537109375, -0.0186920166015625, 0.054290771484375, -0.044281005859375, -0.005275726318359375, -0.049896240234375, 0.006683349609375, -0.011810302734375, 0.00922393798828125, -0.0193634033203125, -0.02410888671875, 0.00592803955078125, -0.05169677734375, 0.01302337646484375, 0.02178955078125, 0.09344482421875, 0.0077667236328125, -0.0279388427734375, -0.04180908203125, -0.026641845703125, 0.043243408203125, -0.067138671875, 0.0269012451171875, 0.03033447265625, 0.004100799560546875, -0.0201416015625, -0.071044921875, -0.051544189453125, -0.0345458984375, -0.0131683349609375, 0.0251312255859375, -0.0291748046875, 0.003326416015625, 0.034027099609375, 0.0301971435546875, -0.053070068359375, 0.0029888153076171875, -0.04620361328125, -0.038604736328125, 0.061676025390625, -0.0010051727294921875, 0.01523590087890625, -0.0166778564453125, -0.02899169921875, -0.0005125999450683594, -0.05328369140625, 0.0277862548828125, 0.054046630859375, 0.0169677734375, -0.0196685791015625, 0.053070068359375, -0.00209808349609375, 0.034088134765625, -0.004146575927734375, -0.0004124641418457031, 0.030242919921875, -0.03973388671875, -0.0250701904296875, 0.00016629695892333984, 0.07183837890625, 0.02545166015625, -0.001697540283203125, -0.002391815185546875, -0.02093505859375, -0.00797271728515625, 0.0174560546875, -0.04473876953125, -0.007144927978515625, 0.0313720703125, -0.0521240234375, -0.0240631103515625, 0.01284027099609375, -0.048553466796875, -0.01148223876953125, -0.010467529296875, 0.032196044921875, -0.040069580078125, -0.00478363037109375, 0.00986480712890625, -0.024749755859375, 0.049163818359375, 0.006832122802734375, -0.058990478515625, 0.00997161865234375, 0.037689208984375, 0.03369140625, 0.024169921875, -0.002002716064453125, -0.045562744140625, 0.01261138916015625, -0.021728515625, 0.036163330078125, -0.037322998046875, -0.0186004638671875, 0.00878143310546875, 0.01439666748046875, 0.0012569427490234375, -0.0259246826171875, 0.047027587890625, -0.04656982421875, 0.0198211669921875, -0.0233917236328125, -0.037139892578125, -0.0157012939453125, 0.018341064453125, -0.0419921875, 0.057342529296875, -0.005886077880859375, -0.053924560546875, 0.05267333984375, -0.073486328125, -0.033447265625, 0.0008854866027832031, -0.01045989990234375, -0.03582763671875, -0.0271148681640625, 0.0009832382202148438, 0.03057861328125, -0.0131378173828125, 0.036285400390625, -0.028045654296875, -0.00881195068359375, 0.01178741455078125, -0.00524139404296875, 0.10784912109375, 0.040252685546875, -0.027191162109375, 0.01146697998046875, -0.06512451171875, -0.0131378173828125, 0.01605224609375, -0.045745849609375, -0.00726318359375, -0.015777587890625, -0.0055084228515625, 0.017303466796875, 0.023681640625, -0.052825927734375, 0.007232666015625, -0.03741455078125, 0.052001953125, 0.04052734375, 0.0014257431030273438, 0.01552581787109375, -0.028717041015625, 0.046844482421875, 0.0020122528076171875, 0.00893402099609375, -0.0012273788452148438, -0.05035400390625, -0.054901123046875, -0.01288604736328125, 0.0616455078125, 0.060943603515625, -0.020233154296875, 0.050750732421875, -0.01268768310546875, -0.040863037109375, -0.04339599609375, -0.0156707763671875, 0.0228118896484375, 0.0277099609375, 0.0194854736328125, -0.0284576416015625, -0.06256103515625, -0.08575439453125, -0.024932861328125, -0.016815185546875, 0.0033016204833984375, 0.01111602783203125, 0.07379150390625, -0.00637054443359375, 0.06402587890625, -0.0138702392578125, -0.023590087890625, -0.016326904296875, 0.027008056640625, 0.03668212890625, 0.04736328125, 0.0633544921875, -0.0445556640625, -0.037872314453125, 0.00975799560546875, -0.0582275390625, 0.0228729248046875, -0.00284576416015625, -0.015777587890625, 0.0184478759765625, 0.0223236083984375, -0.04180908203125, 0.036224365234375, 0.0079498291015625, -0.033355712890625, 0.05560302734375, -0.03997802734375, -0.0020618438720703125, -0.07061767578125, 0.0169677734375, -0.012939453125, -0.0200347900390625, -0.04522705078125, 0.00888824462890625, 0.0034160614013671875, -0.0120086669921875, -0.050567626953125, 0.0226898193359375, -0.034332275390625, -0.0118865966796875, 0.0210113525390625, -0.004398345947265625, -0.0096893310546875, 0.04241943359375, -0.018951416015625, 0.057769775390625, 0.049713134765625, -0.0533447265625, 0.03466796875, 0.0304412841796875, -0.01470947265625, 0.0276947021484375, -0.0382080078125, 0.0008563995361328125, -0.013641357421875, 0.00778961181640625, -0.04046630859375, -0.014007568359375, 0.035186767578125, -0.033447265625, 0.015350341796875, -0.0167694091796875, -0.03680419921875, -0.037933349609375, -0.0298614501953125, -0.01538848876953125, 0.049774169921875, -0.013519287109375, 0.048309326171875, 0.046539306640625, -0.01128387451171875, -0.06842041015625, -0.06561279296875, 0.01313018798828125, -0.0185699462890625, -0.041595458984375, 0.02716064453125, 0.004169464111328125, -0.01003265380859375, 0.0254669189453125, 0.01253509521484375, -0.0249786376953125, 0.00444793701171875, 0.0094451904296875, 0.00252532958984375, -0.017303466796875, -0.01354217529296875, -0.0016851425170898438, 0.01111602783203125, 0.023284912109375, -0.0029239654541015625, 0.036041259765625, -0.004878997802734375, -0.0157012939453125, -0.0235748291015625, 0.042266845703125, 0.0284271240234375, -0.018798828125, 0.046783447265625, 0.03729248046875, -0.029571533203125, 0.0120086669921875, -0.034759521484375, 0.004138946533203125, -0.0299530029296875, 0.006038665771484375, -0.02197265625, -0.042694091796875, 0.05609130859375, 0.01404571533203125, 0.015594482421875, 0.0711669921875, 0.049407958984375, -0.015869140625, 0.038604736328125, 0.048919677734375, -0.003635406494140625, 0.03173828125, -0.029083251953125, 0.0172576904296875, -0.04339599609375, -0.027191162109375, -0.038055419921875, -0.01099395751953125, -0.065185546875, -0.006549835205078125, 0.010711669921875, -0.007167816162109375, -0.0187530517578125, 0.054412841796875, -0.025848388671875, 0.021453857421875, 0.0635986328125, 0.0024566650390625, 0.005950927734375, -0.0023956298828125, -0.056915283203125, 0.003170013427734375, -0.06414794921875, -0.057037353515625, 0.09857177734375, 0.036163330078125, 0.034423828125, 0.012237548828125, 0.0555419921875, 0.042572021484375, 0.01126861572265625, -0.042999267578125, 0.048004150390625, -0.0161590576171875, -0.076904296875, -0.0254669189453125, -0.0290679931640625, -0.0909423828125, 0.01079559326171875, -0.02325439453125, -0.06048583984375, 0.037078857421875, -0.0136260986328125, -0.050140380859375, 0.0212249755859375, -0.053253173828125, 0.058441162109375, -0.0265960693359375, -0.0229034423828125, -0.024017333984375, -0.06512451171875, 0.031463623046875, -0.004180908203125, 0.025787353515625, -0.01468658447265625, 0.0155792236328125, 0.08447265625, -0.07183837890625, 0.055084228515625, -0.0257110595703125, -0.0031528472900390625, 0.026885986328125, -0.024261474609375, 0.03765869140625, 0.01497650146484375, -0.0164337158203125, 0.003978729248046875, 0.0174560546875, -0.0174560546875, -0.006748199462890625, 0.023590087890625, -0.049163818359375, -0.03765869140625, -0.061859130859375, -0.041748046875, 0.0112457275390625, 0.028656005859375, 0.02734375, 0.02435302734375, -0.00557708740234375, 0.02374267578125, 0.0263671875, -0.031463623046875, 0.034149169921875, 0.05096435546875, -0.003917694091796875, -0.036895751953125, 0.05194091796875, 0.025482177734375, -0.01453399658203125, 0.01177215576171875, 0.00054168701171875, -0.04840087890625, -0.038177490234375, -0.02032470703125, 0.0254364013671875, -0.05914306640625, -0.0177459716796875, -0.0548095703125, -0.0213470458984375, -0.053802490234375, 0.00897979736328125, -0.01345062255859375, -0.0242156982421875, -0.02252197265625, 0.00036597251892089844, 0.023284912109375, 0.055145263671875, -0.021209716796875, -0.00785064697265625, -0.04718017578125, 0.03662109375, 0.0318603515625, 0.0265960693359375, -0.037353515625, -0.063720703125, 0.0001474618911743164, 0.019989013671875, -0.03857421875, -0.0653076171875, 0.033294677734375, 0.01116180419921875, 0.058837890625, 0.0364990234375, 0.01313018798828125, 0.0694580078125, -0.046630859375, 0.0753173828125, 0.023468017578125, -0.07330322265625, 0.049713134765625, -0.02081298828125, -0.033538818359375, 0.030731201171875, 0.041046142578125, -0.019927978515625, -0.047149658203125, -0.07012939453125, -0.058929443359375, 0.0635986328125, 0.038604736328125, 0.01396942138671875, 0.0180816650390625, 0.030059814453125, 0.0048370361328125, 0.0285797119140625, -0.0701904296875, -0.022216796875, -0.0005059242248535156, 0.00879669189453125, 0.036376953125, -0.0235137939453125, -0.0294647216796875, -0.035491943359375, 0.06719970703125, 0.0302886962890625, 0.038665771484375, 0.0255584716796875, -0.0189971923828125, 0.0085601806640625, 0.03582763671875, 0.07379150390625, 0.0736083984375, -0.0275115966796875, -0.0054473876953125, 0.0174713134765625, -0.038330078125, 0.02056884765625, 0.0296630859375, -0.01026153564453125, 0.01552581787109375, 0.0166168212890625, 0.05633544921875, -0.0013942718505859375, -0.04901123046875, 0.0419921875, 0.003177642822265625, -0.061859130859375, -0.04742431640625, -0.0099639892578125, -0.00800323486328125, 0.02801513671875, 0.024169921875, -0.0009083747863769531, 0.02178955078125, -0.0567626953125, 0.0242156982421875, 0.0288543701171875, -0.010711669921875, -0.0005726814270019531, 0.0635986328125, 0.00693511962890625, -0.0013227462768554688, 0.024658203125, -0.05047607421875, -0.03790283203125, 0.058837890625, 0.0271148681640625, 0.04168701171875, 0.0020999908447265625, 0.0136566162109375, 0.0309295654296875, 0.04132080078125, -0.0028057098388671875, 0.044647216796875, 0.0185394287109375, -0.03857421875, -0.01494598388671875, -0.03875732421875, -0.019744873046875, 0.0238037109375, -0.04779052734375, 0.017425537109375, -0.03369140625, -0.0279388427734375, 0.0035877227783203125, 0.00658416748046875, -0.053009033203125, 0.0091094970703125, 0.012420654296875, 0.0625, -0.048675537109375, 0.07440185546875, 0.06243896484375, -0.062103271484375, -0.058990478515625, -0.020355224609375, -0.0242767333984375, -0.0592041015625, 0.0538330078125, 0.000335693359375, 0.0113067626953125, -0.0207672119140625, -0.044647216796875, -0.0511474609375, 0.08416748046875, 0.039581298828125, -0.054229736328125, 0.0028171539306640625, 0.0207977294921875, 0.03363037109375, -0.01306915283203125, 0.007366180419921875, 0.051849365234375, 0.035400390625, 0.0026683807373046875, -0.07232666015625, 0.0140838623046875, -0.0235137939453125, -0.0135955810546875, -0.010040283203125, -0.042022705078125, 0.06787109375, 0.00000852346420288086, -0.006542205810546875, -0.004657745361328125, 0.03759765625, 0.0186767578125, 0.0168304443359375, 0.0400390625, 0.06304931640625, 0.07733154296875, -0.01276397705078125, 0.07611083984375, -0.03521728515625, 0.039459228515625, 0.059600830078125, -0.027435302734375, 0.059417724609375, 0.03253173828125, -0.035003662109375, 0.036895751953125, 0.04345703125, -0.032379150390625, 0.03668212890625, 0.020904541015625, 0.0081787109375, -0.007053375244140625, -0.00995635986328125, -0.043975830078125, 0.021209716796875, 0.0216064453125, -0.034210205078125, -0.0108795166015625, -0.0031070709228515625, 0.0013303756713867188, -0.017181396484375, -0.01306915283203125, 0.055877685546875, 0.0010280609130859375, -0.033782958984375, 0.032958984375, 0.013580322265625, 0.0487060546875, -0.04888916015625, -0.0016202926635742188, -0.00530242919921875, -0.0088348388671875, -0.0093994140625, -0.060943603515625, 0.024627685546875, 0.0289764404296875, -0.004146575927734375, -0.0225830078125, 0.049591064453125, -0.021881103515625, -0.033294677734375, 0.01470184326171875, 0.0205078125, 0.03656005859375, 0.004436492919921875, -0.07818603515625, -0.01268768310546875, -0.018951416015625, -0.01132965087890625, 0.006313323974609375, 0.041656494140625, 0.00922393798828125, 0.033294677734375, 0.05145263671875, 0.0170745849609375, 0.022979736328125, -0.0073394775390625, 0.06732177734375, -0.07391357421875, -0.051849365234375, -0.05133056640625, 0.048736572265625, -0.01509857177734375, -0.029205322265625, 0.02838134765625, 0.061676025390625, 0.05169677734375, -0.007343292236328125, 0.068359375, -0.04266357421875, 0.034210205078125, -0.05804443359375, 0.06854248046875, -0.046356201171875, 0.0106201171875, -0.004917144775390625, -0.05535888671875, -0.01708984375, 0.043243408203125, -0.0273895263671875, 0.0084228515625, 0.05645751953125, 0.04803466796875, -0.005641937255859375, -0.01416015625, 0.035491943359375, 0.0190887451171875, 0.015594482421875, 0.035491943359375, 0.055633544921875, -0.04620361328125, 0.0654296875, -0.031982421875, 0.0095977783203125, -0.03192138671875, -0.06591796875, -0.054046630859375, -0.0269622802734375, -0.017303466796875, -0.024658203125, 0.0011034011840820312, 0.0687255859375, 0.05780029296875, -0.08013916015625, -0.0276947021484375, -0.0146331787109375, -0.01055908203125, -0.0098114013671875, -0.01348114013671875, 0.0160980224609375, -0.0283355712890625, -0.0308685302734375, 0.0167236328125, 0.0008449554443359375, -0.0012035369873046875, -0.004825592041015625, -0.01273345947265625, -0.039764404296875, 0.0053558349609375, 0.048126220703125, 0.032318115234375, -0.07379150390625, -0.02532958984375, -0.013763427734375, -0.0251312255859375, 0.0111236572265625, 0.04791259765625, -0.0218505859375, 0.0229644775390625, 0.0229644775390625, 0.03741455078125, 0.042999267578125, -0.0021381378173828125, 0.03662109375, -0.052703857421875, 0.029998779296875, 0.0197906494140625, 0.05291748046875, -0.01007080078125, -0.022796630859375, 0.046905517578125, 0.01419830322265625, -0.034210205078125, -0.051422119140625, -0.00028705596923828125, -0.0885009765625, -0.00882720947265625, 0.0634765625, -0.0292205810546875, -0.0267486572265625, -0.020843505859375, -0.00815582275390625, 0.03436279296875, -0.0426025390625, 0.041412353515625, 0.060089111328125, 0.00623321533203125, 0.0185699462890625, -0.06854248046875, 0.03167724609375, 0.033782958984375, -0.055938720703125, -0.0164794921875, 0.0281219482421875, 0.02008056640625, 0.0182952880859375, 0.059906005859375, -0.01056671142578125, 0.0259552001953125, 0.005481719970703125, 0.016815185546875, 0.00395965576171875, 0.0008764266967773438, -0.0318603515625, 0.01543426513671875, -0.0150299072265625, -0.02703857421875 ] ]
Helsinki-NLP/opus-mt-ar-en
2023-08-16T11:25:35.000Z
[ "transformers", "pytorch", "tf", "rust", "marian", "text2text-generation", "translation", "ar", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
translation
Helsinki-NLP
null
null
Helsinki-NLP/opus-mt-ar-en
18
190,964
transformers
2022-03-02T23:29:04
--- tags: - translation license: apache-2.0 --- ### opus-mt-ar-en * source languages: ar * target languages: en * OPUS readme: [ar-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ar-en/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/ar-en/opus-2019-12-18.zip) * test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/ar-en/opus-2019-12-18.test.txt) * test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/ar-en/opus-2019-12-18.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | Tatoeba.ar.en | 49.4 | 0.661 |
818
[ [ -0.0210113525390625, -0.0300750732421875, 0.01531982421875, 0.02581787109375, -0.03338623046875, -0.0264892578125, -0.02581787109375, -0.007007598876953125, -0.0005955696105957031, 0.034942626953125, -0.041900634765625, -0.044219970703125, -0.049102783203125, 0.01617431640625, -0.00768280029296875, 0.05743408203125, -0.015716552734375, 0.03631591796875, 0.019622802734375, -0.0374755859375, -0.0200958251953125, -0.031890869140625, -0.04046630859375, -0.0244598388671875, 0.02349853515625, 0.023681640625, 0.022857666015625, 0.033660888671875, 0.07342529296875, 0.0158843994140625, -0.009490966796875, 0.007709503173828125, -0.0283355712890625, -0.005321502685546875, 0.0051727294921875, -0.04022216796875, -0.05328369140625, -0.0115203857421875, 0.08343505859375, 0.0258331298828125, -0.00543975830078125, 0.033599853515625, -0.007083892822265625, 0.0721435546875, -0.0270843505859375, 0.00027632713317871094, -0.042877197265625, 0.0072174072265625, -0.0248870849609375, -0.0218658447265625, -0.0458984375, -0.01153564453125, 0.01074981689453125, -0.0445556640625, -0.0027313232421875, 0.0174713134765625, 0.104248046875, 0.0255126953125, -0.027862548828125, -0.01045989990234375, -0.046905517578125, 0.07586669921875, -0.058135986328125, 0.050445556640625, 0.032135009765625, 0.020782470703125, 0.0160980224609375, -0.03985595703125, -0.02410888671875, 0.0036334991455078125, -0.01224517822265625, 0.0096893310546875, -0.01406097412109375, -0.0183563232421875, 0.0197296142578125, 0.05615234375, -0.055999755859375, -0.00505828857421875, -0.041961669921875, 0.0014019012451171875, 0.0450439453125, 0.003414154052734375, 0.01102447509765625, -0.00991058349609375, -0.034698486328125, -0.04327392578125, -0.057159423828125, 0.0067596435546875, 0.0303802490234375, 0.021575927734375, -0.036468505859375, 0.05694580078125, -0.0093994140625, 0.047332763671875, -0.00140380859375, 0.0032329559326171875, 0.07159423828125, -0.0302734375, -0.0287017822265625, -0.01145172119140625, 0.08599853515625, 0.022247314453125, -0.00007843971252441406, -0.0024738311767578125, -0.019622802734375, -0.01934814453125, 0.0074310302734375, -0.06488037109375, -0.0029621124267578125, 0.011138916015625, -0.037353515625, -0.006671905517578125, -0.0005655288696289062, -0.050140380859375, 0.01361083984375, -0.025726318359375, 0.04779052734375, -0.046051025390625, -0.020751953125, 0.025726318359375, 0.002960205078125, 0.0292205810546875, -0.008087158203125, -0.049285888671875, 0.0159759521484375, 0.0282745361328125, 0.059478759765625, -0.0296783447265625, -0.023529052734375, -0.043426513671875, -0.013275146484375, -0.00788116455078125, 0.0543212890625, -0.0094146728515625, -0.029876708984375, -0.005466461181640625, 0.0282745361328125, -0.0259246826171875, -0.0256500244140625, 0.09783935546875, -0.022918701171875, 0.049591064453125, -0.032440185546875, -0.03277587890625, -0.0267181396484375, 0.041412353515625, -0.043426513671875, 0.09344482421875, 0.00738525390625, -0.057952880859375, 0.0192718505859375, -0.06451416015625, -0.01351165771484375, -0.0085296630859375, 0.0037517547607421875, -0.046112060546875, 0.003795623779296875, 0.0124053955078125, 0.027069091796875, -0.0236358642578125, 0.01363372802734375, 0.006954193115234375, -0.0245513916015625, 0.0014324188232421875, -0.0268402099609375, 0.065185546875, 0.0258636474609375, -0.0276031494140625, 0.0176849365234375, -0.07000732421875, -0.00933837890625, -0.004573822021484375, -0.03851318359375, -0.01332855224609375, 0.0103912353515625, 0.01739501953125, 0.0082550048828125, 0.0236053466796875, -0.047149658203125, 0.017120361328125, -0.04949951171875, 0.0095977783203125, 0.046112060546875, -0.0223236083984375, 0.031341552734375, -0.03338623046875, 0.0255889892578125, 0.00775909423828125, 0.0033111572265625, 0.005214691162109375, -0.031097412109375, -0.062744140625, -0.0212554931640625, 0.043914794921875, 0.08050537109375, -0.05267333984375, 0.060150146484375, -0.04486083984375, -0.056640625, -0.05755615234375, -0.006565093994140625, 0.039581298828125, 0.0306396484375, 0.039154052734375, -0.012176513671875, -0.03546142578125, -0.0784912109375, -0.01113128662109375, -0.014556884765625, -0.01372528076171875, 0.01406097412109375, 0.045806884765625, -0.0165863037109375, 0.0310821533203125, -0.029632568359375, -0.033538818359375, -0.0120849609375, 0.012603759765625, 0.031951904296875, 0.046722412109375, 0.04150390625, -0.06378173828125, -0.037200927734375, 0.0015106201171875, -0.051177978515625, -0.005863189697265625, 0.01094818115234375, -0.020904541015625, 0.0128326416015625, 0.003444671630859375, -0.024627685546875, 0.00789642333984375, 0.05181884765625, -0.0484619140625, 0.03936767578125, -0.006137847900390625, 0.0289154052734375, -0.10443115234375, 0.01399993896484375, -0.00946807861328125, -0.0024662017822265625, -0.027313232421875, 0.00421905517578125, 0.01007080078125, 0.007843017578125, -0.062164306640625, 0.04736328125, -0.0159912109375, -0.00479888916015625, 0.01910400390625, -0.005279541015625, 0.00457763671875, 0.055023193359375, -0.0023975372314453125, 0.0703125, 0.053741455078125, -0.04052734375, 0.01129150390625, 0.04833984375, -0.03369140625, 0.0258636474609375, -0.060028076171875, -0.0191497802734375, 0.0269622802734375, -0.0093536376953125, -0.046142578125, 0.00873565673828125, 0.0308837890625, -0.04852294921875, 0.0250396728515625, -0.01074981689453125, -0.056671142578125, 0.005733489990234375, -0.01995849609375, 0.040435791015625, 0.051361083984375, -0.01617431640625, 0.0538330078125, 0.00473785400390625, 0.000782012939453125, -0.0440673828125, -0.07452392578125, -0.00873565673828125, -0.030181884765625, -0.052520751953125, 0.0134124755859375, -0.026824951171875, -0.00841522216796875, 0.0036525726318359375, 0.021240234375, -0.0033588409423828125, -0.0014829635620117188, 0.0043182373046875, 0.01531982421875, -0.036102294921875, 0.0107421875, -0.002216339111328125, -0.00818634033203125, -0.01256561279296875, -0.00559234619140625, 0.048553466796875, -0.03289794921875, -0.0200347900390625, -0.045623779296875, 0.007965087890625, 0.0452880859375, -0.037353515625, 0.06689453125, 0.042694091796875, -0.00676727294921875, 0.0137481689453125, -0.033660888671875, 0.00882720947265625, -0.031768798828125, 0.00952911376953125, -0.032806396484375, -0.0576171875, 0.045562744140625, 0.007709503173828125, 0.0352783203125, 0.06353759765625, 0.047271728515625, 0.00678253173828125, 0.052703857421875, 0.022979736328125, 0.0024433135986328125, 0.035858154296875, -0.03851318359375, -0.0085601806640625, -0.0841064453125, 0.0029296875, -0.054229736328125, -0.0256195068359375, -0.05804443359375, -0.015106201171875, 0.018280029296875, 0.0009288787841796875, -0.0252532958984375, 0.04571533203125, -0.045166015625, 0.01445770263671875, 0.046630859375, -0.0116424560546875, 0.019683837890625, 0.00782012939453125, -0.03424072265625, -0.00927734375, -0.031097412109375, -0.04608154296875, 0.09814453125, 0.0277099609375, 0.0243072509765625, 0.0207061767578125, 0.0379638671875, 0.002552032470703125, 0.00888824462890625, -0.043487548828125, 0.032196044921875, -0.022857666015625, -0.054046630859375, -0.0205230712890625, -0.044036865234375, -0.062744140625, 0.033111572265625, -0.02410888671875, -0.037567138671875, 0.01309967041015625, 0.0035400390625, -0.01355743408203125, 0.041229248046875, -0.0557861328125, 0.08123779296875, -0.004955291748046875, -0.00395965576171875, 0.020233154296875, -0.0325927734375, 0.018798828125, 0.0035152435302734375, 0.0175933837890625, -0.0153656005859375, 0.01123809814453125, 0.052520751953125, -0.0006470680236816406, 0.027557373046875, -0.0009145736694335938, -0.0011310577392578125, 0.005367279052734375, 0.008544921875, 0.031524658203125, -0.008026123046875, -0.03558349609375, 0.032257080078125, 0.0095977783203125, -0.03424072265625, -0.01100921630859375, 0.040374755859375, -0.058349609375, -0.00008100271224975586, -0.038116455078125, -0.049346923828125, 0.0031185150146484375, 0.0236358642578125, 0.0498046875, 0.054168701171875, -0.019775390625, 0.03668212890625, 0.062255859375, -0.0184326171875, 0.0267181396484375, 0.050933837890625, -0.013153076171875, -0.042449951171875, 0.06304931640625, 0.0043792724609375, 0.023101806640625, 0.044342041015625, 0.0045623779296875, -0.01184844970703125, -0.059844970703125, -0.054718017578125, 0.019561767578125, -0.021331787109375, -0.012939453125, -0.04193115234375, -0.004169464111328125, -0.022491455078125, 0.01751708984375, -0.036651611328125, -0.037139892578125, -0.013641357421875, -0.0160675048828125, 0.0217437744140625, 0.0163421630859375, -0.00624847412109375, 0.037353515625, -0.07861328125, 0.01837158203125, -0.00595855712890625, 0.0298309326171875, -0.031585693359375, -0.058990478515625, -0.032440185546875, 0.004535675048828125, -0.04547119140625, -0.0430908203125, 0.039520263671875, 0.00907135009765625, 0.0216064453125, 0.0226287841796875, 0.0093841552734375, 0.02362060546875, -0.057952880859375, 0.07452392578125, -0.00653839111328125, -0.050933837890625, 0.034454345703125, -0.031982421875, 0.031280517578125, 0.07037353515625, 0.015716552734375, -0.0233306884765625, -0.032257080078125, -0.049774169921875, -0.05877685546875, 0.0567626953125, 0.05267333984375, -0.0087890625, 0.0143585205078125, -0.0052032470703125, -0.0025615692138671875, 0.01181793212890625, -0.07891845703125, -0.027740478515625, 0.004642486572265625, -0.0277099609375, -0.01580810546875, -0.0232086181640625, -0.0251617431640625, -0.01090240478515625, 0.08135986328125, 0.0193328857421875, 0.02166748046875, 0.03271484375, -0.006038665771484375, -0.0189056396484375, 0.027191162109375, 0.078125, 0.042236328125, -0.045257568359375, -0.00832366943359375, 0.023712158203125, -0.03338623046875, -0.00955963134765625, 0.01226043701171875, -0.0255279541015625, 0.01995849609375, 0.03460693359375, 0.0838623046875, 0.007083892822265625, -0.051422119140625, 0.02862548828125, -0.0286712646484375, -0.03167724609375, -0.0494384765625, -0.01116180419921875, -0.0010547637939453125, 0.00954437255859375, 0.02471923828125, 0.016845703125, 0.014739990234375, -0.0159912109375, 0.01134490966796875, 0.007579803466796875, -0.044403076171875, -0.040863037109375, 0.034912109375, 0.01122283935546875, -0.032867431640625, 0.03436279296875, -0.030181884765625, -0.03936767578125, 0.0276947021484375, 0.00801849365234375, 0.07830810546875, -0.013671875, -0.0159454345703125, 0.0521240234375, 0.044647216796875, -0.019622802734375, 0.034698486328125, 0.0072784423828125, -0.051971435546875, -0.0433349609375, -0.0657958984375, -0.0162200927734375, 0.00943756103515625, -0.05926513671875, 0.02520751953125, 0.0186614990234375, 0.004329681396484375, -0.0267791748046875, 0.017974853515625, -0.037078857421875, 0.010711669921875, -0.01898193359375, 0.079833984375, -0.06866455078125, 0.059478759765625, 0.042999267578125, -0.018951416015625, -0.06304931640625, -0.0175628662109375, -0.0186767578125, -0.0355224609375, 0.04498291015625, 0.0107269287109375, 0.01325225830078125, -0.00695037841796875, -0.0171966552734375, -0.06005859375, 0.08135986328125, 0.01715087890625, -0.040435791015625, -0.00041866302490234375, 0.006778717041015625, 0.036285400390625, -0.0281982421875, 0.001789093017578125, 0.030029296875, 0.05126953125, 0.010955810546875, -0.08087158203125, -0.0262603759765625, -0.042999267578125, -0.03057861328125, 0.043212890625, -0.04132080078125, 0.06781005859375, 0.034454345703125, -0.006488800048828125, 0.0093536376953125, 0.0458984375, 0.0304718017578125, 0.0264892578125, 0.03741455078125, 0.09320068359375, 0.0215606689453125, -0.032073974609375, 0.07635498046875, -0.02227783203125, 0.0306854248046875, 0.0841064453125, -0.00372314453125, 0.07440185546875, 0.029327392578125, -0.006988525390625, 0.03839111328125, 0.04736328125, -0.02081298828125, 0.042236328125, -0.0032749176025390625, 0.003986358642578125, -0.0101470947265625, 0.0150146484375, -0.055084228515625, 0.0222320556640625, 0.01045989990234375, -0.0172271728515625, 0.002628326416015625, -0.0032444000244140625, -0.0013780593872070312, -0.0029544830322265625, -0.015289306640625, 0.04571533203125, -0.005260467529296875, -0.040771484375, 0.05145263671875, -0.006805419921875, 0.05120849609375, -0.052215576171875, 0.0038700103759765625, -0.0028839111328125, 0.017120361328125, 0.003261566162109375, -0.039764404296875, 0.037445068359375, 0.0023403167724609375, -0.0217132568359375, -0.0273590087890625, 0.010406494140625, -0.037811279296875, -0.06982421875, 0.0257568359375, 0.0300750732421875, 0.028472900390625, 0.004848480224609375, -0.06353759765625, 0.00632476806640625, 0.00775909423828125, -0.046600341796875, 0.005931854248046875, 0.04998779296875, 0.0253448486328125, 0.036163330078125, 0.049224853515625, 0.0211639404296875, 0.02117919921875, -0.0084228515625, 0.0489501953125, -0.0289459228515625, -0.02716064453125, -0.05865478515625, 0.05975341796875, -0.0032939910888671875, -0.05364990234375, 0.050079345703125, 0.07489013671875, 0.07635498046875, -0.007720947265625, 0.0183258056640625, -0.00516510009765625, 0.058807373046875, -0.05010986328125, 0.051300048828125, -0.0718994140625, 0.0148468017578125, -0.004547119140625, -0.07080078125, -0.019561767578125, 0.027496337890625, -0.01556396484375, -0.0254974365234375, 0.05523681640625, 0.04815673828125, -0.01556396484375, -0.0259857177734375, 0.020172119140625, 0.017425537109375, 0.01531982421875, 0.03814697265625, 0.0272369384765625, -0.07672119140625, 0.043731689453125, -0.0190887451171875, -0.0043487548828125, -0.00011610984802246094, -0.05267333984375, -0.06842041015625, -0.050384521484375, -0.0105438232421875, -0.0165557861328125, -0.01934814453125, 0.0675048828125, 0.040679931640625, -0.07232666015625, -0.03857421875, 0.001277923583984375, 0.00852203369140625, -0.01412200927734375, -0.0164794921875, 0.04949951171875, -0.0211181640625, -0.0732421875, 0.036102294921875, 0.00246429443359375, -0.004344940185546875, 0.0017309188842773438, -0.024993896484375, -0.0433349609375, -0.0015392303466796875, 0.0199127197265625, 0.00959014892578125, -0.04254150390625, 0.0034084320068359375, 0.010711669921875, -0.0022640228271484375, 0.0272369384765625, 0.023223876953125, -0.0200653076171875, 0.018829345703125, 0.05828857421875, 0.031280517578125, 0.029022216796875, -0.008148193359375, 0.03851318359375, -0.051239013671875, 0.02880859375, 0.01898193359375, 0.0438232421875, 0.027069091796875, -0.006404876708984375, 0.061309814453125, 0.0088653564453125, -0.053680419921875, -0.08575439453125, 0.004993438720703125, -0.0933837890625, 0.004734039306640625, 0.07269287109375, -0.0173187255859375, -0.0201568603515625, 0.02862548828125, -0.0155181884765625, 0.0182647705078125, -0.029144287109375, 0.03399658203125, 0.0645751953125, 0.0246124267578125, -0.0019521713256835938, -0.05078125, 0.0278167724609375, 0.0423583984375, -0.048431396484375, -0.0190582275390625, 0.0129547119140625, 0.008575439453125, 0.031341552734375, 0.02978515625, -0.0192413330078125, 0.00876617431640625, -0.0225372314453125, 0.0355224609375, -0.01161956787109375, -0.01445770263671875, -0.025726318359375, 0.00014913082122802734, -0.0021419525146484375, -0.0284423828125 ] ]
laion/CLIP-ViT-L-14-DataComp.XL-s13B-b90K
2023-05-16T16:59:39.000Z
[ "open_clip", "pytorch", "clip", "zero-shot-image-classification", "dataset:mlfoundations/datacomp_pools", "arxiv:2304.14108", "license:mit", "has_space", "region:us" ]
zero-shot-image-classification
laion
null
null
laion/CLIP-ViT-L-14-DataComp.XL-s13B-b90K
89
190,462
open_clip
2023-04-26T01:41:18
--- license: mit widget: - src: >- https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png candidate_labels: playing music, playing sports example_title: Cat & Dog library_name: open_clip datasets: - mlfoundations/datacomp_pools pipeline_tag: zero-shot-image-classification --- # Model card for CLIP ViT-L-14 trained DataComp-1B # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) 7. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description A CLIP ViT-L/14 model trained with the DataComp-1B (https://github.com/mlfoundations/datacomp) using OpenCLIP (https://github.com/mlfoundations/open_clip). Model training done on the [stability.ai](https://stability.ai/) cluster. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the DataComp paper (https://arxiv.org/abs/2304.14108) include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. # Training Details ## Training Data This model was trained with the 1.4 Billion samples of the DataComp-1B dataset (https://arxiv.org/abs/2304.14108). **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure Please see https://arxiv.org/abs/2304.14108. # Evaluation Evaluation done on 38 datasets, using the [DataComp repo](https://github.com/mlfoundations/datacomp) and the [LAION CLIP Benchmark](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed on a suite of 38 datasets. See our paper for more details (https://arxiv.org/abs/2304.14108). ## Results The model achieves a 79.2% zero-shot top-1 accuracy on ImageNet-1k. See our paper for more details and results (https://arxiv.org/abs/2304.14108). # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model. # Citation **BibTeX:** DataComp ```bibtex @article{datacomp, title={DataComp: In search of the next generation of multimodal datasets}, author={Samir Yitzhak Gadre, Gabriel Ilharco, Alex Fang, Jonathan Hayase, Georgios Smyrnis, Thao Nguyen, Ryan Marten, Mitchell Wortsman, Dhruba Ghosh, Jieyu Zhang, Eyal Orgad, Rahim Entezari, Giannis Daras, Sarah Pratt, Vivek Ramanujan, Yonatan Bitton, Kalyani Marathe, Stephen Mussmann, Richard Vencu, Mehdi Cherti, Ranjay Krishna, Pang Wei Koh, Olga Saukh, Alexander Ratner, Shuran Song, Hannaneh Hajishirzi, Ali Farhadi, Romain Beaumont, Sewoong Oh, Alex Dimakis, Jenia Jitsev, Yair Carmon, Vaishaal Shankar, Ludwig Schmidt}, journal={arXiv preprint arXiv:2304.14108}, year={2023} } ``` OpenAI CLIP paper ``` @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` OpenCLIP software ``` @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` # How to Get Started with the Model See https://github.com/mlfoundations/open_clip
7,405
[ [ -0.033233642578125, -0.047607421875, 0.012725830078125, 0.0025997161865234375, -0.0294647216796875, -0.034912109375, -0.01422119140625, -0.04278564453125, 0.0028171539306640625, 0.031951904296875, -0.043060302734375, -0.04608154296875, -0.048248291015625, -0.005626678466796875, -0.033355712890625, 0.067138671875, -0.0115203857421875, -0.00457763671875, -0.0261993408203125, -0.026885986328125, -0.039459228515625, -0.03912353515625, -0.025665283203125, 0.0093994140625, 0.0145721435546875, 0.0247955322265625, 0.05078125, 0.058441162109375, 0.058837890625, 0.0166473388671875, -0.0010662078857421875, 0.00862884521484375, -0.044097900390625, -0.032318115234375, -0.0063629150390625, -0.02191162109375, -0.03900146484375, 0.0166473388671875, 0.044952392578125, 0.033294677734375, -0.006214141845703125, 0.017608642578125, -0.004547119140625, 0.040130615234375, -0.0576171875, 0.0177154541015625, -0.047088623046875, 0.003787994384765625, -0.01453399658203125, -0.0019445419311523438, -0.023101806640625, -0.012725830078125, 0.0149078369140625, -0.0582275390625, 0.012847900390625, -0.0036106109619140625, 0.091064453125, 0.0194091796875, -0.0236358642578125, 0.0135498046875, -0.048858642578125, 0.060211181640625, -0.05401611328125, 0.02349853515625, 0.0265045166015625, 0.0362548828125, 0.006191253662109375, -0.057861328125, -0.03802490234375, -0.00843048095703125, 0.01128387451171875, 0.0108795166015625, -0.0289764404296875, 0.00371551513671875, 0.03466796875, 0.018402099609375, -0.0328369140625, 0.0022449493408203125, -0.0479736328125, 0.00167083740234375, 0.048126220703125, 0.01076507568359375, 0.027252197265625, -0.018951416015625, -0.058441162109375, -0.0350341796875, -0.04595947265625, 0.0288848876953125, 0.0132904052734375, 0.01325225830078125, -0.036102294921875, 0.038238525390625, 0.0006012916564941406, 0.03680419921875, -0.009307861328125, -0.0198211669921875, 0.039031982421875, -0.03753662109375, -0.024932861328125, -0.01104736328125, 0.07769775390625, 0.05389404296875, 0.01508331298828125, 0.007099151611328125, 0.004467010498046875, -0.0006499290466308594, 0.0245819091796875, -0.0762939453125, -0.00811004638671875, 0.0056915283203125, -0.04345703125, -0.02178955078125, 0.034576416015625, -0.06683349609375, -0.0034084320068359375, -0.022125244140625, 0.040771484375, -0.035125732421875, -0.023284912109375, 0.005748748779296875, -0.00452423095703125, 0.015228271484375, 0.026123046875, -0.049530029296875, 0.017822265625, 0.0281219482421875, 0.08642578125, -0.021759033203125, -0.0272674560546875, -0.0198974609375, 0.007167816162109375, -0.0238189697265625, 0.03485107421875, -0.0217742919921875, -0.0258331298828125, -0.00494384765625, 0.0297698974609375, -0.01270294189453125, -0.04010009765625, 0.036651611328125, -0.01493072509765625, 0.0009531974792480469, -0.01312255859375, -0.0176849365234375, -0.041595458984375, 0.01763916015625, -0.055572509765625, 0.06689453125, 0.00206756591796875, -0.0657958984375, 0.0291748046875, -0.052734375, -0.007122039794921875, -0.022674560546875, -0.007534027099609375, -0.0491943359375, -0.0235443115234375, 0.038543701171875, 0.038299560546875, -0.0243988037109375, 0.035400390625, -0.051666259765625, -0.0238800048828125, 0.015716552734375, -0.0322265625, 0.07305908203125, 0.004878997802734375, -0.024139404296875, 0.01186370849609375, -0.053558349609375, -0.00586700439453125, 0.0184478759765625, 0.0019855499267578125, -0.0195159912109375, -0.0179595947265625, 0.004863739013671875, 0.023101806640625, 0.0025386810302734375, -0.048797607421875, 0.006130218505859375, -0.006694793701171875, 0.03271484375, 0.056640625, 0.0093231201171875, 0.0203399658203125, -0.032012939453125, 0.04632568359375, 0.0175018310546875, 0.046295166015625, -0.0244293212890625, -0.036407470703125, -0.0494384765625, -0.045166015625, 0.0237884521484375, 0.03851318359375, -0.05413818359375, 0.032196044921875, -0.0213165283203125, -0.0479736328125, -0.0271453857421875, -0.0012140274047851562, 0.03558349609375, 0.04327392578125, 0.0390625, -0.03912353515625, -0.042388916015625, -0.067138671875, 0.0215301513671875, 0.00525665283203125, 0.004184722900390625, 0.037261962890625, 0.05560302734375, -0.01480865478515625, 0.0810546875, -0.049041748046875, -0.040435791015625, -0.00926971435546875, 0.0029430389404296875, 0.009918212890625, 0.044830322265625, 0.070556640625, -0.0709228515625, -0.0287628173828125, -0.0166015625, -0.08062744140625, 0.0110626220703125, 0.0004222393035888672, -0.02142333984375, 0.00568389892578125, 0.034423828125, -0.043182373046875, 0.05633544921875, 0.03472900390625, -0.000021398067474365234, 0.0364990234375, -0.005573272705078125, 0.0076141357421875, -0.0828857421875, 0.03424072265625, 0.018524169921875, -0.01030731201171875, -0.040802001953125, -0.0046844482421875, -0.0009708404541015625, -0.0282745361328125, -0.05810546875, 0.035858154296875, -0.0235748291015625, 0.00376129150390625, -0.00484466552734375, -0.002750396728515625, 0.0037860870361328125, 0.05572509765625, 0.00676727294921875, 0.06658935546875, 0.061004638671875, -0.049560546875, 0.0072021484375, 0.034088134765625, -0.03582763671875, 0.0255126953125, -0.06903076171875, -0.0059051513671875, -0.007007598876953125, 0.01480865478515625, -0.044464111328125, -0.022735595703125, 0.032318115234375, -0.03143310546875, 0.0252838134765625, -0.0218353271484375, -0.022735595703125, -0.0262908935546875, -0.04296875, 0.03997802734375, 0.037506103515625, -0.047637939453125, 0.0198974609375, 0.037933349609375, 0.004718780517578125, -0.05242919921875, -0.05938720703125, -0.0200042724609375, -0.019317626953125, -0.049560546875, 0.0248870849609375, -0.01031494140625, 0.0009102821350097656, 0.002178192138671875, 0.006610870361328125, -0.01505279541015625, -0.00926971435546875, 0.0506591796875, 0.0479736328125, -0.005401611328125, -0.0040130615234375, -0.01206207275390625, -0.006130218505859375, -0.0031757354736328125, 0.00067138671875, 0.01367950439453125, -0.00982666015625, -0.0347900390625, -0.03924560546875, 0.0213623046875, 0.050140380859375, -0.0305328369140625, 0.05133056640625, 0.047576904296875, -0.0279998779296875, 0.001251220703125, -0.0240631103515625, -0.00388336181640625, -0.0347900390625, 0.034454345703125, 0.006633758544921875, -0.0513916015625, 0.041412353515625, 0.00946807861328125, -0.0032444000244140625, 0.038726806640625, 0.027618408203125, 0.0090789794921875, 0.072265625, 0.06463623046875, 0.0007419586181640625, 0.055389404296875, -0.052581787109375, 0.00811004638671875, -0.069091796875, -0.0296173095703125, -0.016326904296875, -0.012847900390625, -0.044952392578125, -0.044189453125, 0.0546875, 0.01352691650390625, -0.01727294921875, 0.032806396484375, -0.034942626953125, 0.016204833984375, 0.044189453125, 0.035369873046875, 0.007442474365234375, -0.0016832351684570312, 0.00006109476089477539, -0.01053619384765625, -0.052093505859375, -0.0273590087890625, 0.09002685546875, 0.042877197265625, 0.06658935546875, -0.00672149658203125, 0.0265045166015625, 0.0133209228515625, -0.004810333251953125, -0.05279541015625, 0.046295166015625, -0.02337646484375, -0.05328369140625, -0.0207061767578125, -0.027618408203125, -0.06463623046875, 0.005096435546875, -0.01360321044921875, -0.05462646484375, 0.04425048828125, 0.00830841064453125, -0.0235443115234375, 0.0386962890625, -0.044952392578125, 0.08050537109375, -0.0239105224609375, -0.0241851806640625, 0.007122039794921875, -0.058441162109375, 0.040924072265625, 0.01033782958984375, 0.006237030029296875, -0.015838623046875, 0.00426483154296875, 0.068115234375, -0.051116943359375, 0.07745361328125, -0.0177001953125, 0.0164947509765625, 0.046142578125, -0.0162200927734375, 0.0151824951171875, 0.00391387939453125, 0.00959014892578125, 0.045013427734375, 0.007381439208984375, -0.01055145263671875, -0.033416748046875, 0.03607177734375, -0.06512451171875, -0.0184478759765625, -0.03057861328125, -0.040252685546875, 0.01108551025390625, 0.0242156982421875, 0.0352783203125, 0.05487060546875, -0.005462646484375, 0.032135009765625, 0.051177978515625, -0.026824951171875, 0.0350341796875, 0.02239990234375, -0.015960693359375, -0.050506591796875, 0.0762939453125, 0.027099609375, 0.0269775390625, 0.01476287841796875, 0.000007569789886474609, -0.004337310791015625, -0.032623291015625, -0.033416748046875, 0.01031494140625, -0.05938720703125, -0.03192138671875, -0.036224365234375, -0.022674560546875, -0.0254974365234375, -0.0036220550537109375, -0.04193115234375, -0.022674560546875, -0.043975830078125, -0.0039043426513671875, 0.036773681640625, 0.034271240234375, -0.016845703125, 0.01360321044921875, -0.059051513671875, 0.023651123046875, 0.0225677490234375, 0.036529541015625, 0.0030574798583984375, -0.0426025390625, -0.0205078125, 0.0173797607421875, -0.04058837890625, -0.041412353515625, 0.027496337890625, 0.0220794677734375, 0.04400634765625, 0.04290771484375, 0.0178680419921875, 0.051483154296875, -0.0255584716796875, 0.07452392578125, 0.0302581787109375, -0.053863525390625, 0.0494384765625, -0.040740966796875, 0.0191192626953125, 0.05267333984375, 0.056243896484375, -0.01428985595703125, 0.0003268718719482422, -0.045135498046875, -0.07220458984375, 0.06927490234375, 0.00850677490234375, -0.00203704833984375, 0.0030975341796875, 0.01244354248046875, 0.0038738250732421875, 0.0169677734375, -0.064453125, -0.0130615234375, -0.037353515625, 0.0012178421020507812, 0.0102691650390625, -0.01227569580078125, -0.0157318115234375, -0.036285400390625, 0.053436279296875, -0.016082763671875, 0.046142578125, 0.02423095703125, -0.0035724639892578125, -0.0145721435546875, -0.001918792724609375, 0.040191650390625, 0.050506591796875, -0.03759765625, -0.01214599609375, -0.0017518997192382812, -0.04937744140625, -0.007568359375, 0.01128387451171875, -0.032135009765625, -0.006011962890625, 0.031280517578125, 0.08935546875, 0.023101806640625, -0.057281494140625, 0.07183837890625, 0.00435638427734375, -0.031005859375, -0.0275421142578125, 0.0087738037109375, -0.03143310546875, 0.01496124267578125, 0.0183258056640625, 0.0172576904296875, 0.01067352294921875, -0.04351806640625, 0.020965576171875, 0.037139892578125, -0.03729248046875, -0.0251922607421875, 0.061248779296875, -0.0017805099487304688, 0.0086669921875, 0.046112060546875, -0.0008788108825683594, -0.0316162109375, 0.05572509765625, 0.0277862548828125, 0.06646728515625, 0.006317138671875, 0.020416259765625, 0.054046630859375, 0.0200958251953125, -0.015045166015625, 0.00646209716796875, 0.0104827880859375, -0.037017822265625, -0.01125335693359375, -0.0272674560546875, -0.03240966796875, 0.0165252685546875, -0.07183837890625, 0.040374755859375, -0.049041748046875, -0.025146484375, -0.00966644287109375, -0.02783203125, -0.0416259765625, 0.0105743408203125, 0.01360321044921875, 0.0767822265625, -0.0648193359375, 0.053955078125, 0.04437255859375, -0.06292724609375, -0.05615234375, -0.00534820556640625, -0.0031986236572265625, -0.0419921875, 0.0322265625, 0.03704833984375, 0.0020008087158203125, -0.0213470458984375, -0.076904296875, -0.07440185546875, 0.11199951171875, 0.037506103515625, -0.023040771484375, -0.005062103271484375, 0.00738525390625, 0.027099609375, -0.0216217041015625, 0.0283355712890625, 0.0137786865234375, 0.01666259765625, 0.0207366943359375, -0.08026123046875, 0.0006928443908691406, -0.0301971435546875, 0.0135498046875, 0.0056610107421875, -0.06842041015625, 0.07586669921875, -0.022979736328125, -0.0203704833984375, 0.00498199462890625, 0.03936767578125, 0.0085906982421875, 0.0296478271484375, 0.028350830078125, 0.055755615234375, 0.032501220703125, 0.00010538101196289062, 0.07501220703125, -0.0096435546875, 0.0310211181640625, 0.0838623046875, -0.0009636878967285156, 0.07275390625, 0.026611328125, -0.019378662109375, 0.0262451171875, 0.0335693359375, -0.033538818359375, 0.060516357421875, -0.020538330078125, 0.01242828369140625, -0.01470947265625, -0.0307464599609375, -0.03802490234375, 0.038238525390625, 0.0037708282470703125, -0.037933349609375, -0.0143280029296875, 0.031158447265625, -0.0004000663757324219, -0.01448822021484375, -0.0157318115234375, 0.040496826171875, 0.006572723388671875, -0.03082275390625, 0.06329345703125, -0.00897979736328125, 0.055572509765625, -0.0596923828125, -0.01245880126953125, 0.0003414154052734375, 0.0165252685546875, -0.017059326171875, -0.058746337890625, 0.0177154541015625, -0.0007581710815429688, -0.0163726806640625, -0.008148193359375, 0.048919677734375, -0.017578125, -0.035400390625, 0.0299835205078125, 0.004207611083984375, 0.0102081298828125, 0.0001697540283203125, -0.05181884765625, 0.0169830322265625, -0.0004527568817138672, -0.016021728515625, 0.033660888671875, 0.016815185546875, -0.005084991455078125, 0.054718017578125, 0.043365478515625, -0.0185699462890625, 0.01172637939453125, -0.01189422607421875, 0.08819580078125, -0.033477783203125, -0.033233642578125, -0.04327392578125, 0.050994873046875, -0.00623321533203125, -0.036773681640625, 0.057281494140625, 0.04083251953125, 0.07940673828125, -0.01210784912109375, 0.05828857421875, -0.0244598388671875, 0.0298614501953125, -0.046142578125, 0.044891357421875, -0.0499267578125, 0.0050811767578125, -0.044189453125, -0.05316162109375, -0.019073486328125, 0.03961181640625, -0.0232086181640625, 0.0041961669921875, 0.044158935546875, 0.06719970703125, -0.0241851806640625, -0.0009613037109375, 0.0160064697265625, 0.005771636962890625, 0.019805908203125, 0.039703369140625, 0.03521728515625, -0.055694580078125, 0.05120849609375, -0.0582275390625, -0.0236968994140625, -0.01264190673828125, -0.0687255859375, -0.0810546875, -0.048583984375, -0.032989501953125, -0.0062408447265625, -0.005512237548828125, 0.070556640625, 0.07275390625, -0.055816650390625, -0.0169677734375, 0.0048370361328125, -0.0185394287109375, -0.02471923828125, -0.01654052734375, 0.030364990234375, 0.01255035400390625, -0.041595458984375, 0.01464080810546875, 0.00958251953125, 0.0195770263671875, -0.0129547119140625, -0.00432586669921875, -0.037994384765625, -0.01271820068359375, 0.034515380859375, 0.0357666015625, -0.04107666015625, -0.01499176025390625, 0.0008759498596191406, 0.005123138427734375, 0.0241851806640625, 0.04205322265625, -0.04229736328125, 0.049102783203125, 0.033721923828125, 0.034759521484375, 0.04205322265625, 0.01483154296875, 0.01788330078125, -0.048919677734375, 0.0299530029296875, 0.0004153251647949219, 0.0252685546875, 0.0221710205078125, -0.02410888671875, 0.054534912109375, 0.02935791015625, -0.034088134765625, -0.0693359375, -0.01039886474609375, -0.09747314453125, -0.01010894775390625, 0.0836181640625, -0.033782958984375, -0.0301666259765625, 0.021240234375, -0.0171051025390625, 0.025115966796875, -0.032073974609375, 0.033721923828125, 0.027313232421875, 0.00222015380859375, -0.03411865234375, -0.0657958984375, 0.0241851806640625, 0.0014524459838867188, -0.06512451171875, -0.0038661956787109375, 0.0355224609375, 0.022247314453125, 0.021484375, 0.03143310546875, -0.0262908935546875, 0.0279998779296875, -0.0012454986572265625, 0.0234832763671875, -0.0377197265625, -0.04583740234375, -0.031280517578125, 0.0035457611083984375, -0.014556884765625, -0.042938232421875 ] ]
hiiamsid/sentence_similarity_spanish_es
2023-08-02T12:43:34.000Z
[ "sentence-transformers", "pytorch", "bert", "feature-extraction", "sentence-similarity", "transformers", "es", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
hiiamsid
null
null
hiiamsid/sentence_similarity_spanish_es
30
189,946
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity language: - es tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers license: apache-2.0 --- # hiiamsid/sentence_similarity_spanish_es This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ['Mi nombre es Siddhartha', 'Mis amigos me llamaron por mi nombre Siddhartha'] model = SentenceTransformer('hiiamsid/sentence_similarity_spanish_es') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['Mi nombre es Siddhartha', 'Mis amigos me llamaron por mi nombre Siddhartha'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('hiiamsid/sentence_similarity_spanish_es') model = AutoModel.from_pretrained('hiiamsid/sentence_similarity_spanish_es') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results ``` cosine_pearson : 0.8280372842978689 cosine_spearman : 0.8232689765056079 euclidean_pearson : 0.81021993884437 euclidean_spearman : 0.8087904592393836 manhattan_pearson : 0.809645390126291 manhattan_spearman : 0.8077035464970413 dot_pearson : 0.7803662255836028 dot_spearman : 0.7699607641618339 ``` For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=hiiamsid/sentence_similarity_spanish_es) ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 360 with parameters: ``` {'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss` Parameters of the fit()-Method: ``` { "callback": null, "epochs": 4, "evaluation_steps": 1000, "evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator", "max_grad_norm": 1, "optimizer_class": "<class 'transformers.optimization.AdamW'>", "optimizer_params": { "lr": 2e-05 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 144, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors - Datasets : [stsb_multi_mt](https://huggingface.co/datasets/stsb_multi_mt) - Model : [dccuchile/bert-base-spanish-wwm-cased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) - Sentence Transformers [Semantic Textual Similarity](https://www.sbert.net/examples/training/sts/README.html)
4,509
[ [ -0.0211334228515625, -0.06268310546875, 0.02130126953125, 0.031463623046875, -0.01824951171875, -0.02203369140625, -0.0243377685546875, -0.00705718994140625, 0.0269775390625, 0.022674560546875, -0.05108642578125, -0.053802490234375, -0.05072021484375, 0.0102081298828125, -0.0246429443359375, 0.0576171875, -0.01255035400390625, 0.0015974044799804688, -0.01372528076171875, -0.01837158203125, -0.015411376953125, -0.0222625732421875, -0.03173828125, -0.020263671875, 0.016632080078125, 0.0127410888671875, 0.044219970703125, 0.04168701171875, 0.026763916015625, 0.0299530029296875, -0.0007219314575195312, 0.004467010498046875, -0.03094482421875, -0.006359100341796875, -0.0007419586181640625, -0.031890869140625, -0.00396728515625, 0.0126953125, 0.038848876953125, 0.032196044921875, -0.00922393798828125, 0.01303863525390625, 0.00630950927734375, 0.03314208984375, -0.023529052734375, 0.0255584716796875, -0.040771484375, 0.01277923583984375, 0.003307342529296875, -0.0011529922485351562, -0.02374267578125, -0.02716064453125, 0.0228424072265625, -0.0299835205078125, 0.02825927734375, 0.0213623046875, 0.09796142578125, 0.034576416015625, -0.0127716064453125, -0.034881591796875, -0.01320648193359375, 0.06390380859375, -0.06610107421875, 0.0218963623046875, 0.02276611328125, -0.006622314453125, -0.006992340087890625, -0.0592041015625, -0.06298828125, -0.0098724365234375, -0.0267181396484375, 0.0162811279296875, -0.025970458984375, -0.002513885498046875, 0.01129150390625, 0.0222930908203125, -0.06109619140625, 0.005817413330078125, -0.03125, -0.0225982666015625, 0.039886474609375, 0.0016880035400390625, 0.02630615234375, -0.046661376953125, -0.0330810546875, -0.035797119140625, -0.0172576904296875, 0.006412506103515625, 0.0224609375, 0.0113372802734375, -0.027557373046875, 0.055511474609375, -0.005596160888671875, 0.04034423828125, -0.0027904510498046875, 0.0075531005859375, 0.059417724609375, -0.01413726806640625, -0.026153564453125, -0.0019931793212890625, 0.09100341796875, 0.032012939453125, 0.026031494140625, 0.00202178955078125, -0.0011749267578125, 0.01309967041015625, 0.0024433135986328125, -0.061279296875, -0.023529052734375, 0.0191802978515625, -0.0318603515625, -0.020355224609375, 0.022064208984375, -0.06011962890625, 0.004852294921875, 0.0026798248291015625, 0.051422119140625, -0.04522705078125, 0.00746917724609375, 0.0286865234375, -0.0167999267578125, 0.0198822021484375, -0.0245513916015625, -0.048187255859375, 0.017913818359375, 0.0212860107421875, 0.07379150390625, -0.006359100341796875, -0.04345703125, -0.0181884765625, -0.010009765625, 0.00817108154296875, 0.04730224609375, -0.0216064453125, -0.0147552490234375, 0.0011281967163085938, 0.025909423828125, -0.050994873046875, -0.0247344970703125, 0.048614501953125, -0.015106201171875, 0.062469482421875, 0.0014057159423828125, -0.056884765625, -0.0185089111328125, 0.018829345703125, -0.047698974609375, 0.10125732421875, 0.0161285400390625, -0.0711669921875, 0.00908660888671875, -0.05023193359375, -0.0224609375, -0.016632080078125, -0.0087738037109375, -0.057403564453125, -0.001495361328125, 0.036529541015625, 0.05499267578125, -0.00580596923828125, 0.014129638671875, -0.0164794921875, -0.0306396484375, 0.01617431640625, -0.0254974365234375, 0.0816650390625, 0.005619049072265625, -0.03790283203125, 0.014007568359375, -0.04742431640625, -0.00452423095703125, 0.0259552001953125, -0.0148162841796875, -0.020263671875, -0.0145416259765625, 0.0237579345703125, 0.0283355712890625, 0.0197906494140625, -0.052581787109375, 0.0026569366455078125, -0.0504150390625, 0.048004150390625, 0.044647216796875, 0.002666473388671875, 0.02264404296875, -0.0290985107421875, 0.0301666259765625, 0.011810302734375, -0.0083770751953125, -0.00572967529296875, -0.033355712890625, -0.06353759765625, -0.02838134765625, 0.023590087890625, 0.04840087890625, -0.042633056640625, 0.07696533203125, -0.038848876953125, -0.04229736328125, -0.06890869140625, 0.0007882118225097656, 0.024383544921875, 0.0367431640625, 0.044647216796875, -0.00853729248046875, -0.0438232421875, -0.0684814453125, -0.00839996337890625, 0.005870819091796875, -0.00420379638671875, 0.0209197998046875, 0.06024169921875, -0.03375244140625, 0.06927490234375, -0.052154541015625, -0.039886474609375, -0.021881103515625, 0.00577545166015625, 0.023773193359375, 0.040374755859375, 0.04852294921875, -0.04541015625, -0.041656494140625, -0.031341552734375, -0.052520751953125, 0.002044677734375, -0.00844573974609375, -0.01062774658203125, 0.01446533203125, 0.050262451171875, -0.0523681640625, 0.0242156982421875, 0.04693603515625, -0.042572021484375, 0.0273590087890625, -0.0280303955078125, -0.00122833251953125, -0.11309814453125, 0.005298614501953125, 0.0068511962890625, -0.00690460205078125, -0.0192413330078125, -0.0015058517456054688, 0.005527496337890625, -0.00037288665771484375, -0.036376953125, 0.04351806640625, -0.033294677734375, 0.0151519775390625, 0.0007691383361816406, 0.0308990478515625, 0.0021533966064453125, 0.057586669921875, -0.006122589111328125, 0.053375244140625, 0.04852294921875, -0.037322998046875, 0.0288238525390625, 0.04510498046875, -0.0361328125, 0.0191802978515625, -0.06982421875, -0.00441741943359375, -0.00897979736328125, 0.0169830322265625, -0.086669921875, -0.005847930908203125, 0.01250457763671875, -0.03826904296875, -0.002208709716796875, 0.0266876220703125, -0.049285888671875, -0.05224609375, -0.04327392578125, 0.007904052734375, 0.040252685546875, -0.041717529296875, 0.037078857421875, 0.0140838623046875, -0.004055023193359375, -0.043975830078125, -0.0726318359375, -0.01146697998046875, -0.02197265625, -0.0528564453125, 0.04345703125, -0.0074005126953125, 0.01473236083984375, 0.019134521484375, 0.0104217529296875, 0.00865936279296875, -0.00386810302734375, 0.00897979736328125, 0.0188446044921875, -0.004638671875, 0.0059051513671875, 0.01375579833984375, -0.0028553009033203125, 0.006397247314453125, -0.003086090087890625, 0.06866455078125, -0.0190887451171875, -0.011444091796875, -0.039093017578125, 0.006114959716796875, 0.034759521484375, -0.01739501953125, 0.08428955078125, 0.07586669921875, -0.0294036865234375, 0.00331878662109375, -0.044036865234375, -0.0167694091796875, -0.0350341796875, 0.049530029296875, -0.02850341796875, -0.0684814453125, 0.045684814453125, 0.0182952880859375, -0.00377655029296875, 0.05841064453125, 0.05072021484375, -0.005718231201171875, 0.059173583984375, 0.0400390625, -0.017364501953125, 0.039581298828125, -0.043243408203125, 0.01297760009765625, -0.06536865234375, -0.0209503173828125, -0.034454345703125, -0.0257110595703125, -0.0567626953125, -0.03143310546875, 0.0135498046875, -0.0108642578125, -0.0146026611328125, 0.0523681640625, -0.041717529296875, 0.025848388671875, 0.046356201171875, 0.01494598388671875, -0.0006170272827148438, 0.005496978759765625, -0.036376953125, -0.011871337890625, -0.051910400390625, -0.03363037109375, 0.064453125, 0.032562255859375, 0.0345458984375, -0.01149749755859375, 0.053802490234375, -0.00312042236328125, -0.0032444000244140625, -0.053802490234375, 0.039520263671875, -0.01271820068359375, -0.023223876953125, -0.025665283203125, -0.029815673828125, -0.08148193359375, 0.035797119140625, -0.0211334228515625, -0.05322265625, 0.003749847412109375, -0.0216217041015625, -0.024993896484375, 0.01398468017578125, -0.05950927734375, 0.0836181640625, 0.009185791015625, -0.0019378662109375, -0.01004791259765625, -0.05413818359375, 0.006938934326171875, 0.01439666748046875, 0.01309967041015625, -0.01000213623046875, -0.00020825862884521484, 0.07861328125, -0.0198974609375, 0.0570068359375, 0.0009813308715820312, 0.0146484375, 0.0164794921875, -0.0169677734375, 0.0193023681640625, -0.01221466064453125, -0.01059722900390625, 0.007762908935546875, 0.00585174560546875, -0.0330810546875, -0.03338623046875, 0.05987548828125, -0.07110595703125, -0.0205535888671875, -0.040374755859375, -0.044891357421875, -0.002410888671875, 0.017791748046875, 0.039825439453125, 0.0232391357421875, -0.0204925537109375, 0.0301666259765625, 0.03863525390625, -0.0279083251953125, 0.0518798828125, 0.01499176025390625, 0.0002410411834716797, -0.03570556640625, 0.0498046875, -0.003208160400390625, -0.0026302337646484375, 0.023406982421875, 0.0207977294921875, -0.039398193359375, -0.01340484619140625, -0.0211944580078125, 0.04388427734375, -0.0323486328125, -0.01155853271484375, -0.0745849609375, -0.0273284912109375, -0.050445556640625, -0.006191253662109375, -0.0194854736328125, -0.023223876953125, -0.03289794921875, -0.0241851806640625, 0.0278778076171875, 0.02935791015625, 0.01081085205078125, 0.029937744140625, -0.047515869140625, 0.0178680419921875, 0.00074005126953125, 0.0135955810546875, -0.0075531005859375, -0.060699462890625, -0.02374267578125, -0.0017042160034179688, -0.0260162353515625, -0.0703125, 0.05413818359375, 0.0202789306640625, 0.04010009765625, 0.0152587890625, -0.0006923675537109375, 0.04852294921875, -0.0369873046875, 0.05731201171875, 0.008941650390625, -0.07269287109375, 0.04296875, -0.00380706787109375, 0.0362548828125, 0.042388916015625, 0.03509521484375, -0.040771484375, -0.034759521484375, -0.0628662109375, -0.0736083984375, 0.04833984375, 0.030853271484375, 0.026763916015625, -0.0184783935546875, 0.0169219970703125, -0.01458740234375, 0.011260986328125, -0.0760498046875, -0.031402587890625, -0.0160064697265625, -0.052978515625, -0.0166473388671875, -0.0060882568359375, 0.00771331787109375, -0.033966064453125, 0.0587158203125, 0.0011348724365234375, 0.037872314453125, 0.0281524658203125, -0.0262603759765625, 0.01274871826171875, 0.0118560791015625, 0.03375244140625, 0.0128936767578125, -0.011077880859375, 0.005001068115234375, 0.024078369140625, -0.042449951171875, -0.005603790283203125, 0.03082275390625, -0.0080108642578125, 0.0161285400390625, 0.0299835205078125, 0.0701904296875, 0.0286102294921875, -0.04742431640625, 0.057220458984375, -0.00562286376953125, -0.0243988037109375, -0.0265045166015625, -0.004505157470703125, 0.0158538818359375, 0.0170440673828125, 0.0135040283203125, -0.0021820068359375, -0.004039764404296875, -0.033905029296875, 0.020599365234375, 0.019561767578125, -0.0283203125, -0.00533294677734375, 0.04998779296875, -0.0034160614013671875, -0.01605224609375, 0.06298828125, -0.01971435546875, -0.055938720703125, 0.043426513671875, 0.04327392578125, 0.064697265625, -0.00743865966796875, 0.02044677734375, 0.049102783203125, 0.0238037109375, -0.012603759765625, 0.01442718505859375, 0.017669677734375, -0.06591796875, -0.01309967041015625, -0.05224609375, 0.0044097900390625, 0.0006771087646484375, -0.055938720703125, 0.027069091796875, -0.0083465576171875, 0.0007100105285644531, -0.0169219970703125, 0.010345458984375, -0.061614990234375, 0.01137542724609375, -0.01123046875, 0.056304931640625, -0.0787353515625, 0.06475830078125, 0.048248291015625, -0.04620361328125, -0.06268310546875, 0.00012159347534179688, -0.018280029296875, -0.06903076171875, 0.0224151611328125, 0.0362548828125, 0.00939178466796875, 0.0186309814453125, -0.027313232421875, -0.0631103515625, 0.10406494140625, 0.0220184326171875, -0.0384521484375, -0.0178070068359375, 0.004913330078125, 0.04705810546875, -0.0289459228515625, 0.0236968994140625, 0.046722412109375, 0.0209197998046875, 0.002197265625, -0.06268310546875, 0.0194091796875, -0.02166748046875, 0.00836181640625, -0.005435943603515625, -0.0443115234375, 0.061309814453125, -0.00983428955078125, -0.01111602783203125, 0.006137847900390625, 0.062042236328125, 0.02801513671875, 0.004169464111328125, 0.029632568359375, 0.06719970703125, 0.05670166015625, -0.0070953369140625, 0.06982421875, -0.0244903564453125, 0.06298828125, 0.069580078125, 0.01342010498046875, 0.07769775390625, 0.037078857421875, -0.01233673095703125, 0.059112548828125, 0.042572021484375, -0.0299530029296875, 0.041961669921875, 0.01314544677734375, 0.00690460205078125, 0.01342010498046875, 0.012359619140625, -0.0224151611328125, 0.033233642578125, 0.01210784912109375, -0.045166015625, -0.0034084320068359375, 0.012451171875, 0.019439697265625, 0.0140228271484375, 0.0044097900390625, 0.04229736328125, 0.00418853759765625, -0.03558349609375, 0.0350341796875, 0.013427734375, 0.0811767578125, -0.0261993408203125, 0.0231781005859375, 0.0032215118408203125, 0.0175323486328125, -0.008697509765625, -0.04736328125, 0.0305938720703125, -0.0106964111328125, -0.0025234222412109375, -0.01470947265625, 0.028045654296875, -0.05169677734375, -0.04791259765625, 0.028167724609375, 0.041961669921875, 0.0088043212890625, -0.00311279296875, -0.082763671875, 0.005786895751953125, 0.0088043212890625, -0.0396728515625, 0.002941131591796875, 0.0223236083984375, 0.021942138671875, 0.039825439453125, 0.029296875, -0.0034694671630859375, 0.01277923583984375, 0.014129638671875, 0.052734375, -0.0548095703125, -0.041534423828125, -0.077392578125, 0.0312042236328125, -0.01544952392578125, -0.037933349609375, 0.06103515625, 0.049957275390625, 0.07110595703125, -0.01654052734375, 0.033599853515625, -0.0167999267578125, 0.0306549072265625, -0.04290771484375, 0.06011962890625, -0.036041259765625, -0.0011243820190429688, -0.0282135009765625, -0.08001708984375, -0.0212860107421875, 0.0775146484375, -0.0289459228515625, 0.007511138916015625, 0.0771484375, 0.07147216796875, -0.0114593505859375, -0.01027679443359375, 0.00453948974609375, 0.0301666259765625, 0.015228271484375, 0.0416259765625, 0.018707275390625, -0.0672607421875, 0.0460205078125, -0.0352783203125, -0.00501251220703125, -0.007663726806640625, -0.04876708984375, -0.079345703125, -0.06280517578125, -0.03369140625, -0.031585693359375, -0.0149688720703125, 0.08709716796875, 0.0369873046875, -0.05670166015625, -0.01320648193359375, -0.01340484619140625, -0.01143646240234375, -0.0126190185546875, -0.025238037109375, 0.056884765625, -0.0228729248046875, -0.06878662109375, 0.007965087890625, -0.00839996337890625, 0.010162353515625, -0.01751708984375, 0.002338409423828125, -0.037506103515625, 0.01058197021484375, 0.03814697265625, -0.00708770751953125, -0.0587158203125, -0.0167694091796875, 0.010772705078125, -0.034576416015625, 0.00244140625, 0.0205078125, -0.05206298828125, 0.013702392578125, 0.0338134765625, 0.043914794921875, 0.061614990234375, -0.010223388671875, 0.0218505859375, -0.056671142578125, 0.0229034423828125, 0.008819580078125, 0.048187255859375, 0.03204345703125, -0.0234527587890625, 0.039093017578125, 0.0210418701171875, -0.0323486328125, -0.0457763671875, -0.006626129150390625, -0.088134765625, -0.016571044921875, 0.0809326171875, -0.01473236083984375, -0.033355712890625, 0.01555633544921875, -0.025909423828125, 0.032501220703125, -0.02032470703125, 0.06329345703125, 0.063232421875, -0.004825592041015625, -0.0124359130859375, -0.0267181396484375, 0.019317626953125, 0.0443115234375, -0.05230712890625, -0.025299072265625, 0.0151214599609375, 0.03753662109375, 0.0110931396484375, 0.034088134765625, -0.00868988037109375, -0.001056671142578125, -0.005809783935546875, 0.008758544921875, -0.0009703636169433594, 0.00506591796875, -0.0299530029296875, 0.0107269287109375, -0.031768798828125, -0.0312042236328125 ] ]
stabilityai/sd-vae-ft-mse
2023-06-06T11:39:15.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "license:mit", "has_space", "diffusers:AutoencoderKL", "region:us" ]
null
stabilityai
null
null
stabilityai/sd-vae-ft-mse
214
189,411
diffusers
2022-10-13T12:50:55
--- license: mit tags: - stable-diffusion - stable-diffusion-diffusers inference: false --- # Improved Autoencoders ## Utilizing These weights are intended to be used with the [🧨 diffusers library](https://github.com/huggingface/diffusers). If you are looking for the model to use with the original [CompVis Stable Diffusion codebase](https://github.com/CompVis/stable-diffusion), [come here](https://huggingface.co/stabilityai/sd-vae-ft-mse-original). #### How to use with 🧨 diffusers You can integrate this fine-tuned VAE decoder to your existing `diffusers` workflows, by including a `vae` argument to the `StableDiffusionPipeline` ```py from diffusers.models import AutoencoderKL from diffusers import StableDiffusionPipeline model = "CompVis/stable-diffusion-v1-4" vae = AutoencoderKL.from_pretrained("stabilityai/sd-vae-ft-mse") pipe = StableDiffusionPipeline.from_pretrained(model, vae=vae) ``` ## Decoder Finetuning We publish two kl-f8 autoencoder versions, finetuned from the original [kl-f8 autoencoder](https://github.com/CompVis/latent-diffusion#pretrained-autoencoding-models) on a 1:1 ratio of [LAION-Aesthetics](https://laion.ai/blog/laion-aesthetics/) and LAION-Humans, an unreleased subset containing only SFW images of humans. The intent was to fine-tune on the Stable Diffusion training set (the autoencoder was originally trained on OpenImages) but also enrich the dataset with images of humans to improve the reconstruction of faces. The first, _ft-EMA_, was resumed from the original checkpoint, trained for 313198 steps and uses EMA weights. It uses the same loss configuration as the original checkpoint (L1 + LPIPS). The second, _ft-MSE_, was resumed from _ft-EMA_ and uses EMA weights and was trained for another 280k steps using a different loss, with more emphasis on MSE reconstruction (MSE + 0.1 * LPIPS). It produces somewhat ``smoother'' outputs. The batch size for both versions was 192 (16 A100s, batch size 12 per GPU). To keep compatibility with existing models, only the decoder part was finetuned; the checkpoints can be used as a drop-in replacement for the existing autoencoder. _Original kl-f8 VAE vs f8-ft-EMA vs f8-ft-MSE_ ## Evaluation ### COCO 2017 (256x256, val, 5000 images) | Model | train steps | rFID | PSNR | SSIM | PSIM | Link | Comments |----------|---------|------|--------------|---------------|---------------|-----------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------| | | | | | | | | | | original | 246803 | 4.99 | 23.4 +/- 3.8 | 0.69 +/- 0.14 | 1.01 +/- 0.28 | https://ommer-lab.com/files/latent-diffusion/kl-f8.zip | as used in SD | | ft-EMA | 560001 | 4.42 | 23.8 +/- 3.9 | 0.69 +/- 0.13 | 0.96 +/- 0.27 | https://huggingface.co/stabilityai/sd-vae-ft-ema-original/resolve/main/vae-ft-ema-560000-ema-pruned.ckpt | slightly better overall, with EMA | | ft-MSE | 840001 | 4.70 | 24.5 +/- 3.7 | 0.71 +/- 0.13 | 0.92 +/- 0.27 | https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.ckpt | resumed with EMA from ft-EMA, emphasis on MSE (rec. loss = MSE + 0.1 * LPIPS), smoother outputs | ### LAION-Aesthetics 5+ (256x256, subset, 10000 images) | Model | train steps | rFID | PSNR | SSIM | PSIM | Link | Comments |----------|-----------|------|--------------|---------------|---------------|-----------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------| | | | | | | | | | | original | 246803 | 2.61 | 26.0 +/- 4.4 | 0.81 +/- 0.12 | 0.75 +/- 0.36 | https://ommer-lab.com/files/latent-diffusion/kl-f8.zip | as used in SD | | ft-EMA | 560001 | 1.77 | 26.7 +/- 4.8 | 0.82 +/- 0.12 | 0.67 +/- 0.34 | https://huggingface.co/stabilityai/sd-vae-ft-ema-original/resolve/main/vae-ft-ema-560000-ema-pruned.ckpt | slightly better overall, with EMA | | ft-MSE | 840001 | 1.88 | 27.3 +/- 4.7 | 0.83 +/- 0.11 | 0.65 +/- 0.34 | https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.ckpt | resumed with EMA from ft-EMA, emphasis on MSE (rec. loss = MSE + 0.1 * LPIPS), smoother outputs | ### Visual _Visualization of reconstructions on 256x256 images from the COCO2017 validation dataset._ <p align="center"> <br> <b> 256x256: ft-EMA (left), ft-MSE (middle), original (right)</b> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00025_merged.png /> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00011_merged.png /> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00037_merged.png /> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00043_merged.png /> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00053_merged.png /> </p> <p align="center"> <img src=https://huggingface.co/stabilityai/stable-diffusion-decoder-finetune/resolve/main/eval/ae-decoder-tuning-reconstructions/merged/00029_merged.png /> </p>
6,838
[ [ -0.054443359375, -0.02984619140625, 0.0118560791015625, 0.0176239013671875, -0.01029205322265625, -0.0120697021484375, -0.001285552978515625, -0.0037174224853515625, 0.039764404296875, 0.0184326171875, -0.025787353515625, -0.0325927734375, -0.045623779296875, 0.0104522705078125, -0.002758026123046875, 0.048583984375, -0.00677490234375, 0.011383056640625, 0.00653076171875, -0.0194244384765625, -0.032501220703125, -0.0241546630859375, -0.054595947265625, -0.015289306640625, 0.0308990478515625, 0.01058197021484375, 0.0269622802734375, 0.044281005859375, 0.0233001708984375, 0.0216217041015625, -0.0197906494140625, -0.0002117156982421875, -0.028564453125, -0.0187530517578125, 0.0185546875, -0.01154327392578125, -0.05096435546875, -0.00860595703125, 0.04571533203125, 0.0279998779296875, -0.012664794921875, 0.00682830810546875, 0.0194091796875, 0.060211181640625, -0.046966552734375, 0.006561279296875, -0.0091400146484375, 0.00540924072265625, -0.00373077392578125, -0.0284576416015625, -0.01546478271484375, -0.01418304443359375, -0.01041412353515625, -0.054931640625, 0.0180511474609375, -0.002864837646484375, 0.114501953125, 0.0222625732421875, -0.0235443115234375, -0.004302978515625, -0.039794921875, 0.04925537109375, -0.05810546875, 0.0338134765625, 0.01751708984375, 0.013092041015625, -0.00855255126953125, -0.0455322265625, -0.033294677734375, 0.00452423095703125, -0.02716064453125, 0.034759521484375, -0.01763916015625, -0.0008263587951660156, 0.033050537109375, 0.044830322265625, -0.047210693359375, -0.00714874267578125, -0.0452880859375, -0.0256500244140625, 0.045501708984375, 0.01043701171875, 0.01702880859375, -0.0216522216796875, -0.0199432373046875, -0.033905029296875, -0.04010009765625, 0.01287841796875, 0.01727294921875, -0.0127716064453125, -0.0309906005859375, 0.0258636474609375, -0.018463134765625, 0.0440673828125, 0.0117340087890625, -0.01226806640625, 0.0439453125, -0.0293426513671875, -0.04034423828125, -0.0088348388671875, 0.07415771484375, 0.032379150390625, -0.01010894775390625, 0.0185546875, -0.011810302734375, 0.0037593841552734375, -0.00439453125, -0.07708740234375, -0.02935791015625, 0.0254058837890625, -0.05413818359375, -0.025238037109375, 0.005718231201171875, -0.07635498046875, 0.0194091796875, -0.01502227783203125, 0.03167724609375, -0.041259765625, -0.0243988037109375, 0.0043487548828125, -0.0196075439453125, 0.05023193359375, 0.034698486328125, -0.052703857421875, 0.0222625732421875, 0.00801849365234375, 0.0633544921875, -0.001438140869140625, 0.004085540771484375, -0.0257568359375, -0.004364013671875, -0.034332275390625, 0.04107666015625, -0.007904052734375, -0.029693603515625, -0.0245513916015625, 0.028472900390625, -0.0149383544921875, -0.0291900634765625, 0.05816650390625, -0.033233642578125, 0.0140533447265625, -0.02783203125, -0.03173828125, -0.0197296142578125, 0.0096588134765625, -0.054443359375, 0.0826416015625, 0.011810302734375, -0.06329345703125, 0.0215301513671875, -0.0467529296875, -0.007415771484375, -0.0266876220703125, -0.01479339599609375, -0.050384521484375, -0.00968170166015625, 0.04010009765625, 0.025665283203125, -0.017578125, 0.0003573894500732422, -0.0165863037109375, -0.022186279296875, 0.00145721435546875, -0.048553466796875, 0.090576171875, 0.0308074951171875, -0.0362548828125, -0.00047206878662109375, -0.069580078125, -0.005462646484375, 0.0275726318359375, -0.025482177734375, -0.0006341934204101562, -0.025787353515625, -0.004192352294921875, 0.03021240234375, 0.0020313262939453125, -0.0266876220703125, 0.004302978515625, -0.03179931640625, 0.034759521484375, 0.0628662109375, 0.0164947509765625, 0.032684326171875, -0.03717041015625, 0.03558349609375, 0.0222625732421875, 0.005825042724609375, -0.02386474609375, -0.047149658203125, -0.07666015625, -0.03564453125, 0.03082275390625, 0.037109375, -0.036102294921875, 0.0443115234375, -0.0247802734375, -0.035491943359375, -0.047882080078125, -0.0032958984375, 0.01012420654296875, 0.0309906005859375, 0.033050537109375, -0.03558349609375, -0.042572021484375, -0.06597900390625, 0.01538848876953125, 0.00858306884765625, -0.000644683837890625, 0.0275115966796875, 0.048614501953125, 0.003955841064453125, 0.06463623046875, -0.051422119140625, -0.0291748046875, 0.0161895751953125, -0.0035343170166015625, 0.0284881591796875, 0.060546875, 0.06646728515625, -0.053619384765625, -0.05584716796875, -0.008697509765625, -0.0635986328125, 0.000946044921875, -0.0005116462707519531, -0.023193359375, 0.01058197021484375, 0.0305328369140625, -0.03948974609375, 0.05694580078125, 0.032196044921875, -0.02386474609375, 0.039276123046875, -0.0278472900390625, 0.02911376953125, -0.0872802734375, 0.0282440185546875, 0.006622314453125, -0.0266265869140625, -0.0302276611328125, -0.009063720703125, 0.001087188720703125, -0.0015306472778320312, -0.03546142578125, 0.0406494140625, -0.057586669921875, 0.002155303955078125, 0.005115509033203125, -0.0026702880859375, 0.0168609619140625, 0.056610107421875, 0.003002166748046875, 0.049102783203125, 0.05340576171875, -0.03717041015625, 0.004817962646484375, 0.012237548828125, -0.0206451416015625, 0.03192138671875, -0.0633544921875, 0.0021228790283203125, -0.023345947265625, 0.028961181640625, -0.077880859375, -0.01430511474609375, 0.0487060546875, -0.04254150390625, 0.04376220703125, -0.0233306884765625, -0.021514892578125, -0.036895751953125, -0.031341552734375, 0.0270538330078125, 0.05316162109375, -0.0308074951171875, 0.0445556640625, 0.008941650390625, 0.0211181640625, -0.040008544921875, -0.053192138671875, -0.017730712890625, -0.018463134765625, -0.0460205078125, 0.0309295654296875, -0.016876220703125, 0.01004791259765625, 0.0076141357421875, -0.00501251220703125, 0.005100250244140625, -0.00797271728515625, 0.033905029296875, 0.0241241455078125, -0.0269775390625, -0.034637451171875, 0.0106048583984375, -0.01258087158203125, -0.0032978057861328125, -0.0092926025390625, 0.04583740234375, -0.0081787109375, -0.0196685791015625, -0.05303955078125, 0.01442718505859375, 0.05621337890625, -0.020751953125, 0.0609130859375, 0.064208984375, -0.0296630859375, 0.01168060302734375, -0.033233642578125, -0.014739990234375, -0.03765869140625, -0.00033974647521972656, -0.03717041015625, -0.050567626953125, 0.0556640625, 0.01058197021484375, 0.002735137939453125, 0.07220458984375, 0.034912109375, -0.015960693359375, 0.0830078125, 0.01177978515625, 0.0006384849548339844, 0.0330810546875, -0.0755615234375, 0.0025577545166015625, -0.0753173828125, -0.022613525390625, -0.03857421875, -0.01702880859375, -0.0297088623046875, -0.05047607421875, 0.03009033203125, 0.0259857177734375, -0.0191650390625, 0.0198211669921875, -0.05615234375, 0.0120697021484375, 0.0226287841796875, 0.0167388916015625, 0.002262115478515625, 0.019622802734375, -0.0191650390625, -0.0026035308837890625, -0.060089111328125, -0.03436279296875, 0.0789794921875, 0.03167724609375, 0.048858642578125, 0.00351715087890625, 0.048858642578125, 0.0169219970703125, 0.02337646484375, -0.033935546875, 0.030487060546875, -0.004772186279296875, -0.043304443359375, 0.0003943443298339844, -0.0229034423828125, -0.0731201171875, 0.028594970703125, -0.01702880859375, -0.056976318359375, 0.047943115234375, 0.032745361328125, -0.02435302734375, 0.03466796875, -0.0537109375, 0.07757568359375, -0.006191253662109375, -0.029052734375, 0.00370025634765625, -0.046905517578125, 0.01861572265625, 0.02447509765625, 0.0123748779296875, -0.0038909912109375, 0.00525665283203125, 0.06988525390625, -0.05780029296875, 0.053863525390625, -0.013031005859375, -0.008697509765625, 0.0478515625, -0.00833892822265625, 0.03887939453125, 0.0037593841552734375, -0.002643585205078125, 0.022491455078125, 0.005031585693359375, -0.040008544921875, -0.029296875, 0.06585693359375, -0.0706787109375, -0.03466796875, -0.048065185546875, -0.007080078125, 0.0341796875, 0.0197906494140625, 0.048736572265625, 0.0491943359375, -0.0065155029296875, 0.0164947509765625, 0.058349609375, -0.01763916015625, 0.039703369140625, 0.018310546875, -0.004817962646484375, -0.05718994140625, 0.0830078125, 0.0145263671875, 0.022674560546875, 0.0257568359375, 0.005893707275390625, -0.01053619384765625, -0.032135009765625, -0.03857421875, 0.023162841796875, -0.056243896484375, -0.02947998046875, -0.06591796875, -0.029144287109375, -0.03802490234375, -0.0161590576171875, -0.036102294921875, -0.028411865234375, -0.050201416015625, 0.0110931396484375, 0.0276641845703125, 0.030120849609375, -0.018402099609375, 0.01255035400390625, -0.05621337890625, 0.018157958984375, 0.004344940185546875, 0.0180206298828125, 0.012237548828125, -0.04400634765625, -0.00934600830078125, 0.0112152099609375, -0.04058837890625, -0.07745361328125, 0.0501708984375, 0.00902557373046875, 0.0491943359375, 0.0270233154296875, -0.0110931396484375, 0.05975341796875, -0.0167388916015625, 0.05859375, 0.016357421875, -0.051483154296875, 0.049530029296875, -0.0199737548828125, 0.015106201171875, 0.029083251953125, 0.038543701171875, -0.0177459716796875, -0.0046844482421875, -0.055450439453125, -0.06988525390625, 0.055389404296875, 0.038299560546875, -0.025054931640625, 0.006168365478515625, 0.0246429443359375, -0.01227569580078125, -0.0003895759582519531, -0.052734375, -0.05474853515625, -0.035858154296875, -0.01027679443359375, -0.000812530517578125, -0.00516510009765625, -0.00399017333984375, -0.04205322265625, 0.05828857421875, -0.003192901611328125, 0.048553466796875, 0.038055419921875, -0.01039886474609375, -0.01131439208984375, 0.00518798828125, 0.03643798828125, 0.0298614501953125, -0.036529541015625, -0.004230499267578125, 0.01451873779296875, -0.0290374755859375, 0.01605224609375, -0.00222015380859375, -0.033905029296875, 0.002307891845703125, 0.01366424560546875, 0.06964111328125, -0.0058746337890625, -0.0190582275390625, 0.04205322265625, -0.0162200927734375, -0.032196044921875, -0.034454345703125, 0.0089263916015625, 0.00677490234375, -0.00315093994140625, 0.02313232421875, 0.03045654296875, 0.0143280029296875, -0.0258636474609375, 0.015411376953125, 0.02301025390625, -0.0209197998046875, -0.024017333984375, 0.06549072265625, -0.0004515647888183594, -0.00376129150390625, 0.033416748046875, -0.0271453857421875, -0.03631591796875, 0.06585693359375, 0.0330810546875, 0.0640869140625, -0.0208282470703125, 0.001140594482421875, 0.07904052734375, 0.015777587890625, -0.0035190582275390625, 0.0207672119140625, -0.0024547576904296875, -0.0350341796875, -0.0152587890625, -0.061553955078125, 0.024749755859375, 0.01299285888671875, -0.05755615234375, 0.031768798828125, -0.0306549072265625, -0.019012451171875, 0.0029659271240234375, 0.002536773681640625, -0.06793212890625, 0.0289306640625, -0.0029964447021484375, 0.0743408203125, -0.07379150390625, 0.060943603515625, 0.04205322265625, -0.047027587890625, -0.068359375, -0.00653839111328125, -0.00777435302734375, -0.032867431640625, 0.037109375, 0.00858306884765625, 0.006824493408203125, 0.00994873046875, -0.031402587890625, -0.0723876953125, 0.11151123046875, 0.01885986328125, -0.04351806640625, 0.0013780593872070312, -0.018951416015625, 0.034332275390625, -0.033599853515625, 0.04632568359375, 0.03546142578125, 0.0288848876953125, 0.0207977294921875, -0.050567626953125, 0.023956298828125, -0.031341552734375, 0.0174407958984375, 0.01340484619140625, -0.0736083984375, 0.059661865234375, -0.00820159912109375, -0.0165863037109375, 0.0012655258178710938, 0.06427001953125, 0.0197601318359375, 0.01195526123046875, 0.040435791015625, 0.06781005859375, 0.034576416015625, -0.0172271728515625, 0.07525634765625, -0.024017333984375, 0.041748046875, 0.058837890625, 0.0275726318359375, 0.0462646484375, 0.032928466796875, -0.024505615234375, 0.03338623046875, 0.06585693359375, -0.0120391845703125, 0.041595458984375, 0.00618743896484375, -0.0025424957275390625, -0.00484466552734375, 0.008209228515625, -0.044921875, 0.011566162109375, 0.032318115234375, -0.04168701171875, -0.00392913818359375, -0.00235748291015625, 0.0167388916015625, -0.0297088623046875, -0.0131378173828125, 0.03662109375, 0.004878997802734375, -0.0447998046875, 0.07867431640625, -0.0007505416870117188, 0.05322265625, -0.03662109375, -0.0152740478515625, -0.0113372802734375, 0.00449371337890625, -0.02996826171875, -0.054229736328125, 0.02935791015625, -0.0194854736328125, -0.0005121231079101562, -0.005615234375, 0.051483154296875, -0.0263671875, -0.037322998046875, 0.034576416015625, 0.00847625732421875, 0.0214080810546875, 0.0213623046875, -0.066162109375, 0.030242919921875, 0.0160980224609375, -0.03900146484375, 0.0241851806640625, 0.01340484619140625, 0.0165557861328125, 0.0273284912109375, 0.047943115234375, 0.00518798828125, 0.0263519287109375, -0.00656890869140625, 0.08209228515625, -0.0452880859375, -0.034698486328125, -0.04302978515625, 0.0537109375, -0.0186309814453125, -0.024383544921875, 0.06427001953125, 0.05755615234375, 0.05474853515625, -0.00799560546875, 0.042236328125, -0.032928466796875, 0.005168914794921875, -0.0390625, 0.047943115234375, -0.06982421875, 0.0183563232421875, -0.0274658203125, -0.062103271484375, -0.005870819091796875, 0.060150146484375, -0.0091552734375, 0.027191162109375, 0.04522705078125, 0.08087158203125, -0.01751708984375, -0.0212554931640625, 0.00464630126953125, 0.0236053466796875, 0.033782958984375, 0.0484619140625, 0.0239410400390625, -0.08270263671875, 0.041595458984375, -0.0487060546875, -0.0109405517578125, -0.0240631103515625, -0.053375244140625, -0.055938720703125, -0.059173583984375, -0.05023193359375, -0.056915283203125, -0.006877899169921875, 0.05596923828125, 0.07025146484375, -0.05621337890625, -0.01212310791015625, -0.0174713134765625, 0.00025272369384765625, -0.0238189697265625, -0.0211181640625, 0.04461669921875, 0.0090789794921875, -0.05902099609375, 0.012664794921875, 0.01483154296875, 0.0303955078125, -0.00946044921875, -0.03887939453125, -0.029541015625, 0.003490447998046875, 0.0301666259765625, 0.039520263671875, -0.046905517578125, -0.003170013427734375, -0.01110076904296875, -0.0127410888671875, 0.037109375, 0.038360595703125, -0.048248291015625, 0.040008544921875, 0.06494140625, 0.0079345703125, 0.06573486328125, 0.00022721290588378906, 0.0233001708984375, -0.03387451171875, 0.01190948486328125, 0.002338409423828125, 0.03570556640625, 0.014984130859375, -0.0276947021484375, 0.038330078125, 0.0367431640625, -0.044921875, -0.05291748046875, -0.02606201171875, -0.09625244140625, -0.01068115234375, 0.07220458984375, -0.019317626953125, -0.05206298828125, 0.007488250732421875, -0.02362060546875, 0.032012939453125, -0.05218505859375, 0.01702880859375, 0.036102294921875, -0.00287628173828125, -0.0269317626953125, -0.048797607421875, 0.0447998046875, 0.0298614501953125, -0.050628662109375, -0.019073486328125, 0.0311737060546875, 0.044281005859375, 0.02374267578125, 0.07275390625, -0.029266357421875, 0.0142822265625, 0.010498046875, 0.0034465789794921875, -0.0003070831298828125, -0.00826263427734375, -0.037506103515625, 0.01142120361328125, -0.0132904052734375, -0.0303497314453125 ] ]
openai/whisper-tiny
2023-09-08T13:08:03.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "whisper", "automatic-speech-recognition", "audio", "hf-asr-leaderboard", "en", "zh", "de", "es", "ru", "ko", "fr", "ja", "pt", "tr", "pl", "ca", "nl", "ar", "sv", "it", "id", "hi", "fi", "vi", "he", "uk", "el", "ms", "cs", "ro", "da", "hu", "ta", "no", "th", "ur", "hr", "bg", "lt", "la", "mi", "ml", "cy", "sk", "te", "fa", "lv", "bn", "sr", "az", "sl", "kn", "et", "mk", "br", "eu", "is", "hy", "ne", "mn", "bs", "kk", "sq", "sw", "gl", "mr", "pa", "si", "km", "sn", "yo", "so", "af", "oc", "ka", "be", "tg", "sd", "gu", "am", "yi", "lo", "uz", "fo", "ht", "ps", "tk", "nn", "mt", "sa", "lb", "my", "bo", "tl", "mg", "as", "tt", "haw", "ln", "ha", "ba", "jw", "su", "arxiv:2212.04356", "license:apache-2.0", "model-index", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
openai
null
null
openai/whisper-tiny
117
189,229
transformers
2022-09-26T06:50:30
--- language: - en - zh - de - es - ru - ko - fr - ja - pt - tr - pl - ca - nl - ar - sv - it - id - hi - fi - vi - he - uk - el - ms - cs - ro - da - hu - ta - no - th - ur - hr - bg - lt - la - mi - ml - cy - sk - te - fa - lv - bn - sr - az - sl - kn - et - mk - br - eu - is - hy - ne - mn - bs - kk - sq - sw - gl - mr - pa - si - km - sn - yo - so - af - oc - ka - be - tg - sd - gu - am - yi - lo - uz - fo - ht - ps - tk - nn - mt - sa - lb - my - bo - tl - mg - as - tt - haw - ln - ha - ba - jw - su tags: - audio - automatic-speech-recognition - hf-asr-leaderboard widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac model-index: - name: whisper-tiny results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (clean) type: librispeech_asr config: clean split: test args: language: en metrics: - name: Test WER type: wer value: 7.54 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (other) type: librispeech_asr config: other split: test args: language: en metrics: - name: Test WER type: wer value: 17.15 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 11.0 type: mozilla-foundation/common_voice_11_0 config: hi split: test args: language: hi metrics: - name: Test WER type: wer value: 141 pipeline_tag: automatic-speech-recognition license: apache-2.0 --- # Whisper Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need for fine-tuning. Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356) by Alec Radford et al from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper). **Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were copied and pasted from the original model card. ## Model details Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model. It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision. The models were trained on either English-only data or multilingual data. The English-only models were trained on the task of speech recognition. The multilingual models were trained on both speech recognition and speech translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio. For speech translation, the model predicts transcriptions to a *different* language to the audio. Whisper checkpoints come in five configurations of varying model sizes. The smallest four are trained on either English-only or multilingual data. The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The checkpoints are summarised in the following table with links to the models on the Hub: | Size | Parameters | English-only | Multilingual | |----------|------------|------------------------------------------------------|-----------------------------------------------------| | tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) | | base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) | | small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) | | medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) | | large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) | | large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) | # Usage To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor). The `WhisperProcessor` is used to: 1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model) 2. Post-process the model outputs (converting them from tokens to text) The model is informed of which task to perform (transcription or translation) by passing the appropriate "context tokens". These context tokens are a sequence of tokens that are given to the decoder at the start of the decoding process, and take the following order: 1. The transcription always starts with the `<|startoftranscript|>` token 2. The second token is the language token (e.g. `<|en|>` for English) 3. The third token is the "task token". It can take one of two values: `<|transcribe|>` for speech recognition or `<|translate|>` for speech translation 4. In addition, a `<|notimestamps|>` token is added if the model should not include timestamp prediction Thus, a typical sequence of context tokens might look as follows: ``` <|startoftranscript|> <|en|> <|transcribe|> <|notimestamps|> ``` Which tells the model to decode in English, under the task of speech recognition, and not to predict timestamps. These tokens can either be forced or un-forced. If they are forced, the model is made to predict each token at each position. This allows one to control the output language and task for the Whisper model. If they are un-forced, the Whisper model will automatically predict the output langauge and task itself. The context tokens can be set accordingly: ```python model.config.forced_decoder_ids = WhisperProcessor.get_decoder_prompt_ids(language="english", task="transcribe") ``` Which forces the model to predict in English under the task of speech recognition. ## Transcription ### English to English In this example, the context tokens are 'unforced', meaning the model automatically predicts the output language (English) and task (transcribe). ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny") >>> model.config.forced_decoder_ids = None >>> # load dummy dataset and read audio files >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False) ['<|startoftranscript|><|en|><|transcribe|><|notimestamps|> Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.'] ``` The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`. ### French to French The following example demonstrates French to French transcription by setting the decoder ids appropriately. ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="transcribe") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids) ['<|startoftranscript|><|fr|><|transcribe|><|notimestamps|> Un vrai travail intéressant va enfin être mené sur ce sujet.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Un vrai travail intéressant va enfin être mené sur ce sujet.'] ``` ## Translation Setting the task to "translate" forces the Whisper model to perform speech translation. ### French to English ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="translate") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' A very interesting work, we will finally be given on this subject.'] ``` ## Evaluation This code snippet shows how to evaluate Whisper Tiny on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr): ```python >>> from datasets import load_dataset >>> from transformers import WhisperForConditionalGeneration, WhisperProcessor >>> import torch >>> from evaluate import load >>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test") >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny").to("cuda") >>> def map_to_pred(batch): >>> audio = batch["audio"] >>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features >>> batch["reference"] = processor.tokenizer._normalize(batch['text']) >>> >>> with torch.no_grad(): >>> predicted_ids = model.generate(input_features.to("cuda"))[0] >>> transcription = processor.decode(predicted_ids) >>> batch["prediction"] = processor.tokenizer._normalize(transcription) >>> return batch >>> result = librispeech_test_clean.map(map_to_pred) >>> wer = load("wer") >>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"])) 7.547098647858638 ``` ## Long-Form Transcription The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers [`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline) method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`: ```python >>> import torch >>> from transformers import pipeline >>> from datasets import load_dataset >>> device = "cuda:0" if torch.cuda.is_available() else "cpu" >>> pipe = pipeline( >>> "automatic-speech-recognition", >>> model="openai/whisper-tiny", >>> chunk_length_s=30, >>> device=device, >>> ) >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> prediction = pipe(sample.copy(), batch_size=8)["text"] " Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel." >>> # we can also return timestamps for the predictions >>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"] [{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.', 'timestamp': (0.0, 5.44)}] ``` Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm. ## Fine-Tuning The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However, its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step guide to fine-tuning the Whisper model with as little as 5 hours of labelled data. ### Evaluated Use The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research. The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them. In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes. ## Training Data The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages. As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language. ## Performance and Limitations Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level. However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself. Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf). In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages. ## Broader Implications We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications. There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects. ### BibTeX entry and citation info ```bibtex @misc{radford2022whisper, doi = {10.48550/ARXIV.2212.04356}, url = {https://arxiv.org/abs/2212.04356}, author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya}, title = {Robust Speech Recognition via Large-Scale Weak Supervision}, publisher = {arXiv}, year = {2022}, copyright = {arXiv.org perpetual, non-exclusive license} } ```
19,750
[ [ -0.0143585205078125, -0.04266357421875, 0.0085601806640625, 0.03216552734375, -0.00887298583984375, -0.0106353759765625, -0.02447509765625, -0.04010009765625, 0.0162811279296875, 0.032684326171875, -0.06231689453125, -0.04156494140625, -0.05609130859375, -0.0003955364227294922, -0.039520263671875, 0.0751953125, 0.009429931640625, -0.00027561187744140625, 0.01256561279296875, -0.0084991455078125, -0.033111572265625, -0.0316162109375, -0.054107666015625, -0.0159759521484375, 0.0106964111328125, 0.0166168212890625, 0.0311737060546875, 0.0390625, 0.0080108642578125, 0.030029296875, -0.031646728515625, -0.0019197463989257812, -0.0301055908203125, -0.007785797119140625, 0.0263671875, -0.040557861328125, -0.04412841796875, 0.0054473876953125, 0.0545654296875, 0.03936767578125, -0.0183563232421875, 0.0316162109375, 0.0191650390625, 0.0238037109375, -0.0168304443359375, 0.0220184326171875, -0.046417236328125, -0.0104522705078125, -0.0208282470703125, -0.0007224082946777344, -0.023834228515625, -0.0199737548828125, 0.038330078125, -0.040679931640625, 0.0222320556640625, 0.003566741943359375, 0.07427978515625, 0.0168914794921875, -0.01236724853515625, -0.0290679931640625, -0.052490234375, 0.0771484375, -0.06451416015625, 0.034423828125, 0.036529541015625, 0.01361846923828125, -0.004535675048828125, -0.06610107421875, -0.052642822265625, -0.00174713134765625, -0.0110015869140625, 0.018768310546875, -0.03533935546875, -0.00023126602172851562, 0.01763916015625, 0.0202178955078125, -0.035614013671875, 0.001125335693359375, -0.04791259765625, -0.052215576171875, 0.0421142578125, 0.0015430450439453125, 0.0264892578125, -0.0227508544921875, -0.012939453125, -0.0234375, -0.0241241455078125, 0.034210205078125, 0.025604248046875, 0.038330078125, -0.050750732421875, 0.03057861328125, -0.0087432861328125, 0.04754638671875, 0.01334381103515625, -0.049835205078125, 0.04693603515625, -0.01715087890625, -0.01317596435546875, 0.0269775390625, 0.07525634765625, 0.0194244384765625, 0.00983428955078125, 0.00457763671875, -0.01708984375, 0.00927734375, -0.00798797607421875, -0.0552978515625, 0.00295257568359375, 0.035736083984375, -0.04315185546875, -0.02581787109375, -0.0196380615234375, -0.03851318359375, 0.0175018310546875, -0.01715087890625, 0.053009033203125, -0.041839599609375, -0.0247344970703125, 0.0096588134765625, -0.0276641845703125, 0.01617431640625, 0.0023593902587890625, -0.0631103515625, 0.0281829833984375, 0.03045654296875, 0.068603515625, 0.00637054443359375, -0.048187255859375, -0.043304443359375, 0.005649566650390625, 0.0042724609375, 0.03485107421875, -0.0179901123046875, -0.0428466796875, -0.0079193115234375, 0.01319122314453125, -0.0304412841796875, -0.0384521484375, 0.0511474609375, -0.0075531005859375, 0.034454345703125, -0.003448486328125, -0.034637451171875, -0.0180511474609375, -0.01285552978515625, -0.033935546875, 0.0701904296875, 0.009918212890625, -0.053558349609375, 0.00905609130859375, -0.041015625, -0.037872314453125, -0.013214111328125, 0.0166015625, -0.032379150390625, 0.0001862049102783203, 0.0335693359375, 0.033966064453125, -0.0087738037109375, 0.004486083984375, 0.00730133056640625, -0.030731201171875, 0.025665283203125, -0.033782958984375, 0.07489013671875, 0.01123809814453125, -0.028228759765625, 0.013916015625, -0.058135986328125, 0.006229400634765625, 0.005558013916015625, -0.01947021484375, 0.00539398193359375, -0.0008068084716796875, 0.019378662109375, 0.0086517333984375, 0.013824462890625, -0.051300048828125, -0.0039520263671875, -0.052703857421875, 0.06585693359375, 0.042572021484375, -0.0019893646240234375, 0.029571533203125, -0.041595458984375, 0.0221710205078125, 0.0098876953125, 0.0257720947265625, -0.0164031982421875, -0.048126220703125, -0.0628662109375, -0.0289306640625, 0.035064697265625, 0.058441162109375, -0.034515380859375, 0.040863037109375, -0.02325439453125, -0.044525146484375, -0.09600830078125, -0.004360198974609375, 0.041900634765625, 0.04644775390625, 0.04833984375, -0.0033245086669921875, -0.049468994140625, -0.059295654296875, -0.010986328125, -0.0227813720703125, -0.0137176513671875, 0.0269775390625, 0.0285797119140625, -0.0306549072265625, 0.053131103515625, -0.0305633544921875, -0.042999267578125, -0.026123046875, 0.00439453125, 0.033721923828125, 0.0477294921875, 0.02569580078125, -0.0555419921875, -0.0303192138671875, -0.014923095703125, -0.039794921875, -0.01123046875, -0.00848388671875, 0.00007098913192749023, 0.0138397216796875, 0.032257080078125, -0.053009033203125, 0.034332275390625, 0.05029296875, -0.0142669677734375, 0.0462646484375, 0.00457000732421875, -0.004238128662109375, -0.0870361328125, 0.0009012222290039062, -0.0183258056640625, -0.0130767822265625, -0.053009033203125, -0.0177001953125, -0.005832672119140625, -0.00753021240234375, -0.044525146484375, 0.04766845703125, -0.0250091552734375, 0.004459381103515625, -0.00433349609375, 0.01157379150390625, -0.0038776397705078125, 0.047760009765625, 0.0180511474609375, 0.0518798828125, 0.0601806640625, -0.042083740234375, 0.01519775390625, 0.044525146484375, -0.019561767578125, 0.0212249755859375, -0.07080078125, 0.00946807861328125, 0.007720947265625, 0.01055145263671875, -0.06610107421875, -0.00814056396484375, 0.006870269775390625, -0.070556640625, 0.03173828125, -0.02734375, -0.0247344970703125, -0.040130615234375, -0.008514404296875, 0.006580352783203125, 0.064453125, -0.037384033203125, 0.051910400390625, 0.032379150390625, -0.01776123046875, -0.038726806640625, -0.054412841796875, -0.007228851318359375, -0.01062774658203125, -0.057464599609375, 0.0364990234375, -0.0016326904296875, 0.004146575927734375, -0.006622314453125, -0.005084991455078125, 0.00921630859375, -0.0162200927734375, 0.03466796875, 0.02960205078125, -0.007205963134765625, -0.0190887451171875, 0.018310546875, -0.01904296875, -0.0009236335754394531, -0.020782470703125, 0.0482177734375, -0.0201263427734375, -0.0003402233123779297, -0.0577392578125, 0.0279541015625, 0.044769287109375, -0.02325439453125, 0.049224853515625, 0.05670166015625, -0.0215301513671875, -0.0135498046875, -0.0445556640625, -0.0157928466796875, -0.040069580078125, 0.01348876953125, -0.036712646484375, -0.0606689453125, 0.058441162109375, 0.0174560546875, 0.0104522705078125, 0.049896240234375, 0.039642333984375, -0.012603759765625, 0.07989501953125, 0.039459228515625, -0.019195556640625, 0.0197906494140625, -0.051239013671875, -0.006649017333984375, -0.07440185546875, -0.02923583984375, -0.041351318359375, -0.015625, -0.034027099609375, -0.020782470703125, 0.03472900390625, 0.0137481689453125, -0.00011044740676879883, 0.0391845703125, -0.052398681640625, 0.0015821456909179688, 0.050079345703125, 0.00021445751190185547, 0.0053863525390625, -0.00145721435546875, -0.021697998046875, -0.0031566619873046875, -0.039215087890625, -0.03033447265625, 0.07257080078125, 0.034332275390625, 0.0362548828125, -0.0016002655029296875, 0.054534912109375, 0.0007047653198242188, 0.0016069412231445312, -0.06158447265625, 0.0382080078125, -0.01043701171875, -0.03839111328125, -0.0299530029296875, -0.0201263427734375, -0.0631103515625, 0.0140838623046875, -0.011749267578125, -0.05670166015625, 0.0106658935546875, 0.0008029937744140625, -0.0251007080078125, 0.013824462890625, -0.05517578125, 0.0496826171875, 0.01285552978515625, 0.01064300537109375, 0.00138092041015625, -0.055206298828125, 0.01222991943359375, 0.006061553955078125, 0.01027679443359375, -0.0048675537109375, 0.00986480712890625, 0.07769775390625, -0.0394287109375, 0.07232666015625, -0.0220184326171875, 0.003376007080078125, 0.03411865234375, -0.00637054443359375, 0.028289794921875, -0.01355743408203125, -0.006977081298828125, 0.037353515625, 0.028778076171875, -0.020660400390625, -0.0190582275390625, 0.04107666015625, -0.07965087890625, -0.027801513671875, -0.019287109375, -0.0223846435546875, -0.00714111328125, 0.0197906494140625, 0.06787109375, 0.054107666015625, -0.01049041748046875, -0.0023670196533203125, 0.030731201171875, -0.0162353515625, 0.04254150390625, 0.04803466796875, -0.015533447265625, -0.035888671875, 0.069091796875, 0.021331787109375, 0.0166473388671875, 0.0200347900390625, 0.02740478515625, -0.032958984375, -0.049102783203125, -0.04266357421875, 0.02484130859375, -0.038177490234375, -0.01239776611328125, -0.068115234375, -0.040985107421875, -0.051666259765625, 0.0027370452880859375, -0.0298309326171875, -0.0224456787109375, -0.037353515625, 0.00930023193359375, 0.03961181640625, 0.0307159423828125, -0.0009241104125976562, 0.043304443359375, -0.07513427734375, 0.033172607421875, 0.0245513916015625, 0.005336761474609375, 0.001453399658203125, -0.0770263671875, -0.00702667236328125, 0.0167694091796875, -0.01473236083984375, -0.0540771484375, 0.04107666015625, 0.027069091796875, 0.040191650390625, 0.0199127197265625, 0.0007052421569824219, 0.0606689453125, -0.05633544921875, 0.06414794921875, 0.011505126953125, -0.0943603515625, 0.056640625, -0.024566650390625, 0.0246124267578125, 0.030517578125, 0.023956298828125, -0.052947998046875, -0.03460693359375, -0.04833984375, -0.046875, 0.06451416015625, 0.02813720703125, 0.01126861572265625, 0.00823211669921875, 0.020599365234375, 0.005413055419921875, 0.01050567626953125, -0.0357666015625, -0.0325927734375, -0.032623291015625, -0.0198822021484375, -0.01244354248046875, -0.011749267578125, -0.00440216064453125, -0.040313720703125, 0.0556640625, -0.00441741943359375, 0.044036865234375, 0.030426025390625, -0.00445556640625, -0.002758026123046875, 0.007396697998046875, 0.04345703125, 0.0192718505859375, -0.01412200927734375, -0.0260467529296875, 0.0241241455078125, -0.0592041015625, 0.0002853870391845703, 0.0185546875, -0.0234375, 0.01338958740234375, 0.058929443359375, 0.0889892578125, 0.017425537109375, -0.035552978515625, 0.054931640625, -0.01058197021484375, -0.0282135009765625, -0.040740966796875, 0.00356292724609375, 0.0212554931640625, 0.01552581787109375, 0.0245513916015625, 0.01139068603515625, 0.005649566650390625, -0.0360107421875, 0.005374908447265625, 0.0200653076171875, -0.03350830078125, -0.040191650390625, 0.061370849609375, 0.01125335693359375, -0.03753662109375, 0.052215576171875, 0.00630950927734375, -0.054931640625, 0.03564453125, 0.051239013671875, 0.07568359375, -0.035400390625, 0.00209808349609375, 0.032196044921875, 0.0204620361328125, -0.004955291748046875, 0.03887939453125, -0.009613037109375, -0.05743408203125, -0.032135009765625, -0.07489013671875, -0.0184478759765625, 0.01129150390625, -0.06890869140625, 0.023101806640625, -0.01788330078125, -0.0222625732421875, 0.023712158203125, 0.0019378662109375, -0.0601806640625, 0.00933074951171875, 0.006877899169921875, 0.0791015625, -0.056427001953125, 0.07965087890625, 0.019500732421875, -0.019775390625, -0.0819091796875, 0.0033397674560546875, 0.004543304443359375, -0.078369140625, 0.03155517578125, 0.02484130859375, -0.0160980224609375, 0.01450347900390625, -0.040863037109375, -0.06353759765625, 0.07342529296875, 0.00934600830078125, -0.052337646484375, -0.0087738037109375, -0.00366973876953125, 0.0394287109375, -0.02392578125, 0.00923919677734375, 0.05657958984375, 0.031280517578125, 0.00528717041015625, -0.103759765625, -0.0070953369140625, -0.01959228515625, -0.01094818115234375, -0.000598907470703125, -0.05419921875, 0.06280517578125, -0.02606201171875, -0.0203704833984375, 0.02166748046875, 0.05047607421875, 0.01529693603515625, 0.0164947509765625, 0.04608154296875, 0.0357666015625, 0.052947998046875, -0.01340484619140625, 0.07470703125, -0.01885986328125, 0.0110321044921875, 0.06732177734375, -0.0017118453979492188, 0.0850830078125, 0.02288818359375, -0.02813720703125, 0.0428466796875, 0.0287017822265625, 0.00008702278137207031, 0.042236328125, -0.00902557373046875, -0.0209503173828125, 0.00800323486328125, -0.0037136077880859375, -0.03131103515625, 0.057464599609375, 0.031890869140625, -0.019927978515625, 0.0250701904296875, 0.02447509765625, 0.006786346435546875, -0.01094818115234375, -0.0213623046875, 0.0721435546875, 0.012359619140625, -0.04132080078125, 0.06439208984375, 0.0034885406494140625, 0.07330322265625, -0.061859130859375, 0.0181732177734375, 0.0028476715087890625, 0.0131988525390625, -0.01305389404296875, -0.047607421875, 0.0257720947265625, -0.01039886474609375, -0.025238037109375, -0.0142364501953125, 0.0435791015625, -0.054534912109375, -0.039337158203125, 0.04083251953125, 0.027587890625, 0.0233154296875, -0.008087158203125, -0.06634521484375, 0.0300445556640625, 0.0166473388671875, -0.017486572265625, 0.0135498046875, 0.014068603515625, 0.018646240234375, 0.047088623046875, 0.06304931640625, 0.03143310546875, 0.011505126953125, 0.01212310791015625, 0.05999755859375, -0.0477294921875, -0.052490234375, -0.051177978515625, 0.035675048828125, 0.0042877197265625, -0.031890869140625, 0.0604248046875, 0.0360107421875, 0.0517578125, -0.0018396377563476562, 0.0560302734375, 0.0036449432373046875, 0.070556640625, -0.041168212890625, 0.0634765625, -0.033294677734375, 0.0015478134155273438, -0.024169921875, -0.053802490234375, 0.00478363037109375, 0.04351806640625, -0.0059051513671875, -0.00927734375, 0.028472900390625, 0.06707763671875, 0.0040435791015625, 0.0131378173828125, 0.01030731201171875, 0.0307769775390625, 0.01483917236328125, 0.040863037109375, 0.043304443359375, -0.05841064453125, 0.050018310546875, -0.038330078125, -0.01727294921875, 0.0021514892578125, -0.04364013671875, -0.07379150390625, -0.061614990234375, -0.0202789306640625, -0.0419921875, -0.0199127197265625, 0.059417724609375, 0.0667724609375, -0.0645751953125, -0.0237579345703125, 0.0227203369140625, -0.0018777847290039062, -0.0308990478515625, -0.0186920166015625, 0.042724609375, -0.0012416839599609375, -0.06683349609375, 0.0478515625, 0.0028476715087890625, 0.0299530029296875, -0.01268768310546875, -0.0165557861328125, 0.0043182373046875, 0.0078125, 0.04095458984375, 0.0224456787109375, -0.06427001953125, -0.01210784912109375, 0.00624847412109375, 0.0017404556274414062, -0.00240325927734375, 0.031951904296875, -0.05364990234375, 0.0261077880859375, 0.0269927978515625, 0.00876617431640625, 0.059661865234375, -0.023162841796875, 0.0259857177734375, -0.058746337890625, 0.03533935546875, 0.0153961181640625, 0.0247344970703125, 0.025726318359375, -0.021759033203125, 0.01125335693359375, 0.0228424072265625, -0.040618896484375, -0.07769775390625, -0.0097503662109375, -0.08306884765625, -0.0102081298828125, 0.0740966796875, 0.0029850006103515625, -0.025726318359375, -0.00695037841796875, -0.0250701904296875, 0.03131103515625, -0.035400390625, 0.02325439453125, 0.041748046875, 0.0036945343017578125, -0.0032196044921875, -0.04388427734375, 0.0562744140625, 0.01568603515625, -0.0164947509765625, -0.00225067138671875, 0.00345611572265625, 0.045379638671875, 0.02044677734375, 0.06451416015625, -0.0147857666015625, 0.01291656494140625, 0.0115509033203125, 0.0100860595703125, -0.0079803466796875, -0.01558685302734375, -0.035736083984375, -0.003215789794921875, -0.0241241455078125, -0.032806396484375 ] ]
hpcai-tech/Colossal-LLaMA-2-7b-base
2023-10-10T06:21:00.000Z
[ "transformers", "pytorch", "llama", "text-generation", "zh", "en", "arxiv:2307.09288", "license:llama2", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
hpcai-tech
null
null
hpcai-tech/Colossal-LLaMA-2-7b-base
67
188,722
transformers
2023-09-18T07:51:31
--- license: llama2 language: - zh - en --- <!-- markdownlint-disable first-line-h1 --> <!-- markdownlint-disable html --> <div align="center"> <h1> Colossal-LLaMA-2-7B </h1> </div> <div align="center"> 🎉 We released Colossal-LLaMA-2-7B-base based on LLaMA-2 !! </div> <div align="center"> |<a href="https://github.com/hpcaitech/ColossalAI/tree/main/applications/Colossal-LLaMA-2" target="_blank">🔥 GitHub </a> | <a href="https://modelscope.cn/models/colossalai/Colossal-LLaMA-2-7b-base/summary" target="_blank">👾 Modelscope</a>| <a href="https://github.com/hpcaitech/public_assets/tree/main/colossalai/contact/slack" target="_blank">😊 Slack</a>| <a href="https://raw.githubusercontent.com/hpcaitech/public_assets/main/colossalai/img/WeChat.png" target="_blank">💬 WeChat</a>| </div> <div align="center"> <h1> <img src="https://github.com/hpcaitech/public_assets/blob/main/applications/colossal-llama-2/colossalllam2.jpg?raw=true" width=800/> </h1> </div> # Table of Contents - [Model Introduction](#model-introducation) - [Usage](#usage) - [Performance Evaluation](#performance-evaluation) - [Technical Insights](#technical-insights) - [Data](#data) - [Tokenizer](#tokenizer) - [Training Logs](#training-logs) - [Training Strategy](#training-strategy) - [Multi-stage Training](#multi-stage-training) - [Bucket-based Training](#bucket-based-training) - [Limitations](#limitations) - [Citations](#citations) # Model Introduction The [Colossal-AI](https://github.com/hpcaitech/ColossalAI) team has introduced the **open-source** model **Colossal-LLaMA-2-7B-base**. This model, a derivation of LLaMA-2, has undergone continual pre-training involving approximately 8.5 billion tokens over a duration of 15 hours with 64 A800 GPUs. At a cost of **less than $1,000**, you can achieve results **similar to those that cost millions of dollars to pretrain from scratch**. It is licensed under the LLaMA-2 license and [Apache 2.0 License](https://github.com/hpcaitech/ColossalAI/blob/main/LICENSE) **without any additional commercial use restrictions**. This solution can also be used to build models of specific domain knowledge or tasks. Colossal-LLaMA-2-7B-base is designed to accommodate both the Chinese and English languages, featuring an expansive context window spanning 4096 tokens. Remarkably, it has exhibited exceptional performance when benchmarked against models of equivalent scale in standard Chinese and English evaluation metrics, including C-Eval and MMLU, among others. # Usage To load Colossal-LLaMA-2-7B-base model using Transformers, use the following code: ```Python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("hpcai-tech/Colossal-LLaMA-2-7b-base", device_map="auto", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("hpcai-tech/Colossal-LLaMA-2-7b-base", trust_remote_code=True) input = "离离原上草," inputs = tokenizer(input, return_tensors='pt') inputs = inputs.to('cuda:0') pred = model.generate(**inputs, max_new_tokens=256, do_sample=True, top_k=50, top_p=0.95, num_return_sequences=1) print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True)[len(input):]) ``` # Performance Evaluation ### Performance Evaluation We conducted comprehensive evaluation on 4 dataset and compare our Colossal-Llama-2-7b-base model with various models. * We use 5-shot for MMLU and calculate scores based on the logits of first predicted token. * We use 5-shot for CMMLU and calculate scores based on the logits of first predicted token. * We use 5-shot for AGIEval and only calculate scores for 4-choice questions using a combination metric of exact match and the logits of first predicted token. If any of the exact match or logits of first predicted token is correct, the model will get the score. * We use 0-shot for GAOKAO-Bench and only calculate scores for 4-choice questions based on the logits of first predicted token. The generation config for all dataset is greedy search. * We also provided CEval scores from its lastest leaderboard or the official repository of the model. | | Backbone | Tokens Consumed | | MMLU | CMMLU | AGIEval | GAOKAO | CEval | | :----------------------------: | :--------: | :-------------: | :------------------: | :-----------: | :-----: | :----: | :----: | :------------------------------: | | | | - | | 5-shot | 5-shot | 5-shot | 0-shot | 5-shot | | Baichuan-7B | - | 1.2T | | 42.32 (42.30) | 44.53 (44.02) | 38.72 | 36.74 | 42.80 | | Baichuan-13B-Base | - | 1.4T | | 50.51 (51.60) | 55.73 (55.30) | 47.20 | 51.41 | 53.60 | | Baichuan2-7B-Base | - | 2.6T | | 46.97 (54.16) | 57.67 (57.07) | 45.76 | 52.60 | 54.00 | | Baichuan2-13B-Base | - | 2.6T | | 54.84 (59.17) | 62.62 (61.97) | 52.08 | 58.25 | 58.10 | | ChatGLM-6B | - | 1.0T | | 39.67 (40.63) | 41.17 (-) | 40.10 | 36.53 | 38.90 | | ChatGLM2-6B | - | 1.4T | | 44.74 (45.46) | 49.40 (-) | 46.36 | 45.49 | 51.70 | | InternLM-7B | - | 1.6T | | 46.70 (51.00) | 52.00 (-) | 44.77 | 61.64 | 52.80 | | Qwen-7B | - | 2.2T | | 54.29 (56.70) | 56.03 (58.80) | 52.47 | 56.42 | 59.60 | | | | | | | | | | | | Llama-2-7B | - | 2.0T | | 44.47 (45.30) | 32.97 (-) | 32.60 | 25.46 | - | | Linly-AI/Chinese-LLaMA-2-7B-hf | Llama-2-7B | 1.0T | | 37.43 | 29.92 | 32.00 | 27.57 | - | | wenge-research/yayi-7b-llama2 | Llama-2-7B | - | | 38.56 | 31.52 | 30.99 | 25.95 | - | | ziqingyang/chinese-llama-2-7b | Llama-2-7B | - | | 33.86 | 34.69 | 34.52 | 25.18 | 34.2 | | TigerResearch/tigerbot-7b-base | Llama-2-7B | 0.3T | | 43.73 | 42.04 | 37.64 | 30.61 | - | | LinkSoul/Chinese-Llama-2-7b | Llama-2-7B | - | | 48.41 | 38.31 | 38.45 | 27.72 | - | | FlagAlpha/Atom-7B | Llama-2-7B | 0.1T | | 49.96 | 41.10 | 39.83 | 33.00 | - | | IDEA-CCNL/Ziya-LLaMA-13B-v1.1 | Llama-13B | 0.11T | | 50.25 | 40.99 | 40.04 | 30.54 | - | | | | | | | | | | | | **Colossal-LLaMA-2-7b-base** | Llama-2-7B | **0.0085T** | | 53.06 | 49.89 | 51.48 | 58.82 | 50.2 | > The score in parentheses corresponds to the scores in the official repository of the model. > > We use zero-shot for ChatGLM models. > > Qwen-7B is now inaccessible in Hugging Face, we are using the latest version of it before it was made inaccessible. Only for dataset MMLU, the prompt would be "xxx Answer:"(remove the space after ":") and we calculate the logits over " A", " B", " C" and " D" for Qwen-7B. Qwen-7B tends to be much more deterministic than other models. For example, the logits over " A" can be `-inf` and softmax would be exact `0`. > > For other models and other dataset, we calculate logits over "A", "B", "C" and "D". ❗️ More details of the evaluation methods and reproduction of the results, please refer to [ColossalEval](https://github.com/Camille7777/ColossalAI_yt/tree/main/applications/ColossalEval). # Technical Insights In order to enhance LLaMA-2's capabilities for understanding and generating Chinese content, The [Colossal-AI](https://github.com/hpcaitech/ColossalAI) team proposes the continuation of pre-training the LLaMA-2 model using both Chinese and English corpora. ## Data Large language models such as LLaMA-2 have undergone training using a heterogeneous blend of high-quality datasets, yielding promising outcomes. Enhancing LLaMA-2's performance for the Chinese corpus, while preserving its proficiency in English, critically hinges on two pivotal factors: the composition of the dataset, which encompasses both English and Chinese content, and the quality of each constituent dataset. The following figure shows the data processing pipeline conducted for Colossal-LLaMA-2. <p id="Colossal-LLaMA-2-data-processing-pipeline" align="center"> <img src="https://github.com/hpcaitech/public_assets/blob/main/applications/colossal-llama-2/data_processing_pipeline.jpeg?raw=true" width=800/> </p> ❗️**Important**: We will open-source our data-processing toolkit soon, stay tuned! ## Tokenizer The original LLaMA-2 vacabulary comprises fewer than a thousand Chinese characters, thus proves inadequate for encoding comprehensive Chinese texts effectively. Secondly, the utilization of byte tokens presents a challenge for transformer encoders to capture the semantic nuances of Chinese characters. To address the above issues, we extend LLaMA-2 vocabulary from 32,000 to 69,104. To adapt the LLaMA-2 model for use with the Colossal-LLaMA-2 tokenizer, we initialize the new word embeddings by calculating the mean values from the original LLaMA-2 embeddings and subsequently append these new rows to the end of the original embedding matrices. Advantages of extending vocabulary size: * Improve the compression rate of string sequence encoding. * Enhance the integrity of information. * Enable encoded sequences to contain more valuable information, thereby theoretically enhancing the ability for chapter-level encoding. Advantages of large vocabulary size under low-resource settings: * The presence of numerous unused tokens can be attributed to the limited training dataset, where an excessive number of tokens might not have been effectively learned. * Excessive vocabulary expansion leads to an increase in embedding-related parameters, resulting in higher memory usage, which, in turn, affects the efficiency of the training process. To balance both sides, we finally construct our vocabulary with size 69,104. The following table below presents a comparison of various models at the 7B level. | Model | Vocabulary Size | Compression Rate | Average Length of Samples (token-level) | | :-----------: | :---------: | :----: | :----: | | **Colossal-LLaMA-2** | **69104** | **0.659** | **73.682** | | LLaMA-2-7B | 32000 | 1.205 | 134.689 | | Atom-7B | 65000 | 0.634 | 70.915 | | Baichuan-7B | 64000 | 0.678 | 75.857 | | Baichuan2-7B-base | 125696 | 0.570 | 63.761 | | Chatglm2-6B | 64789 | 0.645 | 72.178 | | InternLM-7B | 103168 | 0.566 | 63.349 | | Qwen-7B | 151643 | 0.578 | 64.703 | | Tigerbot-7B-base | 60515 | 0.630 | 70.515 | | Yayi-7B-llama2 | 32005 | 1.214 | 135.689 | | Chinese-llama-2-7b | 55296 | 0.668 | 74.690 | | Chinese-Falcon-7B | 90046 | 0.669 | 74.858 | | LinkSoul-Chinese-Llama-2-7b | 40076 | 0.958 | 107.089 | | Ziya-LLaMA-13B-v1.1 | 39410 | 0.958 | 107.074 | ## Training Logs Here are the training logs for the our experiment: <p id="Colossal-LLaMA-2-Multi-stage-training" align="center"> <img src="https://github.com/hpcaitech/public_assets/blob/main/applications/colossal-llama-2/trainingLossBySteps.jpeg?raw=true" width=600/> </p> <p id="Colossal-LLaMA-2-Multi-stage-training" align="center"> <img src="https://github.com/hpcaitech/public_assets/blob/main/applications/colossal-llama-2/trainingLossByTokens.jpeg?raw=true" width=600/> </p> ## Training Strategy ### Multi-stage Training In order to enhance the model's performance and harness the full potential of the original LLaMA-2, we have developed a multi-stage training strategy. This strategy is designed to systematically unlock the model's capabilities over a series of stages. Therefore, we have divided the training process into three stages: * Large-scale pre-training stage (Conducted by LLaMA-2): This initial stage is aimed at establishing the model's foundational capabilities from the ground up. It necessitates the use of a substantial dataset comprising no less than 1 trillion tokens. * Chinese knowledge injection stage: In this stage, we introduce Chinese knowledge into the model. It requires access to a high-quality dataset rich in comprehensive knowledge relevant to the Chinese language. * Knowledge replay stage: Knowledge is replayed through a question-answering (QA) mechanism, encompassing both the Chinese and English domains. Following the completion of this multi-stage training process, the model exhibits notable improvements in performance across both English and Chinese benchmarks. The following figure illustrates the three stages for training Colossal-LLaMA-2. <p id="Colossal-LLaMA-2-Multi-stage-training" align="center"> <img src="https://github.com/hpcaitech/public_assets/blob/main/applications/colossal-llama-2/multi-stage-training.png?raw=true" width=600/> </p> ### Bucket-based Training Our experiments have revealed that the distributions within the training dataset, as well as the arrangement of various topic-related data points, significantly impact the overall performance of the model, particularly in the context of continual pre-training of LLaMA-2. In an effort to achieve a more balanced distribution and exert control over the dataset's ordering, we have adopted a method where we divide each sub-dataset into discrete bins. These bins are then combined to construct individual data buckets, with one bin contributed by each sub-dataset. For more details, please refer to our [Github](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Colossal-LLaMA-2). # Limitations Colossal-LLaMA-2-7B is a derivation of LLaMA-2 that carries risks with use. Testing conducted to date has been exclusively performed in English and Chinese languages, and it is important to acknowledge that it could not encompass all possible scenarios. Same as other LLMs, it is impossible to predict the potential outcomes of Colossal-LLaMA-2-7B-base in advance. In certain situations, Colossal-LLaMA-2-7B-base may generate responses that are inaccurate, biased, or otherwise poisonous. Consequently, prior to deploying any applications powered by Colossal-LLaMA-2-7B-base, it is imperative for developers to engage in safety testing and tuning tailored the model to meet the specific requirements of their applications. # Citations ```bibtex @article{bian2021colossal, title={Colossal-AI: A Unified Deep Learning System For Large-Scale Parallel Training}, author={Bian, Zhengda and Liu, Hongxin and Wang, Boxiang and Huang, Haichen and Li, Yongbin and Wang, Chuanrui and Cui, Fan and You, Yang}, journal={arXiv preprint arXiv:2110.14883}, year={2021} } ``` ```bibtex @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint={2307.09288}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtex @article{dao2023flashattention2, title={Flash{A}ttention-2: Faster Attention with Better Parallelism and Work Partitioning}, author={Dao, Tri}, year={2023} } ```
16,966
[ [ -0.024658203125, -0.05694580078125, 0.019775390625, 0.0143585205078125, -0.03253173828125, 0.01197052001953125, 0.0013647079467773438, -0.037353515625, 0.03448486328125, 0.0081634521484375, -0.047943115234375, -0.0411376953125, -0.052978515625, 0.0087890625, -0.005184173583984375, 0.062042236328125, 0.007381439208984375, -0.00432586669921875, 0.014312744140625, -0.01021575927734375, -0.0360107421875, -0.0247955322265625, -0.04779052734375, -0.02227783203125, 0.0300750732421875, 0.0310211181640625, 0.06024169921875, 0.059173583984375, 0.045013427734375, 0.025634765625, -0.0208587646484375, 0.01419830322265625, -0.036224365234375, -0.02935791015625, 0.01861572265625, -0.0450439453125, -0.055877685546875, -0.00655364990234375, 0.048309326171875, 0.0224456787109375, -0.00617218017578125, 0.036590576171875, 0.01381683349609375, 0.03961181640625, -0.0291595458984375, 0.02886962890625, -0.033172607421875, 0.00704193115234375, -0.017852783203125, -0.00672149658203125, -0.01800537109375, -0.0291595458984375, -0.01480865478515625, -0.06304931640625, -0.004108428955078125, 0.00586700439453125, 0.10247802734375, 0.01456451416015625, -0.0268402099609375, -0.012298583984375, -0.0141448974609375, 0.06817626953125, -0.06829833984375, -0.0012149810791015625, 0.02777099609375, 0.02252197265625, -0.01439666748046875, -0.041839599609375, -0.047576904296875, -0.0165557861328125, -0.024169921875, 0.0224761962890625, -0.0193023681640625, -0.0254058837890625, 0.0213623046875, 0.03466796875, -0.0498046875, 0.0096588134765625, -0.031494140625, -0.007678985595703125, 0.0572509765625, 0.0287933349609375, 0.0223846435546875, -0.0211181640625, -0.0281524658203125, -0.01041412353515625, -0.047119140625, 0.0173492431640625, 0.0159759521484375, 0.0172882080078125, -0.03814697265625, 0.048309326171875, -0.0234222412109375, 0.0285186767578125, 0.0300750732421875, -0.039459228515625, 0.043487548828125, -0.0165863037109375, -0.03240966796875, -0.00739288330078125, 0.0697021484375, 0.040985107421875, -0.0066680908203125, 0.01087188720703125, -0.00173187255859375, 0.004383087158203125, -0.01898193359375, -0.0670166015625, -0.01030731201171875, 0.029022216796875, -0.037994384765625, -0.04986572265625, -0.00263214111328125, -0.04583740234375, -0.007354736328125, -0.00820159912109375, 0.038543701171875, -0.0219573974609375, -0.03131103515625, -0.00299072265625, 0.004001617431640625, 0.045196533203125, 0.024444580078125, -0.06658935546875, 0.016265869140625, 0.034698486328125, 0.06353759765625, -0.0038623809814453125, -0.0231170654296875, -0.0228271484375, 0.006633758544921875, -0.031768798828125, 0.05322265625, -0.015716552734375, -0.041015625, -0.0206451416015625, 0.0242156982421875, -0.01187896728515625, -0.03143310546875, 0.03607177734375, -0.019256591796875, 0.008148193359375, -0.03814697265625, -0.0301971435546875, -0.0265045166015625, 0.042999267578125, -0.04827880859375, 0.07794189453125, 0.0037994384765625, -0.064697265625, 0.0223388671875, -0.03857421875, -0.00238037109375, -0.0007915496826171875, 0.0011844635009765625, -0.041473388671875, -0.021392822265625, 0.025665283203125, 0.0291900634765625, -0.037811279296875, 0.01629638671875, -0.02069091796875, -0.02874755859375, 0.003810882568359375, -0.01454925537109375, 0.0869140625, 0.021240234375, -0.043121337890625, 0.005992889404296875, -0.054534912109375, 0.005352020263671875, 0.048858642578125, -0.028717041015625, 0.024139404296875, -0.0210113525390625, -0.004058837890625, 0.01544952392578125, 0.035858154296875, -0.0191802978515625, 0.0288238525390625, -0.0125579833984375, 0.035919189453125, 0.06787109375, -0.007289886474609375, 0.0187835693359375, -0.04937744140625, 0.037811279296875, 0.01227569580078125, 0.037567138671875, -0.00641632080078125, -0.047149658203125, -0.0709228515625, -0.024444580078125, 0.017913818359375, 0.04876708984375, -0.040771484375, 0.048858642578125, -0.00848388671875, -0.06353759765625, -0.021575927734375, 0.0020904541015625, 0.032867431640625, 0.03857421875, 0.01971435546875, -0.020660400390625, -0.037689208984375, -0.0653076171875, 0.0206146240234375, -0.0189361572265625, 0.01383209228515625, 0.03961181640625, 0.06121826171875, -0.020050048828125, 0.04315185546875, -0.055511474609375, -0.0267791748046875, -0.0249176025390625, -0.018768310546875, 0.053436279296875, 0.045654296875, 0.05438232421875, -0.046234130859375, -0.040374755859375, 0.00757598876953125, -0.0745849609375, 0.00165557861328125, -0.00508880615234375, -0.0279693603515625, 0.0164947509765625, 0.0051422119140625, -0.057220458984375, 0.036224365234375, 0.044921875, -0.03369140625, 0.06805419921875, -0.00868988037109375, 0.01503753662109375, -0.08612060546875, 0.0257415771484375, -0.0164794921875, 0.004116058349609375, -0.033294677734375, 0.02294921875, 0.001262664794921875, 0.016571044921875, -0.046417236328125, 0.047760009765625, -0.03125, 0.004177093505859375, 0.0008897781372070312, 0.0090789794921875, 0.00765228271484375, 0.054840087890625, -0.0094146728515625, 0.0491943359375, 0.04931640625, -0.032806396484375, 0.02947998046875, 0.022125244140625, -0.021484375, 0.0194244384765625, -0.059661865234375, -0.0006742477416992188, 0.006992340087890625, 0.0265350341796875, -0.0819091796875, -0.0218658447265625, 0.0310211181640625, -0.052154541015625, 0.018829345703125, 0.00030803680419921875, -0.036376953125, -0.050689697265625, -0.046661376953125, 0.026611328125, 0.037017822265625, -0.036376953125, 0.03533935546875, 0.0264129638671875, 0.011199951171875, -0.056976318359375, -0.052490234375, -0.005352020263671875, -0.0245361328125, -0.048614501953125, 0.023284912109375, -0.0206146240234375, -0.005401611328125, -0.01396942138671875, 0.00028204917907714844, -0.00927734375, 0.01087188720703125, 0.02093505859375, 0.05322265625, -0.02569580078125, -0.0181884765625, -0.00988006591796875, -0.00894927978515625, -0.0031833648681640625, -0.0019588470458984375, 0.047149658203125, -0.026275634765625, -0.013671875, -0.047943115234375, -0.0019989013671875, 0.040863037109375, -0.00794219970703125, 0.055419921875, 0.05419921875, -0.024688720703125, 0.023895263671875, -0.04620361328125, -0.00067138671875, -0.039459228515625, 0.01496124267578125, -0.03582763671875, -0.0576171875, 0.0572509765625, 0.0164642333984375, 0.01451873779296875, 0.061859130859375, 0.052764892578125, 0.0018768310546875, 0.07666015625, 0.033599853515625, -0.0249786376953125, 0.01122283935546875, -0.060882568359375, -0.01187896728515625, -0.07086181640625, -0.0360107421875, -0.029083251953125, -0.0218353271484375, -0.0479736328125, -0.0255584716796875, 0.016265869140625, 0.0078887939453125, -0.050048828125, 0.03643798828125, -0.053497314453125, 0.0295257568359375, 0.039276123046875, 0.020965576171875, 0.01444244384765625, -0.016082763671875, -0.0251007080078125, -0.001674652099609375, -0.0443115234375, -0.02606201171875, 0.08154296875, 0.0225067138671875, 0.038787841796875, 0.0256805419921875, 0.041656494140625, 0.0156097412109375, 0.024749755859375, -0.047515869140625, 0.03912353515625, 0.0078277587890625, -0.05877685546875, -0.0167236328125, -0.0204315185546875, -0.06585693359375, 0.03741455078125, -0.0218048095703125, -0.06451416015625, 0.004161834716796875, 0.00801849365234375, -0.03607177734375, 0.034027099609375, -0.047332763671875, 0.053985595703125, -0.022125244140625, -0.04168701171875, 0.00911712646484375, -0.04345703125, 0.052978515625, -0.00016379356384277344, 0.021728515625, -0.0191802978515625, 0.0061187744140625, 0.05889892578125, -0.053497314453125, 0.055023193359375, -0.00603485107421875, 0.0005359649658203125, 0.044891357421875, 0.004261016845703125, 0.044219970703125, 0.00972747802734375, -0.0018749237060546875, 0.013214111328125, -0.00569915771484375, -0.03875732421875, -0.013916015625, 0.058380126953125, -0.07440185546875, -0.06097412109375, -0.046875, -0.023529052734375, 0.0091400146484375, 0.02294921875, 0.03363037109375, 0.020477294921875, 0.002887725830078125, 0.016265869140625, 0.029510498046875, -0.0216064453125, 0.04443359375, 0.0225067138671875, -0.0214080810546875, -0.04302978515625, 0.053802490234375, 0.01355743408203125, 0.01447296142578125, 0.025421142578125, 0.0258941650390625, -0.0287017822265625, -0.0147552490234375, -0.042083740234375, 0.03411865234375, -0.037200927734375, -0.02520751953125, -0.037994384765625, -0.02459716796875, -0.037139892578125, -0.020355224609375, -0.015533447265625, -0.0325927734375, -0.038238525390625, -0.01445770263671875, 0.029510498046875, 0.04345703125, -0.00848388671875, 0.0285797119140625, -0.048675537109375, 0.0113525390625, 0.0362548828125, 0.0026397705078125, 0.0204315185546875, -0.0653076171875, -0.004108428955078125, 0.003002166748046875, -0.045867919921875, -0.055755615234375, 0.04827880859375, -0.0019502639770507812, 0.035308837890625, 0.037200927734375, -0.016082763671875, 0.0699462890625, -0.0244293212890625, 0.07421875, 0.033203125, -0.06402587890625, 0.03753662109375, -0.039276123046875, 0.034332275390625, 0.025177001953125, 0.0243682861328125, -0.01617431640625, -0.020416259765625, -0.05615234375, -0.057586669921875, 0.0594482421875, 0.0289154052734375, -0.00396728515625, 0.01215362548828125, 0.00783538818359375, -0.0132293701171875, 0.004878997802734375, -0.0654296875, -0.039031982421875, -0.0218048095703125, 0.001811981201171875, -0.004161834716796875, -0.0214691162109375, 0.0008234977722167969, -0.043212890625, 0.05596923828125, -0.00136566162109375, 0.0212249755859375, 0.00647735595703125, -0.01282501220703125, -0.005252838134765625, -0.006450653076171875, 0.050140380859375, 0.047119140625, -0.020751953125, -0.0252532958984375, 0.038818359375, -0.05743408203125, 0.007434844970703125, 0.0003821849822998047, -0.0166473388671875, -0.006847381591796875, 0.0275115966796875, 0.046234130859375, 0.0232086181640625, -0.031494140625, 0.032958984375, 0.0025501251220703125, -0.0288848876953125, -0.035369873046875, 0.00734710693359375, 0.0163116455078125, 0.0248260498046875, 0.03753662109375, -0.017059326171875, 0.005870819091796875, -0.041412353515625, 0.007518768310546875, 0.0228424072265625, -0.01226043701171875, -0.0153350830078125, 0.0726318359375, 0.00447845458984375, -0.0020580291748046875, 0.034271240234375, -0.01073455810546875, -0.03521728515625, 0.0826416015625, 0.03607177734375, 0.043792724609375, -0.0289459228515625, 0.01398468017578125, 0.06610107421875, 0.0228271484375, -0.0010919570922851562, 0.0176544189453125, 0.005710601806640625, -0.033599853515625, -0.01166534423828125, -0.054962158203125, -0.02130126953125, 0.0186614990234375, -0.03594970703125, 0.0335693359375, -0.03875732421875, -0.022918701171875, 0.00414276123046875, 0.0362548828125, -0.050689697265625, 0.0286102294921875, 0.0197601318359375, 0.067138671875, -0.050872802734375, 0.06463623046875, 0.03265380859375, -0.04559326171875, -0.08465576171875, -0.031982421875, 0.0261077880859375, -0.07989501953125, 0.047943115234375, 0.0218353271484375, 0.0134735107421875, -0.0012416839599609375, -0.048431396484375, -0.0894775390625, 0.1278076171875, 0.00850677490234375, -0.033477783203125, 0.0007219314575195312, 0.005779266357421875, 0.0255279541015625, -0.01319122314453125, 0.050689697265625, 0.04315185546875, 0.0404052734375, 0.0271148681640625, -0.07818603515625, 0.018829345703125, -0.03839111328125, -0.00940704345703125, -0.0081634521484375, -0.10308837890625, 0.08843994140625, -0.0231475830078125, -0.00402069091796875, 0.00908660888671875, 0.040496826171875, 0.0478515625, 0.002582550048828125, 0.033416748046875, 0.054779052734375, 0.048919677734375, -0.013580322265625, 0.066162109375, -0.020294189453125, 0.04876708984375, 0.060302734375, -0.005893707275390625, 0.057159423828125, 0.007198333740234375, -0.032257080078125, 0.037017822265625, 0.0621337890625, -0.019378662109375, 0.023223876953125, 0.005405426025390625, -0.00910186767578125, -0.01194000244140625, -0.0022258758544921875, -0.046722412109375, 0.035430908203125, 0.023681640625, -0.0256195068359375, -0.00794219970703125, -0.0299224853515625, 0.030303955078125, -0.0305633544921875, -0.0235748291015625, 0.0400390625, 0.0089263916015625, -0.03765869140625, 0.07574462890625, -0.00901031494140625, 0.058563232421875, -0.0443115234375, 0.00045228004455566406, -0.03155517578125, 0.00943756103515625, -0.0234375, -0.05828857421875, -0.0005211830139160156, -0.0006823539733886719, 0.0007367134094238281, 0.021575927734375, 0.0404052734375, 0.0003745555877685547, -0.045379638671875, 0.0152435302734375, 0.023101806640625, 0.016204833984375, 0.01312255859375, -0.075927734375, 0.017974853515625, 0.0137481689453125, -0.042938232421875, 0.02752685546875, 0.0157012939453125, 0.006793975830078125, 0.056793212890625, 0.056488037109375, -0.017242431640625, 0.00830078125, -0.0110015869140625, 0.07818603515625, -0.05413818359375, -0.0281829833984375, -0.071044921875, 0.04864501953125, -0.0138092041015625, -0.040771484375, 0.0596923828125, 0.053131103515625, 0.03961181640625, -0.00815582275390625, 0.03857421875, -0.0203857421875, 0.01503753662109375, -0.0288238525390625, 0.06805419921875, -0.05242919921875, 0.00041866302490234375, -0.018707275390625, -0.058135986328125, -0.025177001953125, 0.044342041015625, -0.0116729736328125, -0.00365447998046875, 0.046661376953125, 0.06768798828125, 0.01035308837890625, -0.010528564453125, 0.02227783203125, 0.033447265625, 0.020965576171875, 0.0733642578125, 0.04736328125, -0.0570068359375, 0.05322265625, -0.040069580078125, -0.0234527587890625, -0.0347900390625, -0.051483154296875, -0.06158447265625, -0.031524658203125, -0.017913818359375, -0.0203704833984375, -0.01215362548828125, 0.051788330078125, 0.047210693359375, -0.053497314453125, -0.0309600830078125, 0.0187225341796875, 0.0038909912109375, -0.0080718994140625, -0.0185089111328125, 0.052886962890625, -0.0028934478759765625, -0.05633544921875, 0.0007891654968261719, 0.0039005279541015625, 0.00971221923828125, 0.0008082389831542969, -0.01453399658203125, -0.027435302734375, 0.0039825439453125, 0.04595947265625, 0.0263824462890625, -0.061920166015625, -0.014984130859375, 0.01399993896484375, -0.01812744140625, 0.0176849365234375, -0.0028781890869140625, -0.0421142578125, 0.01102447509765625, 0.027099609375, 0.01953125, 0.03973388671875, -0.0003879070281982422, -0.00046706199645996094, -0.0174407958984375, 0.0243072509765625, -0.00981903076171875, 0.038665771484375, 0.010467529296875, -0.03369140625, 0.054962158203125, 0.0256500244140625, -0.046112060546875, -0.06158447265625, -0.017364501953125, -0.093505859375, -0.01427459716796875, 0.08807373046875, -0.0134124755859375, -0.047119140625, 0.020599365234375, -0.029083251953125, 0.0182342529296875, -0.0276336669921875, 0.058563232421875, 0.050872802734375, 0.00222015380859375, -0.004604339599609375, -0.048095703125, 0.017730712890625, 0.0218048095703125, -0.0625, -0.025238037109375, 0.0225677490234375, 0.02459716796875, 0.0196380615234375, 0.055206298828125, -0.01617431640625, 0.03155517578125, 0.0177459716796875, 0.0030460357666015625, -0.0013828277587890625, 0.00769805908203125, -0.0019512176513671875, -0.002468109130859375, 0.0011234283447265625, -0.0294189453125 ] ]
MCG-NJU/videomae-base-finetuned-kinetics
2023-04-22T11:30:54.000Z
[ "transformers", "pytorch", "videomae", "video-classification", "vision", "arxiv:2203.12602", "arxiv:2111.06377", "license:cc-by-nc-4.0", "endpoints_compatible", "has_space", "region:us" ]
video-classification
MCG-NJU
null
null
MCG-NJU/videomae-base-finetuned-kinetics
10
185,265
transformers
2022-07-08T15:01:34
--- license: "cc-by-nc-4.0" tags: - vision - video-classification --- # VideoMAE (base-sized model, fine-tuned on Kinetics-400) VideoMAE model pre-trained for 1600 epochs in a self-supervised way and fine-tuned in a supervised way on Kinetics-400. It was introduced in the paper [VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training](https://arxiv.org/abs/2203.12602) by Tong et al. and first released in [this repository](https://github.com/MCG-NJU/VideoMAE). Disclaimer: The team releasing VideoMAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description VideoMAE is an extension of [Masked Autoencoders (MAE)](https://arxiv.org/abs/2111.06377) to video. The architecture of the model is very similar to that of a standard Vision Transformer (ViT), with a decoder on top for predicting pixel values for masked patches. Videos are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds fixed sinus/cosinus position embeddings before feeding the sequence to the layers of the Transformer encoder. By pre-training the model, it learns an inner representation of videos that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled videos for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire video. ## Intended uses & limitations You can use the raw model for video classification into one of the 400 possible Kinetics-400 labels. ### How to use Here is how to use this model to classify a video: ```python from transformers import VideoMAEImageProcessor, VideoMAEForVideoClassification import numpy as np import torch video = list(np.random.randn(16, 3, 224, 224)) processor = VideoMAEImageProcessor.from_pretrained("MCG-NJU/videomae-base-finetuned-kinetics") model = VideoMAEForVideoClassification.from_pretrained("MCG-NJU/videomae-base-finetuned-kinetics") inputs = processor(video, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) logits = outputs.logits predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` For more code examples, we refer to the [documentation](https://huggingface.co/transformers/main/model_doc/videomae.html#). ## Training data (to do, feel free to open a PR) ## Training procedure ### Preprocessing (to do, feel free to open a PR) ### Pretraining (to do, feel free to open a PR) ## Evaluation results This model obtains a top-1 accuracy of 80.9 and a top-5 accuracy of 94.7 on the test set of Kinetics-400. ### BibTeX entry and citation info ```bibtex misc{https://doi.org/10.48550/arxiv.2203.12602, doi = {10.48550/ARXIV.2203.12602}, url = {https://arxiv.org/abs/2203.12602}, author = {Tong, Zhan and Song, Yibing and Wang, Jue and Wang, Limin}, keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
3,584
[ [ -0.035888671875, -0.0186767578125, 0.00922393798828125, -0.0169830322265625, -0.029693603515625, 0.0015974044799804688, 0.0086822509765625, 0.000629425048828125, 0.024444580078125, 0.03167724609375, -0.041717529296875, -0.033477783203125, -0.07586669921875, -0.024078369140625, -0.0377197265625, 0.085693359375, -0.016021728515625, 0.006076812744140625, -0.00922393798828125, -0.00885772705078125, -0.0284423828125, -0.052337646484375, -0.035858154296875, -0.0250244140625, 0.024261474609375, 0.00907135009765625, 0.036376953125, 0.077392578125, 0.042388916015625, 0.036163330078125, 0.01110076904296875, -0.01517486572265625, -0.0311737060546875, -0.0240020751953125, 0.00586700439453125, -0.028045654296875, -0.035552978515625, 0.0147552490234375, 0.044403076171875, 0.0218353271484375, 0.006305694580078125, 0.041839599609375, -0.0012197494506835938, 0.01483154296875, -0.0653076171875, 0.00803375244140625, -0.0258331298828125, 0.032867431640625, -0.00350189208984375, -0.015655517578125, -0.026397705078125, -0.00733184814453125, 0.00899505615234375, -0.0244293212890625, 0.0271148681640625, -0.0104827880859375, 0.07904052734375, 0.03253173828125, -0.0248260498046875, 0.012481689453125, -0.06378173828125, 0.040435791015625, -0.0313720703125, 0.04180908203125, 0.01267242431640625, 0.04766845703125, 0.016754150390625, -0.0760498046875, -0.0379638671875, -0.00363922119140625, 0.005035400390625, -0.00128173828125, -0.01091766357421875, 0.0222625732421875, 0.042877197265625, 0.038848876953125, -0.047027587890625, 0.0122528076171875, -0.030792236328125, -0.0297393798828125, 0.04193115234375, 0.00795745849609375, 0.019012451171875, -0.0083160400390625, -0.047393798828125, -0.0306243896484375, -0.0100860595703125, 0.0165252685546875, 0.017425537109375, -0.01236724853515625, -0.01629638671875, 0.03887939453125, -0.0111541748046875, 0.037689208984375, 0.0289459228515625, -0.0207061767578125, 0.043182373046875, -0.005405426025390625, -0.053924560546875, 0.004878997802734375, 0.05938720703125, 0.0267333984375, 0.0138702392578125, 0.001102447509765625, -0.015869140625, 0.019927978515625, 0.028564453125, -0.0794677734375, -0.0251922607421875, 0.00290679931640625, -0.03631591796875, -0.01812744140625, 0.0283203125, -0.03887939453125, 0.00479888916015625, -0.03466796875, 0.07757568359375, -0.0024204254150390625, -0.00995635986328125, 0.0006823539733886719, -0.0085296630859375, 0.0248870849609375, 0.01092529296875, -0.058013916015625, 0.035125732421875, 0.020965576171875, 0.06915283203125, -0.00872039794921875, -0.022918701171875, -0.036407470703125, 0.0084991455078125, -0.01467132568359375, 0.03826904296875, -0.02032470703125, 0.003875732421875, -0.0008215904235839844, 0.033172607421875, -0.005786895751953125, -0.036407470703125, 0.0205841064453125, -0.041534423828125, 0.0143280029296875, 0.0003292560577392578, -0.040435791015625, -0.031524658203125, 0.01233673095703125, -0.042572021484375, 0.08642578125, -0.005126953125, -0.04583740234375, 0.039276123046875, -0.03704833984375, 0.007049560546875, -0.004909515380859375, 0.0024013519287109375, -0.0465087890625, -0.0108642578125, 0.00824737548828125, 0.042236328125, 0.01371002197265625, 0.0166168212890625, -0.01849365234375, -0.0240325927734375, 0.01270294189453125, -0.034027099609375, 0.04278564453125, 0.02203369140625, -0.036224365234375, 0.0250396728515625, -0.06280517578125, -0.005992889404296875, -0.01526641845703125, -0.0029468536376953125, 0.00930023193359375, -0.029083251953125, -0.00635528564453125, 0.035247802734375, 0.01194000244140625, -0.043975830078125, 0.00678253173828125, 0.0006232261657714844, 0.025146484375, 0.061004638671875, -0.004444122314453125, 0.03985595703125, -0.006134033203125, 0.049896240234375, 0.01206207275390625, 0.033355712890625, -0.021453857421875, -0.0218963623046875, -0.064208984375, -0.0172576904296875, 0.0230560302734375, 0.03521728515625, -0.038787841796875, 0.036651611328125, -0.00856781005859375, -0.04461669921875, -0.029815673828125, 0.00962066650390625, 0.0291595458984375, 0.034881591796875, 0.0341796875, -0.039886474609375, -0.06463623046875, -0.05792236328125, 0.0139312744140625, 0.00917816162109375, -0.0121002197265625, 0.0078582763671875, 0.054901123046875, -0.0173187255859375, 0.057708740234375, -0.0292816162109375, -0.01555633544921875, 0.010345458984375, 0.00807952880859375, 0.0188140869140625, 0.05206298828125, 0.032958984375, -0.055328369140625, -0.034027099609375, -0.01358795166015625, -0.0693359375, 0.008209228515625, -0.00274658203125, -0.02288818359375, -0.01348876953125, 0.03662109375, -0.040863037109375, 0.054840087890625, 0.0301971435546875, -0.01251220703125, 0.022064208984375, -0.018829345703125, 0.0194091796875, -0.0657958984375, -0.000705718994140625, -0.00530242919921875, -0.026702880859375, -0.052490234375, -0.003841400146484375, -0.01059722900390625, -0.01934814453125, -0.0540771484375, 0.025909423828125, -0.0258941650390625, -0.0243072509765625, -0.033050537109375, -0.0184326171875, -0.01557159423828125, 0.057098388671875, 0.01678466796875, 0.03192138671875, 0.05621337890625, -0.0640869140625, 0.04351806640625, 0.00801849365234375, -0.0279693603515625, 0.0207672119140625, -0.04547119140625, 0.01361846923828125, -0.0185546875, 0.0160064697265625, -0.06683349609375, -0.036163330078125, 0.0187530517578125, -0.03472900390625, 0.035369873046875, -0.0258941650390625, -0.017974853515625, -0.048919677734375, -0.00885009765625, 0.05609130859375, 0.054107666015625, -0.04364013671875, 0.038726806640625, 0.041168212890625, 0.03216552734375, -0.055938720703125, -0.049774169921875, -0.018218994140625, -0.0302734375, -0.0322265625, 0.024993896484375, -0.01031494140625, 0.01922607421875, -0.00434112548828125, -0.01025390625, -0.02154541015625, -0.0180511474609375, 0.03875732421875, 0.016204833984375, -0.01287078857421875, 0.00852203369140625, -0.022796630859375, -0.0245361328125, 0.01284027099609375, -0.036651611328125, 0.048553466796875, -0.00997161865234375, -0.02276611328125, -0.053466796875, 0.00547027587890625, 0.04766845703125, -0.0086517333984375, 0.04693603515625, 0.075439453125, -0.05438232421875, 0.005596160888671875, -0.042083740234375, -0.0131378173828125, -0.04241943359375, 0.032470703125, -0.023345947265625, -0.037506103515625, 0.053802490234375, 0.01140594482421875, -0.024810791015625, 0.047821044921875, 0.05194091796875, -0.01332855224609375, 0.07373046875, 0.0518798828125, 0.007389068603515625, 0.054473876953125, -0.057647705078125, -0.009307861328125, -0.050750732421875, -0.04461669921875, -0.01036834716796875, -0.03564453125, -0.036376953125, -0.037078857421875, 0.0285186767578125, 0.02001953125, -0.04962158203125, 0.044921875, -0.040557861328125, 0.03857421875, 0.03717041015625, 0.031219482421875, -0.0146636962890625, -0.002094268798828125, -0.0045928955078125, -0.00650787353515625, -0.058319091796875, -0.02801513671875, 0.067626953125, 0.04876708984375, 0.050567626953125, -0.014007568359375, 0.05157470703125, 0.024078369140625, 0.01025390625, -0.054168701171875, 0.045135498046875, -0.01244354248046875, -0.04315185546875, -0.00550079345703125, -0.00922393798828125, -0.06048583984375, -0.004730224609375, -0.022491455078125, -0.050811767578125, 0.0156097412109375, 0.0288848876953125, -0.0260467529296875, 0.043548583984375, -0.04254150390625, 0.09014892578125, -0.01611328125, -0.02130126953125, 0.0008497238159179688, -0.051727294921875, 0.014984130859375, 0.00809478759765625, 0.0055084228515625, 0.0204315185546875, 0.0221099853515625, 0.08172607421875, -0.061614990234375, 0.06591796875, -0.032867431640625, 0.0202178955078125, 0.05194091796875, -0.012939453125, 0.0216522216796875, -0.0171661376953125, 0.036346435546875, 0.019561767578125, -0.00872802734375, -0.030120849609375, -0.054351806640625, 0.02349853515625, -0.06365966796875, -0.03253173828125, -0.0325927734375, -0.0215301513671875, 0.0251007080078125, 0.01401519775390625, 0.04876708984375, 0.04547119140625, 0.0086212158203125, 0.0204315185546875, 0.0589599609375, -0.01386260986328125, 0.050140380859375, 0.0018329620361328125, -0.014923095703125, -0.041473388671875, 0.061004638671875, 0.0232696533203125, 0.026580810546875, 0.0275726318359375, 0.00849151611328125, -0.01806640625, -0.0278778076171875, -0.0261077880859375, 0.00021588802337646484, -0.062225341796875, -0.02532958984375, -0.0369873046875, -0.0499267578125, -0.02911376953125, -0.0146636962890625, -0.039825439453125, -0.01540374755859375, -0.032257080078125, -0.03363037109375, 0.024658203125, 0.049285888671875, -0.0266571044921875, 0.051788330078125, -0.055877685546875, 0.01898193359375, 0.04486083984375, 0.03228759765625, -0.0016717910766601562, -0.0697021484375, -0.036285400390625, -0.00007385015487670898, -0.0223541259765625, -0.054962158203125, 0.038177490234375, 0.01335906982421875, 0.052215576171875, 0.047760009765625, -0.0274505615234375, 0.07086181640625, -0.04376220703125, 0.056793212890625, 0.03204345703125, -0.06982421875, 0.047637939453125, -0.0110015869140625, 0.0174560546875, 0.019287109375, 0.053619384765625, -0.0159759521484375, 0.0059051513671875, -0.057159423828125, -0.038726806640625, 0.046905517578125, 0.01276397705078125, 0.01190948486328125, 0.01038360595703125, 0.0413818359375, 0.005382537841796875, 0.00951385498046875, -0.0780029296875, -0.0216827392578125, -0.055023193359375, -0.00357818603515625, -0.0142364501953125, -0.016326904296875, 0.00479888916015625, -0.038543701171875, 0.039093017578125, 0.0014467239379882812, 0.04608154296875, 0.0141754150390625, -0.0217437744140625, -0.0210723876953125, -0.01934814453125, 0.032501220703125, 0.015625, -0.04248046875, 0.00991058349609375, 0.0206146240234375, -0.05474853515625, 0.0247344970703125, -0.025115966796875, -0.003265380859375, 0.0027103424072265625, 0.025604248046875, 0.08758544921875, 0.0168914794921875, -0.01244354248046875, 0.05059814453125, 0.024658203125, -0.0183258056640625, -0.03692626953125, 0.01324462890625, -0.03582763671875, 0.026580810546875, 0.02252197265625, 0.0177154541015625, 0.01493072509765625, -0.043365478515625, 0.041473388671875, 0.0248870849609375, -0.033477783203125, -0.036834716796875, 0.07379150390625, -0.006244659423828125, -0.026580810546875, 0.0300750732421875, -0.002399444580078125, -0.052337646484375, 0.0557861328125, 0.02581787109375, 0.07745361328125, -0.0312347412109375, 0.0175628662109375, 0.0601806640625, 0.0105743408203125, -0.0164947509765625, -0.0032806396484375, -0.013427734375, -0.048553466796875, -0.037322998046875, -0.047454833984375, -0.006992340087890625, 0.012664794921875, -0.06610107421875, 0.04180908203125, -0.04473876953125, -0.0251312255859375, 0.002498626708984375, -0.00714111328125, -0.07586669921875, 0.042449951171875, 0.0430908203125, 0.059722900390625, -0.07757568359375, 0.0760498046875, 0.03265380859375, -0.037750244140625, -0.048614501953125, -0.0379638671875, -0.0157928466796875, -0.056365966796875, 0.06842041015625, 0.025115966796875, 0.004207611083984375, -0.0010585784912109375, -0.057037353515625, -0.0777587890625, 0.09259033203125, 0.0204620361328125, -0.02880859375, -0.01166534423828125, 0.0099639892578125, 0.039031982421875, -0.0494384765625, 0.050537109375, 0.00824737548828125, 0.0092926025390625, 0.029327392578125, -0.06719970703125, -0.017486572265625, -0.01287078857421875, -0.0012273788452148438, 0.006702423095703125, -0.054534912109375, 0.07501220703125, -0.0058135986328125, 0.0011281967163085938, -0.0115203857421875, 0.0518798828125, -0.00797271728515625, 0.024658203125, 0.04412841796875, 0.0565185546875, 0.0355224609375, -0.0007367134094238281, 0.06854248046875, 0.0018901824951171875, 0.036285400390625, 0.06591796875, 0.021148681640625, 0.050567626953125, 0.013092041015625, -0.0180511474609375, 0.064697265625, 0.060546875, -0.026214599609375, 0.056549072265625, 0.00516510009765625, -0.00034689903259277344, -0.0380859375, 0.01416015625, -0.02764892578125, 0.05120849609375, 0.0172576904296875, -0.043060302734375, -0.003200531005859375, 0.028167724609375, -0.0212860107421875, -0.025115966796875, -0.05059814453125, 0.047821044921875, -0.005886077880859375, -0.036041259765625, 0.047821044921875, -0.0229949951171875, 0.03173828125, -0.04608154296875, -0.005672454833984375, -0.00048661231994628906, 0.0179290771484375, -0.0302734375, -0.0457763671875, 0.0139312744140625, 0.00283050537109375, -0.01433563232421875, -0.0089111328125, 0.042755126953125, -0.037109375, -0.035552978515625, 0.0107269287109375, -0.0005812644958496094, 0.0367431640625, 0.0117645263671875, -0.0386962890625, 0.0003726482391357422, -0.01322174072265625, -0.0184173583984375, 0.035003662109375, 0.0171661376953125, 0.00274658203125, 0.0516357421875, 0.03558349609375, -0.029510498046875, 0.038482666015625, -0.005672454833984375, 0.065185546875, -0.048553466796875, -0.03887939453125, -0.059173583984375, 0.057861328125, 0.00244140625, -0.0146026611328125, 0.055938720703125, 0.044769287109375, 0.0775146484375, -0.0213623046875, 0.0258941650390625, 0.0094146728515625, 0.00591278076171875, -0.047943115234375, 0.03070068359375, -0.02716064453125, -0.005710601806640625, -0.029632568359375, -0.073486328125, -0.0176544189453125, 0.061004638671875, -0.0012035369873046875, 0.01064300537109375, 0.042083740234375, 0.046051025390625, -0.0225372314453125, -0.018798828125, 0.0291748046875, 0.0266571044921875, 0.005077362060546875, 0.040740966796875, 0.038177490234375, -0.067626953125, 0.037353515625, -0.04400634765625, -0.02459716796875, -0.01049041748046875, -0.06378173828125, -0.080322265625, -0.039886474609375, -0.05255126953125, -0.0309600830078125, -0.01087188720703125, 0.060638427734375, 0.08306884765625, -0.0694580078125, -0.018646240234375, -0.00464630126953125, -0.0221099853515625, -0.0115203857421875, -0.01335906982421875, 0.03045654296875, 0.0007424354553222656, -0.0595703125, 0.0027637481689453125, 0.0030765533447265625, 0.019683837890625, -0.0277862548828125, -0.0184478759765625, -0.01708984375, -0.0140838623046875, 0.0433349609375, 0.030181884765625, -0.04522705078125, -0.043609619140625, -0.0007762908935546875, 0.00734710693359375, 0.0196380615234375, 0.059326171875, -0.07672119140625, 0.043975830078125, 0.0308837890625, 0.02801513671875, 0.085693359375, -0.00797271728515625, 0.027008056640625, -0.061859130859375, 0.0250396728515625, -0.0018758773803710938, 0.038421630859375, 0.007694244384765625, -0.023040771484375, 0.033447265625, 0.033172607421875, -0.047454833984375, -0.0657958984375, 0.00862884521484375, -0.0927734375, 0.003559112548828125, 0.0753173828125, -0.02069091796875, -0.018890380859375, 0.0206756591796875, -0.007541656494140625, 0.055419921875, 0.00007140636444091797, 0.042572021484375, 0.03387451171875, -0.00275421142578125, -0.040191650390625, -0.039581298828125, 0.03826904296875, 0.004589080810546875, -0.034332275390625, -0.03350830078125, 0.00997161865234375, 0.0297393798828125, 0.01904296875, 0.0281982421875, -0.007289886474609375, 0.0224609375, 0.01403045654296875, 0.02001953125, -0.01432037353515625, -0.039276123046875, -0.034423828125, 0.0179290771484375, -0.033355712890625, -0.053314208984375 ] ]
distilbert-base-cased-distilled-squad
2023-04-12T12:06:44.000Z
[ "transformers", "pytorch", "tf", "rust", "safetensors", "openvino", "distilbert", "question-answering", "en", "dataset:squad", "arxiv:1910.01108", "arxiv:1910.09700", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
null
null
null
distilbert-base-cased-distilled-squad
141
184,739
transformers
2022-03-02T23:29:04
--- language: en license: apache-2.0 datasets: - squad metrics: - squad model-index: - name: distilbert-base-cased-distilled-squad results: - task: type: question-answering name: Question Answering dataset: name: squad type: squad config: plain_text split: validation metrics: - type: exact_match value: 79.5998 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTViZDA2Y2E2NjUyMjNjYjkzNTUzODc5OTk2OTNkYjQxMDRmMDhlYjdmYWJjYWQ2N2RlNzY1YmI3OWY1NmRhOSIsInZlcnNpb24iOjF9.ZJHhboAMwsi3pqU-B-XKRCYP_tzpCRb8pEjGr2Oc-TteZeoWHI8CXcpDxugfC3f7d_oBcKWLzh3CClQxBW1iAQ - type: f1 value: 86.9965 name: F1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWZlMzY2MmE1NDNhOGNjNWRmODg0YjQ2Zjk5MjUzZDQ2MDYxOTBlMTNhNzQ4NTA2NjRmNDU3MGIzMTYwMmUyOSIsInZlcnNpb24iOjF9.z0ZDir87aT7UEmUeDm8Uw0oUdAqzlBz343gwnsQP3YLfGsaHe-jGlhco0Z7ISUd9NokyCiJCRc4NNxJQ83IuCw --- # DistilBERT base cased distilled SQuAD ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** The DistilBERT model was proposed in the blog post [Smaller, faster, cheaper, lighter: Introducing DistilBERT, adistilled version of BERT](https://medium.com/huggingface/distilbert-8cf3380435b5), and the paper [DistilBERT, adistilled version of BERT: smaller, faster, cheaper and lighter](https://arxiv.org/abs/1910.01108). DistilBERT is a small, fast, cheap and light Transformer model trained by distilling BERT base. It has 40% less parameters than *bert-base-uncased*, runs 60% faster while preserving over 95% of BERT's performances as measured on the GLUE language understanding benchmark. This model is a fine-tune checkpoint of [DistilBERT-base-cased](https://huggingface.co/distilbert-base-cased), fine-tuned using (a second step of) knowledge distillation on [SQuAD v1.1](https://huggingface.co/datasets/squad). - **Developed by:** Hugging Face - **Model Type:** Transformer-based language model - **Language(s):** English - **License:** Apache 2.0 - **Related Models:** [DistilBERT-base-cased](https://huggingface.co/distilbert-base-cased) - **Resources for more information:** - See [this repository](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for more about Distil\* (a class of compressed models including this model) - See [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108) for more information about knowledge distillation and the training procedure ## How to Get Started with the Model Use the code below to get started with the model. ```python >>> from transformers import pipeline >>> question_answerer = pipeline("question-answering", model='distilbert-base-cased-distilled-squad') >>> context = r""" ... Extractive Question Answering is the task of extracting an answer from a text given a question. An example of a ... question answering dataset is the SQuAD dataset, which is entirely based on that task. If you would like to fine-tune ... a model on a SQuAD task, you may leverage the examples/pytorch/question-answering/run_squad.py script. ... """ >>> result = question_answerer(question="What is a good example of a question answering dataset?", context=context) >>> print( ... f"Answer: '{result['answer']}', score: {round(result['score'], 4)}, start: {result['start']}, end: {result['end']}" ...) Answer: 'SQuAD dataset', score: 0.5152, start: 147, end: 160 ``` Here is how to use this model in PyTorch: ```python from transformers import DistilBertTokenizer, DistilBertModel import torch tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-cased-distilled-squad') model = DistilBertModel.from_pretrained('distilbert-base-cased-distilled-squad') question, text = "Who was Jim Henson?", "Jim Henson was a nice puppet" inputs = tokenizer(question, text, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) print(outputs) ``` And in TensorFlow: ```python from transformers import DistilBertTokenizer, TFDistilBertForQuestionAnswering import tensorflow as tf tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-cased-distilled-squad") model = TFDistilBertForQuestionAnswering.from_pretrained("distilbert-base-cased-distilled-squad") question, text = "Who was Jim Henson?", "Jim Henson was a nice puppet" inputs = tokenizer(question, text, return_tensors="tf") outputs = model(**inputs) answer_start_index = int(tf.math.argmax(outputs.start_logits, axis=-1)[0]) answer_end_index = int(tf.math.argmax(outputs.end_logits, axis=-1)[0]) predict_answer_tokens = inputs.input_ids[0, answer_start_index : answer_end_index + 1] tokenizer.decode(predict_answer_tokens) ``` ## Uses This model can be used for question answering. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware that language generated by this model can be disturbing or offensive to some and can propagate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example: ```python >>> from transformers import pipeline >>> question_answerer = pipeline("question-answering", model='distilbert-base-cased-distilled-squad') >>> context = r""" ... Alice is sitting on the bench. Bob is sitting next to her. ... """ >>> result = question_answerer(question="Who is the CEO?", context=context) >>> print( ... f"Answer: '{result['answer']}', score: {round(result['score'], 4)}, start: {result['start']}, end: {result['end']}" ...) Answer: 'Bob', score: 0.7527, start: 32, end: 35 ``` Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. ## Training #### Training Data The [distilbert-base-cased model](https://huggingface.co/distilbert-base-cased) was trained using the same data as the [distilbert-base-uncased model](https://huggingface.co/distilbert-base-uncased). The [distilbert-base-uncased model](https://huggingface.co/distilbert-base-uncased) model describes it's training data as: > DistilBERT pretrained on the same data as BERT, which is [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). To learn more about the SQuAD v1.1 dataset, see the [SQuAD v1.1 data card](https://huggingface.co/datasets/squad). #### Training Procedure ##### Preprocessing See the [distilbert-base-cased model card](https://huggingface.co/distilbert-base-cased) for further details. ##### Pretraining See the [distilbert-base-cased model card](https://huggingface.co/distilbert-base-cased) for further details. ## Evaluation As discussed in the [model repository](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md) > This model reaches a F1 score of 87.1 on the [SQuAD v1.1] dev set (for comparison, BERT bert-base-cased version reaches a F1 score of 88.7). ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and hours used based on the [associated paper](https://arxiv.org/pdf/1910.01108.pdf). Note that these details are just for training DistilBERT, not including the fine-tuning with SQuAD. - **Hardware Type:** 8 16GB V100 GPUs - **Hours used:** 90 hours - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/abs/1910.01108) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @inproceedings{sanh2019distilbert, title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter}, author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas}, booktitle={NeurIPS EMC^2 Workshop}, year={2019} } ``` APA: - Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. ## Model Card Authors This model card was written by the Hugging Face team.
9,515
[ [ -0.0268707275390625, -0.0660400390625, 0.01812744140625, 0.01105499267578125, -0.00925445556640625, 0.01464080810546875, -0.011138916015625, -0.0164337158203125, -0.00392913818359375, 0.0118865966796875, -0.061004638671875, -0.022796630859375, -0.055206298828125, 0.006511688232421875, -0.0226593017578125, 0.09417724609375, -0.004566192626953125, 0.005115509033203125, -0.013885498046875, -0.0038776397705078125, -0.0203399658203125, -0.05938720703125, -0.03369140625, -0.0258941650390625, 0.0200042724609375, 0.00628662109375, 0.032501220703125, 0.02154541015625, 0.041107177734375, 0.03350830078125, -0.017730712890625, -0.0166168212890625, -0.05169677734375, -0.005573272705078125, 0.007205963134765625, -0.03607177734375, -0.03076171875, 0.022979736328125, 0.0201568603515625, 0.05133056640625, -0.0188446044921875, 0.0227203369140625, 0.00727081298828125, 0.04840087890625, -0.01788330078125, 0.0335693359375, -0.053436279296875, -0.00759124755859375, 0.00998687744140625, 0.0169677734375, -0.0277099609375, -0.01016998291015625, 0.029083251953125, -0.0279541015625, 0.04437255859375, -0.0023746490478515625, 0.07867431640625, 0.034881591796875, -0.00783538818359375, -0.0235443115234375, -0.03802490234375, 0.070068359375, -0.061676025390625, 0.0013408660888671875, 0.019805908203125, 0.026641845703125, -0.005657196044921875, -0.06304931640625, -0.058807373046875, -0.01384735107421875, -0.0182037353515625, 0.0238037109375, -0.031280517578125, 0.0023097991943359375, 0.025054931640625, 0.0302581787109375, -0.032440185546875, -0.00855255126953125, -0.0579833984375, -0.004978179931640625, 0.05908203125, 0.0025043487548828125, 0.003215789794921875, -0.007404327392578125, -0.037506103515625, -0.025360107421875, -0.01947021484375, 0.0161895751953125, 0.0299835205078125, 0.0261383056640625, -0.016510009765625, 0.04205322265625, -0.0171356201171875, 0.037689208984375, 0.031707763671875, -0.002971649169921875, 0.0322265625, -0.017486572265625, -0.0277557373046875, 0.01464080810546875, 0.062744140625, 0.0253448486328125, 0.0194854736328125, -0.00689697265625, -0.0095062255859375, -0.004322052001953125, 0.008575439453125, -0.08135986328125, -0.0369873046875, 0.030975341796875, -0.02508544921875, -0.03802490234375, 0.004085540771484375, -0.0430908203125, -0.0010967254638671875, -0.0041961669921875, 0.035675048828125, -0.03009033203125, -0.0277557373046875, 0.01177978515625, -0.03277587890625, 0.00849151611328125, 0.0079498291015625, -0.0667724609375, 0.005924224853515625, 0.0194854736328125, 0.055816650390625, -0.00672149658203125, -0.0149688720703125, -0.0159454345703125, -0.01806640625, 0.004730224609375, 0.0301055908203125, -0.0077056884765625, -0.0246429443359375, -0.01470184326171875, 0.0172576904296875, 0.0015316009521484375, -0.039642333984375, 0.02294921875, -0.0322265625, 0.0265350341796875, -0.0161590576171875, -0.044952392578125, -0.0189208984375, 0.0118560791015625, -0.048248291015625, 0.0850830078125, 0.0308380126953125, -0.05682373046875, 0.0300750732421875, -0.047943115234375, -0.0308685302734375, -0.00823974609375, 0.0166473388671875, -0.04150390625, 0.0003616809844970703, 0.018218994140625, 0.04315185546875, -0.018035888671875, 0.0382080078125, -0.028076171875, -0.01800537109375, 0.0118408203125, -0.033447265625, 0.1015625, 0.01494598388671875, -0.027801513671875, -0.0188446044921875, -0.0484619140625, -0.00023293495178222656, 0.0217742919921875, -0.0333251953125, -0.0044097900390625, -0.01194000244140625, 0.0025768280029296875, 0.0212249755859375, 0.021453857421875, -0.034942626953125, 0.017913818359375, -0.0094757080078125, 0.0565185546875, 0.05755615234375, -0.0171356201171875, 0.023468017578125, -0.03143310546875, 0.025634765625, 0.02630615234375, 0.0164794921875, 0.004047393798828125, -0.035980224609375, -0.06805419921875, -0.032135009765625, 0.0239715576171875, 0.040618896484375, -0.048797607421875, 0.053741455078125, -0.00931549072265625, -0.055877685546875, -0.03741455078125, 0.00022482872009277344, 0.023345947265625, 0.0640869140625, 0.04278564453125, 0.0102386474609375, -0.053375244140625, -0.069580078125, 0.01355743408203125, -0.042236328125, -0.0002560615539550781, 0.005245208740234375, 0.06353759765625, -0.005126953125, 0.0809326171875, -0.05145263671875, -0.0118560791015625, -0.039520263671875, 0.01049041748046875, 0.045623779296875, 0.041778564453125, 0.055999755859375, -0.06304931640625, -0.038421630859375, -0.03302001953125, -0.06414794921875, 0.002132415771484375, 0.01480865478515625, -0.005619049072265625, 0.015411376953125, 0.0281829833984375, -0.044769287109375, 0.0302581787109375, 0.036529541015625, -0.0196990966796875, 0.041473388671875, -0.01032257080078125, 0.01035308837890625, -0.0872802734375, 0.01267242431640625, -0.0006780624389648438, -0.002918243408203125, -0.05328369140625, -0.01512908935546875, -0.0174713134765625, 0.002277374267578125, -0.05206298828125, 0.02886962890625, -0.0188446044921875, 0.0252685546875, 0.004634857177734375, -0.0186309814453125, 0.0164031982421875, 0.059417724609375, 0.0108489990234375, 0.055511474609375, 0.036956787109375, -0.044647216796875, 0.03826904296875, 0.03143310546875, -0.0287933349609375, 0.026641845703125, -0.08111572265625, 0.01045989990234375, -0.0183868408203125, 0.016815185546875, -0.08154296875, 0.004825592041015625, 0.005496978759765625, -0.03546142578125, 0.032318115234375, -0.0245361328125, -0.04022216796875, -0.044403076171875, -0.00463104248046875, 0.023040771484375, 0.060943603515625, -0.03228759765625, 0.031097412109375, 0.027435302734375, -0.0012521743774414062, -0.05023193359375, -0.0665283203125, -0.036224365234375, -0.0399169921875, -0.050567626953125, 0.03509521484375, -0.0175933837890625, -0.0279083251953125, -0.0009603500366210938, -0.01047515869140625, -0.036041259765625, 0.00817108154296875, 0.00966644287109375, 0.0457763671875, -0.00765228271484375, 0.007648468017578125, -0.005138397216796875, 0.00592041015625, 0.0016460418701171875, -0.0035266876220703125, 0.038726806640625, -0.0304107666015625, 0.00559234619140625, -0.03082275390625, 0.0169677734375, 0.03594970703125, 0.0009016990661621094, 0.0670166015625, 0.034027099609375, -0.01904296875, 0.00568389892578125, -0.04156494140625, -0.033172607421875, -0.0390625, 0.051025390625, -0.0178985595703125, -0.048187255859375, 0.044586181640625, 0.01465606689453125, 0.01145172119140625, 0.060516357421875, 0.04486083984375, -0.033843994140625, 0.0675048828125, 0.045135498046875, -0.0161895751953125, 0.0272674560546875, -0.0537109375, 0.00922393798828125, -0.044036865234375, -0.02276611328125, -0.038330078125, -0.036712646484375, -0.040771484375, -0.027435302734375, 0.0233917236328125, 0.032501220703125, -0.034515380859375, 0.046295166015625, -0.05328369140625, 0.0210418701171875, 0.0406494140625, 0.00960540771484375, 0.01280975341796875, 0.0012636184692382812, -0.0032024383544921875, 0.0021762847900390625, -0.0667724609375, -0.038482666015625, 0.08349609375, 0.03143310546875, 0.056427001953125, -0.008880615234375, 0.051849365234375, 0.0169677734375, 0.022247314453125, -0.042999267578125, 0.0256805419921875, -0.004802703857421875, -0.09063720703125, -0.0247955322265625, -0.03485107421875, -0.05517578125, 0.0056915283203125, -0.002681732177734375, -0.0518798828125, 0.01012420654296875, 0.003070831298828125, -0.031585693359375, 0.0220489501953125, -0.07470703125, 0.070556640625, -0.0277099609375, -0.019256591796875, 0.00817108154296875, -0.053192138671875, 0.0195770263671875, 0.005199432373046875, -0.00574493408203125, -0.01000213623046875, 0.0296783447265625, 0.061279296875, -0.05218505859375, 0.06024169921875, -0.029205322265625, 0.01290130615234375, 0.0457763671875, -0.024444580078125, 0.021087646484375, 0.00443267822265625, -0.0212860107421875, 0.042633056640625, 0.01922607421875, -0.029571533203125, -0.0352783203125, 0.032196044921875, -0.0677490234375, -0.047210693359375, -0.05078125, -0.047149658203125, 0.0012369155883789062, 0.006011962890625, 0.034942626953125, 0.01873779296875, -0.0170135498046875, 0.0217742919921875, 0.05206298828125, -0.024932861328125, 0.044158935546875, 0.034576416015625, -0.005100250244140625, 0.0005512237548828125, 0.03704833984375, 0.00775146484375, 0.0285797119140625, 0.0289306640625, 0.0116729736328125, -0.048370361328125, -0.0256805419921875, -0.0362548828125, 0.00804901123046875, -0.0491943359375, -0.0183868408203125, -0.047454833984375, -0.03778076171875, -0.033843994140625, 0.00464630126953125, -0.028778076171875, -0.031402587890625, -0.0355224609375, -0.01922607421875, 0.04742431640625, 0.04254150390625, 0.0097503662109375, 0.025115966796875, -0.037139892578125, 0.013671875, 0.0280914306640625, 0.0111236572265625, -0.016815185546875, -0.050872802734375, -0.004253387451171875, 0.03851318359375, -0.03662109375, -0.060455322265625, 0.0215301513671875, 0.01139068603515625, 0.0455322265625, 0.016571044921875, 0.021209716796875, 0.059173583984375, -0.032257080078125, 0.058563232421875, 0.0200653076171875, -0.05596923828125, 0.044158935546875, -0.00450897216796875, 0.00841522216796875, 0.0572509765625, 0.03790283203125, -0.005016326904296875, -0.0187835693359375, -0.05914306640625, -0.0582275390625, 0.06854248046875, 0.03369140625, 0.01171875, -0.0026702880859375, 0.0145111083984375, 0.00402069091796875, 0.0325927734375, -0.06231689453125, -0.03997802734375, -0.02655029296875, -0.011474609375, -0.0158233642578125, 0.00021266937255859375, 0.0006389617919921875, -0.056182861328125, 0.061126708984375, 0.01007080078125, 0.0275726318359375, 0.0223541259765625, -0.012542724609375, 0.0257720947265625, -0.0030364990234375, 0.0230712890625, 0.032684326171875, -0.04571533203125, -0.012847900390625, 0.0253143310546875, -0.037567138671875, 0.020416259765625, 0.0199737548828125, -0.00223541259765625, 0.01416778564453125, 0.0205535888671875, 0.064208984375, -0.01360321044921875, -0.04608154296875, 0.034088134765625, -0.00862884521484375, -0.03338623046875, -0.0281982421875, 0.00452423095703125, 0.013580322265625, 0.0269927978515625, 0.034881591796875, 0.00231170654296875, -0.0005440711975097656, -0.05731201171875, 0.0157318115234375, 0.024932861328125, -0.036041259765625, -0.01192474365234375, 0.057586669921875, 0.0122528076171875, -0.007183074951171875, 0.07196044921875, -0.013397216796875, -0.046295166015625, 0.0689697265625, 0.0224609375, 0.0491943359375, 0.0017871856689453125, 0.0160675048828125, 0.042572021484375, 0.0219879150390625, -0.008758544921875, 0.002437591552734375, 0.0056304931640625, -0.047882080078125, -0.00475311279296875, -0.0601806640625, 0.00749969482421875, 0.020172119140625, -0.0499267578125, 0.0255126953125, -0.0246429443359375, -0.0207061767578125, 0.00994873046875, 0.015899658203125, -0.07666015625, 0.01528167724609375, -0.01065826416015625, 0.061492919921875, -0.06610107421875, 0.063720703125, 0.040130615234375, -0.06146240234375, -0.06719970703125, -0.0157623291015625, -0.0218963623046875, -0.063720703125, 0.0687255859375, 0.022613525390625, 0.01763916015625, 0.0032100677490234375, -0.034637451171875, -0.0419921875, 0.095703125, 0.03778076171875, -0.051116943359375, -0.01407623291015625, 0.024505615234375, 0.04974365234375, -0.0219573974609375, 0.049102783203125, 0.053558349609375, 0.0267181396484375, 0.023529052734375, -0.060394287109375, -0.0063018798828125, -0.0264129638671875, -0.01090240478515625, -0.0130157470703125, -0.055816650390625, 0.08660888671875, -0.0251922607421875, -0.00444793701171875, -0.001552581787109375, 0.03741455078125, 0.0235443115234375, 0.00730133056640625, 0.04083251953125, 0.04742431640625, 0.053863525390625, -0.031524658203125, 0.07916259765625, -0.01520538330078125, 0.04827880859375, 0.072265625, -0.00736236572265625, 0.045074462890625, 0.045379638671875, -0.041717529296875, 0.03863525390625, 0.0472412109375, -0.0185089111328125, 0.062744140625, 0.032806396484375, -0.00760650634765625, -0.006526947021484375, 0.01399993896484375, -0.036712646484375, 0.03692626953125, -0.0019388198852539062, -0.032867431640625, -0.0140228271484375, -0.0125885009765625, 0.0012426376342773438, -0.016021728515625, -0.0002894401550292969, 0.04986572265625, -0.0025920867919921875, -0.059967041015625, 0.06787109375, -0.00965118408203125, 0.0595703125, -0.04327392578125, -0.0030460357666015625, -0.016937255859375, 0.020416259765625, -0.0014772415161132812, -0.055999755859375, 0.0204925537109375, 0.00611114501953125, -0.036285400390625, -0.0232391357421875, 0.02398681640625, -0.039520263671875, -0.0667724609375, 0.01251220703125, 0.03497314453125, 0.0192108154296875, -0.00826263427734375, -0.07269287109375, -0.006183624267578125, 0.00846099853515625, -0.018218994140625, 0.01537322998046875, 0.0254974365234375, 0.0252685546875, 0.047088623046875, 0.0426025390625, -0.01904296875, 0.00803375244140625, -0.00897979736328125, 0.0748291015625, -0.0159454345703125, -0.0147857666015625, -0.07879638671875, 0.06451416015625, -0.01290130615234375, -0.033477783203125, 0.040496826171875, 0.05609130859375, 0.06866455078125, -0.0201263427734375, 0.07403564453125, -0.034515380859375, 0.01404571533203125, -0.0220794677734375, 0.06939697265625, -0.0445556640625, 0.010284423828125, -0.026275634765625, -0.0672607421875, 0.02032470703125, 0.06353759765625, -0.0149078369140625, 0.0195770263671875, 0.045318603515625, 0.059173583984375, -0.01068115234375, -0.004978179931640625, 0.00421905517578125, 0.0186614990234375, 0.0200653076171875, 0.0509033203125, 0.05218505859375, -0.05914306640625, 0.05126953125, -0.04473876953125, -0.0219573974609375, -0.00937652587890625, -0.06768798828125, -0.09375, -0.062469482421875, -0.03369140625, -0.03546142578125, -0.004940032958984375, 0.052947998046875, 0.0670166015625, -0.061859130859375, -0.0187530517578125, -0.01593017578125, -0.00176239013671875, -0.0212249755859375, -0.01922607421875, 0.022979736328125, -0.0160369873046875, -0.0718994140625, 0.00146484375, -0.010894775390625, 0.016387939453125, -0.018035888671875, -0.005222320556640625, -0.032806396484375, -0.01302337646484375, 0.03643798828125, -0.002727508544921875, -0.0416259765625, -0.014190673828125, 0.0172271728515625, -0.0013828277587890625, 0.01090240478515625, 0.0283355712890625, -0.05322265625, 0.02752685546875, 0.032012939453125, 0.0222930908203125, 0.057647705078125, -0.00568389892578125, 0.03680419921875, -0.057891845703125, 0.0311737060546875, 0.028076171875, 0.0406494140625, 0.01812744140625, -0.0290069580078125, 0.039154052734375, 0.0173187255859375, -0.03662109375, -0.061492919921875, -0.006168365478515625, -0.076904296875, -0.023895263671875, 0.08203125, -0.0287933349609375, -0.01528167724609375, 0.004795074462890625, -0.020965576171875, 0.045196533203125, -0.03125, 0.071533203125, 0.06988525390625, 0.0054168701171875, 0.01039886474609375, -0.042449951171875, 0.03179931640625, 0.0220794677734375, -0.04876708984375, -0.0030803680419921875, 0.019134521484375, 0.0450439453125, 0.00531005859375, 0.03338623046875, -0.0011663436889648438, -0.00689697265625, 0.0037364959716796875, 0.002651214599609375, -0.0210723876953125, -0.00934600830078125, 0.0016260147094726562, -0.00925445556640625, -0.0282135009765625, -0.022216796875 ] ]
hkunlp/instructor-large
2023-04-21T06:04:33.000Z
[ "sentence-transformers", "pytorch", "t5", "text-embedding", "embeddings", "information-retrieval", "beir", "text-classification", "language-model", "text-clustering", "text-semantic-similarity", "text-evaluation", "prompt-retrieval", "text-reranking", "feature-extraction", "sentence-similarity", "transformers", "English", "Sentence Similarity", "natural_questions", "ms_marco", "fever", "hotpot_qa", "mteb", "en", "arxiv:2212.09741", "license:apache-2.0", "model-index", "has_space", "text-generation-inference", "region:us" ]
sentence-similarity
hkunlp
null
null
hkunlp/instructor-large
328
182,758
sentence-transformers
2022-12-20T05:31:06
--- pipeline_tag: sentence-similarity tags: - text-embedding - embeddings - information-retrieval - beir - text-classification - language-model - text-clustering - text-semantic-similarity - text-evaluation - prompt-retrieval - text-reranking - sentence-transformers - feature-extraction - sentence-similarity - transformers - t5 - English - Sentence Similarity - natural_questions - ms_marco - fever - hotpot_qa - mteb language: en inference: false license: apache-2.0 model-index: - name: INSTRUCTOR results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 88.13432835820896 - type: ap value: 59.298209334395665 - type: f1 value: 83.31769058643586 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 91.526375 - type: ap value: 88.16327709705504 - type: f1 value: 91.51095801287843 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.856 - type: f1 value: 45.41490917650942 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 31.223 - type: map_at_10 value: 47.947 - type: map_at_100 value: 48.742000000000004 - type: map_at_1000 value: 48.745 - type: map_at_3 value: 43.137 - type: map_at_5 value: 45.992 - type: mrr_at_1 value: 32.432 - type: mrr_at_10 value: 48.4 - type: mrr_at_100 value: 49.202 - type: mrr_at_1000 value: 49.205 - type: mrr_at_3 value: 43.551 - type: mrr_at_5 value: 46.467999999999996 - type: ndcg_at_1 value: 31.223 - type: ndcg_at_10 value: 57.045 - type: ndcg_at_100 value: 60.175 - type: ndcg_at_1000 value: 60.233000000000004 - type: ndcg_at_3 value: 47.171 - type: ndcg_at_5 value: 52.322 - type: precision_at_1 value: 31.223 - type: precision_at_10 value: 8.599 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 19.63 - type: precision_at_5 value: 14.282 - type: recall_at_1 value: 31.223 - type: recall_at_10 value: 85.989 - type: recall_at_100 value: 99.075 - type: recall_at_1000 value: 99.502 - type: recall_at_3 value: 58.89 - type: recall_at_5 value: 71.408 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 43.1621946393635 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 32.56417132407894 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 64.29539304390207 - type: mrr value: 76.44484017060196 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_spearman value: 84.38746499431112 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 78.51298701298701 - type: f1 value: 77.49041754069235 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 37.61848554098577 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 31.32623280148178 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 35.803000000000004 - type: map_at_10 value: 48.848 - type: map_at_100 value: 50.5 - type: map_at_1000 value: 50.602999999999994 - type: map_at_3 value: 45.111000000000004 - type: map_at_5 value: 47.202 - type: mrr_at_1 value: 44.635000000000005 - type: mrr_at_10 value: 55.593 - type: mrr_at_100 value: 56.169999999999995 - type: mrr_at_1000 value: 56.19499999999999 - type: mrr_at_3 value: 53.361999999999995 - type: mrr_at_5 value: 54.806999999999995 - type: ndcg_at_1 value: 44.635000000000005 - type: ndcg_at_10 value: 55.899 - type: ndcg_at_100 value: 60.958 - type: ndcg_at_1000 value: 62.302 - type: ndcg_at_3 value: 51.051 - type: ndcg_at_5 value: 53.351000000000006 - type: precision_at_1 value: 44.635000000000005 - type: precision_at_10 value: 10.786999999999999 - type: precision_at_100 value: 1.6580000000000001 - type: precision_at_1000 value: 0.213 - type: precision_at_3 value: 24.893 - type: precision_at_5 value: 17.740000000000002 - type: recall_at_1 value: 35.803000000000004 - type: recall_at_10 value: 68.657 - type: recall_at_100 value: 89.77199999999999 - type: recall_at_1000 value: 97.67 - type: recall_at_3 value: 54.066 - type: recall_at_5 value: 60.788 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 33.706 - type: map_at_10 value: 44.896 - type: map_at_100 value: 46.299 - type: map_at_1000 value: 46.44 - type: map_at_3 value: 41.721000000000004 - type: map_at_5 value: 43.486000000000004 - type: mrr_at_1 value: 41.592 - type: mrr_at_10 value: 50.529 - type: mrr_at_100 value: 51.22 - type: mrr_at_1000 value: 51.258 - type: mrr_at_3 value: 48.205999999999996 - type: mrr_at_5 value: 49.528 - type: ndcg_at_1 value: 41.592 - type: ndcg_at_10 value: 50.77199999999999 - type: ndcg_at_100 value: 55.383 - type: ndcg_at_1000 value: 57.288 - type: ndcg_at_3 value: 46.324 - type: ndcg_at_5 value: 48.346000000000004 - type: precision_at_1 value: 41.592 - type: precision_at_10 value: 9.516 - type: precision_at_100 value: 1.541 - type: precision_at_1000 value: 0.2 - type: precision_at_3 value: 22.399 - type: precision_at_5 value: 15.770999999999999 - type: recall_at_1 value: 33.706 - type: recall_at_10 value: 61.353 - type: recall_at_100 value: 80.182 - type: recall_at_1000 value: 91.896 - type: recall_at_3 value: 48.204 - type: recall_at_5 value: 53.89699999999999 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 44.424 - type: map_at_10 value: 57.169000000000004 - type: map_at_100 value: 58.202 - type: map_at_1000 value: 58.242000000000004 - type: map_at_3 value: 53.825 - type: map_at_5 value: 55.714 - type: mrr_at_1 value: 50.470000000000006 - type: mrr_at_10 value: 60.489000000000004 - type: mrr_at_100 value: 61.096 - type: mrr_at_1000 value: 61.112 - type: mrr_at_3 value: 58.192 - type: mrr_at_5 value: 59.611999999999995 - type: ndcg_at_1 value: 50.470000000000006 - type: ndcg_at_10 value: 63.071999999999996 - type: ndcg_at_100 value: 66.964 - type: ndcg_at_1000 value: 67.659 - type: ndcg_at_3 value: 57.74399999999999 - type: ndcg_at_5 value: 60.367000000000004 - type: precision_at_1 value: 50.470000000000006 - type: precision_at_10 value: 10.019 - type: precision_at_100 value: 1.29 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 25.558999999999997 - type: precision_at_5 value: 17.467 - type: recall_at_1 value: 44.424 - type: recall_at_10 value: 77.02 - type: recall_at_100 value: 93.738 - type: recall_at_1000 value: 98.451 - type: recall_at_3 value: 62.888 - type: recall_at_5 value: 69.138 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.294 - type: map_at_10 value: 34.503 - type: map_at_100 value: 35.641 - type: map_at_1000 value: 35.724000000000004 - type: map_at_3 value: 31.753999999999998 - type: map_at_5 value: 33.190999999999995 - type: mrr_at_1 value: 28.362 - type: mrr_at_10 value: 36.53 - type: mrr_at_100 value: 37.541000000000004 - type: mrr_at_1000 value: 37.602000000000004 - type: mrr_at_3 value: 33.917 - type: mrr_at_5 value: 35.358000000000004 - type: ndcg_at_1 value: 28.362 - type: ndcg_at_10 value: 39.513999999999996 - type: ndcg_at_100 value: 44.815 - type: ndcg_at_1000 value: 46.839 - type: ndcg_at_3 value: 34.02 - type: ndcg_at_5 value: 36.522 - type: precision_at_1 value: 28.362 - type: precision_at_10 value: 6.101999999999999 - type: precision_at_100 value: 0.9129999999999999 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 14.161999999999999 - type: precision_at_5 value: 9.966 - type: recall_at_1 value: 26.294 - type: recall_at_10 value: 53.098 - type: recall_at_100 value: 76.877 - type: recall_at_1000 value: 91.834 - type: recall_at_3 value: 38.266 - type: recall_at_5 value: 44.287 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.407 - type: map_at_10 value: 25.185999999999996 - type: map_at_100 value: 26.533 - type: map_at_1000 value: 26.657999999999998 - type: map_at_3 value: 22.201999999999998 - type: map_at_5 value: 23.923 - type: mrr_at_1 value: 20.522000000000002 - type: mrr_at_10 value: 29.522 - type: mrr_at_100 value: 30.644 - type: mrr_at_1000 value: 30.713 - type: mrr_at_3 value: 26.679000000000002 - type: mrr_at_5 value: 28.483000000000004 - type: ndcg_at_1 value: 20.522000000000002 - type: ndcg_at_10 value: 30.656 - type: ndcg_at_100 value: 36.864999999999995 - type: ndcg_at_1000 value: 39.675 - type: ndcg_at_3 value: 25.319000000000003 - type: ndcg_at_5 value: 27.992 - type: precision_at_1 value: 20.522000000000002 - type: precision_at_10 value: 5.795999999999999 - type: precision_at_100 value: 1.027 - type: precision_at_1000 value: 0.13999999999999999 - type: precision_at_3 value: 12.396 - type: precision_at_5 value: 9.328 - type: recall_at_1 value: 16.407 - type: recall_at_10 value: 43.164 - type: recall_at_100 value: 69.695 - type: recall_at_1000 value: 89.41900000000001 - type: recall_at_3 value: 28.634999999999998 - type: recall_at_5 value: 35.308 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.473 - type: map_at_10 value: 41.676 - type: map_at_100 value: 43.120999999999995 - type: map_at_1000 value: 43.230000000000004 - type: map_at_3 value: 38.306000000000004 - type: map_at_5 value: 40.355999999999995 - type: mrr_at_1 value: 37.536 - type: mrr_at_10 value: 47.643 - type: mrr_at_100 value: 48.508 - type: mrr_at_1000 value: 48.551 - type: mrr_at_3 value: 45.348 - type: mrr_at_5 value: 46.744 - type: ndcg_at_1 value: 37.536 - type: ndcg_at_10 value: 47.823 - type: ndcg_at_100 value: 53.395 - type: ndcg_at_1000 value: 55.271 - type: ndcg_at_3 value: 42.768 - type: ndcg_at_5 value: 45.373000000000005 - type: precision_at_1 value: 37.536 - type: precision_at_10 value: 8.681 - type: precision_at_100 value: 1.34 - type: precision_at_1000 value: 0.165 - type: precision_at_3 value: 20.468 - type: precision_at_5 value: 14.495 - type: recall_at_1 value: 30.473 - type: recall_at_10 value: 60.092999999999996 - type: recall_at_100 value: 82.733 - type: recall_at_1000 value: 94.875 - type: recall_at_3 value: 45.734 - type: recall_at_5 value: 52.691 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.976000000000003 - type: map_at_10 value: 41.097 - type: map_at_100 value: 42.547000000000004 - type: map_at_1000 value: 42.659000000000006 - type: map_at_3 value: 37.251 - type: map_at_5 value: 39.493 - type: mrr_at_1 value: 37.557 - type: mrr_at_10 value: 46.605000000000004 - type: mrr_at_100 value: 47.487 - type: mrr_at_1000 value: 47.54 - type: mrr_at_3 value: 43.721 - type: mrr_at_5 value: 45.411 - type: ndcg_at_1 value: 37.557 - type: ndcg_at_10 value: 47.449000000000005 - type: ndcg_at_100 value: 53.052 - type: ndcg_at_1000 value: 55.010999999999996 - type: ndcg_at_3 value: 41.439 - type: ndcg_at_5 value: 44.292 - type: precision_at_1 value: 37.557 - type: precision_at_10 value: 8.847 - type: precision_at_100 value: 1.357 - type: precision_at_1000 value: 0.16999999999999998 - type: precision_at_3 value: 20.091 - type: precision_at_5 value: 14.384 - type: recall_at_1 value: 29.976000000000003 - type: recall_at_10 value: 60.99099999999999 - type: recall_at_100 value: 84.245 - type: recall_at_1000 value: 96.97200000000001 - type: recall_at_3 value: 43.794 - type: recall_at_5 value: 51.778999999999996 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 28.099166666666665 - type: map_at_10 value: 38.1365 - type: map_at_100 value: 39.44491666666667 - type: map_at_1000 value: 39.55858333333334 - type: map_at_3 value: 35.03641666666666 - type: map_at_5 value: 36.79833333333334 - type: mrr_at_1 value: 33.39966666666667 - type: mrr_at_10 value: 42.42583333333333 - type: mrr_at_100 value: 43.28575 - type: mrr_at_1000 value: 43.33741666666667 - type: mrr_at_3 value: 39.94975 - type: mrr_at_5 value: 41.41633333333334 - type: ndcg_at_1 value: 33.39966666666667 - type: ndcg_at_10 value: 43.81741666666667 - type: ndcg_at_100 value: 49.08166666666667 - type: ndcg_at_1000 value: 51.121166666666674 - type: ndcg_at_3 value: 38.73575 - type: ndcg_at_5 value: 41.18158333333333 - type: precision_at_1 value: 33.39966666666667 - type: precision_at_10 value: 7.738916666666667 - type: precision_at_100 value: 1.2265833333333331 - type: precision_at_1000 value: 0.15983333333333336 - type: precision_at_3 value: 17.967416666666665 - type: precision_at_5 value: 12.78675 - type: recall_at_1 value: 28.099166666666665 - type: recall_at_10 value: 56.27049999999999 - type: recall_at_100 value: 78.93291666666667 - type: recall_at_1000 value: 92.81608333333334 - type: recall_at_3 value: 42.09775 - type: recall_at_5 value: 48.42533333333334 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.663 - type: map_at_10 value: 30.377 - type: map_at_100 value: 31.426 - type: map_at_1000 value: 31.519000000000002 - type: map_at_3 value: 28.069 - type: map_at_5 value: 29.256999999999998 - type: mrr_at_1 value: 26.687 - type: mrr_at_10 value: 33.107 - type: mrr_at_100 value: 34.055 - type: mrr_at_1000 value: 34.117999999999995 - type: mrr_at_3 value: 31.058000000000003 - type: mrr_at_5 value: 32.14 - type: ndcg_at_1 value: 26.687 - type: ndcg_at_10 value: 34.615 - type: ndcg_at_100 value: 39.776 - type: ndcg_at_1000 value: 42.05 - type: ndcg_at_3 value: 30.322 - type: ndcg_at_5 value: 32.157000000000004 - type: precision_at_1 value: 26.687 - type: precision_at_10 value: 5.491 - type: precision_at_100 value: 0.877 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 13.139000000000001 - type: precision_at_5 value: 9.049 - type: recall_at_1 value: 23.663 - type: recall_at_10 value: 45.035 - type: recall_at_100 value: 68.554 - type: recall_at_1000 value: 85.077 - type: recall_at_3 value: 32.982 - type: recall_at_5 value: 37.688 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.403 - type: map_at_10 value: 25.197000000000003 - type: map_at_100 value: 26.355 - type: map_at_1000 value: 26.487 - type: map_at_3 value: 22.733 - type: map_at_5 value: 24.114 - type: mrr_at_1 value: 21.37 - type: mrr_at_10 value: 29.091 - type: mrr_at_100 value: 30.018 - type: mrr_at_1000 value: 30.096 - type: mrr_at_3 value: 26.887 - type: mrr_at_5 value: 28.157 - type: ndcg_at_1 value: 21.37 - type: ndcg_at_10 value: 30.026000000000003 - type: ndcg_at_100 value: 35.416 - type: ndcg_at_1000 value: 38.45 - type: ndcg_at_3 value: 25.764 - type: ndcg_at_5 value: 27.742 - type: precision_at_1 value: 21.37 - type: precision_at_10 value: 5.609 - type: precision_at_100 value: 0.9860000000000001 - type: precision_at_1000 value: 0.14300000000000002 - type: precision_at_3 value: 12.423 - type: precision_at_5 value: 9.009 - type: recall_at_1 value: 17.403 - type: recall_at_10 value: 40.573 - type: recall_at_100 value: 64.818 - type: recall_at_1000 value: 86.53699999999999 - type: recall_at_3 value: 28.493000000000002 - type: recall_at_5 value: 33.660000000000004 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 28.639 - type: map_at_10 value: 38.951 - type: map_at_100 value: 40.238 - type: map_at_1000 value: 40.327 - type: map_at_3 value: 35.842 - type: map_at_5 value: 37.617 - type: mrr_at_1 value: 33.769 - type: mrr_at_10 value: 43.088 - type: mrr_at_100 value: 44.03 - type: mrr_at_1000 value: 44.072 - type: mrr_at_3 value: 40.656 - type: mrr_at_5 value: 42.138999999999996 - type: ndcg_at_1 value: 33.769 - type: ndcg_at_10 value: 44.676 - type: ndcg_at_100 value: 50.416000000000004 - type: ndcg_at_1000 value: 52.227999999999994 - type: ndcg_at_3 value: 39.494 - type: ndcg_at_5 value: 42.013 - type: precision_at_1 value: 33.769 - type: precision_at_10 value: 7.668 - type: precision_at_100 value: 1.18 - type: precision_at_1000 value: 0.145 - type: precision_at_3 value: 18.221 - type: precision_at_5 value: 12.966 - type: recall_at_1 value: 28.639 - type: recall_at_10 value: 57.687999999999995 - type: recall_at_100 value: 82.541 - type: recall_at_1000 value: 94.896 - type: recall_at_3 value: 43.651 - type: recall_at_5 value: 49.925999999999995 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.57 - type: map_at_10 value: 40.004 - type: map_at_100 value: 41.75 - type: map_at_1000 value: 41.97 - type: map_at_3 value: 36.788 - type: map_at_5 value: 38.671 - type: mrr_at_1 value: 35.375 - type: mrr_at_10 value: 45.121 - type: mrr_at_100 value: 45.994 - type: mrr_at_1000 value: 46.04 - type: mrr_at_3 value: 42.227 - type: mrr_at_5 value: 43.995 - type: ndcg_at_1 value: 35.375 - type: ndcg_at_10 value: 46.392 - type: ndcg_at_100 value: 52.196 - type: ndcg_at_1000 value: 54.274 - type: ndcg_at_3 value: 41.163 - type: ndcg_at_5 value: 43.813 - type: precision_at_1 value: 35.375 - type: precision_at_10 value: 8.676 - type: precision_at_100 value: 1.678 - type: precision_at_1000 value: 0.253 - type: precision_at_3 value: 19.104 - type: precision_at_5 value: 13.913 - type: recall_at_1 value: 29.57 - type: recall_at_10 value: 58.779 - type: recall_at_100 value: 83.337 - type: recall_at_1000 value: 95.979 - type: recall_at_3 value: 44.005 - type: recall_at_5 value: 50.975 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 20.832 - type: map_at_10 value: 29.733999999999998 - type: map_at_100 value: 30.727 - type: map_at_1000 value: 30.843999999999998 - type: map_at_3 value: 26.834999999999997 - type: map_at_5 value: 28.555999999999997 - type: mrr_at_1 value: 22.921 - type: mrr_at_10 value: 31.791999999999998 - type: mrr_at_100 value: 32.666000000000004 - type: mrr_at_1000 value: 32.751999999999995 - type: mrr_at_3 value: 29.144 - type: mrr_at_5 value: 30.622 - type: ndcg_at_1 value: 22.921 - type: ndcg_at_10 value: 34.915 - type: ndcg_at_100 value: 39.744 - type: ndcg_at_1000 value: 42.407000000000004 - type: ndcg_at_3 value: 29.421000000000003 - type: ndcg_at_5 value: 32.211 - type: precision_at_1 value: 22.921 - type: precision_at_10 value: 5.675 - type: precision_at_100 value: 0.872 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 12.753999999999998 - type: precision_at_5 value: 9.353 - type: recall_at_1 value: 20.832 - type: recall_at_10 value: 48.795 - type: recall_at_100 value: 70.703 - type: recall_at_1000 value: 90.187 - type: recall_at_3 value: 34.455000000000005 - type: recall_at_5 value: 40.967 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 10.334 - type: map_at_10 value: 19.009999999999998 - type: map_at_100 value: 21.129 - type: map_at_1000 value: 21.328 - type: map_at_3 value: 15.152 - type: map_at_5 value: 17.084 - type: mrr_at_1 value: 23.453 - type: mrr_at_10 value: 36.099 - type: mrr_at_100 value: 37.069 - type: mrr_at_1000 value: 37.104 - type: mrr_at_3 value: 32.096000000000004 - type: mrr_at_5 value: 34.451 - type: ndcg_at_1 value: 23.453 - type: ndcg_at_10 value: 27.739000000000004 - type: ndcg_at_100 value: 35.836 - type: ndcg_at_1000 value: 39.242 - type: ndcg_at_3 value: 21.263 - type: ndcg_at_5 value: 23.677 - type: precision_at_1 value: 23.453 - type: precision_at_10 value: 9.199 - type: precision_at_100 value: 1.791 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 16.2 - type: precision_at_5 value: 13.147 - type: recall_at_1 value: 10.334 - type: recall_at_10 value: 35.177 - type: recall_at_100 value: 63.009 - type: recall_at_1000 value: 81.938 - type: recall_at_3 value: 19.914 - type: recall_at_5 value: 26.077 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 8.212 - type: map_at_10 value: 17.386 - type: map_at_100 value: 24.234 - type: map_at_1000 value: 25.724999999999998 - type: map_at_3 value: 12.727 - type: map_at_5 value: 14.785 - type: mrr_at_1 value: 59.25 - type: mrr_at_10 value: 68.687 - type: mrr_at_100 value: 69.133 - type: mrr_at_1000 value: 69.14099999999999 - type: mrr_at_3 value: 66.917 - type: mrr_at_5 value: 67.742 - type: ndcg_at_1 value: 48.625 - type: ndcg_at_10 value: 36.675999999999995 - type: ndcg_at_100 value: 41.543 - type: ndcg_at_1000 value: 49.241 - type: ndcg_at_3 value: 41.373 - type: ndcg_at_5 value: 38.707 - type: precision_at_1 value: 59.25 - type: precision_at_10 value: 28.525 - type: precision_at_100 value: 9.027000000000001 - type: precision_at_1000 value: 1.8339999999999999 - type: precision_at_3 value: 44.833 - type: precision_at_5 value: 37.35 - type: recall_at_1 value: 8.212 - type: recall_at_10 value: 23.188 - type: recall_at_100 value: 48.613 - type: recall_at_1000 value: 73.093 - type: recall_at_3 value: 14.419 - type: recall_at_5 value: 17.798 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 52.725 - type: f1 value: 46.50743309855908 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 55.086 - type: map_at_10 value: 66.914 - type: map_at_100 value: 67.321 - type: map_at_1000 value: 67.341 - type: map_at_3 value: 64.75800000000001 - type: map_at_5 value: 66.189 - type: mrr_at_1 value: 59.28600000000001 - type: mrr_at_10 value: 71.005 - type: mrr_at_100 value: 71.304 - type: mrr_at_1000 value: 71.313 - type: mrr_at_3 value: 69.037 - type: mrr_at_5 value: 70.35 - type: ndcg_at_1 value: 59.28600000000001 - type: ndcg_at_10 value: 72.695 - type: ndcg_at_100 value: 74.432 - type: ndcg_at_1000 value: 74.868 - type: ndcg_at_3 value: 68.72200000000001 - type: ndcg_at_5 value: 71.081 - type: precision_at_1 value: 59.28600000000001 - type: precision_at_10 value: 9.499 - type: precision_at_100 value: 1.052 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 27.503 - type: precision_at_5 value: 17.854999999999997 - type: recall_at_1 value: 55.086 - type: recall_at_10 value: 86.453 - type: recall_at_100 value: 94.028 - type: recall_at_1000 value: 97.052 - type: recall_at_3 value: 75.821 - type: recall_at_5 value: 81.6 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 22.262999999999998 - type: map_at_10 value: 37.488 - type: map_at_100 value: 39.498 - type: map_at_1000 value: 39.687 - type: map_at_3 value: 32.529 - type: map_at_5 value: 35.455 - type: mrr_at_1 value: 44.907000000000004 - type: mrr_at_10 value: 53.239000000000004 - type: mrr_at_100 value: 54.086 - type: mrr_at_1000 value: 54.122 - type: mrr_at_3 value: 51.235 - type: mrr_at_5 value: 52.415 - type: ndcg_at_1 value: 44.907000000000004 - type: ndcg_at_10 value: 45.446 - type: ndcg_at_100 value: 52.429 - type: ndcg_at_1000 value: 55.169000000000004 - type: ndcg_at_3 value: 41.882000000000005 - type: ndcg_at_5 value: 43.178 - type: precision_at_1 value: 44.907000000000004 - type: precision_at_10 value: 12.931999999999999 - type: precision_at_100 value: 2.025 - type: precision_at_1000 value: 0.248 - type: precision_at_3 value: 28.652 - type: precision_at_5 value: 21.204 - type: recall_at_1 value: 22.262999999999998 - type: recall_at_10 value: 52.447 - type: recall_at_100 value: 78.045 - type: recall_at_1000 value: 94.419 - type: recall_at_3 value: 38.064 - type: recall_at_5 value: 44.769 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 32.519 - type: map_at_10 value: 45.831 - type: map_at_100 value: 46.815 - type: map_at_1000 value: 46.899 - type: map_at_3 value: 42.836 - type: map_at_5 value: 44.65 - type: mrr_at_1 value: 65.037 - type: mrr_at_10 value: 72.16 - type: mrr_at_100 value: 72.51100000000001 - type: mrr_at_1000 value: 72.53 - type: mrr_at_3 value: 70.682 - type: mrr_at_5 value: 71.54599999999999 - type: ndcg_at_1 value: 65.037 - type: ndcg_at_10 value: 55.17999999999999 - type: ndcg_at_100 value: 58.888 - type: ndcg_at_1000 value: 60.648 - type: ndcg_at_3 value: 50.501 - type: ndcg_at_5 value: 52.977 - type: precision_at_1 value: 65.037 - type: precision_at_10 value: 11.530999999999999 - type: precision_at_100 value: 1.4460000000000002 - type: precision_at_1000 value: 0.168 - type: precision_at_3 value: 31.483 - type: precision_at_5 value: 20.845 - type: recall_at_1 value: 32.519 - type: recall_at_10 value: 57.657000000000004 - type: recall_at_100 value: 72.30199999999999 - type: recall_at_1000 value: 84.024 - type: recall_at_3 value: 47.225 - type: recall_at_5 value: 52.113 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 88.3168 - type: ap value: 83.80165516037135 - type: f1 value: 88.29942471066407 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 20.724999999999998 - type: map_at_10 value: 32.736 - type: map_at_100 value: 33.938 - type: map_at_1000 value: 33.991 - type: map_at_3 value: 28.788000000000004 - type: map_at_5 value: 31.016 - type: mrr_at_1 value: 21.361 - type: mrr_at_10 value: 33.323 - type: mrr_at_100 value: 34.471000000000004 - type: mrr_at_1000 value: 34.518 - type: mrr_at_3 value: 29.453000000000003 - type: mrr_at_5 value: 31.629 - type: ndcg_at_1 value: 21.361 - type: ndcg_at_10 value: 39.649 - type: ndcg_at_100 value: 45.481 - type: ndcg_at_1000 value: 46.775 - type: ndcg_at_3 value: 31.594 - type: ndcg_at_5 value: 35.543 - type: precision_at_1 value: 21.361 - type: precision_at_10 value: 6.3740000000000006 - type: precision_at_100 value: 0.931 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 13.514999999999999 - type: precision_at_5 value: 10.100000000000001 - type: recall_at_1 value: 20.724999999999998 - type: recall_at_10 value: 61.034 - type: recall_at_100 value: 88.062 - type: recall_at_1000 value: 97.86399999999999 - type: recall_at_3 value: 39.072 - type: recall_at_5 value: 48.53 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.8919288645691 - type: f1 value: 93.57059586398059 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 67.97993616051072 - type: f1 value: 48.244319183606535 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.90047074646941 - type: f1 value: 66.48999056063725 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.34566240753195 - type: f1 value: 73.54164154290658 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 34.21866934757011 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 32.000936217235534 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.68189362520352 - type: mrr value: 32.69603637784303 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 6.078 - type: map_at_10 value: 12.671 - type: map_at_100 value: 16.291 - type: map_at_1000 value: 17.855999999999998 - type: map_at_3 value: 9.610000000000001 - type: map_at_5 value: 11.152 - type: mrr_at_1 value: 43.963 - type: mrr_at_10 value: 53.173 - type: mrr_at_100 value: 53.718999999999994 - type: mrr_at_1000 value: 53.756 - type: mrr_at_3 value: 50.980000000000004 - type: mrr_at_5 value: 52.42 - type: ndcg_at_1 value: 42.415000000000006 - type: ndcg_at_10 value: 34.086 - type: ndcg_at_100 value: 32.545 - type: ndcg_at_1000 value: 41.144999999999996 - type: ndcg_at_3 value: 39.434999999999995 - type: ndcg_at_5 value: 37.888 - type: precision_at_1 value: 43.653 - type: precision_at_10 value: 25.014999999999997 - type: precision_at_100 value: 8.594 - type: precision_at_1000 value: 2.169 - type: precision_at_3 value: 37.049 - type: precision_at_5 value: 33.065 - type: recall_at_1 value: 6.078 - type: recall_at_10 value: 16.17 - type: recall_at_100 value: 34.512 - type: recall_at_1000 value: 65.447 - type: recall_at_3 value: 10.706 - type: recall_at_5 value: 13.158 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 27.378000000000004 - type: map_at_10 value: 42.178 - type: map_at_100 value: 43.32 - type: map_at_1000 value: 43.358000000000004 - type: map_at_3 value: 37.474000000000004 - type: map_at_5 value: 40.333000000000006 - type: mrr_at_1 value: 30.823 - type: mrr_at_10 value: 44.626 - type: mrr_at_100 value: 45.494 - type: mrr_at_1000 value: 45.519 - type: mrr_at_3 value: 40.585 - type: mrr_at_5 value: 43.146 - type: ndcg_at_1 value: 30.794 - type: ndcg_at_10 value: 50.099000000000004 - type: ndcg_at_100 value: 54.900999999999996 - type: ndcg_at_1000 value: 55.69499999999999 - type: ndcg_at_3 value: 41.238 - type: ndcg_at_5 value: 46.081 - type: precision_at_1 value: 30.794 - type: precision_at_10 value: 8.549 - type: precision_at_100 value: 1.124 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 18.926000000000002 - type: precision_at_5 value: 14.16 - type: recall_at_1 value: 27.378000000000004 - type: recall_at_10 value: 71.842 - type: recall_at_100 value: 92.565 - type: recall_at_1000 value: 98.402 - type: recall_at_3 value: 49.053999999999995 - type: recall_at_5 value: 60.207 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 70.557 - type: map_at_10 value: 84.729 - type: map_at_100 value: 85.369 - type: map_at_1000 value: 85.382 - type: map_at_3 value: 81.72 - type: map_at_5 value: 83.613 - type: mrr_at_1 value: 81.3 - type: mrr_at_10 value: 87.488 - type: mrr_at_100 value: 87.588 - type: mrr_at_1000 value: 87.589 - type: mrr_at_3 value: 86.53 - type: mrr_at_5 value: 87.18599999999999 - type: ndcg_at_1 value: 81.28999999999999 - type: ndcg_at_10 value: 88.442 - type: ndcg_at_100 value: 89.637 - type: ndcg_at_1000 value: 89.70700000000001 - type: ndcg_at_3 value: 85.55199999999999 - type: ndcg_at_5 value: 87.154 - type: precision_at_1 value: 81.28999999999999 - type: precision_at_10 value: 13.489999999999998 - type: precision_at_100 value: 1.54 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.553 - type: precision_at_5 value: 24.708 - type: recall_at_1 value: 70.557 - type: recall_at_10 value: 95.645 - type: recall_at_100 value: 99.693 - type: recall_at_1000 value: 99.995 - type: recall_at_3 value: 87.359 - type: recall_at_5 value: 91.89699999999999 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 63.65060114776209 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 64.63271250680617 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.263 - type: map_at_10 value: 10.801 - type: map_at_100 value: 12.888 - type: map_at_1000 value: 13.224 - type: map_at_3 value: 7.362 - type: map_at_5 value: 9.149000000000001 - type: mrr_at_1 value: 21 - type: mrr_at_10 value: 31.416 - type: mrr_at_100 value: 32.513 - type: mrr_at_1000 value: 32.58 - type: mrr_at_3 value: 28.116999999999997 - type: mrr_at_5 value: 29.976999999999997 - type: ndcg_at_1 value: 21 - type: ndcg_at_10 value: 18.551000000000002 - type: ndcg_at_100 value: 26.657999999999998 - type: ndcg_at_1000 value: 32.485 - type: ndcg_at_3 value: 16.834 - type: ndcg_at_5 value: 15.204999999999998 - type: precision_at_1 value: 21 - type: precision_at_10 value: 9.84 - type: precision_at_100 value: 2.16 - type: precision_at_1000 value: 0.35500000000000004 - type: precision_at_3 value: 15.667 - type: precision_at_5 value: 13.62 - type: recall_at_1 value: 4.263 - type: recall_at_10 value: 19.922 - type: recall_at_100 value: 43.808 - type: recall_at_1000 value: 72.14500000000001 - type: recall_at_3 value: 9.493 - type: recall_at_5 value: 13.767999999999999 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_spearman value: 81.27446313317233 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_spearman value: 76.27963301217527 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_spearman value: 88.18495048450949 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_spearman value: 81.91982338692046 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_spearman value: 89.00896818385291 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_spearman value: 85.48814644586132 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_spearman value: 90.30116926966582 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_spearman value: 67.74132963032342 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_spearman value: 86.87741355780479 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 82.0019012295875 - type: mrr value: 94.70267024188593 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 50.05 - type: map_at_10 value: 59.36 - type: map_at_100 value: 59.967999999999996 - type: map_at_1000 value: 60.023 - type: map_at_3 value: 56.515 - type: map_at_5 value: 58.272999999999996 - type: mrr_at_1 value: 53 - type: mrr_at_10 value: 61.102000000000004 - type: mrr_at_100 value: 61.476 - type: mrr_at_1000 value: 61.523 - type: mrr_at_3 value: 58.778 - type: mrr_at_5 value: 60.128 - type: ndcg_at_1 value: 53 - type: ndcg_at_10 value: 64.43100000000001 - type: ndcg_at_100 value: 66.73599999999999 - type: ndcg_at_1000 value: 68.027 - type: ndcg_at_3 value: 59.279 - type: ndcg_at_5 value: 61.888 - type: precision_at_1 value: 53 - type: precision_at_10 value: 8.767 - type: precision_at_100 value: 1.01 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 23.444000000000003 - type: precision_at_5 value: 15.667 - type: recall_at_1 value: 50.05 - type: recall_at_10 value: 78.511 - type: recall_at_100 value: 88.5 - type: recall_at_1000 value: 98.333 - type: recall_at_3 value: 64.117 - type: recall_at_5 value: 70.867 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.72178217821782 - type: cos_sim_ap value: 93.0728601593541 - type: cos_sim_f1 value: 85.6727976766699 - type: cos_sim_precision value: 83.02063789868667 - type: cos_sim_recall value: 88.5 - type: dot_accuracy value: 99.72178217821782 - type: dot_ap value: 93.07287396168348 - type: dot_f1 value: 85.6727976766699 - type: dot_precision value: 83.02063789868667 - type: dot_recall value: 88.5 - type: euclidean_accuracy value: 99.72178217821782 - type: euclidean_ap value: 93.07285657982895 - type: euclidean_f1 value: 85.6727976766699 - type: euclidean_precision value: 83.02063789868667 - type: euclidean_recall value: 88.5 - type: manhattan_accuracy value: 99.72475247524753 - type: manhattan_ap value: 93.02792973059809 - type: manhattan_f1 value: 85.7727737973388 - type: manhattan_precision value: 87.84067085953879 - type: manhattan_recall value: 83.8 - type: max_accuracy value: 99.72475247524753 - type: max_ap value: 93.07287396168348 - type: max_f1 value: 85.7727737973388 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 68.77583615550819 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 36.151636938606956 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 52.16607939471187 - type: mrr value: 52.95172046091163 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.314646669495666 - type: cos_sim_spearman value: 31.83562491439455 - type: dot_pearson value: 31.314590842874157 - type: dot_spearman value: 31.83363065810437 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.198 - type: map_at_10 value: 1.3010000000000002 - type: map_at_100 value: 7.2139999999999995 - type: map_at_1000 value: 20.179 - type: map_at_3 value: 0.528 - type: map_at_5 value: 0.8019999999999999 - type: mrr_at_1 value: 72 - type: mrr_at_10 value: 83.39999999999999 - type: mrr_at_100 value: 83.39999999999999 - type: mrr_at_1000 value: 83.39999999999999 - type: mrr_at_3 value: 81.667 - type: mrr_at_5 value: 83.06700000000001 - type: ndcg_at_1 value: 66 - type: ndcg_at_10 value: 58.059000000000005 - type: ndcg_at_100 value: 44.316 - type: ndcg_at_1000 value: 43.147000000000006 - type: ndcg_at_3 value: 63.815999999999995 - type: ndcg_at_5 value: 63.005 - type: precision_at_1 value: 72 - type: precision_at_10 value: 61.4 - type: precision_at_100 value: 45.62 - type: precision_at_1000 value: 19.866 - type: precision_at_3 value: 70 - type: precision_at_5 value: 68.8 - type: recall_at_1 value: 0.198 - type: recall_at_10 value: 1.517 - type: recall_at_100 value: 10.587 - type: recall_at_1000 value: 41.233 - type: recall_at_3 value: 0.573 - type: recall_at_5 value: 0.907 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.894 - type: map_at_10 value: 8.488999999999999 - type: map_at_100 value: 14.445 - type: map_at_1000 value: 16.078 - type: map_at_3 value: 4.589 - type: map_at_5 value: 6.019 - type: mrr_at_1 value: 22.448999999999998 - type: mrr_at_10 value: 39.82 - type: mrr_at_100 value: 40.752 - type: mrr_at_1000 value: 40.771 - type: mrr_at_3 value: 34.354 - type: mrr_at_5 value: 37.721 - type: ndcg_at_1 value: 19.387999999999998 - type: ndcg_at_10 value: 21.563 - type: ndcg_at_100 value: 33.857 - type: ndcg_at_1000 value: 46.199 - type: ndcg_at_3 value: 22.296 - type: ndcg_at_5 value: 21.770999999999997 - type: precision_at_1 value: 22.448999999999998 - type: precision_at_10 value: 19.796 - type: precision_at_100 value: 7.142999999999999 - type: precision_at_1000 value: 1.541 - type: precision_at_3 value: 24.490000000000002 - type: precision_at_5 value: 22.448999999999998 - type: recall_at_1 value: 1.894 - type: recall_at_10 value: 14.931 - type: recall_at_100 value: 45.524 - type: recall_at_1000 value: 83.243 - type: recall_at_3 value: 5.712 - type: recall_at_5 value: 8.386000000000001 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.049 - type: ap value: 13.85116971310922 - type: f1 value: 54.37504302487686 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 64.1312959818902 - type: f1 value: 64.11413877009383 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 54.13103431861502 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.327889372355 - type: cos_sim_ap value: 77.42059895975699 - type: cos_sim_f1 value: 71.02706903250873 - type: cos_sim_precision value: 69.75324344950394 - type: cos_sim_recall value: 72.34828496042216 - type: dot_accuracy value: 87.327889372355 - type: dot_ap value: 77.4209479346677 - type: dot_f1 value: 71.02706903250873 - type: dot_precision value: 69.75324344950394 - type: dot_recall value: 72.34828496042216 - type: euclidean_accuracy value: 87.327889372355 - type: euclidean_ap value: 77.42096495861037 - type: euclidean_f1 value: 71.02706903250873 - type: euclidean_precision value: 69.75324344950394 - type: euclidean_recall value: 72.34828496042216 - type: manhattan_accuracy value: 87.31000774870358 - type: manhattan_ap value: 77.38930750711619 - type: manhattan_f1 value: 71.07935314027831 - type: manhattan_precision value: 67.70957726295677 - type: manhattan_recall value: 74.80211081794195 - type: max_accuracy value: 87.327889372355 - type: max_ap value: 77.42096495861037 - type: max_f1 value: 71.07935314027831 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.58939729110878 - type: cos_sim_ap value: 87.17594155025475 - type: cos_sim_f1 value: 79.21146953405018 - type: cos_sim_precision value: 76.8918527109307 - type: cos_sim_recall value: 81.67539267015707 - type: dot_accuracy value: 89.58939729110878 - type: dot_ap value: 87.17593963273593 - type: dot_f1 value: 79.21146953405018 - type: dot_precision value: 76.8918527109307 - type: dot_recall value: 81.67539267015707 - type: euclidean_accuracy value: 89.58939729110878 - type: euclidean_ap value: 87.17592466925834 - type: euclidean_f1 value: 79.21146953405018 - type: euclidean_precision value: 76.8918527109307 - type: euclidean_recall value: 81.67539267015707 - type: manhattan_accuracy value: 89.62626615438352 - type: manhattan_ap value: 87.16589873161546 - type: manhattan_f1 value: 79.25143598295348 - type: manhattan_precision value: 76.39494177323712 - type: manhattan_recall value: 82.32984293193716 - type: max_accuracy value: 89.62626615438352 - type: max_ap value: 87.17594155025475 - type: max_f1 value: 79.25143598295348 --- # hkunlp/instructor-large We introduce **Instructor**👨‍🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨‍ achieves sota on 70 diverse embedding tasks ([MTEB leaderboard](https://huggingface.co/spaces/mteb/leaderboard))! The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)! **************************** **Updates** **************************** * 12/28: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-large) trained with hard negatives, which gives better performance. * 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-large) and [project page](https://instructor-embedding.github.io/)! Check them out! ## Quick start <hr /> ## Installation ```bash pip install InstructorEmbedding ``` ## Compute your customized embeddings Then you can use the model like this to calculate domain-specific and task-aware embeddings: ```python from InstructorEmbedding import INSTRUCTOR model = INSTRUCTOR('hkunlp/instructor-large') sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments" instruction = "Represent the Science title:" embeddings = model.encode([[instruction,sentence]]) print(embeddings) ``` ## Use cases <hr /> ## Calculate embeddings for your customized texts If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions: &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Represent the `domain` `text_type` for `task_objective`: * `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc. * `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc. * `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc. ## Calculate Sentence similarities You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**. ```python from sklearn.metrics.pairwise import cosine_similarity sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'], ['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']] sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'], ['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']] embeddings_a = model.encode(sentences_a) embeddings_b = model.encode(sentences_b) similarities = cosine_similarity(embeddings_a,embeddings_b) print(similarities) ``` ## Information Retrieval You can also use **customized embeddings** for information retrieval. ```python import numpy as np from sklearn.metrics.pairwise import cosine_similarity query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']] corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'], ['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"], ['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']] query_embeddings = model.encode(query) corpus_embeddings = model.encode(corpus) similarities = cosine_similarity(query_embeddings,corpus_embeddings) retrieved_doc_id = np.argmax(similarities) print(retrieved_doc_id) ``` ## Clustering Use **customized embeddings** for clustering texts in groups. ```python import sklearn.cluster sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'], ['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'], ['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'], ['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"], ['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']] embeddings = model.encode(sentences) clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2) clustering_model.fit(embeddings) cluster_assignment = clustering_model.labels_ print(cluster_assignment) ```
66,300
[ [ -0.015899658203125, -0.07867431640625, 0.037322998046875, 0.00189971923828125, -0.0016460418701171875, -0.01432037353515625, -0.0211029052734375, -0.00904083251953125, 0.021240234375, 0.0200653076171875, -0.0175018310546875, -0.06201171875, -0.0350341796875, -0.000751495361328125, -0.023651123046875, 0.07977294921875, -0.006259918212890625, 0.01247406005859375, -0.0439453125, -0.009613037109375, -0.0247039794921875, -0.040191650390625, -0.01412200927734375, -0.02520751953125, 0.03045654296875, 0.0189666748046875, 0.041778564453125, 0.053924560546875, 0.029296875, 0.025421142578125, -0.0027179718017578125, -0.00872802734375, -0.02777099609375, -0.0171661376953125, -0.009979248046875, -0.043365478515625, -0.01702880859375, -0.0004401206970214844, 0.055145263671875, 0.055938720703125, -0.005855560302734375, 0.007076263427734375, 0.0197296142578125, 0.0262298583984375, -0.041046142578125, 0.02081298828125, -0.0262908935546875, 0.01021575927734375, -0.0022430419921875, 0.0006594657897949219, -0.0168609619140625, -0.0229034423828125, 0.0160064697265625, -0.05364990234375, 0.00884246826171875, 0.0172576904296875, 0.07159423828125, 0.029144287109375, -0.0367431640625, -0.037017822265625, -0.00536346435546875, 0.0806884765625, -0.059478759765625, 0.0288238525390625, 0.031280517578125, -0.007282257080078125, -0.0223541259765625, -0.06695556640625, -0.0506591796875, -0.01885986328125, -0.0215911865234375, 0.01190948486328125, -0.002956390380859375, -0.0002522468566894531, 0.0193939208984375, 0.035980224609375, -0.049530029296875, 0.007671356201171875, -0.0318603515625, -0.0125274658203125, 0.03704833984375, 0.019775390625, 0.032196044921875, -0.03485107421875, -0.0267181396484375, -0.0233001708984375, -0.031097412109375, 0.0108184814453125, 0.0235595703125, 0.010467529296875, -0.0284271240234375, 0.06378173828125, 0.0001575946807861328, 0.03729248046875, 0.00016379356384277344, -0.0185089111328125, 0.03887939453125, -0.025238037109375, -0.01024627685546875, 0.00479888916015625, 0.0633544921875, 0.0325927734375, -0.01116943359375, 0.002063751220703125, 0.00373077392578125, 0.01189422607421875, 0.003082275390625, -0.05224609375, -0.0225067138671875, 0.018890380859375, -0.0433349609375, -0.026336669921875, 0.006748199462890625, -0.06201171875, -0.012847900390625, -0.01073455810546875, 0.05450439453125, -0.04180908203125, 0.00737762451171875, 0.01442718505859375, -0.0257110595703125, -0.0025539398193359375, 0.0037593841552734375, -0.05914306640625, 0.0266265869140625, 0.053314208984375, 0.06304931640625, -0.0098419189453125, -0.033416748046875, -0.01080322265625, 0.00958251953125, 0.006282806396484375, 0.03436279296875, -0.0533447265625, -0.00873565673828125, 0.00907135009765625, 0.0091400146484375, -0.0148468017578125, -0.0254974365234375, 0.020782470703125, -0.00839996337890625, 0.036773681640625, -0.0031757354736328125, -0.07470703125, -0.036346435546875, 0.0211029052734375, -0.036712646484375, 0.08447265625, 0.0135040283203125, -0.0718994140625, -0.0006937980651855469, -0.06298828125, -0.0307464599609375, -0.0220489501953125, -0.0176544189453125, -0.03302001953125, -0.0179443359375, 0.00205230712890625, 0.056915283203125, -0.00453948974609375, 0.01097869873046875, -0.027435302734375, -0.0290679931640625, 0.0249481201171875, 0.0045318603515625, 0.07135009765625, 0.01102447509765625, -0.031524658203125, 0.003795623779296875, -0.044952392578125, -0.0109405517578125, 0.00927734375, -0.036773681640625, -0.037109375, 0.0032596588134765625, 0.01461029052734375, -0.01248931884765625, 0.03790283203125, -0.046234130859375, 0.0256500244140625, -0.019927978515625, 0.0545654296875, 0.053436279296875, 0.00876617431640625, 0.0347900390625, -0.027252197265625, 0.0239715576171875, 0.004596710205078125, -0.0010023117065429688, -0.01058197021484375, -0.0447998046875, -0.0478515625, -0.0244598388671875, 0.0282440185546875, 0.04510498046875, -0.04058837890625, 0.050537109375, -0.00983428955078125, -0.051513671875, -0.041290283203125, -0.00188446044921875, 0.0280609130859375, 0.0188140869140625, 0.048431396484375, -0.0084381103515625, -0.04547119140625, -0.052886962890625, -0.0225372314453125, 0.00826263427734375, 0.0141754150390625, 0.0258636474609375, 0.06390380859375, -0.038482666015625, 0.0538330078125, -0.07586669921875, -0.0251922607421875, -0.01544952392578125, -0.01181793212890625, 0.019012451171875, 0.0609130859375, 0.054443359375, -0.057647705078125, -0.047943115234375, -0.00739288330078125, -0.049774169921875, 0.019287109375, -0.0096435546875, -0.00852203369140625, 0.00016427040100097656, 0.040985107421875, -0.043212890625, 0.02752685546875, 0.0288238525390625, -0.03076171875, 0.0400390625, -0.022430419921875, 0.0035305023193359375, -0.1063232421875, -0.0015954971313476562, 0.0169525146484375, -0.00931549072265625, -0.03521728515625, 0.0183868408203125, -0.007320404052734375, 0.0005583763122558594, -0.0252532958984375, 0.0430908203125, -0.0230712890625, 0.012786865234375, 0.0004973411560058594, 0.022003173828125, -0.01053619384765625, 0.048095703125, -0.01212310791015625, 0.0333251953125, 0.048736572265625, -0.05218505859375, 0.028350830078125, 0.041717529296875, -0.0312347412109375, 0.02276611328125, -0.043670654296875, -0.005035400390625, 0.0018625259399414062, 0.0244598388671875, -0.051239013671875, -0.01483154296875, 0.0264892578125, -0.031097412109375, 0.0146331787109375, 0.00855255126953125, -0.03753662109375, -0.033416748046875, -0.043060302734375, 0.021392822265625, 0.04638671875, -0.033203125, 0.0145721435546875, 0.043487548828125, 0.000545501708984375, -0.0556640625, -0.051849365234375, -0.0115966796875, -0.0247650146484375, -0.02203369140625, 0.0418701171875, -0.01666259765625, -0.003620147705078125, 0.03472900390625, 0.005527496337890625, -0.01507568359375, 0.0126800537109375, 0.0247650146484375, 0.0165557861328125, -0.014007568359375, 0.01248931884765625, 0.016357421875, 0.004070281982421875, -0.0011548995971679688, -0.024505615234375, 0.049530029296875, -0.01873779296875, -0.018951416015625, -0.038177490234375, 0.02935791015625, 0.0184478759765625, 0.005035400390625, 0.053558349609375, 0.06640625, -0.043548583984375, 0.00507354736328125, -0.0272216796875, -0.01873779296875, -0.03765869140625, 0.05487060546875, -0.01526641845703125, -0.08416748046875, 0.019866943359375, 0.0008997917175292969, 0.00835418701171875, 0.04962158203125, 0.04278564453125, -0.0108184814453125, 0.061737060546875, 0.057373046875, -0.0107269287109375, 0.044586181640625, -0.042205810546875, 0.0242156982421875, -0.061737060546875, -0.032745361328125, -0.0401611328125, -0.03216552734375, -0.06231689453125, -0.041748046875, 0.01244354248046875, 0.0193023681640625, -0.02203369140625, 0.043792724609375, -0.043182373046875, 0.0214691162109375, 0.05023193359375, 0.01375579833984375, -0.00470733642578125, 0.0120391845703125, -0.023193359375, -0.01136016845703125, -0.060638427734375, -0.041168212890625, 0.09136962890625, 0.0290679931640625, 0.05828857421875, -0.015838623046875, 0.07464599609375, 0.0183563232421875, 0.004421234130859375, -0.05780029296875, 0.042633056640625, -0.0294036865234375, -0.03485107421875, -0.0132904052734375, -0.036376953125, -0.08148193359375, 0.022796630859375, -0.034332275390625, -0.06195068359375, 0.00853729248046875, -0.0017719268798828125, -0.019317626953125, 0.0367431640625, -0.039703369140625, 0.080810546875, -0.006954193115234375, -0.0156707763671875, -0.03973388671875, -0.0301361083984375, 0.004428863525390625, 0.0036792755126953125, 0.0215301513671875, -0.003223419189453125, -0.005077362060546875, 0.0794677734375, -0.0235595703125, 0.07501220703125, -0.01259613037109375, 0.02117919921875, 0.030517578125, -0.02679443359375, 0.01434326171875, -0.0028133392333984375, -0.00394439697265625, 0.00789642333984375, 0.026336669921875, -0.041473388671875, -0.037994384765625, 0.072509765625, -0.07086181640625, -0.033935546875, -0.033447265625, -0.0543212890625, 0.0208740234375, 0.0176849365234375, 0.018646240234375, 0.0256195068359375, -0.0206451416015625, 0.0335693359375, 0.0288543701171875, -0.036590576171875, 0.0229034423828125, 0.0114898681640625, -0.0082550048828125, -0.039581298828125, 0.086181640625, 0.01097869873046875, -0.00518798828125, 0.0406494140625, 0.0232086181640625, -0.0183563232421875, -0.0183563232421875, -0.007720947265625, 0.0302886962890625, -0.048431396484375, -0.01132965087890625, -0.07672119140625, -0.01335906982421875, -0.05364990234375, -0.029571533203125, -0.0124053955078125, -0.04718017578125, -0.0416259765625, -0.00913238525390625, 0.026580810546875, 0.07257080078125, -0.0130615234375, 0.0156402587890625, -0.05255126953125, 0.01248931884765625, 0.0009646415710449219, 0.0022258758544921875, 0.0088653564453125, -0.0227813720703125, -0.050567626953125, 0.009185791015625, -0.046417236328125, -0.059600830078125, 0.0241851806640625, 0.014007568359375, 0.06494140625, 0.0389404296875, 0.00949859619140625, 0.0531005859375, -0.04833984375, 0.07720947265625, 0.027435302734375, -0.07196044921875, 0.035858154296875, 0.0017852783203125, -0.01097869873046875, 0.0261383056640625, 0.054473876953125, -0.040313720703125, -0.037994384765625, -0.0548095703125, -0.0714111328125, 0.035308837890625, 0.0105438232421875, 0.0256500244140625, -0.0119781494140625, 0.036346435546875, 0.01514434814453125, 0.015899658203125, -0.07427978515625, -0.0416259765625, -0.0267791748046875, -0.049652099609375, -0.008697509765625, -0.00975799560546875, -0.0022335052490234375, -0.0389404296875, 0.04730224609375, 0.005054473876953125, 0.03839111328125, 0.0206756591796875, -0.02392578125, 0.01727294921875, 0.0208892822265625, 0.03924560546875, 0.02691650390625, -0.01485443115234375, 0.0183868408203125, 0.0279388427734375, -0.04852294921875, 0.006870269775390625, 0.0097198486328125, -0.00553131103515625, 0.006500244140625, 0.028564453125, 0.04852294921875, 0.034149169921875, -0.047882080078125, 0.04638671875, 0.0164337158203125, -0.0218048095703125, -0.0311126708984375, 0.0165252685546875, 0.004669189453125, 0.034942626953125, 0.0200958251953125, -0.0037975311279296875, 0.012298583984375, -0.055938720703125, 0.01378631591796875, 0.0130462646484375, -0.033935546875, -0.0276947021484375, 0.04632568359375, 0.0006775856018066406, -0.01256561279296875, 0.035186767578125, -0.0194091796875, -0.06353759765625, 0.054290771484375, 0.06512451171875, 0.05010986328125, -0.002223968505859375, 0.016448974609375, 0.051849365234375, 0.01434326171875, -0.0091552734375, 0.022247314453125, 0.0056915283203125, -0.046783447265625, -0.006317138671875, -0.04443359375, -0.019775390625, 0.0005321502685546875, -0.05560302734375, 0.012969970703125, -0.03692626953125, -0.0104827880859375, -0.007778167724609375, 0.01561737060546875, -0.058868408203125, 0.005779266357421875, -0.00787353515625, 0.057373046875, -0.06842041015625, 0.061309814453125, 0.08575439453125, -0.036163330078125, -0.044281005859375, 0.00464630126953125, -0.01247406005859375, -0.048736572265625, 0.03472900390625, 0.035369873046875, 0.0341796875, 0.005954742431640625, -0.044952392578125, -0.059234619140625, 0.10455322265625, 0.00540924072265625, -0.031005859375, -0.007717132568359375, 0.0098876953125, 0.0345458984375, -0.043548583984375, 0.005756378173828125, 0.02655029296875, 0.0384521484375, -0.038726806640625, -0.046783447265625, 0.0301361083984375, -0.0176544189453125, -0.0141754150390625, -0.009735107421875, -0.048675537109375, 0.0809326171875, -0.02655029296875, -0.0020465850830078125, 0.01369476318359375, 0.04949951171875, 0.00904083251953125, 0.03973388671875, 0.0377197265625, 0.070556640625, 0.0709228515625, -0.0023708343505859375, 0.0875244140625, -0.03375244140625, 0.050079345703125, 0.06451416015625, 0.01248931884765625, 0.0782470703125, 0.0270843505859375, -0.027587890625, 0.0633544921875, 0.05816650390625, -0.03240966796875, 0.050506591796875, 0.0193939208984375, -0.0015134811401367188, -0.004795074462890625, -0.0069580078125, -0.0543212890625, 0.022491455078125, 0.0300140380859375, -0.0245513916015625, 0.004131317138671875, 0.01678466796875, 0.007450103759765625, 0.004535675048828125, -0.013275146484375, 0.047760009765625, 0.0243682861328125, -0.0214691162109375, 0.028411865234375, 0.0116729736328125, 0.072509765625, -0.031494140625, -0.005153656005859375, 0.00591278076171875, 0.0211029052734375, -0.0318603515625, -0.06036376953125, 0.00620269775390625, -0.007305145263671875, -0.01280975341796875, 0.0024890899658203125, 0.04071044921875, -0.051239013671875, -0.026123046875, 0.0311737060546875, 0.01910400390625, 0.0289306640625, 0.01047515869140625, -0.0675048828125, -0.009521484375, 0.01080322265625, -0.0168609619140625, 0.02935791015625, 0.0224609375, 0.012847900390625, 0.0340576171875, 0.052734375, -0.00994110107421875, 0.00966644287109375, -0.00948333740234375, 0.06640625, -0.06634521484375, -0.05035400390625, -0.05645751953125, 0.035797119140625, -0.006427764892578125, -0.016082763671875, 0.058319091796875, 0.064697265625, 0.07794189453125, -0.01480865478515625, 0.05859375, -0.0167236328125, 0.0219573974609375, -0.036773681640625, 0.0679931640625, -0.06890869140625, -0.00811004638671875, -0.0303192138671875, -0.07366943359375, -0.014739990234375, 0.0762939453125, -0.021087646484375, -0.007450103759765625, 0.067138671875, 0.056793212890625, -0.007904052734375, -0.0105133056640625, 0.0131683349609375, 0.027801513671875, 0.0141143798828125, 0.0406494140625, 0.0496826171875, -0.048919677734375, 0.046600341796875, -0.03955078125, -0.004428863525390625, -0.03631591796875, -0.04949951171875, -0.0777587890625, -0.06109619140625, -0.044281005859375, -0.029815673828125, -0.0006494522094726562, 0.07720947265625, 0.033966064453125, -0.07147216796875, -0.018646240234375, 0.0071258544921875, 0.022430419921875, -0.025421142578125, -0.0237579345703125, 0.049102783203125, -0.02056884765625, -0.05859375, 0.015960693359375, -0.005462646484375, 0.0107421875, -0.00801849365234375, -0.01090240478515625, -0.035675048828125, -0.0059356689453125, 0.043182373046875, 0.0019311904907226562, -0.0615234375, -0.0274505615234375, -0.005802154541015625, -0.0260009765625, 0.0004024505615234375, 0.0458984375, -0.034637451171875, 0.0083160400390625, 0.041656494140625, 0.0543212890625, 0.0390625, -0.02191162109375, 0.021484375, -0.043548583984375, 0.01226806640625, 0.006748199462890625, 0.04058837890625, 0.01424407958984375, -0.0306854248046875, 0.041290283203125, 0.0253448486328125, -0.045745849609375, -0.06573486328125, -0.00945281982421875, -0.0794677734375, -0.037109375, 0.06536865234375, -0.037506103515625, -0.0267791748046875, 0.0018663406372070312, -0.0187225341796875, 0.037139892578125, -0.033355712890625, 0.059326171875, 0.047943115234375, 0.002246856689453125, 0.00568389892578125, -0.04132080078125, 0.018829345703125, 0.0406494140625, -0.0504150390625, -0.033447265625, 0.0217132568359375, 0.040191650390625, 0.02642822265625, 0.021820068359375, 0.0033664703369140625, 0.012298583984375, 0.0132904052734375, 0.005054473876953125, -0.00693511962890625, -0.00113677978515625, -0.0111846923828125, 0.01029205322265625, -0.0302886962890625, -0.02642822265625 ] ]
sentence-transformers/paraphrase-multilingual-mpnet-base-v2
2023-11-02T09:45:42.000Z
[ "sentence-transformers", "pytorch", "tf", "xlm-roberta", "feature-extraction", "sentence-similarity", "transformers", "multilingual", "ar", "bg", "ca", "cs", "da", "de", "el", "en", "es", "et", "fa", "fi", "fr", "gl", "gu", "he", "hi", "hr", "hu", "hy", "id", "it", "ja", "ka", "ko", "ku", "lt", "lv", "mk", "mn", "mr", "ms", "my", "nb", "nl", "pl", "pt", "ro", "ru", "sk", "sl", "sq", "sr", "sv", "th", "tr", "uk", "ur", "vi", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/paraphrase-multilingual-mpnet-base-v2
148
180,114
sentence-transformers
2022-03-02T23:29:05
--- language: - multilingual - ar - bg - ca - cs - da - de - el - en - es - et - fa - fi - fr - gl - gu - he - hi - hr - hu - hy - id - it - ja - ka - ko - ku - lt - lv - mk - mn - mr - ms - my - nb - nl - pl - pt - ro - ru - sk - sl - sq - sr - sv - th - tr - uk - ur - vi language_bcp47: - fr-ca - pt-br - zh-cn - zh-tw pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/paraphrase-multilingual-mpnet-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/paraphrase-multilingual-mpnet-base-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-multilingual-mpnet-base-v2') model = AutoModel.from_pretrained('sentence-transformers/paraphrase-multilingual-mpnet-base-v2') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, average pooling sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-multilingual-mpnet-base-v2) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
4,102
[ [ -0.016876220703125, -0.04315185546875, 0.0311431884765625, 0.031707763671875, -0.028350830078125, -0.030426025390625, -0.01029205322265625, 0.0102386474609375, 0.0106353759765625, 0.037017822265625, -0.02899169921875, -0.0236663818359375, -0.055999755859375, 0.01007080078125, -0.031768798828125, 0.0604248046875, -0.01551055908203125, -0.000560760498046875, -0.0236968994140625, -0.02020263671875, -0.004085540771484375, -0.0377197265625, -0.0323486328125, -0.0215301513671875, 0.02752685546875, 0.0160675048828125, 0.03936767578125, 0.038330078125, 0.0274810791015625, 0.032684326171875, -0.0033626556396484375, 0.00905609130859375, -0.0087738037109375, -0.007152557373046875, -0.0053253173828125, -0.0201568603515625, -0.0017986297607421875, 0.014617919921875, 0.050811767578125, 0.03131103515625, -0.0119781494140625, 0.00879669189453125, 0.01039886474609375, 0.0165557861328125, -0.0390625, 0.038055419921875, -0.052032470703125, 0.01293182373046875, 0.004608154296875, -0.0014772415161132812, -0.0367431640625, 0.00008410215377807617, 0.01477813720703125, -0.0250091552734375, 0.00800323486328125, 0.0028095245361328125, 0.076171875, 0.016998291015625, -0.0310516357421875, -0.015838623046875, -0.0128936767578125, 0.065185546875, -0.06939697265625, 0.00998687744140625, 0.02545166015625, 0.003993988037109375, 0.01044464111328125, -0.088623046875, -0.056793212890625, -0.012908935546875, -0.038818359375, 0.0118255615234375, -0.0225372314453125, -0.0033245086669921875, 0.00669097900390625, 0.00945281982421875, -0.04522705078125, -0.0199432373046875, -0.034454345703125, -0.0181732177734375, 0.0261077880859375, 0.0007343292236328125, 0.0289154052734375, -0.051300048828125, -0.037261962890625, -0.023651123046875, -0.008544921875, -0.003925323486328125, 0.0086822509765625, 0.018310546875, -0.0241851806640625, 0.06396484375, -0.01184844970703125, 0.0382080078125, -0.0106658935546875, 0.006137847900390625, 0.0474853515625, -0.0361328125, -0.0119476318359375, -0.008819580078125, 0.08953857421875, 0.0335693359375, 0.0230865478515625, -0.006916046142578125, -0.009796142578125, -0.01230621337890625, 0.004215240478515625, -0.0625, -0.0281982421875, 0.00836944580078125, -0.037811279296875, -0.0187530517578125, 0.0165557861328125, -0.060760498046875, -0.0077667236328125, -0.004337310791015625, 0.0496826171875, -0.047149658203125, 0.0029277801513671875, 0.00797271728515625, -0.0300445556640625, 0.0178375244140625, -0.0325927734375, -0.047882080078125, 0.0102691650390625, 0.031463623046875, 0.09002685546875, 0.006458282470703125, -0.0592041015625, -0.02435302734375, -0.0007729530334472656, 0.01617431640625, 0.0478515625, -0.030487060546875, 0.0017290115356445312, -0.00330352783203125, 0.0180511474609375, -0.05731201171875, -0.0292510986328125, 0.05072021484375, -0.0120391845703125, 0.043548583984375, 0.004985809326171875, -0.046783447265625, -0.0107421875, 0.00856781005859375, -0.030487060546875, 0.07110595703125, 0.006282806396484375, -0.07379150390625, -0.007488250732421875, -0.05078125, -0.02191162109375, -0.015838623046875, 0.00396728515625, -0.050048828125, 0.0028362274169921875, 0.037353515625, 0.058563232421875, -0.0011873245239257812, 0.0265960693359375, -0.0196685791015625, -0.0245513916015625, 0.0308837890625, -0.03643798828125, 0.08319091796875, 0.0189361572265625, -0.0313720703125, 0.00812530517578125, -0.034332275390625, -0.00457763671875, 0.0151824951171875, -0.01444244384765625, -0.00518035888671875, 0.003204345703125, 0.0184173583984375, 0.041839599609375, 0.023468017578125, -0.048431396484375, -0.004833221435546875, -0.03094482421875, 0.08038330078125, 0.03936767578125, 0.0115203857421875, 0.04095458984375, -0.0287933349609375, 0.0158538818359375, 0.0281982421875, 0.0068206787109375, -0.0171051025390625, -0.03558349609375, -0.06561279296875, -0.01751708984375, 0.01678466796875, 0.05419921875, -0.0716552734375, 0.056182861328125, -0.033599853515625, -0.03594970703125, -0.048095703125, 0.004497528076171875, 0.016693115234375, 0.036712646484375, 0.05322265625, 0.00281524658203125, -0.04058837890625, -0.0828857421875, -0.00727081298828125, -0.0006146430969238281, 0.002986907958984375, 0.0196990966796875, 0.053985595703125, -0.0207672119140625, 0.05731201171875, -0.0340576171875, -0.0203399658203125, -0.036346435546875, 0.020477294921875, 0.0180511474609375, 0.04693603515625, 0.03997802734375, -0.06549072265625, -0.0419921875, -0.0304718017578125, -0.059326171875, -0.0031452178955078125, -0.02655029296875, -0.0167388916015625, 0.001949310302734375, 0.043731689453125, -0.06939697265625, 0.01324462890625, 0.042999267578125, -0.0347900390625, 0.0288543701171875, -0.0241241455078125, -0.00390625, -0.11370849609375, 0.01003265380859375, 0.0006413459777832031, -0.00815582275390625, -0.0274810791015625, 0.0163726806640625, 0.0197906494140625, -0.01800537109375, -0.043365478515625, 0.027099609375, -0.03253173828125, 0.01544952392578125, -0.0019664764404296875, 0.0245208740234375, -0.0006108283996582031, 0.05877685546875, -0.007015228271484375, 0.06329345703125, 0.04638671875, -0.0400390625, 0.0300750732421875, 0.037384033203125, -0.03399658203125, 0.0184783935546875, -0.0638427734375, -0.00608062744140625, -0.0032634735107421875, 0.0263824462890625, -0.08245849609375, -0.006458282470703125, 0.029266357421875, -0.037872314453125, -0.0012722015380859375, 0.01113128662109375, -0.050567626953125, -0.0418701171875, -0.02899169921875, 0.005321502685546875, 0.0369873046875, -0.035186767578125, 0.040679931640625, 0.01282501220703125, -0.01385498046875, -0.045379638671875, -0.08245849609375, 0.0225372314453125, -0.0223388671875, -0.0455322265625, 0.03765869140625, -0.003879547119140625, 0.0115203857421875, 0.00643157958984375, 0.0185699462890625, -0.0168914794921875, -0.004779815673828125, -0.00920867919921875, 0.00368499755859375, -0.01082611083984375, 0.0028076171875, 0.0134124755859375, -0.00731658935546875, 0.0096893310546875, -0.0222930908203125, 0.056243896484375, -0.011444091796875, -0.008697509765625, -0.041656494140625, 0.028289794921875, 0.05108642578125, -0.0255889892578125, 0.091552734375, 0.0762939453125, -0.017913818359375, 0.01116180419921875, -0.0307464599609375, -0.0128631591796875, -0.03411865234375, 0.047088623046875, -0.0247344970703125, -0.055755615234375, 0.03228759765625, 0.022705078125, -0.002109527587890625, 0.049835205078125, 0.03997802734375, -0.00601959228515625, 0.06304931640625, 0.0261383056640625, -0.0026798248291015625, 0.034423828125, -0.0318603515625, 0.0167999267578125, -0.06915283203125, -0.012603759765625, -0.0257568359375, -0.0240325927734375, -0.04974365234375, -0.037384033203125, 0.01568603515625, -0.0008630752563476562, -0.01904296875, 0.054412841796875, -0.033111572265625, 0.0198822021484375, 0.05694580078125, 0.022705078125, -0.01477813720703125, 0.0067901611328125, -0.04400634765625, -0.0035991668701171875, -0.053192138671875, -0.037872314453125, 0.066162109375, 0.0174407958984375, 0.017791748046875, -0.0010013580322265625, 0.052978515625, -0.007099151611328125, -0.00928497314453125, -0.040557861328125, 0.04864501953125, -0.0268707275390625, -0.025970458984375, -0.0149688720703125, -0.03753662109375, -0.0592041015625, 0.0382080078125, -0.005893707275390625, -0.0633544921875, 0.00885772705078125, -0.017120361328125, -0.016693115234375, 0.028533935546875, -0.06744384765625, 0.0880126953125, 0.00713348388671875, -0.0066986083984375, -0.0030689239501953125, -0.06341552734375, 0.0163116455078125, 0.00421142578125, 0.0040740966796875, -0.007686614990234375, -0.0149078369140625, 0.060638427734375, -0.026763916015625, 0.06085205078125, -0.01507568359375, 0.027099609375, 0.0244598388671875, -0.020416259765625, 0.026214599609375, -0.004001617431640625, -0.01525115966796875, -0.0008025169372558594, -0.0037746429443359375, -0.0374755859375, -0.042724609375, 0.053985595703125, -0.06768798828125, -0.0281524658203125, -0.0292816162109375, -0.051177978515625, -0.002162933349609375, 0.0138397216796875, 0.029083251953125, 0.0269775390625, 0.00829315185546875, 0.05145263671875, 0.0302276611328125, -0.027923583984375, 0.0614013671875, 0.001575469970703125, 0.00846099853515625, -0.045684814453125, 0.05535888671875, 0.0081787109375, 0.0123443603515625, 0.04742431640625, 0.02545166015625, -0.03125, -0.0219573974609375, -0.0233154296875, 0.0286102294921875, -0.04931640625, -0.009124755859375, -0.0885009765625, -0.0357666015625, -0.051849365234375, 0.007122039794921875, -0.00933074951171875, -0.038116455078125, -0.034210205078125, -0.0141143798828125, 0.0215911865234375, 0.019378662109375, -0.001850128173828125, 0.0372314453125, -0.053863525390625, 0.0173797607421875, 0.018280029296875, -0.0003516674041748047, -0.003749847412109375, -0.05865478515625, -0.01904296875, 0.00839996337890625, -0.029998779296875, -0.0650634765625, 0.047882080078125, 0.026458740234375, 0.0504150390625, 0.00457763671875, 0.004375457763671875, 0.044769287109375, -0.043670654296875, 0.069580078125, 0.007770538330078125, -0.07720947265625, 0.0233917236328125, -0.01068878173828125, 0.03753662109375, 0.039947509765625, 0.0278472900390625, -0.042877197265625, -0.0276641845703125, -0.05078125, -0.08148193359375, 0.0552978515625, 0.043731689453125, 0.046905517578125, -0.0220947265625, 0.0268096923828125, -0.02288818359375, 0.0148162841796875, -0.0911865234375, -0.03582763671875, -0.032135009765625, -0.0390625, -0.0323486328125, -0.0251007080078125, 0.0162811279296875, -0.033233642578125, 0.053253173828125, 0.00394439697265625, 0.05963134765625, 0.019012451171875, -0.0335693359375, 0.0311737060546875, 0.01904296875, 0.046051025390625, 0.032318115234375, -0.002986907958984375, 0.028961181640625, 0.0217742919921875, -0.022705078125, 0.004482269287109375, 0.040496826171875, -0.00853729248046875, 0.0169677734375, 0.0284271240234375, 0.06982421875, 0.034698486328125, -0.03009033203125, 0.0556640625, 0.000033795833587646484, -0.022430419921875, -0.0190277099609375, -0.01464080810546875, 0.031402587890625, 0.02716064453125, 0.020477294921875, 0.0096893310546875, -0.00006836652755737305, -0.026214599609375, 0.036102294921875, 0.00911712646484375, -0.0225677490234375, -0.0017080307006835938, 0.05206298828125, -0.002986907958984375, -0.0175018310546875, 0.066650390625, -0.01910400390625, -0.05523681640625, 0.0364990234375, 0.04412841796875, 0.074951171875, 0.0003521442413330078, 0.0286102294921875, 0.03350830078125, 0.0291748046875, -0.009368896484375, -0.004680633544921875, -0.0019741058349609375, -0.058868408203125, -0.019866943359375, -0.05133056640625, 0.003635406494140625, 0.0008244514465332031, -0.0399169921875, 0.013275146484375, -0.00677490234375, 0.003696441650390625, -0.0024280548095703125, -0.016510009765625, -0.03643798828125, -0.004138946533203125, 0.00270843505859375, 0.059478759765625, -0.06671142578125, 0.0662841796875, 0.0516357421875, -0.05194091796875, -0.052398681640625, 0.00518035888671875, -0.0277252197265625, -0.06024169921875, 0.032806396484375, 0.0333251953125, 0.0214996337890625, 0.01416015625, -0.0299530029296875, -0.062103271484375, 0.0997314453125, 0.033111572265625, -0.017120361328125, -0.0178680419921875, 0.01617431640625, 0.029388427734375, -0.029541015625, 0.025970458984375, 0.030670166015625, 0.02838134765625, -0.0012111663818359375, -0.056884765625, 0.01464080810546875, -0.022705078125, 0.011383056640625, -0.006195068359375, -0.042938232421875, 0.088623046875, 0.0078582763671875, 0.0018339157104492188, 0.02081298828125, 0.055267333984375, 0.0194091796875, -0.00537109375, 0.023406982421875, 0.0537109375, 0.031585693359375, -0.0017147064208984375, 0.0787353515625, -0.031402587890625, 0.05694580078125, 0.07977294921875, 0.0094146728515625, 0.08331298828125, 0.04541015625, -0.007251739501953125, 0.049835205078125, 0.034515380859375, -0.01558685302734375, 0.052032470703125, 0.00482940673828125, -0.01189422607421875, -0.006107330322265625, 0.0095062255859375, -0.01206207275390625, 0.035736083984375, 0.006195068359375, -0.06427001953125, -0.02001953125, 0.01021575927734375, 0.0072021484375, -0.0016269683837890625, -0.0031986236572265625, 0.041900634765625, 0.0181121826171875, -0.04718017578125, 0.0291748046875, 0.0239410400390625, 0.06854248046875, -0.036102294921875, 0.0142364501953125, -0.00829315185546875, 0.031646728515625, 0.007144927978515625, -0.05035400390625, 0.035736083984375, -0.01531219482421875, -0.0033702850341796875, -0.0251312255859375, 0.042205810546875, -0.048858642578125, -0.052398681640625, 0.02630615234375, 0.04150390625, 0.00951385498046875, -0.01282501220703125, -0.09893798828125, -0.0175018310546875, 0.0066986083984375, -0.030853271484375, 0.02056884765625, 0.0283203125, 0.0308685302734375, 0.04541015625, 0.02459716796875, -0.017791748046875, 0.01666259765625, 0.0012273788452148438, 0.048370361328125, -0.04071044921875, -0.03924560546875, -0.08367919921875, 0.04638671875, -0.0185699462890625, -0.028533935546875, 0.0675048828125, 0.042266845703125, 0.058013916015625, -0.0190582275390625, 0.040985107421875, -0.014129638671875, 0.01171112060546875, -0.038421630859375, 0.06732177734375, -0.032928466796875, -0.0157470703125, -0.021881103515625, -0.07379150390625, -0.0225372314453125, 0.07666015625, -0.0220794677734375, 0.007495880126953125, 0.0770263671875, 0.06524658203125, -0.024444580078125, -0.0169830322265625, 0.0115814208984375, 0.0235443115234375, 0.00934600830078125, 0.04107666015625, 0.029541015625, -0.061248779296875, 0.06927490234375, -0.0518798828125, 0.00406646728515625, 0.0037689208984375, -0.06329345703125, -0.06842041015625, -0.06500244140625, -0.031402587890625, -0.0255889892578125, -0.00444793701171875, 0.07232666015625, 0.045135498046875, -0.05999755859375, -0.01473236083984375, -0.0262451171875, -0.00907135009765625, -0.0131072998046875, -0.022705078125, 0.03656005859375, -0.0401611328125, -0.06256103515625, 0.0134124755859375, -0.0058746337890625, 0.006206512451171875, -0.01324462890625, 0.00829315185546875, -0.06671142578125, 0.004241943359375, 0.040191650390625, -0.017181396484375, -0.054351806640625, -0.0184173583984375, -0.003871917724609375, -0.028472900390625, -0.0024013519287109375, 0.033111572265625, -0.037933349609375, 0.00989532470703125, 0.031402587890625, 0.044158935546875, 0.047760009765625, -0.0173187255859375, 0.036773681640625, -0.0662841796875, 0.0251007080078125, 0.0017347335815429688, 0.06024169921875, 0.03289794921875, -0.0113677978515625, 0.03045654296875, 0.03106689453125, -0.035064697265625, -0.0460205078125, -0.01056671142578125, -0.0709228515625, -0.028472900390625, 0.09307861328125, -0.035247802734375, -0.02301025390625, 0.00998687744140625, -0.018768310546875, 0.037261962890625, -0.0183868408203125, 0.0408935546875, 0.06671142578125, 0.0079498291015625, -0.0330810546875, -0.0277557373046875, 0.0056915283203125, 0.040313720703125, -0.0426025390625, -0.0117950439453125, 0.0099334716796875, 0.026519775390625, 0.0164031982421875, 0.0199737548828125, -0.002536773681640625, 0.00597381591796875, 0.00994873046875, 0.0007114410400390625, -0.018585205078125, -0.00017249584197998047, -0.029632568359375, 0.02093505859375, -0.03338623046875, -0.0266265869140625 ] ]
Helsinki-NLP/opus-mt-it-en
2023-08-16T11:58:49.000Z
[ "transformers", "pytorch", "tf", "marian", "text2text-generation", "translation", "it", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
translation
Helsinki-NLP
null
null
Helsinki-NLP/opus-mt-it-en
8
179,643
transformers
2022-03-02T23:29:04
--- tags: - translation license: apache-2.0 --- ### opus-mt-it-en * source languages: it * target languages: en * OPUS readme: [it-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/it-en/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/it-en/opus-2019-12-18.zip) * test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/it-en/opus-2019-12-18.test.txt) * test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/it-en/opus-2019-12-18.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | newssyscomb2009.it.en | 35.3 | 0.600 | | newstest2009.it.en | 34.0 | 0.594 | | Tatoeba.it.en | 70.9 | 0.808 |
901
[ [ -0.0203094482421875, -0.0238494873046875, 0.0155029296875, 0.03216552734375, -0.035919189453125, -0.020416259765625, -0.032379150390625, 0.0016050338745117188, 0.0008111000061035156, 0.0290679931640625, -0.046478271484375, -0.046844482421875, -0.042816162109375, 0.01265716552734375, -0.014617919921875, 0.051483154296875, -0.01477813720703125, 0.03472900390625, 0.01837158203125, -0.029296875, -0.028411865234375, -0.0301055908203125, -0.03497314453125, -0.0209197998046875, 0.01348114013671875, 0.03173828125, 0.0293426513671875, 0.0343017578125, 0.0706787109375, 0.019195556640625, -0.0035114288330078125, -0.0068817138671875, -0.029388427734375, -0.01517486572265625, 0.00372314453125, -0.03631591796875, -0.049652099609375, -0.0131988525390625, 0.08709716796875, 0.033050537109375, -0.00403594970703125, 0.029388427734375, -0.0009908676147460938, 0.072021484375, -0.0243988037109375, 0.00983428955078125, -0.043792724609375, 0.00467681884765625, -0.0310821533203125, -0.028045654296875, -0.0498046875, -0.01486968994140625, 0.004688262939453125, -0.048583984375, -0.003173828125, 0.01617431640625, 0.1099853515625, 0.0244598388671875, -0.0258941650390625, -0.00262451171875, -0.043731689453125, 0.07647705078125, -0.06243896484375, 0.044677734375, 0.03228759765625, 0.023834228515625, 0.0220947265625, -0.03765869140625, -0.0187530517578125, 0.003437042236328125, -0.0161895751953125, 0.0155029296875, -0.010833740234375, -0.0211334228515625, 0.0213165283203125, 0.061370849609375, -0.057342529296875, -0.0022125244140625, -0.054229736328125, 0.0006551742553710938, 0.052459716796875, 0.01446533203125, 0.00980377197265625, -0.0081634521484375, -0.0282440185546875, -0.043548583984375, -0.05609130859375, 0.010009765625, 0.0229339599609375, 0.021484375, -0.0289764404296875, 0.05609130859375, -0.011199951171875, 0.05224609375, -0.00606536865234375, 0.0005483627319335938, 0.072509765625, -0.031402587890625, -0.0227203369140625, -0.0114288330078125, 0.0909423828125, 0.03204345703125, 0.008270263671875, 0.0033416748046875, -0.01160430908203125, -0.01226806640625, 0.0160675048828125, -0.06915283203125, -0.002971649169921875, 0.017974853515625, -0.02703857421875, -0.00183868408203125, 0.0008072853088378906, -0.042572021484375, 0.01511383056640625, -0.031646728515625, 0.045867919921875, -0.053619384765625, -0.0265655517578125, 0.0271148681640625, -0.0018825531005859375, 0.022186279296875, -0.0100250244140625, -0.039398193359375, 0.0144500732421875, 0.0279083251953125, 0.051910400390625, -0.038238525390625, -0.0232086181640625, -0.03326416015625, -0.01548004150390625, -0.0113372802734375, 0.05181884765625, -0.01003265380859375, -0.0301055908203125, -0.0034770965576171875, 0.031341552734375, -0.035614013671875, -0.0264739990234375, 0.101318359375, -0.0196075439453125, 0.05279541015625, -0.031829833984375, -0.04803466796875, -0.0252838134765625, 0.040374755859375, -0.0452880859375, 0.09393310546875, 0.007694244384765625, -0.065673828125, 0.0197906494140625, -0.056182861328125, -0.0133056640625, -0.006191253662109375, 0.0037384033203125, -0.048919677734375, 0.005218505859375, 0.0072479248046875, 0.0276336669921875, -0.035186767578125, 0.02899169921875, -0.0027103424072265625, -0.0248260498046875, -0.0014390945434570312, -0.0195770263671875, 0.07879638671875, 0.0227203369140625, -0.026519775390625, 0.0217437744140625, -0.07586669921875, 0.006320953369140625, 0.00015819072723388672, -0.038116455078125, -0.0088653564453125, 0.00809478759765625, 0.02081298828125, 0.004146575927734375, 0.0185699462890625, -0.0479736328125, 0.0217437744140625, -0.04095458984375, 0.01457977294921875, 0.0518798828125, -0.0242919921875, 0.0308380126953125, -0.033721923828125, 0.01690673828125, 0.006134033203125, 0.01255035400390625, 0.0037631988525390625, -0.0305328369140625, -0.05780029296875, -0.03045654296875, 0.041778564453125, 0.07684326171875, -0.052520751953125, 0.059112548828125, -0.04937744140625, -0.057342529296875, -0.05621337890625, -0.00890350341796875, 0.025177001953125, 0.02960205078125, 0.03497314453125, -0.00701141357421875, -0.031219482421875, -0.0762939453125, -0.0167694091796875, -0.013397216796875, -0.012969970703125, 0.00391387939453125, 0.04595947265625, -0.01076507568359375, 0.0304412841796875, -0.028900146484375, -0.040435791015625, -0.006622314453125, 0.00548553466796875, 0.038970947265625, 0.049652099609375, 0.036163330078125, -0.061004638671875, -0.046295166015625, -0.004024505615234375, -0.05401611328125, -0.01019287109375, 0.0112152099609375, -0.01129913330078125, 0.0008325576782226562, 0.003692626953125, -0.0171966552734375, 0.0061798095703125, 0.047149658203125, -0.050628662109375, 0.032562255859375, -0.006595611572265625, 0.0285797119140625, -0.10333251953125, 0.01210784912109375, -0.00815582275390625, -0.000823974609375, -0.031585693359375, 0.004425048828125, 0.023162841796875, 0.010833740234375, -0.04791259765625, 0.047882080078125, -0.0222625732421875, -0.0130615234375, 0.02215576171875, 0.003566741943359375, 0.004810333251953125, 0.049896240234375, -0.0024509429931640625, 0.0645751953125, 0.057586669921875, -0.04156494140625, 0.0049591064453125, 0.0379638671875, -0.032073974609375, 0.03668212890625, -0.055633544921875, -0.02398681640625, 0.0294036865234375, -0.0161285400390625, -0.04901123046875, 0.0080718994140625, 0.0180206298828125, -0.0433349609375, 0.030792236328125, 0.003513336181640625, -0.060455322265625, -0.0089111328125, -0.02947998046875, 0.034637451171875, 0.047454833984375, -0.0136260986328125, 0.048431396484375, 0.004222869873046875, 0.001556396484375, -0.03582763671875, -0.06634521484375, -0.003627777099609375, -0.031524658203125, -0.061920166015625, 0.019805908203125, -0.037994384765625, -0.0122222900390625, 0.00484466552734375, 0.022003173828125, -0.0160980224609375, 0.006134033203125, 0.004520416259765625, 0.0197601318359375, -0.036346435546875, 0.00897979736328125, -0.0034923553466796875, -0.01507568359375, -0.0160064697265625, -0.01442718505859375, 0.03936767578125, -0.02850341796875, -0.0227203369140625, -0.046417236328125, 0.009002685546875, 0.042144775390625, -0.0286865234375, 0.058685302734375, 0.042510986328125, -0.006954193115234375, 0.018402099609375, -0.0304107666015625, 0.00647735595703125, -0.0328369140625, 0.006778717041015625, -0.0396728515625, -0.05499267578125, 0.039764404296875, 0.00681304931640625, 0.041717529296875, 0.06439208984375, 0.046783447265625, 0.005802154541015625, 0.05609130859375, 0.0277252197265625, -0.006855010986328125, 0.036773681640625, -0.035858154296875, -0.0125885009765625, -0.07403564453125, 0.0101470947265625, -0.059295654296875, -0.0210418701171875, -0.062255859375, -0.0158538818359375, 0.024505615234375, 0.0024280548095703125, -0.0255584716796875, 0.05267333984375, -0.04449462890625, 0.012542724609375, 0.0447998046875, -0.01055145263671875, 0.0307769775390625, 0.00554656982421875, -0.0391845703125, -0.024078369140625, -0.03338623046875, -0.03497314453125, 0.09521484375, 0.023284912109375, 0.0234832763671875, 0.016815185546875, 0.034912109375, 0.0032291412353515625, 0.01861572265625, -0.04302978515625, 0.0264892578125, -0.0214996337890625, -0.054962158203125, -0.01323699951171875, -0.041778564453125, -0.054107666015625, 0.02850341796875, -0.01097869873046875, -0.0509033203125, 0.00782012939453125, 0.0013723373413085938, -0.002544403076171875, 0.03472900390625, -0.0511474609375, 0.0826416015625, -0.00489044189453125, -0.01495361328125, 0.0169219970703125, -0.0236663818359375, 0.01708984375, 0.00469970703125, 0.02081298828125, -0.01183319091796875, 0.0088653564453125, 0.047760009765625, -0.00814056396484375, 0.02593994140625, -0.00006109476089477539, -0.005512237548828125, 0.00476837158203125, 0.00823974609375, 0.028533935546875, -0.0122833251953125, -0.04193115234375, 0.0340576171875, 0.003978729248046875, -0.038787841796875, -0.0008449554443359375, 0.03814697265625, -0.052947998046875, -0.007659912109375, -0.031005859375, -0.053741455078125, -0.0009908676147460938, 0.0261993408203125, 0.0595703125, 0.04949951171875, -0.01446533203125, 0.042633056640625, 0.060394287109375, -0.0177764892578125, 0.0277252197265625, 0.05609130859375, -0.0167236328125, -0.04241943359375, 0.0662841796875, 0.00843048095703125, 0.0256805419921875, 0.042938232421875, 0.00888824462890625, -0.00823211669921875, -0.05523681640625, -0.056884765625, 0.01494598388671875, -0.02801513671875, -0.01116943359375, -0.045318603515625, -0.006160736083984375, -0.01334381103515625, 0.017303466796875, -0.043182373046875, -0.04107666015625, -0.009796142578125, -0.0167236328125, 0.0194244384765625, 0.0205535888671875, -0.0052947998046875, 0.03924560546875, -0.07183837890625, 0.0157623291015625, -0.0180206298828125, 0.0288543701171875, -0.045196533203125, -0.06304931640625, -0.0257568359375, -0.0036144256591796875, -0.0430908203125, -0.052947998046875, 0.03631591796875, 0.01226806640625, 0.021942138671875, 0.0215911865234375, 0.0117034912109375, 0.026153564453125, -0.050567626953125, 0.0760498046875, -0.0091094970703125, -0.054840087890625, 0.027740478515625, -0.03033447265625, 0.035186767578125, 0.06866455078125, 0.0190277099609375, -0.01873779296875, -0.0416259765625, -0.050689697265625, -0.06280517578125, 0.060211181640625, 0.054107666015625, -0.0138397216796875, 0.0201263427734375, -0.01116943359375, -0.00197601318359375, 0.0179443359375, -0.0802001953125, -0.0302734375, 0.004886627197265625, -0.03173828125, -0.013427734375, -0.017730712890625, -0.020751953125, -0.0189361572265625, 0.083740234375, 0.004177093505859375, 0.021759033203125, 0.0239105224609375, -0.0099334716796875, -0.016326904296875, 0.02471923828125, 0.06842041015625, 0.039398193359375, -0.038360595703125, -0.01220703125, 0.01349639892578125, -0.023834228515625, -0.01175689697265625, 0.014129638671875, -0.026336669921875, 0.0230255126953125, 0.03326416015625, 0.0787353515625, 0.0091552734375, -0.043670654296875, 0.032501220703125, -0.03076171875, -0.031707763671875, -0.044921875, -0.0111236572265625, 0.0097503662109375, 0.00774383544921875, 0.0201873779296875, 0.0091705322265625, 0.0174560546875, -0.0166168212890625, 0.01222991943359375, 0.01084136962890625, -0.044158935546875, -0.03826904296875, 0.033477783203125, 0.003459930419921875, -0.0238494873046875, 0.032958984375, -0.0295257568359375, -0.04571533203125, 0.032257080078125, 0.01183319091796875, 0.08038330078125, -0.0173797607421875, -0.0133209228515625, 0.06524658203125, 0.0494384765625, -0.0205230712890625, 0.04193115234375, 0.0014505386352539062, -0.045379638671875, -0.03570556640625, -0.050628662109375, -0.0121612548828125, 0.0068817138671875, -0.059417724609375, 0.028961181640625, 0.01556396484375, 0.00380706787109375, -0.023406982421875, 0.0151824951171875, -0.045166015625, 0.00662994384765625, -0.0271453857421875, 0.08538818359375, -0.06695556640625, 0.058349609375, 0.039520263671875, -0.0212860107421875, -0.0643310546875, -0.0163421630859375, -0.0134124755859375, -0.037139892578125, 0.0457763671875, 0.0118865966796875, 0.0291595458984375, -0.007335662841796875, -0.01654052734375, -0.06500244140625, 0.083740234375, 0.0162811279296875, -0.04864501953125, 0.0052032470703125, 0.0142669677734375, 0.03326416015625, -0.0245361328125, 0.0157623291015625, 0.032806396484375, 0.055816650390625, 0.004230499267578125, -0.08648681640625, -0.0205078125, -0.0489501953125, -0.028839111328125, 0.036376953125, -0.044586181640625, 0.08038330078125, 0.03131103515625, -0.01171112060546875, 0.005706787109375, 0.032135009765625, 0.02056884765625, 0.0236053466796875, 0.03790283203125, 0.0823974609375, 0.03173828125, -0.0360107421875, 0.07940673828125, -0.0301513671875, 0.035369873046875, 0.0919189453125, -0.0049591064453125, 0.0654296875, 0.0225677490234375, -0.0053863525390625, 0.031829833984375, 0.051422119140625, -0.01873779296875, 0.035430908203125, 0.0023288726806640625, 0.01184844970703125, -0.00867462158203125, 0.0137481689453125, -0.050384521484375, 0.01380157470703125, 0.0136566162109375, -0.0189666748046875, -0.0038909912109375, -0.002887725830078125, 0.004878997802734375, -0.0014934539794921875, -0.00899505615234375, 0.052947998046875, -0.0035552978515625, -0.04486083984375, 0.05780029296875, -0.00836181640625, 0.042938232421875, -0.051605224609375, 0.0110931396484375, -0.004138946533203125, 0.02642822265625, 0.00018537044525146484, -0.046417236328125, 0.034820556640625, 0.0074310302734375, -0.015625, -0.028533935546875, 0.0167694091796875, -0.032379150390625, -0.06536865234375, 0.03216552734375, 0.0301055908203125, 0.0192718505859375, -0.007183074951171875, -0.06951904296875, 0.01239013671875, 0.01287841796875, -0.047821044921875, 0.0010251998901367188, 0.048583984375, 0.016845703125, 0.039764404296875, 0.03546142578125, 0.012115478515625, 0.020782470703125, -0.00437164306640625, 0.054473876953125, -0.035614013671875, -0.02984619140625, -0.057342529296875, 0.06829833984375, -0.0164794921875, -0.046844482421875, 0.052032470703125, 0.06915283203125, 0.07489013671875, -0.0012445449829101562, 0.0197296142578125, -0.00713348388671875, 0.0479736328125, -0.042938232421875, 0.034759521484375, -0.072021484375, 0.024444580078125, -0.00925445556640625, -0.07037353515625, -0.0093536376953125, 0.0199127197265625, -0.01371002197265625, -0.02557373046875, 0.06317138671875, 0.044647216796875, -0.0106964111328125, -0.01171112060546875, 0.022674560546875, 0.022613525390625, 0.019378662109375, 0.041778564453125, 0.02947998046875, -0.07904052734375, 0.0361328125, -0.0298309326171875, -0.005828857421875, 0.0029048919677734375, -0.05706787109375, -0.06585693359375, -0.05218505859375, -0.018524169921875, -0.01873779296875, -0.0231475830078125, 0.0653076171875, 0.03662109375, -0.06982421875, -0.042724609375, -0.0012331008911132812, 0.00768280029296875, -0.0170440673828125, -0.020538330078125, 0.041839599609375, -0.023956298828125, -0.06951904296875, 0.0298919677734375, 0.005268096923828125, -0.0186767578125, -0.0006146430969238281, -0.0170135498046875, -0.03515625, -0.00801849365234375, 0.0155181884765625, 0.0013055801391601562, -0.035064697265625, 0.01074981689453125, 0.016845703125, -0.00859832763671875, 0.032989501953125, 0.028472900390625, -0.0176544189453125, 0.0234832763671875, 0.05877685546875, 0.03460693359375, 0.0338134765625, -0.0184478759765625, 0.040771484375, -0.056854248046875, 0.019073486328125, 0.01183319091796875, 0.04669189453125, 0.03521728515625, -0.0053863525390625, 0.0665283203125, 0.01294708251953125, -0.043792724609375, -0.07952880859375, -0.0008325576782226562, -0.0845947265625, -0.0015583038330078125, 0.07379150390625, -0.019195556640625, -0.023101806640625, 0.02947998046875, -0.0167083740234375, 0.0113983154296875, -0.02911376953125, 0.0361328125, 0.061004638671875, 0.0303955078125, 0.006717681884765625, -0.060882568359375, 0.0266876220703125, 0.041107177734375, -0.0615234375, -0.0132598876953125, 0.01486968994140625, 0.01285552978515625, 0.03564453125, 0.042755126953125, -0.02008056640625, 0.009002685546875, -0.0173492431640625, 0.0289154052734375, -0.0014753341674804688, -0.0091705322265625, -0.0213470458984375, 0.0021724700927734375, -0.001750946044921875, -0.0225830078125 ] ]
MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli
2023-03-20T08:27:01.000Z
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "zero-shot-classification", "en", "dataset:multi_nli", "dataset:anli", "dataset:fever", "dataset:lingnli", "dataset:alisawuffles/WANLI", "arxiv:2104.07179", "arxiv:2111.09543", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
zero-shot-classification
MoritzLaurer
null
null
MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli
61
179,353
transformers
2022-06-06T18:28:10
--- language: - en tags: - text-classification - zero-shot-classification license: mit metrics: - accuracy datasets: - multi_nli - anli - fever - lingnli - alisawuffles/WANLI pipeline_tag: zero-shot-classification #- text-classification #widget: #- text: "I first thought that I really liked the movie, but upon second thought it was actually disappointing. [SEP] The movie was not good." model-index: # info: https://github.com/huggingface/hub-docs/blame/main/modelcard.md - name: DeBERTa-v3-large-mnli-fever-anli-ling-wanli results: - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: multi_nli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: MultiNLI-matched # Required. A pretty name for the dataset. Example: Common Voice (French) split: validation_matched # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,912 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: multi_nli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: MultiNLI-mismatched # Required. A pretty name for the dataset. Example: Common Voice (French) split: validation_mismatched # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,908 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: anli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: ANLI-all # Required. A pretty name for the dataset. Example: Common Voice (French) split: test_r1+test_r2+test_r3 # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,702 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: anli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: ANLI-r3 # Required. A pretty name for the dataset. Example: Common Voice (French) split: test_r3 # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,64 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: alisawuffles/WANLI # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: WANLI # Required. A pretty name for the dataset. Example: Common Voice (French) split: test # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,77 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). - task: type: text-classification # Required. Example: automatic-speech-recognition name: Natural Language Inference # Optional. Example: Speech Recognition dataset: type: lingnli # Required. Example: common_voice. Use dataset id from https://hf.co/datasets name: LingNLI # Required. A pretty name for the dataset. Example: Common Voice (French) split: test # Optional. Example: test metrics: - type: accuracy # Required. Example: wer. Use metric id from https://hf.co/metrics value: 0,87 # Required. Example: 20.90 #name: # Optional. Example: Test WER verified: false # Optional. If true, indicates that evaluation was generated by Hugging Face (vs. self-reported). --- # DeBERTa-v3-large-mnli-fever-anli-ling-wanli ## Model description This model was fine-tuned on the [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), Adversarial-NLI ([ANLI](https://huggingface.co/datasets/anli)), [LingNLI](https://arxiv.org/pdf/2104.07179.pdf) and [WANLI](https://huggingface.co/datasets/alisawuffles/WANLI) datasets, which comprise 885 242 NLI hypothesis-premise pairs. This model is the best performing NLI model on the Hugging Face Hub as of 06.06.22 and can be used for zero-shot classification. It significantly outperforms all other large models on the [ANLI benchmark](https://github.com/facebookresearch/anli). The foundation model is [DeBERTa-v3-large from Microsoft](https://huggingface.co/microsoft/deberta-v3-large). DeBERTa-v3 combines several recent innovations compared to classical Masked Language Models like BERT, RoBERTa etc., see the [paper](https://arxiv.org/abs/2111.09543) ### How to use the model #### Simple zero-shot classification pipeline ```python from transformers import pipeline classifier = pipeline("zero-shot-classification", model="MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli") sequence_to_classify = "Angela Merkel is a politician in Germany and leader of the CDU" candidate_labels = ["politics", "economy", "entertainment", "environment"] output = classifier(sequence_to_classify, candidate_labels, multi_label=False) print(output) ``` #### NLI use-case ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu") model_name = "MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForSequenceClassification.from_pretrained(model_name) premise = "I first thought that I liked the movie, but upon second thought it was actually disappointing." hypothesis = "The movie was not good." input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt") output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu" prediction = torch.softmax(output["logits"][0], -1).tolist() label_names = ["entailment", "neutral", "contradiction"] prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)} print(prediction) ``` ### Training data DeBERTa-v3-large-mnli-fever-anli-ling-wanli was trained on the [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), Adversarial-NLI ([ANLI](https://huggingface.co/datasets/anli)), [LingNLI](https://arxiv.org/pdf/2104.07179.pdf) and [WANLI](https://huggingface.co/datasets/alisawuffles/WANLI) datasets, which comprise 885 242 NLI hypothesis-premise pairs. Note that [SNLI](https://huggingface.co/datasets/snli) was explicitly excluded due to quality issues with the dataset. More data does not necessarily make for better NLI models. ### Training procedure DeBERTa-v3-large-mnli-fever-anli-ling-wanli was trained using the Hugging Face trainer with the following hyperparameters. Note that longer training with more epochs hurt performance in my tests (overfitting). ``` training_args = TrainingArguments( num_train_epochs=4, # total number of training epochs learning_rate=5e-06, per_device_train_batch_size=16, # batch size per device during training gradient_accumulation_steps=2, # doubles the effective batch_size to 32, while decreasing memory requirements per_device_eval_batch_size=64, # batch size for evaluation warmup_ratio=0.06, # number of warmup steps for learning rate scheduler weight_decay=0.01, # strength of weight decay fp16=True # mixed precision training ) ``` ### Eval results The model was evaluated using the test sets for MultiNLI, ANLI, LingNLI, WANLI and the dev set for Fever-NLI. The metric used is accuracy. The model achieves state-of-the-art performance on each dataset. Surprisingly, it outperforms the previous [state-of-the-art on ANLI](https://github.com/facebookresearch/anli) (ALBERT-XXL) by 8,3%. I assume that this is because ANLI was created to fool masked language models like RoBERTa (or ALBERT), while DeBERTa-v3 uses a better pre-training objective (RTD), disentangled attention and I fine-tuned it on higher quality NLI data. |Datasets|mnli_test_m|mnli_test_mm|anli_test|anli_test_r3|ling_test|wanli_test| | :---: | :---: | :---: | :---: | :---: | :---: | :---: | |Accuracy|0.912|0.908|0.702|0.64|0.87|0.77| |Speed (text/sec, A100 GPU)|696.0|697.0|488.0|425.0|828.0|980.0| ## Limitations and bias Please consult the original DeBERTa-v3 paper and literature on different NLI datasets for more information on the training data and potential biases. The model will reproduce statistical patterns in the training data. ## Citation If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k. ### Ideas for cooperation or questions? If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/) ### Debugging and issues Note that DeBERTa-v3 was released on 06.12.21 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers>=4.13 might solve some issues.
11,394
[ [ -0.028717041015625, -0.041412353515625, 0.003604888916015625, 0.023406982421875, -0.004848480224609375, -0.00565338134765625, -0.00366973876953125, -0.04229736328125, 0.0250244140625, 0.017730712890625, -0.03240966796875, -0.0304718017578125, -0.0465087890625, 0.013458251953125, -0.0164794921875, 0.06988525390625, 0.004978179931640625, 0.006450653076171875, 0.00836181640625, -0.01739501953125, -0.016876220703125, -0.05242919921875, -0.055145263671875, -0.03619384765625, 0.03240966796875, 0.0215606689453125, 0.0518798828125, 0.0389404296875, 0.0309600830078125, 0.01849365234375, -0.01303863525390625, 0.005634307861328125, -0.0246734619140625, -0.012969970703125, 0.0285186767578125, -0.036102294921875, -0.04638671875, 0.0118408203125, 0.037261962890625, 0.0247802734375, 0.0011663436889648438, 0.0184478759765625, -0.0109710693359375, 0.054962158203125, -0.04571533203125, 0.006397247314453125, -0.03839111328125, 0.018310546875, -0.0153350830078125, 0.018890380859375, -0.0258331298828125, -0.01386260986328125, 0.01445770263671875, -0.0304718017578125, 0.0041656494140625, -0.00943756103515625, 0.09765625, 0.024078369140625, -0.0246124267578125, -0.0026397705078125, -0.043304443359375, 0.06927490234375, -0.08160400390625, 0.02508544921875, 0.0223541259765625, 0.0079498291015625, 0.006496429443359375, -0.0308837890625, -0.05206298828125, -0.015899658203125, -0.007350921630859375, 0.0149688720703125, -0.0321044921875, -0.0191802978515625, 0.034088134765625, 0.02056884765625, -0.0572509765625, 0.0132598876953125, -0.04010009765625, -0.005687713623046875, 0.0604248046875, -0.004062652587890625, 0.01390838623046875, -0.0240325927734375, -0.0279083251953125, -0.0195770263671875, -0.045257568359375, 0.0006437301635742188, 0.0214691162109375, 0.005382537841796875, -0.0189666748046875, 0.0164337158203125, -0.01073455810546875, 0.0653076171875, 0.01812744140625, -0.0178985595703125, 0.06866455078125, -0.0143890380859375, -0.0239715576171875, 0.006511688232421875, 0.073974609375, 0.029937744140625, 0.0219268798828125, -0.0000247955322265625, 0.0036754608154296875, -0.01290130615234375, -0.00911712646484375, -0.0792236328125, -0.0232086181640625, 0.026458740234375, -0.026031494140625, -0.034759521484375, -0.0003058910369873047, -0.06640625, -0.01947021484375, -0.02655029296875, 0.03662109375, -0.045379638671875, -0.03662109375, -0.004974365234375, -0.004840850830078125, 0.0224456787109375, 0.003299713134765625, -0.059478759765625, 0.004337310791015625, 0.0172271728515625, 0.0626220703125, -0.0088958740234375, -0.0306549072265625, -0.021636962890625, -0.0081634521484375, -0.01556396484375, 0.0179595947265625, -0.0200042724609375, -0.00954437255859375, -0.01558685302734375, -0.0016126632690429688, -0.045074462890625, -0.0401611328125, 0.0233001708984375, -0.028106689453125, 0.0240020751953125, -0.00860595703125, -0.0259246826171875, -0.0300140380859375, 0.014923095703125, -0.04510498046875, 0.0751953125, 0.006130218505859375, -0.07989501953125, 0.01495361328125, -0.047271728515625, -0.0177459716796875, -0.025115966796875, 0.00826263427734375, -0.05108642578125, -0.0152740478515625, 0.0299072265625, 0.04522705078125, -0.015106201171875, 0.04345703125, -0.034515380859375, -0.0266571044921875, 0.029083251953125, -0.02838134765625, 0.0875244140625, 0.0295867919921875, -0.04925537109375, 0.0052947998046875, -0.07611083984375, -0.006805419921875, 0.0285186767578125, 0.0030345916748046875, -0.0073394775390625, -0.0302734375, -0.002353668212890625, 0.0299072265625, 0.00986480712890625, -0.036041259765625, 0.0131378173828125, -0.047607421875, 0.048919677734375, 0.02752685546875, -0.00807952880859375, 0.0178985595703125, -0.0328369140625, 0.017333984375, 0.021270751953125, 0.0321044921875, 0.00901031494140625, -0.051025390625, -0.078125, -0.0251922607421875, 0.032623291015625, 0.061248779296875, -0.053802490234375, 0.037384033203125, -0.0102691650390625, -0.0638427734375, -0.0460205078125, 0.0122222900390625, 0.0309906005859375, 0.036468505859375, 0.03497314453125, -0.00876617431640625, -0.057098388671875, -0.06768798828125, 0.01055908203125, -0.01331329345703125, 0.0009279251098632812, 0.01910400390625, 0.0518798828125, -0.0335693359375, 0.06488037109375, -0.027679443359375, -0.03839111328125, -0.026275634765625, -0.0023899078369140625, 0.048187255859375, 0.047332763671875, 0.06951904296875, -0.0567626953125, -0.03363037109375, -0.01459503173828125, -0.07305908203125, 0.00946807861328125, -0.00025153160095214844, -0.0103759765625, 0.037078857421875, 0.0208587646484375, -0.0321044921875, 0.0255889892578125, 0.04571533203125, -0.012054443359375, 0.01013946533203125, -0.001392364501953125, 0.01335906982421875, -0.08807373046875, 0.030364990234375, 0.00472259521484375, 0.01026153564453125, -0.07904052734375, -0.00396728515625, -0.0015659332275390625, -0.0006246566772460938, -0.046905517578125, 0.043182373046875, -0.01003265380859375, 0.0236663818359375, -0.007171630859375, -0.0003650188446044922, 0.0153961181640625, 0.050506591796875, 0.002315521240234375, 0.0389404296875, 0.06524658203125, -0.050048828125, 0.0218353271484375, 0.01459503173828125, -0.004894256591796875, 0.0006494522094726562, -0.060516357421875, 0.00948333740234375, -0.007350921630859375, 0.0134429931640625, -0.04937744140625, -0.0213623046875, 0.037872314453125, -0.0322265625, 0.039520263671875, 0.0035648345947265625, -0.026519775390625, -0.042999267578125, -0.02923583984375, 0.030487060546875, 0.053619384765625, -0.0535888671875, 0.041717529296875, 0.0226287841796875, 0.0190582275390625, -0.068359375, -0.0601806640625, -0.01161956787109375, -0.0400390625, -0.044097900390625, 0.034332275390625, -0.0033588409423828125, -0.016693115234375, -0.0020084381103515625, 0.012451171875, -0.01540374755859375, 0.01256561279296875, 0.023406982421875, 0.0303802490234375, -0.01383209228515625, -0.01200103759765625, -0.01535797119140625, 0.00046253204345703125, -0.0077972412109375, -0.00919342041015625, 0.03790283203125, -0.030670166015625, 0.004497528076171875, -0.0445556640625, 0.005645751953125, 0.047271728515625, -0.0157470703125, 0.06597900390625, 0.0799560546875, -0.036590576171875, 0.0273284912109375, -0.05426025390625, -0.00662994384765625, -0.02978515625, 0.011322021484375, -0.0297393798828125, -0.046142578125, 0.043701171875, 0.02630615234375, -0.00406646728515625, 0.061920166015625, 0.0276947021484375, 0.0313720703125, 0.07159423828125, 0.036590576171875, -0.029083251953125, 0.0234222412109375, -0.058502197265625, 0.015899658203125, -0.06500244140625, -0.0216522216796875, -0.0289154052734375, -0.007137298583984375, -0.040374755859375, -0.024688720703125, 0.0221710205078125, 0.029144287109375, -0.0226898193359375, 0.03643798828125, -0.039398193359375, 0.0087127685546875, 0.0474853515625, 0.01018524169921875, 0.005863189697265625, -0.0085906982421875, 0.01290130615234375, 0.00287628173828125, -0.0628662109375, -0.0190582275390625, 0.07965087890625, 0.04486083984375, 0.03460693359375, 0.01030731201171875, 0.06768798828125, -0.01200103759765625, 0.0235443115234375, -0.03277587890625, 0.0251617431640625, -0.023651123046875, -0.052703857421875, -0.0203094482421875, -0.037628173828125, -0.06817626953125, 0.0202178955078125, -0.022674560546875, -0.064697265625, 0.043182373046875, 0.0089111328125, -0.038909912109375, 0.0273284912109375, -0.055450439453125, 0.06787109375, -0.006977081298828125, -0.02008056640625, 0.00458526611328125, -0.049285888671875, 0.046112060546875, -0.0126953125, 0.01444244384765625, -0.02593994140625, 0.0298004150390625, 0.072509765625, -0.0174102783203125, 0.06829833984375, -0.0238494873046875, 0.007129669189453125, 0.0290679931640625, -0.0221405029296875, 0.0060272216796875, 0.020416259765625, -0.040802001953125, 0.05218505859375, 0.008453369140625, -0.03350830078125, -0.034423828125, 0.0665283203125, -0.077880859375, -0.047698974609375, -0.044525146484375, -0.0164337158203125, 0.0019197463989257812, 0.00589752197265625, 0.0447998046875, 0.051605224609375, -0.0014181137084960938, 0.005199432373046875, 0.05059814453125, -0.0276947021484375, 0.037841796875, 0.020782470703125, -0.0236358642578125, -0.03619384765625, 0.07586669921875, 0.01270294189453125, 0.00943756103515625, 0.0148468017578125, 0.0171356201171875, -0.015350341796875, -0.0282135009765625, -0.05694580078125, 0.0270843505859375, -0.044036865234375, -0.0310211181640625, -0.078857421875, -0.04010009765625, -0.058624267578125, 0.01004791259765625, -0.02471923828125, -0.0279388427734375, -0.0313720703125, 0.007297515869140625, 0.04449462890625, 0.04620361328125, -0.011810302734375, 0.005340576171875, -0.050384521484375, 0.00881195068359375, 0.016815185546875, 0.0067138671875, 0.005451202392578125, -0.057952880859375, -0.007526397705078125, 0.007083892822265625, -0.0215606689453125, -0.069580078125, 0.05615234375, 0.031768798828125, 0.0313720703125, 0.0215301513671875, 0.00769805908203125, 0.0513916015625, -0.0189666748046875, 0.058349609375, 0.0133819580078125, -0.07232666015625, 0.050567626953125, -0.0013484954833984375, 0.03179931640625, 0.03692626953125, 0.0518798828125, -0.01055908203125, -0.035430908203125, -0.053619384765625, -0.067138671875, 0.05645751953125, 0.0209503173828125, -0.0031566619873046875, 0.0068359375, 0.0244598388671875, -0.007205963134765625, 0.005458831787109375, -0.06585693359375, -0.0455322265625, -0.02923583984375, -0.0077056884765625, -0.0214385986328125, -0.015777587890625, -0.00571441650390625, -0.040863037109375, 0.08111572265625, -0.0014743804931640625, 0.0245208740234375, 0.04400634765625, -0.0068359375, -0.0002677440643310547, 0.003833770751953125, 0.044830322265625, 0.04083251953125, -0.0325927734375, -0.0078887939453125, 0.034576416015625, -0.026123046875, 0.00027871131896972656, 0.02764892578125, -0.023223876953125, 0.0166168212890625, 0.03314208984375, 0.09747314453125, -0.0095977783203125, -0.03546142578125, 0.046051025390625, -0.012481689453125, -0.021881103515625, -0.031585693359375, 0.01013946533203125, -0.013458251953125, 0.0168609619140625, 0.0267181396484375, 0.0213623046875, 0.0257720947265625, -0.031280517578125, 0.020111083984375, 0.01364898681640625, -0.0265960693359375, -0.024688720703125, 0.0523681640625, 0.0077972412109375, -0.0009546279907226562, 0.04559326171875, -0.0280303955078125, -0.04644775390625, 0.056060791015625, 0.0340576171875, 0.053741455078125, -0.0262908935546875, 0.0232086181640625, 0.0648193359375, 0.0135040283203125, 0.001422882080078125, 0.005390167236328125, 0.028167724609375, -0.050262451171875, -0.037689208984375, -0.058624267578125, -0.00902557373046875, 0.0335693359375, -0.051788330078125, 0.0259857177734375, -0.0215911865234375, -0.01459503173828125, 0.0139617919921875, 0.00829315185546875, -0.057037353515625, 0.01172637939453125, 0.017578125, 0.057037353515625, -0.078125, 0.08099365234375, 0.040008544921875, -0.0292510986328125, -0.0655517578125, 0.002643585205078125, -0.004207611083984375, -0.049774169921875, 0.0587158203125, 0.04449462890625, 0.00984954833984375, -0.0051727294921875, -0.01151275634765625, -0.08612060546875, 0.08099365234375, 0.016815185546875, -0.042724609375, 0.00498199462890625, -0.01016998291015625, 0.0467529296875, -0.0300140380859375, 0.032562255859375, 0.04937744140625, 0.027313232421875, 0.0196990966796875, -0.07415771484375, 0.0091552734375, -0.031707763671875, -0.0002810955047607422, 0.00179290771484375, -0.050445556640625, 0.06732177734375, -0.033935546875, -0.0022296905517578125, 0.003978729248046875, 0.0556640625, 0.01861572265625, 0.040740966796875, 0.0433349609375, 0.04193115234375, 0.05120849609375, -0.0070648193359375, 0.0711669921875, -0.0307464599609375, 0.0208282470703125, 0.05206298828125, -0.017120361328125, 0.0660400390625, 0.017578125, -0.006206512451171875, 0.03448486328125, 0.047607421875, -0.0228729248046875, 0.0306549072265625, 0.0159454345703125, -0.00261688232421875, -0.0009813308715820312, -0.00844573974609375, -0.048309326171875, 0.037750244140625, 0.0173797607421875, -0.015869140625, 0.00531005859375, 0.0307464599609375, 0.0188446044921875, -0.01439666748046875, 0.0026721954345703125, 0.0489501953125, -0.0018072128295898438, -0.059051513671875, 0.093994140625, -0.00662994384765625, 0.0672607421875, -0.01800537109375, 0.0102081298828125, -0.01519012451171875, 0.0161590576171875, -0.0247802734375, -0.024444580078125, 0.0297698974609375, -0.0009713172912597656, -0.00928497314453125, 0.006866455078125, 0.036712646484375, -0.03729248046875, -0.04119873046875, 0.0283050537109375, 0.024383544921875, 0.0203399658203125, 0.01085662841796875, -0.078125, 0.0187835693359375, 0.01073455810546875, -0.029052734375, 0.026611328125, 0.0087432861328125, 0.014892578125, 0.04351806640625, 0.037933349609375, -0.01071929931640625, -0.005077362060546875, 0.00835418701171875, 0.0615234375, -0.0305633544921875, 0.00164031982421875, -0.0712890625, 0.02008056640625, -0.0178375244140625, -0.03399658203125, 0.058868408203125, 0.045867919921875, 0.059478759765625, -0.011993408203125, 0.036865234375, -0.01971435546875, 0.0266876220703125, -0.04449462890625, 0.04327392578125, -0.043182373046875, 0.0050048828125, -0.021148681640625, -0.062347412109375, -0.040496826171875, 0.046722412109375, -0.018402099609375, 0.001811981201171875, 0.03497314453125, 0.06719970703125, 0.01971435546875, -0.004352569580078125, 0.0144805908203125, 0.0215301513671875, 0.014495849609375, 0.05645751953125, 0.0439453125, -0.06298828125, 0.0270843505859375, -0.062042236328125, -0.0289154052734375, -0.01493072509765625, -0.05206298828125, -0.08612060546875, -0.041717529296875, -0.055816650390625, -0.04510498046875, 0.013458251953125, 0.074462890625, 0.06011962890625, -0.07855224609375, -0.0121002197265625, 0.00669097900390625, -0.00301361083984375, -0.022247314453125, -0.017608642578125, 0.04095458984375, -0.0182342529296875, -0.07281494140625, 0.0248870849609375, 0.001739501953125, 0.0173797607421875, -0.01232147216796875, -0.0051727294921875, -0.0278472900390625, -0.003875732421875, 0.037017822265625, 0.021270751953125, -0.061431884765625, -0.0013704299926757812, 0.015655517578125, -0.00887298583984375, 0.00055694580078125, 0.015960693359375, -0.05023193359375, 0.0171051025390625, 0.02337646484375, 0.02777099609375, 0.043853759765625, -0.0207977294921875, 0.0242919921875, -0.04022216796875, 0.0251312255859375, 0.0037670135498046875, 0.0224151611328125, 0.0274658203125, -0.031280517578125, 0.051177978515625, 0.013671875, -0.04022216796875, -0.0712890625, -0.00359344482421875, -0.0701904296875, -0.021820068359375, 0.101318359375, -0.01486968994140625, -0.035369873046875, 0.00925445556640625, -0.0109100341796875, 0.0265350341796875, -0.022674560546875, 0.04718017578125, 0.03338623046875, -0.014373779296875, 0.002338409423828125, -0.048187255859375, 0.055206298828125, 0.040191650390625, -0.036163330078125, -0.0152435302734375, 0.00017249584197998047, 0.01198577880859375, 0.040283203125, 0.048004150390625, -0.0083160400390625, 0.002582550048828125, -0.01617431640625, 0.0073394775390625, -0.00757598876953125, -0.023468017578125, -0.04315185546875, -0.00017690658569335938, -0.0011310577392578125, -0.0124969482421875 ] ]
stabilityai/stable-diffusion-2-inpainting
2023-07-05T16:19:10.000Z
[ "diffusers", "stable-diffusion", "arxiv:2112.10752", "arxiv:2202.00512", "arxiv:1910.09700", "license:openrail++", "has_space", "diffusers:StableDiffusionInpaintPipeline", "region:us" ]
null
stabilityai
null
null
stabilityai/stable-diffusion-2-inpainting
371
177,563
diffusers
2022-11-23T17:41:55
--- license: openrail++ tags: - stable-diffusion inference: false --- # Stable Diffusion v2 Model Card This model card focuses on the model associated with the Stable Diffusion v2, available [here](https://github.com/Stability-AI/stablediffusion). This `stable-diffusion-2-inpainting` model is resumed from [stable-diffusion-2-base](https://huggingface.co/stabilityai/stable-diffusion-2-base) (`512-base-ema.ckpt`) and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. ![image](https://huggingface.co/stabilityai/stable-diffusion-2-inpainting/resolve/main/merged-leopards.png) - Use it with the [`stablediffusion`](https://github.com/Stability-AI/stablediffusion) repository: download the `512-inpainting-ema.ckpt` [here](https://huggingface.co/stabilityai/stable-diffusion-2-inpainting/resolve/main/512-inpainting-ema.ckpt). - Use it with 🧨 [`diffusers`](https://huggingface.co/stabilityai/stable-diffusion-2-inpainting#examples) ## Model Details - **Developed by:** Robin Rombach, Patrick Esser - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) - **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([OpenCLIP-ViT/H](https://github.com/mlfoundations/open_clip)). - **Resources for more information:** [GitHub Repository](https://github.com/Stability-AI/). - **Cite as:** @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } ## Examples Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run Stable Diffusion 2 inpainting in a simple and efficient manner. ```bash pip install diffusers transformers accelerate scipy safetensors ``` ```python from diffusers import StableDiffusionInpaintPipeline pipe = StableDiffusionInpaintPipeline.from_pretrained( "stabilityai/stable-diffusion-2-inpainting", torch_dtype=torch.float16, ) pipe.to("cuda") prompt = "Face of a yellow cat, high resolution, sitting on a park bench" #image and mask_image should be PIL images. #The mask structure is white for inpainting and black for keeping as is image = pipe(prompt=prompt, image=image, mask_image=mask_image).images[0] image.save("./yellow_cat_on_park_bench.png") ``` **Notes**: - Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance) - If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed) **How it works:** `image` | `mask_image` :-------------------------:|:-------------------------:| <img src="https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" alt="drawing" width="300"/> | <img src="https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" alt="drawing" width="300"/> `prompt` | `Output` :-------------------------:|:-------------------------:| <span style="position: relative;bottom: 150px;">Face of a yellow cat, high resolution, sitting on a park bench</span> | <img src="https://huggingface.co/datasets/patrickvonplaten/images/resolve/main/test.png" alt="drawing" width="300"/> # Uses ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include - Safe deployment of models which have the potential to generate harmful content. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use _Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_. The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. - The model was trained mainly with English captions and will not work as well in other languages. - The autoencoding part of the model is lossy - The model was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section). ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion vw was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as white and western cultures are often set as the default. Further, the ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts. Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ## Training **Training Data** The model developers used the following dataset for training the model: - LAION-5B and subsets (details below). The training data is further filtered using LAION's NSFW detector, with a "p_unsafe" score of 0.1 (conservative). For more details, please refer to LAION-5B's [NeurIPS 2022](https://openreview.net/forum?id=M3Y74vmsMcY) paper and reviewer discussions on the topic. **Training Procedure** Stable Diffusion v2 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training, - Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4 - Text prompts are encoded through the OpenCLIP-ViT/H text-encoder. - The output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention. - The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet. We also use the so-called _v-objective_, see https://arxiv.org/abs/2202.00512. We currently provide the following checkpoints: - `512-base-ema.ckpt`: 550k steps at resolution `256x256` on a subset of [LAION-5B](https://laion.ai/blog/laion-5b/) filtered for explicit pornographic material, using the [LAION-NSFW classifier](https://github.com/LAION-AI/CLIP-based-NSFW-Detector) with `punsafe=0.1` and an [aesthetic score](https://github.com/christophschuhmann/improved-aesthetic-predictor) >= `4.5`. 850k steps at resolution `512x512` on the same dataset with resolution `>= 512x512`. - `768-v-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for 150k steps using a [v-objective](https://arxiv.org/abs/2202.00512) on the same dataset. Resumed for another 140k steps on a `768x768` subset of our dataset. - `512-depth-ema.ckpt`: Resumed from `512-base-ema.ckpt` and finetuned for 200k steps. Added an extra input channel to process the (relative) depth prediction produced by [MiDaS](https://github.com/isl-org/MiDaS) (`dpt_hybrid`) which is used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. - `512-inpainting-ema.ckpt`: Resumed from `512-base-ema.ckpt` and trained for another 200k steps. Follows the mask-generation strategy presented in [LAMA](https://github.com/saic-mdal/lama) which, in combination with the latent VAE representations of the masked image, are used as an additional conditioning. The additional input channels of the U-Net which process this extra information were zero-initialized. The same strategy was used to train the [1.5-inpainting checkpoint](https://github.com/saic-mdal/lama). - `x4-upscaling-ema.ckpt`: Trained for 1.25M steps on a 10M subset of LAION containing images `>2048x2048`. The model was trained on crops of size `512x512` and is a text-guided [latent upscaling diffusion model](https://arxiv.org/abs/2112.10752). In addition to the textual input, it receives a `noise_level` as an input parameter, which can be used to add noise to the low-resolution input according to a [predefined diffusion schedule](configs/stable-diffusion/x4-upscaling.yaml). - **Hardware:** 32 x 8 x A100 GPUs - **Optimizer:** AdamW - **Gradient Accumulations**: 1 - **Batch:** 32 x 8 x 2 x 4 = 2048 - **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant ## Evaluation Results Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0) and 50 steps DDIM sampling steps show the relative improvements of the checkpoints: ![pareto](model-variants.jpg) Evaluated using 50 DDIM steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores. ## Environmental Impact **Stable Diffusion v1** **Estimated Emissions** Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact. - **Hardware Type:** A100 PCIe 40GB - **Hours used:** 200000 - **Cloud Provider:** AWS - **Compute Region:** US-east - **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 15000 kg CO2 eq. ## Citation @InProceedings{Rombach_2022_CVPR, author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn}, title = {High-Resolution Image Synthesis With Latent Diffusion Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {10684-10695} } *This model card was written by: Robin Rombach, Patrick Esser and David Ha and is based on the [Stable Diffusion v1](https://github.com/CompVis/stable-diffusion/blob/main/Stable_Diffusion_v1_Model_Card.md) and [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).*
13,062
[ [ -0.035614013671875, -0.06195068359375, 0.02667236328125, 0.022918701171875, -0.0135498046875, -0.0146026611328125, 0.0054931640625, -0.0367431640625, 0.00534820556640625, 0.031951904296875, -0.033782958984375, -0.0258636474609375, -0.044891357421875, -0.00807952880859375, -0.01422119140625, 0.0673828125, -0.0013580322265625, -0.0017652511596679688, -0.0241241455078125, 0.004512786865234375, -0.0260772705078125, -0.01126861572265625, -0.079345703125, -0.0236663818359375, 0.0228271484375, 0.0196533203125, 0.04376220703125, 0.03680419921875, 0.0303497314453125, 0.020477294921875, -0.0310516357421875, 0.005153656005859375, -0.049285888671875, -0.002864837646484375, 0.0023975372314453125, -0.0250244140625, -0.0301361083984375, 0.00977325439453125, 0.05615234375, 0.0201568603515625, 0.0007109642028808594, 0.0008192062377929688, 0.006580352783203125, 0.045654296875, -0.039459228515625, -0.0097808837890625, -0.0186920166015625, 0.011627197265625, -0.019683837890625, 0.0201263427734375, -0.0286407470703125, -0.0198974609375, 0.0036029815673828125, -0.06036376953125, 0.0177459716796875, -0.0233917236328125, 0.0869140625, 0.0255889892578125, -0.0157928466796875, -0.003253936767578125, -0.041107177734375, 0.045318603515625, -0.049072265625, 0.008087158203125, 0.03424072265625, 0.01422119140625, -0.01605224609375, -0.0753173828125, -0.0546875, -0.00677490234375, 0.0028896331787109375, 0.039276123046875, -0.0301361083984375, -0.00981903076171875, 0.03619384765625, 0.0258026123046875, -0.0411376953125, -0.00803375244140625, -0.029632568359375, -0.002918243408203125, 0.045928955078125, 0.00691986083984375, 0.0276641845703125, -0.012176513671875, -0.036895751953125, -0.011871337890625, -0.037811279296875, 0.01306915283203125, 0.0310821533203125, -0.01593017578125, -0.03253173828125, 0.034881591796875, 0.00313568115234375, 0.033447265625, 0.030059814453125, -0.01971435546875, 0.024627685546875, -0.022186279296875, -0.0181427001953125, -0.035430908203125, 0.06890869140625, 0.04791259765625, -0.0088348388671875, 0.0102081298828125, -0.01221466064453125, 0.01219940185546875, 0.0013856887817382812, -0.0977783203125, -0.031463623046875, 0.00922393798828125, -0.044891357421875, -0.04241943359375, -0.01152801513671875, -0.07867431640625, -0.0207672119140625, 0.01529693603515625, 0.03857421875, -0.0269622802734375, -0.038330078125, -0.00043272972106933594, -0.03302001953125, 0.01345062255859375, 0.03369140625, -0.052642822265625, 0.0178680419921875, 0.00402069091796875, 0.08135986328125, -0.0228424072265625, 0.00992584228515625, -0.0166015625, 0.004978179931640625, -0.019989013671875, 0.056427001953125, -0.033233642578125, -0.038543701171875, -0.019287109375, 0.0279998779296875, 0.00801849365234375, -0.047119140625, 0.04052734375, -0.0233154296875, 0.033050537109375, -0.003936767578125, -0.0276947021484375, -0.0196380615234375, -0.0010347366333007812, -0.053497314453125, 0.08984375, 0.02093505859375, -0.0635986328125, 0.0007271766662597656, -0.05816650390625, -0.0139312744140625, -0.003963470458984375, 0.0107269287109375, -0.0562744140625, -0.012542724609375, 0.0016021728515625, 0.029571533203125, -0.01448822021484375, 0.01104736328125, -0.0248565673828125, -0.0195465087890625, 0.004421234130859375, -0.0282745361328125, 0.08148193359375, 0.0221405029296875, -0.0372314453125, 0.0019254684448242188, -0.05523681640625, -0.0262451171875, 0.0380859375, -0.01702880859375, -0.01166534423828125, -0.0253448486328125, 0.0197906494140625, 0.034698486328125, 0.010986328125, -0.044769287109375, -0.003940582275390625, -0.03118896484375, 0.038421630859375, 0.055511474609375, 0.0183563232421875, 0.054656982421875, -0.0302886962890625, 0.04779052734375, 0.0193939208984375, 0.0180511474609375, -0.0157623291015625, -0.07244873046875, -0.0535888671875, -0.0248260498046875, 0.009521484375, 0.041229248046875, -0.064697265625, 0.0117645263671875, 0.013336181640625, -0.053375244140625, -0.01873779296875, -0.007556915283203125, 0.03179931640625, 0.057891845703125, 0.01654052734375, -0.0404052734375, -0.0248260498046875, -0.060638427734375, 0.023345947265625, -0.00809478759765625, 0.00998687744140625, 0.017486572265625, 0.0428466796875, -0.0266265869140625, 0.041717529296875, -0.041229248046875, -0.0245513916015625, 0.00649261474609375, 0.0008788108825683594, 0.0013980865478515625, 0.05523681640625, 0.055206298828125, -0.06915283203125, -0.05352783203125, -0.012176513671875, -0.06146240234375, -0.0044708251953125, 0.0021915435791015625, -0.030792236328125, 0.03265380859375, 0.038848876953125, -0.05462646484375, 0.043792724609375, 0.04052734375, -0.0228424072265625, 0.033477783203125, -0.018402099609375, -0.0030612945556640625, -0.0841064453125, 0.00995635986328125, 0.0288238525390625, -0.025848388671875, -0.048583984375, 0.01290130615234375, -0.00511932373046875, -0.0194549560546875, -0.045440673828125, 0.0625, -0.031646728515625, 0.0357666015625, -0.0299224853515625, -0.00328826904296875, 0.011749267578125, 0.035400390625, 0.019500732421875, 0.0447998046875, 0.061676025390625, -0.036895751953125, 0.0097808837890625, 0.0203399658203125, -0.01183319091796875, 0.04010009765625, -0.060272216796875, 0.017120361328125, -0.0265045166015625, 0.0282745361328125, -0.0869140625, -0.0207061767578125, 0.055572509765625, -0.034210205078125, 0.031494140625, -0.01239013671875, -0.032745361328125, -0.0271759033203125, -0.022674560546875, 0.039215087890625, 0.07110595703125, -0.0293426513671875, 0.039459228515625, 0.03857421875, 0.0052490234375, -0.03143310546875, -0.056121826171875, -0.0026607513427734375, -0.033599853515625, -0.06414794921875, 0.044647216796875, -0.021728515625, -0.01169586181640625, 0.006862640380859375, 0.0181121826171875, -0.0015621185302734375, -0.00913238525390625, 0.035125732421875, 0.01230621337890625, -0.000843048095703125, -0.01137542724609375, 0.0087890625, -0.0185546875, -0.0012226104736328125, 0.0023403167724609375, 0.0265960693359375, 0.006450653076171875, -0.007648468017578125, -0.05853271484375, 0.035736083984375, 0.04229736328125, -0.00028514862060546875, 0.049102783203125, 0.06951904296875, -0.043701171875, 0.0007610321044921875, -0.02789306640625, -0.0160369873046875, -0.035491943359375, 0.0181427001953125, -0.0164031982421875, -0.046539306640625, 0.049957275390625, -0.0038509368896484375, -0.001720428466796875, 0.054656982421875, 0.05767822265625, -0.016845703125, 0.0849609375, 0.053924560546875, 0.026397705078125, 0.057464599609375, -0.054595947265625, -0.0055084228515625, -0.06756591796875, -0.0290069580078125, -0.0184326171875, -0.01824951171875, -0.034637451171875, -0.0472412109375, 0.027557373046875, 0.006214141845703125, -0.00788116455078125, 0.01308441162109375, -0.04998779296875, 0.028564453125, 0.023468017578125, 0.0204315185546875, -0.0029163360595703125, 0.0115966796875, 0.01065826416015625, -0.01183319091796875, -0.05291748046875, -0.044952392578125, 0.0673828125, 0.040313720703125, 0.061370849609375, 0.0046539306640625, 0.0379638671875, 0.0171051025390625, 0.0278167724609375, -0.03521728515625, 0.041534423828125, -0.0243377685546875, -0.041168212890625, -0.01245880126953125, -0.00801849365234375, -0.06414794921875, 0.017242431640625, -0.0215911865234375, -0.0321044921875, 0.041595458984375, 0.0100250244140625, -0.0242156982421875, 0.026153564453125, -0.06146240234375, 0.07037353515625, 0.007122039794921875, -0.05206298828125, -0.007335662841796875, -0.059539794921875, 0.034271240234375, -0.00005048513412475586, 0.00897979736328125, -0.005229949951171875, -0.0011510848999023438, 0.06512451171875, -0.0276336669921875, 0.0765380859375, -0.030517578125, 0.0021877288818359375, 0.027740478515625, -0.0108642578125, 0.03326416015625, 0.011077880859375, -0.01279449462890625, 0.0179901123046875, 0.0009579658508300781, -0.0262603759765625, -0.0289306640625, 0.05352783203125, -0.0615234375, -0.038299560546875, -0.03375244140625, -0.0260162353515625, 0.035430908203125, 0.0070343017578125, 0.051910400390625, 0.029083251953125, -0.013946533203125, -0.005157470703125, 0.0694580078125, -0.0277099609375, 0.037811279296875, 0.0102081298828125, -0.01910400390625, -0.04095458984375, 0.0679931640625, 0.01074981689453125, 0.0433349609375, 0.00005632638931274414, 0.0152587890625, -0.0177764892578125, -0.0249176025390625, -0.0498046875, 0.028778076171875, -0.06378173828125, -0.0164642333984375, -0.057952880859375, -0.022613525390625, -0.0289459228515625, -0.006946563720703125, -0.016845703125, -0.0281982421875, -0.065673828125, 0.015594482421875, 0.028411865234375, 0.042724609375, -0.0213165283203125, 0.0277862548828125, -0.0201263427734375, 0.032928466796875, 0.01910400390625, 0.01171112060546875, 0.0060272216796875, -0.0557861328125, -0.01410675048828125, 0.00565338134765625, -0.05316162109375, -0.06890869140625, 0.026947021484375, 0.00969696044921875, 0.042694091796875, 0.036285400390625, -0.007526397705078125, 0.05010986328125, -0.02301025390625, 0.075927734375, 0.0167999267578125, -0.043792724609375, 0.049224853515625, -0.022369384765625, 0.0149688720703125, 0.0167388916015625, 0.03369140625, -0.029632568359375, -0.040863037109375, -0.06671142578125, -0.062469482421875, 0.05242919921875, 0.0245361328125, 0.0247344970703125, -0.0095062255859375, 0.04345703125, -0.0020885467529296875, -0.0098724365234375, -0.076416015625, -0.036407470703125, -0.026336669921875, 0.00787353515625, 0.008941650390625, -0.031890869140625, -0.00757598876953125, -0.042633056640625, 0.07965087890625, 0.0037441253662109375, 0.040496826171875, 0.0263671875, 0.002262115478515625, -0.03265380859375, -0.03253173828125, 0.04913330078125, 0.023162841796875, -0.0119171142578125, -0.003448486328125, -0.0013608932495117188, -0.034393310546875, 0.01617431640625, 0.0042572021484375, -0.045074462890625, 0.0088958740234375, -0.0072174072265625, 0.06500244140625, -0.02093505859375, -0.03863525390625, 0.0540771484375, -0.01654052734375, -0.0204315185546875, -0.035003662109375, 0.006153106689453125, 0.00853729248046875, 0.0225372314453125, 0.00377655029296875, 0.03668212890625, 0.01435089111328125, -0.0253143310546875, 0.00479888916015625, 0.04229736328125, -0.03173828125, -0.0226593017578125, 0.07672119140625, -0.0008025169372558594, -0.028839111328125, 0.036834716796875, -0.03533935546875, -0.020263671875, 0.054901123046875, 0.062164306640625, 0.06695556640625, -0.017303466796875, 0.032745361328125, 0.05621337890625, 0.0146942138671875, -0.0136871337890625, 0.0149688720703125, 0.01256561279296875, -0.048248291015625, -0.01092529296875, -0.036376953125, -0.00789642333984375, 0.0178985595703125, -0.02777099609375, 0.0325927734375, -0.038177490234375, -0.027191162109375, 0.00395965576171875, -0.01739501953125, -0.03399658203125, 0.01387786865234375, 0.0215911865234375, 0.06561279296875, -0.0860595703125, 0.0579833984375, 0.05706787109375, -0.0533447265625, -0.039093017578125, 0.0037899017333984375, -0.003204345703125, -0.032379150390625, 0.0419921875, 0.013763427734375, -0.004840850830078125, 0.002933502197265625, -0.0672607421875, -0.063720703125, 0.08624267578125, 0.0281219482421875, -0.009368896484375, 0.00920867919921875, -0.0192718505859375, 0.044464111328125, -0.03619384765625, 0.021728515625, 0.0253448486328125, 0.03155517578125, 0.0307464599609375, -0.043701171875, 0.0182647705078125, -0.044158935546875, 0.0310821533203125, -0.01195526123046875, -0.069580078125, 0.07135009765625, -0.0153656005859375, -0.029022216796875, 0.02691650390625, 0.053253173828125, 0.0196533203125, 0.0210723876953125, 0.037811279296875, 0.0732421875, 0.04010009765625, -0.00839996337890625, 0.07757568359375, 0.0030059814453125, 0.0194549560546875, 0.044464111328125, 0.00334930419921875, 0.04388427734375, 0.033172607421875, -0.0093994140625, 0.048126220703125, 0.0635986328125, -0.0194244384765625, 0.061676025390625, 0.00797271728515625, -0.0265045166015625, -0.0014448165893554688, -0.0122833251953125, -0.032806396484375, 0.0052490234375, 0.0245361328125, -0.032318115234375, -0.01236724853515625, 0.0239715576171875, 0.00616455078125, -0.0088043212890625, -0.004421234130859375, 0.058868408203125, 0.0044708251953125, -0.035797119140625, 0.05419921875, 0.0081024169921875, 0.06842041015625, -0.032745361328125, -0.025787353515625, -0.015899658203125, 0.007030487060546875, -0.021881103515625, -0.06634521484375, 0.035247802734375, -0.01329803466796875, -0.02020263671875, -0.0213623046875, 0.06707763671875, -0.030029296875, -0.03948974609375, 0.0204010009765625, 0.0165863037109375, 0.0399169921875, 0.004032135009765625, -0.07281494140625, 0.012237548828125, 0.0025787353515625, -0.0201873779296875, 0.0234527587890625, 0.02203369140625, -0.007755279541015625, 0.04083251953125, 0.047393798828125, -0.00391387939453125, 0.00246429443359375, -0.0022563934326171875, 0.0648193359375, -0.0194244384765625, -0.029998779296875, -0.05706787109375, 0.0634765625, -0.010833740234375, -0.0213623046875, 0.052001953125, 0.045928955078125, 0.0592041015625, -0.01148223876953125, 0.05389404296875, -0.015045166015625, 0.002376556396484375, -0.0311737060546875, 0.06402587890625, -0.06500244140625, -0.006465911865234375, -0.032928466796875, -0.0653076171875, -0.021728515625, 0.0667724609375, -0.007190704345703125, 0.01055145263671875, 0.029083251953125, 0.08935546875, -0.007419586181640625, -0.0205841064453125, 0.0215301513671875, 0.0145263671875, 0.0277557373046875, 0.028289794921875, 0.06585693359375, -0.05145263671875, 0.025390625, -0.042327880859375, -0.028106689453125, -0.0055999755859375, -0.06402587890625, -0.06573486328125, -0.060333251953125, -0.053680419921875, -0.051361083984375, -0.01108551025390625, 0.032684326171875, 0.06695556640625, -0.03857421875, -0.008026123046875, -0.006465911865234375, 0.0018987655639648438, -0.01282501220703125, -0.0208587646484375, 0.0222930908203125, 0.006763458251953125, -0.0712890625, 0.0005259513854980469, 0.026458740234375, 0.038909912109375, -0.03326416015625, -0.0160064697265625, -0.018035888671875, -0.00875091552734375, 0.045806884765625, 0.0235443115234375, -0.057281494140625, -0.0018835067749023438, -0.009674072265625, -0.005756378173828125, 0.01450347900390625, 0.018035888671875, -0.050140380859375, 0.04058837890625, 0.042266845703125, 0.0197601318359375, 0.06536865234375, -0.00995635986328125, 0.00916290283203125, -0.042022705078125, 0.0247802734375, 0.003955841064453125, 0.0256500244140625, 0.0322265625, -0.04388427734375, 0.03570556640625, 0.0443115234375, -0.053009033203125, -0.0635986328125, 0.0115509033203125, -0.08770751953125, -0.014556884765625, 0.09185791015625, -0.00635528564453125, -0.0279693603515625, 0.005344390869140625, -0.038360595703125, 0.024139404296875, -0.0306854248046875, 0.039215087890625, 0.042694091796875, -0.01450347900390625, -0.03900146484375, -0.041229248046875, 0.046539306640625, 0.012176513671875, -0.04913330078125, -0.0222625732421875, 0.044219970703125, 0.0443115234375, 0.0219268798828125, 0.07464599609375, -0.0266571044921875, 0.0233001708984375, 0.0007338523864746094, 0.0153045654296875, 0.0037631988525390625, -0.0075531005859375, -0.038482666015625, 0.0021953582763671875, -0.01432037353515625, -0.0119781494140625 ] ]
bert-large-uncased-whole-word-masking-finetuned-squad
2023-04-06T13:42:50.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "bert", "question-answering", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1810.04805", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
null
null
null
bert-large-uncased-whole-word-masking-finetuned-squad
97
177,375
transformers
2022-03-02T23:29:04
--- language: en license: apache-2.0 datasets: - bookcorpus - wikipedia --- # BERT large model (uncased) whole word masking finetuned on SQuAD Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1810.04805) and first released in [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference between english and English. Differently to other BERT models, this model was trained with a new technique: Whole Word Masking. In this case, all of the tokens corresponding to a word are masked at once. The overall masking rate remains the same. The training is identical -- each masked WordPiece token is predicted independently. After pre-training, this model was fine-tuned on the SQuAD dataset with one of our fine-tuning scripts. See below for more information regarding this fine-tuning. Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. This model has the following configuration: - 24-layer - 1024 hidden dimension - 16 attention heads - 336M parameters. ## Intended uses & limitations This model should be used as a question-answering model. You may use it in a question answering pipeline, or use it to output raw results given a query and a context. You may see other use cases in the [task summary](https://huggingface.co/transformers/task_summary.html#extractive-question-answering) of the transformers documentation.## Training data The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### Fine-tuning After pre-training, this model was fine-tuned on the SQuAD dataset with one of our fine-tuning scripts. In order to reproduce the training, you may use the following command: ``` python -m torch.distributed.launch --nproc_per_node=8 ./examples/question-answering/run_qa.py \ --model_name_or_path bert-large-uncased-whole-word-masking \ --dataset_name squad \ --do_train \ --do_eval \ --learning_rate 3e-5 \ --num_train_epochs 2 \ --max_seq_length 384 \ --doc_stride 128 \ --output_dir ./examples/models/wwm_uncased_finetuned_squad/ \ --per_device_eval_batch_size=3 \ --per_device_train_batch_size=3 \ ``` ## Evaluation results The results obtained are the following: ``` f1 = 93.15 exact_match = 86.91 ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1810-04805, author = {Jacob Devlin and Ming{-}Wei Chang and Kenton Lee and Kristina Toutanova}, title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language Understanding}, journal = {CoRR}, volume = {abs/1810.04805}, year = {2018}, url = {http://arxiv.org/abs/1810.04805}, archivePrefix = {arXiv}, eprint = {1810.04805}, timestamp = {Tue, 30 Oct 2018 20:39:56 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
6,240
[ [ -0.03753662109375, -0.0594482421875, 0.0166015625, 0.0260772705078125, -0.0277557373046875, -0.0024852752685546875, -0.02532958984375, -0.041259765625, 0.0197296142578125, 0.03790283203125, -0.061553955078125, -0.02630615234375, -0.045257568359375, 0.0091400146484375, -0.025482177734375, 0.085693359375, 0.008514404296875, 0.0104522705078125, -0.0050201416015625, 0.009429931640625, -0.035675048828125, -0.049957275390625, -0.048187255859375, -0.0255889892578125, 0.033966064453125, 0.0168304443359375, 0.03302001953125, 0.033294677734375, 0.037628173828125, 0.0258331298828125, -0.0032062530517578125, 0.00152587890625, -0.03656005859375, -0.006809234619140625, 0.0080718994140625, -0.0279998779296875, -0.01904296875, 0.0208740234375, 0.048828125, 0.049835205078125, 0.0014142990112304688, 0.00969696044921875, 0.0017232894897460938, 0.03912353515625, -0.024200439453125, 0.01323699951171875, -0.057708740234375, 0.0042572021484375, -0.00768280029296875, 0.0008220672607421875, -0.034210205078125, -0.01299285888671875, 0.028472900390625, -0.035064697265625, 0.0293121337890625, 0.007328033447265625, 0.09356689453125, 0.00664520263671875, -0.0095367431640625, -0.00762939453125, -0.045623779296875, 0.06756591796875, -0.057525634765625, 0.0259552001953125, 0.033111572265625, 0.01250457763671875, 0.003498077392578125, -0.07208251953125, -0.057769775390625, -0.0177154541015625, -0.0149078369140625, 0.0165252685546875, -0.00815582275390625, 0.010345458984375, 0.0230712890625, 0.016204833984375, -0.04730224609375, 0.017669677734375, -0.045257568359375, -0.024932861328125, 0.0562744140625, -0.007297515869140625, 0.00637054443359375, -0.0160369873046875, -0.04693603515625, -0.0235748291015625, -0.037750244140625, 0.01513671875, 0.031890869140625, 0.0203704833984375, -0.01195526123046875, 0.041656494140625, -0.00675201416015625, 0.048187255859375, 0.006198883056640625, 0.0015926361083984375, 0.03564453125, -0.021240234375, -0.0253448486328125, -0.0015630722045898438, 0.06671142578125, 0.016357421875, 0.035064697265625, -0.0205230712890625, -0.02130126953125, -0.01023101806640625, 0.032379150390625, -0.06805419921875, -0.0181121826171875, 0.013916015625, -0.030731201171875, -0.0187835693359375, 0.01245880126953125, -0.0347900390625, 0.006511688232421875, -0.0161895751953125, 0.05609130859375, -0.045166015625, 0.0010766983032226562, 0.0100250244140625, -0.0174713134765625, 0.0291290283203125, 0.00540924072265625, -0.055877685546875, 0.01073455810546875, 0.03631591796875, 0.048583984375, -0.009185791015625, -0.0297393798828125, -0.015838623046875, -0.01145172119140625, -0.0184783935546875, 0.050567626953125, -0.0203857421875, -0.01105499267578125, 0.004119873046875, 0.017730712890625, -0.01837158203125, -0.019744873046875, 0.031829833984375, -0.051605224609375, 0.034454345703125, -0.008270263671875, -0.05450439453125, -0.0218658447265625, 0.00743865966796875, -0.038543701171875, 0.08428955078125, 0.0019626617431640625, -0.044281005859375, 0.03076171875, -0.0474853515625, -0.040435791015625, 0.002849578857421875, 0.0136260986328125, -0.043304443359375, 0.001190185546875, 0.0184783935546875, 0.042816162109375, -0.0009813308715820312, 0.020904541015625, -0.01416015625, -0.03204345703125, 0.02716064453125, -0.023834228515625, 0.07879638671875, 0.006313323974609375, -0.0299224853515625, 0.0021533966064453125, -0.054412841796875, 0.007251739501953125, 0.0132293701171875, -0.0252532958984375, -0.004886627197265625, -0.00453948974609375, 0.00551605224609375, 0.0256500244140625, 0.0287933349609375, -0.0528564453125, 0.004749298095703125, -0.047607421875, 0.03302001953125, 0.056854248046875, 0.0013589859008789062, 0.022247314453125, -0.030426025390625, 0.0291748046875, 0.001873016357421875, 0.01253509521484375, -0.01335906982421875, -0.050201416015625, -0.0709228515625, -0.0239105224609375, 0.042510986328125, 0.049652099609375, -0.04449462890625, 0.06121826171875, -0.0109710693359375, -0.038055419921875, -0.0567626953125, 0.0036678314208984375, 0.038299560546875, 0.037628173828125, 0.0421142578125, -0.0282745361328125, -0.06329345703125, -0.0792236328125, 0.004611968994140625, -0.01097869873046875, -0.01381683349609375, 0.01044464111328125, 0.04876708984375, -0.01335906982421875, 0.07061767578125, -0.039154052734375, -0.0286102294921875, -0.031402587890625, 0.01715087890625, 0.01482391357421875, 0.046630859375, 0.032684326171875, -0.0460205078125, -0.03350830078125, -0.030426025390625, -0.04425048828125, -0.003795623779296875, -0.01291656494140625, -0.01479339599609375, 0.0294952392578125, 0.047943115234375, -0.048187255859375, 0.0298309326171875, 0.035125732421875, -0.012786865234375, 0.040496826171875, -0.0178985595703125, -0.0163116455078125, -0.09051513671875, 0.0049285888671875, -0.005435943603515625, -0.007232666015625, -0.047821044921875, -0.002155303955078125, 0.0022106170654296875, -0.007701873779296875, -0.03875732421875, 0.0345458984375, -0.03607177734375, 0.00397491455078125, -0.00786590576171875, 0.01110076904296875, 0.00421142578125, 0.06378173828125, 0.0121917724609375, 0.05548095703125, 0.030670166015625, -0.045745849609375, 0.01611328125, 0.0310516357421875, -0.043487548828125, 0.0097503662109375, -0.06524658203125, 0.0184326171875, -0.00797271728515625, 0.0169525146484375, -0.09002685546875, 0.0010547637939453125, 0.0006504058837890625, -0.048828125, 0.033416748046875, 0.0102691650390625, -0.0469970703125, -0.042449951171875, -0.023345947265625, 0.01398468017578125, 0.056854248046875, -0.0232086181640625, 0.0226287841796875, 0.0236663818359375, -0.003063201904296875, -0.049591064453125, -0.059600830078125, -0.007152557373046875, -0.0005922317504882812, -0.046478271484375, 0.04345703125, -0.017608642578125, 0.007640838623046875, -0.008392333984375, 0.00262451171875, -0.0136566162109375, 0.00516510009765625, 0.01410675048828125, 0.02789306640625, -0.0144805908203125, 0.024261474609375, -0.0055389404296875, 0.0026988983154296875, 0.00348663330078125, -0.01763916015625, 0.059722900390625, -0.0014190673828125, -0.0029163360595703125, -0.034027099609375, 0.02734375, 0.0208892822265625, -0.0159912109375, 0.07574462890625, 0.06903076171875, -0.028961181640625, -0.001529693603515625, -0.05352783203125, -0.0269927978515625, -0.037322998046875, 0.041717529296875, -0.0221710205078125, -0.0618896484375, 0.0430908203125, 0.0301666259765625, 0.00989532470703125, 0.055450439453125, 0.0380859375, -0.03155517578125, 0.070068359375, 0.05517578125, -0.005466461181640625, 0.04144287109375, -0.0291900634765625, 0.0194854736328125, -0.060455322265625, -0.034027099609375, -0.035888671875, -0.03302001953125, -0.050933837890625, -0.01953125, 0.0140533447265625, 0.01898193359375, -0.0272064208984375, 0.03741455078125, -0.0377197265625, 0.022247314453125, 0.0625, 0.026336669921875, -0.00664520263671875, 0.005390167236328125, -0.0164337158203125, -0.003948211669921875, -0.04986572265625, -0.022064208984375, 0.09027099609375, 0.032379150390625, 0.034698486328125, -0.006191253662109375, 0.053192138671875, 0.0136566162109375, 0.01031494140625, -0.060150146484375, 0.050201416015625, -0.01508331298828125, -0.0665283203125, -0.032958984375, -0.0155487060546875, -0.0880126953125, 0.0037059783935546875, -0.030609130859375, -0.052459716796875, 0.0017375946044921875, -0.00821685791015625, -0.0205535888671875, 0.017242431640625, -0.0694580078125, 0.058380126953125, -0.01122283935546875, -0.005245208740234375, -0.0054473876953125, -0.07391357421875, 0.0164794921875, -0.01255035400390625, -0.01236724853515625, 0.004486083984375, 0.01245880126953125, 0.08197021484375, -0.0338134765625, 0.072265625, -0.01334381103515625, 0.008697509765625, 0.020904541015625, -0.028076171875, 0.0277557373046875, -0.017120361328125, 0.00820159912109375, 0.027008056640625, -0.007411956787109375, -0.0377197265625, -0.023590087890625, 0.0269927978515625, -0.07086181640625, -0.04656982421875, -0.03204345703125, -0.051116943359375, -0.0167694091796875, 0.022003173828125, 0.040557861328125, 0.027099609375, -0.00957489013671875, 0.032073974609375, 0.0623779296875, -0.033050537109375, 0.05364990234375, 0.036651611328125, -0.00568389892578125, -0.018707275390625, 0.049346923828125, 0.01399993896484375, 0.0250091552734375, 0.045318603515625, 0.006496429443359375, -0.037200927734375, -0.037628173828125, -0.0167236328125, 0.0290985107421875, -0.0322265625, -0.019775390625, -0.06201171875, -0.051666259765625, -0.0477294921875, -0.0025272369384765625, -0.01471710205078125, -0.036407470703125, -0.0257415771484375, -0.00909423828125, 0.0309906005859375, 0.04095458984375, 0.00015294551849365234, 0.03826904296875, -0.0494384765625, 0.020050048828125, 0.037139892578125, 0.020538330078125, -0.00778961181640625, -0.062469482421875, -0.0284271240234375, 0.0147552490234375, -0.03155517578125, -0.058380126953125, 0.027313232421875, 0.0221099853515625, 0.044281005859375, 0.029327392578125, 0.00240325927734375, 0.05224609375, -0.048431396484375, 0.06854248046875, 0.0138702392578125, -0.06298828125, 0.04296875, -0.008819580078125, 0.0258026123046875, 0.035491943359375, 0.031890869140625, -0.0244903564453125, -0.03289794921875, -0.06646728515625, -0.06756591796875, 0.05999755859375, 0.0225067138671875, 0.0270843505859375, -0.00615692138671875, 0.0163116455078125, 0.0002810955047607422, 0.02374267578125, -0.06939697265625, -0.0309295654296875, -0.0125885009765625, -0.00930023193359375, -0.025177001953125, -0.0298004150390625, -0.0002460479736328125, -0.036895751953125, 0.060333251953125, 0.0181427001953125, 0.06085205078125, 0.01360321044921875, -0.0223846435546875, 0.01611328125, 0.00894927978515625, 0.0584716796875, 0.038421630859375, -0.03875732421875, -0.01131439208984375, 0.01409149169921875, -0.0546875, -0.01274871826171875, 0.0275115966796875, -0.00921630859375, 0.0155029296875, 0.0364990234375, 0.07476806640625, 0.0106201171875, -0.0458984375, 0.045501708984375, 0.01007843017578125, -0.0294036865234375, -0.034576416015625, -0.004825592041015625, -0.0034999847412109375, 0.0167694091796875, 0.039154052734375, -0.003047943115234375, -0.00893402099609375, -0.043304443359375, 0.027740478515625, 0.037200927734375, -0.0291900634765625, -0.0165863037109375, 0.047607421875, 0.00799560546875, -0.0294952392578125, 0.058685302734375, -0.01552581787109375, -0.05743408203125, 0.051300048828125, 0.049530029296875, 0.06939697265625, -0.017913818359375, 0.0184326171875, 0.0296478271484375, 0.044677734375, 0.00408935546875, 0.01456451416015625, 0.00516510009765625, -0.0650634765625, -0.03668212890625, -0.059814453125, -0.01274871826171875, 0.0298004150390625, -0.052703857421875, 0.0176544189453125, -0.025543212890625, -0.004665374755859375, 0.00853729248046875, 0.01763916015625, -0.06610107421875, 0.031982421875, 0.00229644775390625, 0.07708740234375, -0.0638427734375, 0.07293701171875, 0.045806884765625, -0.0439453125, -0.06951904296875, -0.004009246826171875, -0.03424072265625, -0.0892333984375, 0.056640625, 0.02362060546875, 0.0212554931640625, 0.00670623779296875, -0.04742431640625, -0.05303955078125, 0.07373046875, 0.01360321044921875, -0.0345458984375, -0.01409149169921875, 0.01248931884765625, 0.048095703125, -0.03179931640625, 0.0182647705078125, 0.03216552734375, 0.022552490234375, 0.006412506103515625, -0.06646728515625, -0.002101898193359375, -0.0163116455078125, 0.00542449951171875, 0.00641632080078125, -0.040130615234375, 0.07159423828125, -0.01149749755859375, -0.003353118896484375, 0.0029354095458984375, 0.041900634765625, 0.01107025146484375, 0.00809478759765625, 0.034942626953125, 0.051666259765625, 0.048583984375, -0.01387786865234375, 0.07611083984375, -0.0207366943359375, 0.02825927734375, 0.075927734375, 0.0042266845703125, 0.07281494140625, 0.0335693359375, -0.0216522216796875, 0.048828125, 0.0535888671875, -0.01131439208984375, 0.05804443359375, 0.018463134765625, -0.0014677047729492188, -0.009429931640625, 0.007648468017578125, -0.03582763671875, 0.0355224609375, 0.01678466796875, -0.039093017578125, 0.0013256072998046875, 0.005008697509765625, 0.00949859619140625, -0.0180816650390625, -0.0192413330078125, 0.058990478515625, 0.0006113052368164062, -0.056884765625, 0.051177978515625, 0.0013284683227539062, 0.0562744140625, -0.058197021484375, 0.012115478515625, -0.017791748046875, 0.0037670135498046875, -0.0000036954879760742188, -0.052154541015625, 0.0191497802734375, -0.010345458984375, -0.0241241455078125, -0.0291595458984375, 0.044097900390625, -0.040435791015625, -0.0472412109375, 0.01061248779296875, 0.0297393798828125, 0.022247314453125, -0.00682830810546875, -0.056854248046875, -0.01544189453125, 0.005413055419921875, -0.020965576171875, 0.0251007080078125, 0.03173828125, 0.01285552978515625, 0.04083251953125, 0.04852294921875, -0.0013856887817382812, 0.0181884765625, 0.0000903010368347168, 0.057861328125, -0.045013427734375, -0.03924560546875, -0.055999755859375, 0.048675537109375, -0.0143585205078125, -0.033416748046875, 0.0523681640625, 0.044769287109375, 0.0745849609375, -0.016571044921875, 0.04510498046875, -0.0085601806640625, 0.0372314453125, -0.044921875, 0.053192138671875, -0.038543701171875, 0.01081085205078125, -0.019073486328125, -0.072021484375, -0.01123046875, 0.06121826171875, -0.01158905029296875, 0.0114898681640625, 0.06524658203125, 0.05364990234375, 0.00702667236328125, -0.006412506103515625, 0.0221710205078125, 0.02301025390625, 0.002124786376953125, 0.04058837890625, 0.0352783203125, -0.0479736328125, 0.049407958984375, -0.030426025390625, -0.00676727294921875, -0.012115478515625, -0.058837890625, -0.08843994140625, -0.06231689453125, -0.021240234375, -0.03765869140625, -0.0001456737518310547, 0.058868408203125, 0.058319091796875, -0.06268310546875, -0.01294708251953125, -0.006134033203125, -0.006534576416015625, -0.0245361328125, -0.0196685791015625, 0.03387451171875, -0.03826904296875, -0.0537109375, 0.0103302001953125, -0.004657745361328125, 0.01074981689453125, -0.0190887451171875, 0.0061798095703125, -0.045654296875, 0.0119476318359375, 0.054962158203125, 0.007434844970703125, -0.045257568359375, -0.031005859375, 0.0158233642578125, -0.011474609375, 0.00038886070251464844, 0.039154052734375, -0.058624267578125, 0.031341552734375, 0.0257110595703125, 0.049835205078125, 0.064453125, -0.0117340087890625, 0.037445068359375, -0.0875244140625, 0.0221405029296875, 0.01885986328125, 0.0201416015625, 0.01287078857421875, -0.0175018310546875, 0.03192138671875, 0.025970458984375, -0.040130615234375, -0.06060791015625, 0.003108978271484375, -0.08514404296875, -0.0233001708984375, 0.08282470703125, -0.0163116455078125, -0.00724029541015625, -0.0021610260009765625, -0.0105438232421875, 0.035736083984375, -0.0218658447265625, 0.06427001953125, 0.06829833984375, 0.007518768310546875, -0.0143585205078125, -0.039154052734375, 0.035125732421875, 0.034027099609375, -0.044921875, -0.033538818359375, 0.01439666748046875, 0.0253143310546875, 0.0162353515625, 0.04144287109375, 0.0070953369140625, 0.00437164306640625, -0.01568603515625, 0.018157958984375, -0.003875732421875, -0.0174713134765625, -0.0188751220703125, 0.006679534912109375, -0.0238037109375, -0.044677734375 ] ]
meta-llama/Llama-2-70b-hf
2023-08-09T15:30:59.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "facebook", "meta", "llama-2", "en", "arxiv:2307.09288", "has_space", "text-generation-inference", "region:us" ]
text-generation
meta-llama
null
null
meta-llama/Llama-2-70b-hf
650
177,169
transformers
2023-07-11T08:56:34
--- extra_gated_heading: Access Llama 2 on Hugging Face extra_gated_description: >- This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days. extra_gated_prompt: "**Your Hugging Face account email address MUST match the email you provide on the Meta website, or your request will not be approved.**" extra_gated_button_content: Submit extra_gated_fields: I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox language: - en pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 --- # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
10,360
[ [ -0.016357421875, -0.05291748046875, 0.02789306640625, 0.01482391357421875, -0.0281982421875, 0.0178375244140625, -0.00429534912109375, -0.05621337890625, 0.0052490234375, 0.022735595703125, -0.05303955078125, -0.041839599609375, -0.050811767578125, 0.0051116943359375, -0.0166168212890625, 0.08074951171875, -0.0014276504516601562, -0.021484375, -0.00916290283203125, 0.00699615478515625, -0.036407470703125, -0.029815673828125, -0.039886474609375, -0.03173828125, 0.0291748046875, 0.03643798828125, 0.04510498046875, 0.04876708984375, 0.041046142578125, 0.0183258056640625, -0.0192108154296875, 0.016510009765625, -0.05377197265625, -0.0200042724609375, 0.0096893310546875, -0.03729248046875, -0.051605224609375, 0.0122222900390625, 0.027130126953125, 0.01302337646484375, -0.0211639404296875, 0.03985595703125, 0.005710601806640625, 0.035858154296875, -0.042144775390625, 0.0126495361328125, -0.055023193359375, 0.00286865234375, -0.017120361328125, -0.00608062744140625, -0.014434814453125, -0.0220489501953125, -0.01438140869140625, -0.0621337890625, -0.00865936279296875, 0.0057220458984375, 0.07867431640625, 0.048736572265625, -0.034088134765625, -0.00861358642578125, -0.021820068359375, 0.07135009765625, -0.06353759765625, 0.004352569580078125, 0.044036865234375, 0.0215301513671875, -0.0167694091796875, -0.057464599609375, -0.048370361328125, -0.0102996826171875, 0.004833221435546875, 0.0266876220703125, -0.0308380126953125, 0.00030922889709472656, 0.01277923583984375, 0.0282135009765625, -0.043304443359375, 0.04327392578125, -0.03839111328125, -0.01300048828125, 0.07916259765625, 0.017913818359375, -0.0007805824279785156, -0.0031833648681640625, -0.03692626953125, -0.0218963623046875, -0.059906005859375, 0.013275146484375, 0.0367431640625, -0.0032901763916015625, -0.0355224609375, 0.04656982421875, -0.03125, 0.02154541015625, 0.0017766952514648438, -0.038543701171875, 0.03631591796875, -0.03546142578125, -0.0203399658203125, -0.00963592529296875, 0.0672607421875, 0.05450439453125, 0.0122222900390625, 0.007541656494140625, -0.004673004150390625, 0.00864410400390625, -0.0009784698486328125, -0.061676025390625, -0.0038661956787109375, 0.0184478759765625, -0.028045654296875, -0.044464111328125, -0.02264404296875, -0.055999755859375, -0.011627197265625, -0.007427215576171875, 0.0187225341796875, -0.0029850006103515625, -0.0291900634765625, 0.00852203369140625, 0.0036106109619140625, 0.041656494140625, 0.0157928466796875, -0.071533203125, 0.0167388916015625, 0.04241943359375, 0.0589599609375, -0.01849365234375, -0.0269622802734375, 0.0011653900146484375, -0.0017871856689453125, -0.023895263671875, 0.06805419921875, -0.0263824462890625, -0.040252685546875, -0.0167083740234375, -0.0018434524536132812, 0.012359619140625, -0.038421630859375, 0.0318603515625, -0.0298614501953125, 0.013580322265625, -0.02508544921875, -0.0280609130859375, -0.0249176025390625, 0.0148162841796875, -0.0291748046875, 0.109375, 0.0090789794921875, -0.036529541015625, 0.0231170654296875, -0.050933837890625, -0.01397705078125, -0.0152740478515625, 0.0074310302734375, -0.039642333984375, -0.02032470703125, 0.00966644287109375, 0.0272979736328125, -0.048431396484375, 0.035400390625, -0.0152740478515625, -0.032867431640625, 0.0033111572265625, -0.03125, 0.06317138671875, 0.0219268798828125, -0.035064697265625, 0.0052642822265625, -0.061920166015625, 0.005008697509765625, 0.03436279296875, -0.03582763671875, 0.0206451416015625, 0.005840301513671875, -0.0088348388671875, 0.0143280029296875, 0.03729248046875, -0.02728271484375, 0.0123138427734375, -0.0237579345703125, 0.03778076171875, 0.05633544921875, 0.0031108856201171875, 0.01258087158203125, -0.03887939453125, 0.038604736328125, -0.00299072265625, 0.029296875, 0.001514434814453125, -0.053680419921875, -0.07708740234375, -0.01375579833984375, -0.0027294158935546875, 0.06353759765625, -0.0192108154296875, 0.052703857421875, -0.0011425018310546875, -0.05609130859375, -0.031219482421875, 0.02777099609375, 0.050872802734375, 0.037811279296875, 0.032135009765625, -0.021453857421875, -0.04620361328125, -0.07598876953125, 0.004329681396484375, -0.033477783203125, -0.00213623046875, 0.026641845703125, 0.049072265625, -0.025299072265625, 0.055084228515625, -0.040771484375, -0.01320648193359375, -0.01971435546875, -0.01004791259765625, 0.004383087158203125, 0.02642822265625, 0.049468994140625, -0.0290069580078125, -0.0164337158203125, -0.009490966796875, -0.0677490234375, -0.007740020751953125, 0.009002685546875, -0.01617431640625, 0.0177459716796875, 0.0234832763671875, -0.0460205078125, 0.0340576171875, 0.053558349609375, -0.01326751708984375, 0.039337158203125, 0.0002799034118652344, -0.0130157470703125, -0.0811767578125, 0.0027313232421875, -0.0158538818359375, 0.0024585723876953125, -0.03271484375, -0.00311279296875, -0.0158538818359375, 0.006374359130859375, -0.04595947265625, 0.044769287109375, -0.023162841796875, -0.0122222900390625, -0.00989532470703125, 0.004367828369140625, 0.004425048828125, 0.046630859375, -0.0095977783203125, 0.08050537109375, 0.0303192138671875, -0.044097900390625, 0.0195770263671875, 0.0298614501953125, -0.03759765625, 0.0115966796875, -0.06658935546875, 0.0278167724609375, 0.008514404296875, 0.040069580078125, -0.07391357421875, -0.0289764404296875, 0.0242767333984375, -0.03271484375, 0.007396697998046875, 0.0175323486328125, -0.041656494140625, -0.030364990234375, -0.032318115234375, 0.0237274169921875, 0.061920166015625, -0.03399658203125, 0.012969970703125, 0.0288848876953125, 0.0019855499267578125, -0.052001953125, -0.0626220703125, 0.004474639892578125, -0.027130126953125, -0.040069580078125, 0.0225830078125, -0.01407623291015625, -0.017608642578125, -0.0196685791015625, 0.00528717041015625, -0.00033855438232421875, 0.028411865234375, 0.0277557373046875, 0.0276336669921875, -0.00914764404296875, -0.0015821456909179688, 0.0109100341796875, -0.01546478271484375, 0.0027923583984375, 0.0149993896484375, 0.04486083984375, -0.01297760009765625, -0.016876220703125, -0.05560302734375, 0.0030364990234375, 0.021270751953125, -0.0193023681640625, 0.045806884765625, 0.032379150390625, -0.01641845703125, 0.017486572265625, -0.05841064453125, -0.00843048095703125, -0.040374755859375, 0.041229248046875, -0.016265869140625, -0.062744140625, 0.039794921875, -0.0005884170532226562, 0.03302001953125, 0.056121826171875, 0.0472412109375, -0.00661468505859375, 0.060211181640625, 0.042938232421875, -0.005313873291015625, 0.0258026123046875, -0.03692626953125, -0.00795745849609375, -0.07073974609375, -0.046630859375, -0.0240631103515625, -0.033050537109375, -0.0496826171875, -0.031585693359375, 0.0196685791015625, 0.014312744140625, -0.05145263671875, 0.0242156982421875, -0.043975830078125, 0.0433349609375, 0.040008544921875, 0.01010894775390625, 0.0223388671875, 0.00832366943359375, 0.01108551025390625, 0.0037899017333984375, -0.038848876953125, -0.055908203125, 0.111083984375, 0.032379150390625, 0.033477783203125, 0.007762908935546875, 0.050872802734375, 0.01067352294921875, 0.02447509765625, -0.053070068359375, 0.049102783203125, 0.0035343170166015625, -0.0540771484375, -0.011749267578125, -0.00865936279296875, -0.067138671875, 0.01102447509765625, -0.0156402587890625, -0.059112548828125, 0.0016603469848632812, -0.0017642974853515625, -0.027496337890625, 0.02203369140625, -0.050323486328125, 0.04510498046875, -0.042755126953125, -0.0232696533203125, -0.0265655517578125, -0.060333251953125, 0.051513671875, -0.01541900634765625, 0.007038116455078125, -0.037506103515625, -0.019744873046875, 0.0679931640625, -0.026336669921875, 0.075439453125, -0.0037708282470703125, -0.0076141357421875, 0.043243408203125, -0.0138397216796875, 0.033935546875, 0.0023784637451171875, -0.020294189453125, 0.050323486328125, -0.0097503662109375, -0.0243377685546875, -0.0115509033203125, 0.040069580078125, -0.091552734375, -0.059356689453125, -0.0367431640625, -0.038238525390625, -0.00331878662109375, 0.00653839111328125, 0.038787841796875, -0.007266998291015625, -0.00258636474609375, 0.00933074951171875, 0.034393310546875, -0.038177490234375, 0.03533935546875, 0.04168701171875, -0.007541656494140625, -0.034576416015625, 0.049163818359375, 0.003704071044921875, 0.0275421142578125, 0.01690673828125, 0.0029754638671875, -0.0310211181640625, -0.031890869140625, -0.03802490234375, 0.0208587646484375, -0.035186767578125, -0.036529541015625, -0.04071044921875, -0.02679443359375, -0.0247802734375, -0.00543975830078125, -0.033416748046875, -0.0328369140625, -0.056182861328125, -0.02886962890625, 0.03936767578125, 0.061309814453125, -0.00003331899642944336, 0.048614501953125, -0.0245361328125, 0.01374053955078125, 0.0287933349609375, 0.01399993896484375, -0.001613616943359375, -0.0582275390625, 0.004688262939453125, 0.010162353515625, -0.057342529296875, -0.04638671875, 0.0177001953125, 0.02105712890625, 0.03509521484375, 0.035980224609375, -0.005748748779296875, 0.058624267578125, -0.0267791748046875, 0.08221435546875, 0.0272369384765625, -0.049835205078125, 0.0528564453125, -0.01538848876953125, 0.0029449462890625, 0.048065185546875, 0.020050048828125, -0.005878448486328125, -0.0120086669921875, -0.047821044921875, -0.05078125, 0.060516357421875, 0.01763916015625, 0.014251708984375, 0.00470733642578125, 0.0345458984375, 0.00439453125, 0.0081329345703125, -0.061767578125, -0.023193359375, -0.0206146240234375, -0.007717132568359375, -0.0149688720703125, -0.038238525390625, -0.005268096923828125, -0.0238189697265625, 0.0477294921875, 0.0041961669921875, 0.0260162353515625, -0.0105133056640625, 0.0014638900756835938, -0.0077972412109375, 0.00327301025390625, 0.05487060546875, 0.037139892578125, -0.0193328857421875, -0.0112762451171875, 0.048553466796875, -0.04779052734375, 0.02593994140625, 0.0006551742553710938, -0.00927734375, -0.027862548828125, 0.0306396484375, 0.06658935546875, 0.0195465087890625, -0.053131103515625, 0.0256195068359375, 0.01065826416015625, -0.0279083251953125, -0.031707763671875, 0.0276336669921875, 0.006557464599609375, 0.024932861328125, 0.0209503173828125, -0.010772705078125, 0.005542755126953125, -0.03802490234375, -0.008941650390625, 0.02911376953125, 0.0087432861328125, -0.032073974609375, 0.07513427734375, 0.0239715576171875, -0.0218353271484375, 0.04010009765625, -0.01293182373046875, -0.027496337890625, 0.0679931640625, 0.04754638671875, 0.04901123046875, -0.020233154296875, 0.00901031494140625, 0.053497314453125, 0.033935546875, -0.0173797607421875, 0.0172271728515625, -0.0011005401611328125, -0.036773681640625, -0.0160369873046875, -0.05255126953125, -0.035369873046875, 0.02679443359375, -0.04241943359375, 0.0233917236328125, -0.046966552734375, -0.0207366943359375, -0.02398681640625, 0.03436279296875, -0.051300048828125, 0.01546478271484375, 0.008087158203125, 0.06939697265625, -0.054107666015625, 0.057830810546875, 0.037109375, -0.037109375, -0.06658935546875, -0.0221710205078125, 0.0149078369140625, -0.09320068359375, 0.03997802734375, 0.027984619140625, -0.00467681884765625, 0.00954437255859375, -0.05706787109375, -0.09136962890625, 0.127685546875, 0.034332275390625, -0.05706787109375, -0.0018377304077148438, 0.0257568359375, 0.0372314453125, -0.0084686279296875, 0.034088134765625, 0.0621337890625, 0.037200927734375, 0.00936126708984375, -0.07989501953125, 0.00711822509765625, -0.02679443359375, -0.0024929046630859375, -0.01451873779296875, -0.0987548828125, 0.06109619140625, -0.0298309326171875, -0.0177001953125, 0.016265869140625, 0.048492431640625, 0.051605224609375, 0.041229248046875, 0.02642822265625, 0.059234619140625, 0.068359375, -0.002552032470703125, 0.08331298828125, -0.0274658203125, 0.01373291015625, 0.06707763671875, -0.0223388671875, 0.0732421875, 0.01788330078125, -0.044921875, 0.04632568359375, 0.0760498046875, -0.0021228790283203125, 0.044647216796875, 0.00469970703125, -0.01235198974609375, -0.0137176513671875, -0.01245880126953125, -0.049468994140625, 0.03887939453125, 0.0186614990234375, -0.01045989990234375, -0.0021915435791015625, -0.0251617431640625, 0.017333984375, -0.0253143310546875, -0.00026106834411621094, 0.06072998046875, 0.01230621337890625, -0.04632568359375, 0.06695556640625, 0.0031833648681640625, 0.06402587890625, -0.049346923828125, 0.007167816162109375, -0.039337158203125, 0.0008020401000976562, -0.0279388427734375, -0.0531005859375, 0.005336761474609375, 0.0277557373046875, -0.00019085407257080078, -0.007152557373046875, 0.041107177734375, 0.002880096435546875, -0.042144775390625, 0.0263671875, 0.0208282470703125, 0.026763916015625, 0.0159149169921875, -0.050689697265625, 0.0135650634765625, 0.006801605224609375, -0.041046142578125, 0.02886962890625, 0.00247955322265625, -0.004512786865234375, 0.0596923828125, 0.0557861328125, -0.01555633544921875, 0.0102081298828125, -0.016357421875, 0.07513427734375, -0.037200927734375, -0.01497650146484375, -0.05718994140625, 0.040069580078125, 0.003467559814453125, -0.053558349609375, 0.040924072265625, 0.0484619140625, 0.052215576171875, 0.0203857421875, 0.048980712890625, 0.00530242919921875, 0.0236663818359375, -0.0394287109375, 0.04632568359375, -0.058349609375, 0.0284576416015625, 0.006023406982421875, -0.073486328125, -0.004512786865234375, 0.050048828125, -0.0179595947265625, 0.003940582275390625, 0.0276031494140625, 0.064208984375, 0.01282501220703125, -0.012420654296875, 0.009033203125, 0.01297760009765625, 0.026611328125, 0.0672607421875, 0.0633544921875, -0.047821044921875, 0.05328369140625, -0.02899169921875, -0.018218994140625, -0.0209197998046875, -0.05517578125, -0.07330322265625, -0.0203857421875, -0.01837158203125, -0.01172637939453125, 0.004695892333984375, 0.055816650390625, 0.038238525390625, -0.044036865234375, -0.02215576171875, -0.005176544189453125, -0.006641387939453125, 0.0029010772705078125, -0.011932373046875, 0.0254058837890625, -0.00902557373046875, -0.04388427734375, 0.035858154296875, 0.0004513263702392578, 0.01546478271484375, -0.024505615234375, -0.0205535888671875, -0.0143890380859375, 0.01082611083984375, 0.046356201171875, 0.0213470458984375, -0.06939697265625, -0.017181396484375, 0.003238677978515625, -0.01111602783203125, 0.00930023193359375, 0.0012340545654296875, -0.05792236328125, 0.006988525390625, 0.0109100341796875, 0.028594970703125, 0.0501708984375, 0.00435638427734375, 0.004360198974609375, -0.039031982421875, 0.034393310546875, 0.00067901611328125, 0.01042938232421875, 0.022491455078125, -0.03070068359375, 0.0594482421875, 0.0110321044921875, -0.052703857421875, -0.071533203125, 0.00827789306640625, -0.0784912109375, -0.00021839141845703125, 0.1036376953125, 0.0006709098815917969, -0.00901031494140625, 0.01454925537109375, -0.0157623291015625, 0.0290069580078125, -0.0281524658203125, 0.0606689453125, 0.042083740234375, -0.005870819091796875, -0.00711822509765625, -0.059417724609375, 0.026336669921875, 0.029815673828125, -0.08209228515625, -0.019287109375, 0.033721923828125, 0.03692626953125, -0.007251739501953125, 0.0518798828125, 0.0015277862548828125, 0.0175933837890625, 0.0055999755859375, 0.00803375244140625, -0.0186920166015625, -0.01116943359375, -0.00708770751953125, -0.01959228515625, -0.00409698486328125, -0.0169219970703125 ] ]
sentence-transformers/multi-qa-mpnet-base-dot-v1
2023-11-02T09:30:37.000Z
[ "sentence-transformers", "pytorch", "mpnet", "feature-extraction", "sentence-similarity", "en", "dataset:flax-sentence-embeddings/stackexchange_xml", "dataset:ms_marco", "dataset:gooaq", "dataset:yahoo_answers_topics", "dataset:search_qa", "dataset:eli5", "dataset:natural_questions", "dataset:trivia_qa", "dataset:embedding-data/QQP", "dataset:embedding-data/PAQ_pairs", "dataset:embedding-data/Amazon-QA", "dataset:embedding-data/WikiAnswers", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/multi-qa-mpnet-base-dot-v1
115
175,312
sentence-transformers
2022-03-02T23:29:05
--- language: - en pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity datasets: - flax-sentence-embeddings/stackexchange_xml - ms_marco - gooaq - yahoo_answers_topics - search_qa - eli5 - natural_questions - trivia_qa - embedding-data/QQP - embedding-data/PAQ_pairs - embedding-data/Amazon-QA - embedding-data/WikiAnswers --- # multi-qa-mpnet-base-dot-v1 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and was designed for **semantic search**. It has been trained on 215M (question, answer) pairs from diverse sources. For an introduction to semantic search, have a look at: [SBERT.net - Semantic Search](https://www.sbert.net/examples/applications/semantic-search/README.html) ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer, util query = "How many people live in London?" docs = ["Around 9 Million people live in London", "London is known for its financial district"] #Load the model model = SentenceTransformer('sentence-transformers/multi-qa-mpnet-base-dot-v1') #Encode query and documents query_emb = model.encode(query) doc_emb = model.encode(docs) #Compute dot score between query and all document embeddings scores = util.dot_score(query_emb, doc_emb)[0].cpu().tolist() #Combine docs & scores doc_score_pairs = list(zip(docs, scores)) #Sort by decreasing score doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) #Output passages & scores for doc, score in doc_score_pairs: print(score, doc) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the correct pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #CLS Pooling - Take output from first token def cls_pooling(model_output): return model_output.last_hidden_state[:,0] #Encode text def encode(texts): # Tokenize sentences encoded_input = tokenizer(texts, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input, return_dict=True) # Perform pooling embeddings = cls_pooling(model_output) return embeddings # Sentences we want sentence embeddings for query = "How many people live in London?" docs = ["Around 9 Million people live in London", "London is known for its financial district"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/multi-qa-mpnet-base-dot-v1") model = AutoModel.from_pretrained("sentence-transformers/multi-qa-mpnet-base-dot-v1") #Encode query and docs query_emb = encode(query) doc_emb = encode(docs) #Compute dot score between query and all document embeddings scores = torch.mm(query_emb, doc_emb.transpose(0, 1))[0].cpu().tolist() #Combine docs & scores doc_score_pairs = list(zip(docs, scores)) #Sort by decreasing score doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) #Output passages & scores for doc, score in doc_score_pairs: print(score, doc) ``` ## Technical Details In the following some technical details how this model must be used: | Setting | Value | | --- | :---: | | Dimensions | 768 | | Produces normalized embeddings | No | | Pooling-Method | CLS pooling | | Suitable score functions | dot-product (e.g. `util.dot_score`) | ---- ## Background The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised contrastive learning objective. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset. We developped this model during the [Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by Hugging Face. We developped this model as part of the project: [Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks. ## Intended uses Our model is intented to be used for semantic search: It encodes queries / questions and text paragraphs in a dense vector space. It finds relevant documents for the given passages. Note that there is a limit of 512 word pieces: Text longer than that will be truncated. Further note that the model was just trained on input text up to 250 word pieces. It might not work well for longer text. ## Training procedure The full training script is accessible in this current repository: `train_script.py`. ### Pre-training We use the pretrained [`mpnet-base`](https://huggingface.co/microsoft/mpnet-base) model. Please refer to the model card for more detailed information about the pre-training procedure. #### Training We use the concatenation from multiple datasets to fine-tune our model. In total we have about 215M (question, answer) pairs. We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file. The model was trained with [MultipleNegativesRankingLoss](https://www.sbert.net/docs/package_reference/losses.html#multiplenegativesrankingloss) using CLS-pooling, dot-product as similarity function, and a scale of 1. | Dataset | Number of training tuples | |--------------------------------------------------------|:--------------------------:| | [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs from WikiAnswers | 77,427,422 | | [PAQ](https://github.com/facebookresearch/PAQ) Automatically generated (Question, Paragraph) pairs for each paragraph in Wikipedia | 64,371,441 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs from all StackExchanges | 25,316,456 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs from all StackExchanges | 21,396,559 | | [MS MARCO](https://microsoft.github.io/msmarco/) Triplets (query, answer, hard_negative) for 500k queries from Bing search engine | 17,579,773 | | [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) (query, answer) pairs for 3M Google queries and Google featured snippet | 3,012,496 | | [Amazon-QA](http://jmcauley.ucsd.edu/data/amazon/qa/) (Question, Answer) pairs from Amazon product pages | 2,448,839 | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) pairs from Yahoo Answers | 1,198,260 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) pairs from Yahoo Answers | 681,164 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) pairs from Yahoo Answers | 659,896 | | [SearchQA](https://huggingface.co/datasets/search_qa) (Question, Answer) pairs for 140k questions, each with Top5 Google snippets on that question | 582,261 | | [ELI5](https://huggingface.co/datasets/eli5) (Question, Answer) pairs from Reddit ELI5 (explainlikeimfive) | 325,475 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions pairs (titles) | 304,525 | | [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) (Question, Duplicate_Question, Hard_Negative) triplets for Quora Questions Pairs dataset | 103,663 | | [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) (Question, Paragraph) pairs for 100k real Google queries with relevant Wikipedia paragraph | 100,231 | | [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) (Question, Paragraph) pairs from SQuAD2.0 dataset | 87,599 | | [TriviaQA](https://huggingface.co/datasets/trivia_qa) (Question, Evidence) pairs | 73,346 | | **Total** | **214,988,242** |
8,664
[ [ -0.034942626953125, -0.057373046875, 0.0335693359375, 0.0137939453125, -0.00681304931640625, -0.0241546630859375, -0.006511688232421875, -0.00905609130859375, 0.0218353271484375, 0.0230255126953125, -0.035003662109375, -0.042510986328125, -0.049285888671875, 0.013580322265625, -0.0294189453125, 0.0693359375, 0.0016279220581054688, 0.01035308837890625, -0.03021240234375, -0.0248260498046875, -0.0133514404296875, -0.03741455078125, -0.031463623046875, -0.002620697021484375, 0.038421630859375, 0.0259246826171875, 0.038604736328125, 0.0268707275390625, 0.0301666259765625, 0.0280303955078125, -0.00331878662109375, 0.0200958251953125, -0.0380859375, -0.0053863525390625, -0.003314971923828125, -0.03021240234375, -0.006439208984375, 0.02374267578125, 0.03741455078125, 0.035675048828125, -0.005168914794921875, 0.0157470703125, 0.0009083747863769531, 0.037628173828125, -0.034088134765625, 0.0125732421875, -0.0394287109375, 0.00911712646484375, 0.006244659423828125, -0.00969696044921875, -0.0194091796875, -0.020263671875, 0.0241851806640625, -0.047454833984375, 0.007144927978515625, 0.008331298828125, 0.08349609375, 0.01214599609375, -0.041107177734375, -0.0287628173828125, -0.00908660888671875, 0.06036376953125, -0.060821533203125, 0.0239715576171875, 0.03033447265625, -0.004390716552734375, -0.001079559326171875, -0.05926513671875, -0.059356689453125, -0.00917816162109375, -0.0250701904296875, 0.020904541015625, -0.01422119140625, -0.007297515869140625, 0.0186767578125, 0.03192138671875, -0.05743408203125, 0.00018465518951416016, -0.03460693359375, -0.01190948486328125, 0.052947998046875, 0.0107879638671875, 0.0242462158203125, -0.04931640625, -0.03680419921875, -0.0233306884765625, -0.0248260498046875, 0.01309967041015625, 0.0251007080078125, 0.00884246826171875, -0.01523590087890625, 0.0643310546875, -0.0194549560546875, 0.048980712890625, 0.0013990402221679688, -0.0001685619354248047, 0.042236328125, -0.04638671875, -0.011962890625, -0.02801513671875, 0.0797119140625, 0.03436279296875, 0.014892578125, -0.005298614501953125, -0.002044677734375, -0.0017633438110351562, 0.01209259033203125, -0.065673828125, -0.0201873779296875, 0.0300445556640625, -0.03167724609375, -0.023681640625, 0.01568603515625, -0.04754638671875, -0.02301025390625, -0.0157623291015625, 0.0521240234375, -0.0526123046875, -0.006404876708984375, 0.02471923828125, -0.0293731689453125, 0.03173828125, -0.0034275054931640625, -0.03662109375, 0.006229400634765625, 0.0374755859375, 0.062744140625, 0.005031585693359375, -0.033660888671875, -0.022216796875, -0.0185546875, 0.00024116039276123047, 0.050506591796875, -0.02996826171875, -0.0081787109375, -0.0031681060791015625, 0.0147247314453125, -0.02813720703125, -0.02734375, 0.046478271484375, -0.027923583984375, 0.05780029296875, -0.016998291015625, -0.0692138671875, -0.01511383056640625, 0.0289459228515625, -0.04095458984375, 0.08221435546875, 0.01678466796875, -0.0794677734375, 0.006023406982421875, -0.046112060546875, -0.01486968994140625, -0.0218963623046875, -0.003887176513671875, -0.037628173828125, -0.0017290115356445312, 0.03228759765625, 0.046600341796875, -0.01502227783203125, 0.005786895751953125, -0.0193939208984375, -0.03070068359375, 0.0240936279296875, -0.00897216796875, 0.0797119140625, 0.00624847412109375, -0.0276641845703125, -0.00585174560546875, -0.05682373046875, 0.0016241073608398438, 0.0236358642578125, -0.0229949951171875, -0.0126800537109375, -0.01343536376953125, 0.005001068115234375, 0.042388916015625, 0.021881103515625, -0.055419921875, 0.0186614990234375, -0.045654296875, 0.051513671875, 0.048736572265625, 0.005542755126953125, 0.030181884765625, -0.040008544921875, 0.036712646484375, 0.0211334228515625, 0.007572174072265625, -0.005397796630859375, -0.04290771484375, -0.07000732421875, 0.0004706382751464844, 0.0298614501953125, 0.0494384765625, -0.056884765625, 0.05291748046875, -0.0268707275390625, -0.037445068359375, -0.06878662109375, 0.0080413818359375, 0.0260009765625, 0.044921875, 0.04443359375, -0.0023193359375, -0.02996826171875, -0.07781982421875, -0.0062255859375, 0.004657745361328125, -0.004222869873046875, 0.03448486328125, 0.05291748046875, -0.01177215576171875, 0.06109619140625, -0.056396484375, -0.03106689453125, -0.00731658935546875, 0.007099151611328125, 0.0167388916015625, 0.037353515625, 0.039703369140625, -0.070068359375, -0.0360107421875, -0.0338134765625, -0.05078125, 0.004856109619140625, -0.008056640625, -0.01922607421875, 0.015716552734375, 0.046844482421875, -0.054473876953125, 0.014678955078125, 0.042388916015625, -0.03936767578125, 0.0264129638671875, -0.01410675048828125, 0.003238677978515625, -0.10772705078125, 0.01059722900390625, 0.00498199462890625, -0.0043182373046875, -0.0226287841796875, 0.00971221923828125, -0.0009598731994628906, -0.01204681396484375, -0.03363037109375, 0.033477783203125, -0.034820556640625, 0.00870513916015625, 0.012908935546875, 0.040985107421875, 0.0175018310546875, 0.056121826171875, -0.0171356201171875, 0.0615234375, 0.032684326171875, -0.033477783203125, 0.0304412841796875, 0.0440673828125, -0.0294189453125, 0.031646728515625, -0.06011962890625, 0.007396697998046875, -0.013153076171875, 0.0190887451171875, -0.0863037109375, -0.003997802734375, 0.01125335693359375, -0.050994873046875, 0.00339508056640625, 0.0102996826171875, -0.0477294921875, -0.037200927734375, -0.04571533203125, 0.00739288330078125, 0.0290374755859375, -0.0297393798828125, 0.034454345703125, 0.030059814453125, 0.0010652542114257812, -0.0516357421875, -0.06976318359375, -0.00992584228515625, -0.005695343017578125, -0.06549072265625, 0.031158447265625, -0.0172119140625, 0.01204681396484375, 0.0222930908203125, 0.01139068603515625, 0.0020275115966796875, 0.0011415481567382812, 0.01236724853515625, 0.007320404052734375, -0.01244354248046875, 0.030120849609375, -0.01328277587890625, -0.00807952880859375, -0.00018167495727539062, -0.0225372314453125, 0.052581787109375, -0.0280303955078125, -0.01068878173828125, -0.0279388427734375, 0.03302001953125, 0.034454345703125, -0.021484375, 0.08538818359375, 0.073974609375, -0.0126953125, -0.0009794235229492188, -0.04571533203125, -0.01081085205078125, -0.034820556640625, 0.033294677734375, -0.0260772705078125, -0.08038330078125, 0.0307769775390625, 0.0165557861328125, -0.003864288330078125, 0.060882568359375, 0.0298919677734375, -0.0200958251953125, 0.07354736328125, 0.0265960693359375, -0.0089569091796875, 0.029937744140625, -0.04541015625, 0.021453857421875, -0.061065673828125, -0.0188140869140625, -0.0294189453125, -0.0273895263671875, -0.076171875, -0.0341796875, 0.0301666259765625, 0.00522613525390625, -0.0153961181640625, 0.0302886962890625, -0.0491943359375, 0.01641845703125, 0.059356689453125, 0.02325439453125, -0.00908660888671875, -0.00315093994140625, -0.023468017578125, -0.004993438720703125, -0.0621337890625, -0.0210113525390625, 0.093505859375, 0.021026611328125, 0.0265655517578125, 0.0034923553466796875, 0.062225341796875, 0.007167816162109375, -0.01061248779296875, -0.04644775390625, 0.04541015625, -0.02227783203125, -0.0345458984375, -0.0196533203125, -0.048675537109375, -0.0721435546875, 0.0184173583984375, -0.0341796875, -0.03387451171875, 0.00704193115234375, -0.01403045654296875, -0.0249481201171875, 0.015655517578125, -0.067138671875, 0.08050537109375, -0.00318145751953125, -0.0214996337890625, -0.01110076904296875, -0.058929443359375, 0.01428985595703125, 0.01763916015625, 0.0025501251220703125, -0.006618499755859375, -0.010986328125, 0.062225341796875, -0.029998779296875, 0.04144287109375, -0.01061248779296875, 0.01422119140625, 0.02056884765625, -0.025909423828125, 0.0268707275390625, -0.0055389404296875, -0.00974273681640625, -0.006866455078125, 0.0092926025390625, -0.050262451171875, -0.04296875, 0.062164306640625, -0.07763671875, -0.035308837890625, -0.038238525390625, -0.0328369140625, -0.0159912109375, 0.0095062255859375, 0.0272979736328125, 0.03680419921875, -0.0025577545166015625, 0.0382080078125, 0.04913330078125, -0.038848876953125, 0.033721923828125, 0.0214996337890625, -0.0023860931396484375, -0.044586181640625, 0.06732177734375, 0.01256561279296875, 0.0066986083984375, 0.044158935546875, 0.0210113525390625, -0.03143310546875, -0.032073974609375, -0.01023101806640625, 0.0257415771484375, -0.047210693359375, -0.019561767578125, -0.07647705078125, -0.032501220703125, -0.0555419921875, -0.001041412353515625, -0.00927734375, -0.0406494140625, -0.033660888671875, -0.0205535888671875, 0.0290374755859375, 0.04290771484375, 0.0003647804260253906, 0.0104522705078125, -0.0469970703125, 0.0196533203125, 0.0304107666015625, 0.0188140869140625, -0.0132293701171875, -0.04241943359375, -0.01149749755859375, 0.0026531219482421875, -0.023773193359375, -0.0635986328125, 0.0328369140625, 0.016326904296875, 0.041351318359375, 0.0135040283203125, 0.007354736328125, 0.04150390625, -0.01776123046875, 0.07720947265625, 0.0014543533325195312, -0.054290771484375, 0.0380859375, -0.0210418701171875, 0.034271240234375, 0.0555419921875, 0.048675537109375, -0.040008544921875, -0.021484375, -0.05670166015625, -0.073974609375, 0.04534912109375, 0.028289794921875, 0.0171051025390625, -0.011566162109375, 0.024322509765625, -0.011749267578125, 0.0144805908203125, -0.0662841796875, -0.031951904296875, -0.016265869140625, -0.0294647216796875, -0.0159912109375, -0.022216796875, -0.0016088485717773438, -0.040771484375, 0.06634521484375, -0.006256103515625, 0.04083251953125, 0.03350830078125, -0.02374267578125, 0.034576416015625, 0.014984130859375, 0.04339599609375, 0.0350341796875, -0.0182647705078125, 0.0015316009521484375, 0.0144500732421875, -0.0245819091796875, -0.00864410400390625, 0.033935546875, -0.00704193115234375, -0.005733489990234375, 0.0237274169921875, 0.056060791015625, 0.02032470703125, -0.04541015625, 0.066162109375, -0.016143798828125, -0.01763916015625, -0.028839111328125, -0.0103302001953125, 0.0255279541015625, 0.0187225341796875, 0.0120086669921875, -0.0128631591796875, 0.0083160400390625, -0.037322998046875, 0.034210205078125, 0.01763916015625, -0.0277099609375, -0.0017843246459960938, 0.032501220703125, 0.01410675048828125, -0.005107879638671875, 0.0662841796875, -0.0270233154296875, -0.05059814453125, 0.039154052734375, 0.0243988037109375, 0.05413818359375, 0.00879669189453125, 0.03131103515625, 0.047698974609375, 0.02166748046875, 0.015045166015625, 0.0219268798828125, 0.006046295166015625, -0.058074951171875, -0.00981903076171875, -0.063720703125, -0.00920867919921875, 0.00913238525390625, -0.03515625, 0.0152587890625, -0.017333984375, -0.0028972625732421875, 0.00670623779296875, 0.0265960693359375, -0.06378173828125, 0.00893402099609375, 0.0030117034912109375, 0.075927734375, -0.060028076171875, 0.050201416015625, 0.053192138671875, -0.0662841796875, -0.06103515625, -0.007457733154296875, -0.01369476318359375, -0.06207275390625, 0.029052734375, 0.0297698974609375, 0.01328277587890625, 0.004665374755859375, -0.0390625, -0.06243896484375, 0.1015625, 0.014892578125, -0.0247802734375, -0.0147857666015625, 0.0179901123046875, 0.043853759765625, -0.0306243896484375, 0.033905029296875, 0.029632568359375, 0.0299224853515625, -0.00942230224609375, -0.051788330078125, 0.005947113037109375, -0.037322998046875, -0.0038280487060546875, -0.004360198974609375, -0.06256103515625, 0.07403564453125, -0.0000858306884765625, -0.00850677490234375, 0.002887725830078125, 0.041168212890625, 0.0189056396484375, 0.00614166259765625, 0.038330078125, 0.0738525390625, 0.0546875, -0.006534576416015625, 0.095703125, -0.0330810546875, 0.039764404296875, 0.0732421875, 0.0162353515625, 0.07763671875, 0.0290679931640625, -0.01212310791015625, 0.052825927734375, 0.0509033203125, -0.012054443359375, 0.03448486328125, 0.0179901123046875, -0.002544403076171875, -0.0153961181640625, -0.00405120849609375, -0.0289459228515625, 0.04644775390625, 0.007152557373046875, -0.04339599609375, -0.00612640380859375, 0.0035800933837890625, 0.00899505615234375, 0.0056610107421875, -0.0028228759765625, 0.057769775390625, 0.00795745849609375, -0.045440673828125, 0.039306640625, 0.0009684562683105469, 0.07073974609375, -0.041656494140625, 0.01326751708984375, -0.0215301513671875, 0.0157318115234375, -0.00992584228515625, -0.06695556640625, 0.018280029296875, -0.0249481201171875, -0.01544189453125, -0.0199127197265625, 0.048248291015625, -0.044342041015625, -0.046142578125, 0.0357666015625, 0.04486083984375, 0.007595062255859375, -0.01218414306640625, -0.08233642578125, -0.0199737548828125, 0.00032901763916015625, -0.034820556640625, 0.0204010009765625, 0.032562255859375, 0.033782958984375, 0.037628173828125, 0.04345703125, -0.01456451416015625, 0.013031005859375, -0.0017576217651367188, 0.0687255859375, -0.059661865234375, -0.0390625, -0.04522705078125, 0.043182373046875, -0.02880859375, -0.0462646484375, 0.0638427734375, 0.049041748046875, 0.07012939453125, 0.00000476837158203125, 0.042266845703125, -0.01082611083984375, 0.03375244140625, -0.04193115234375, 0.0538330078125, -0.057098388671875, -0.000690460205078125, -0.0114593505859375, -0.0582275390625, -0.006931304931640625, 0.04486083984375, -0.026519775390625, 0.001743316650390625, 0.056488037109375, 0.0692138671875, -0.005523681640625, -0.00406646728515625, 0.0099029541015625, 0.02923583984375, 0.005947113037109375, 0.05853271484375, 0.042266845703125, -0.06378173828125, 0.060882568359375, -0.035736083984375, 0.0019855499267578125, -0.01107025146484375, -0.052764892578125, -0.060882568359375, -0.06964111328125, -0.025726318359375, -0.040069580078125, -0.005401611328125, 0.0667724609375, 0.05169677734375, -0.061920166015625, 0.0011920928955078125, -0.00847625732421875, 0.0055389404296875, 0.0015697479248046875, -0.0252685546875, 0.0394287109375, -0.02545166015625, -0.045379638671875, 0.0188446044921875, -0.005832672119140625, -0.008331298828125, -0.0230255126953125, 0.00429534912109375, -0.06610107421875, 0.01097869873046875, 0.05279541015625, 0.001766204833984375, -0.045440673828125, -0.02734375, 0.017333984375, -0.0283203125, 0.00690460205078125, 0.0291900634765625, -0.0531005859375, 0.0245208740234375, 0.0555419921875, 0.051971435546875, 0.054290771484375, 0.002452850341796875, 0.033172607421875, -0.058319091796875, 0.0015354156494140625, 0.018096923828125, 0.032623291015625, 0.036712646484375, -0.0225372314453125, 0.05364990234375, 0.02764892578125, -0.045257568359375, -0.041259765625, -0.00572967529296875, -0.08642578125, -0.0307159423828125, 0.09490966796875, -0.0201873779296875, -0.0173187255859375, 0.011077880859375, -0.017333984375, 0.0141754150390625, -0.0264129638671875, 0.04730224609375, 0.062744140625, -0.00896453857421875, -0.0229034423828125, -0.03009033203125, 0.0297393798828125, 0.039947509765625, -0.05718994140625, -0.032379150390625, 0.0255279541015625, 0.037506103515625, 0.0269317626953125, 0.03875732421875, -0.003620147705078125, 0.003810882568359375, 0.00508880615234375, -0.007602691650390625, -0.0165863037109375, 0.002025604248046875, -0.030670166015625, 0.036163330078125, -0.03265380859375, -0.027862548828125 ] ]
microsoft/deberta-xlarge-mnli
2022-06-27T15:47:33.000Z
[ "transformers", "pytorch", "tf", "deberta", "text-classification", "deberta-v1", "deberta-mnli", "en", "arxiv:2006.03654", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-classification
microsoft
null
null
microsoft/deberta-xlarge-mnli
15
174,741
transformers
2022-03-02T23:29:05
--- language: en tags: - deberta-v1 - deberta-mnli tasks: mnli thumbnail: https://huggingface.co/front/thumbnails/microsoft.png license: mit widget: - text: "[CLS] I love you. [SEP] I like you. [SEP]" --- ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This the DeBERTa xlarge model(750M) fine-tuned with mnli task. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, you need to specify **--sharded_ddp** ```bash cd transformers/examples/text-classification/ export TASK_NAME=mrpc python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 128 --per_device_train_batch_size 4 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
3,909
[ [ -0.034423828125, -0.046661376953125, 0.0201568603515625, 0.035064697265625, -0.01314544677734375, 0.0143890380859375, 0.0007791519165039062, -0.048797607421875, 0.0212860107421875, 0.01309967041015625, -0.06243896484375, -0.0254058837890625, -0.0687255859375, -0.005634307861328125, -0.0012340545654296875, 0.06439208984375, -0.005115509033203125, -0.015045166015625, -0.010467529296875, -0.01459503173828125, -0.043792724609375, -0.034271240234375, -0.0384521484375, -0.033111572265625, 0.0220947265625, 0.02447509765625, 0.050628662109375, 0.01097869873046875, 0.03643798828125, 0.0255279541015625, -0.031707763671875, 0.0275421142578125, -0.038604736328125, -0.005001068115234375, 0.01152801513671875, -0.0255279541015625, -0.07000732421875, 0.007781982421875, 0.024810791015625, 0.028472900390625, 0.0165863037109375, 0.025360107421875, 0.0302581787109375, 0.07867431640625, -0.03582763671875, 0.00986480712890625, -0.03570556640625, 0.005558013916015625, 0.009521484375, -0.0028667449951171875, -0.01467132568359375, -0.0026950836181640625, 0.00598907470703125, -0.029815673828125, 0.0017919540405273438, -0.01175689697265625, 0.0941162109375, 0.039642333984375, -0.00855255126953125, -0.007904052734375, -0.031524658203125, 0.0858154296875, -0.055145263671875, 0.026336669921875, 0.0244903564453125, 0.0026226043701171875, -0.01461029052734375, -0.030487060546875, -0.02734375, -0.0128326416015625, -0.016082763671875, 0.0233917236328125, -0.05706787109375, -0.009857177734375, 0.03045654296875, 0.01105499267578125, -0.052398681640625, 0.01276397705078125, -0.0261383056640625, 0.0012292861938476562, 0.0546875, 0.006015777587890625, 0.014373779296875, 0.006946563720703125, -0.037322998046875, -0.01012420654296875, -0.042022705078125, 0.018096923828125, 0.009368896484375, 0.00131988525390625, -0.0234527587890625, 0.018829345703125, -0.0185394287109375, 0.0682373046875, 0.0303497314453125, -0.0003962516784667969, 0.053558349609375, -0.014312744140625, -0.034393310546875, 0.00090789794921875, 0.048736572265625, 0.0245819091796875, -0.0023479461669921875, -0.0080413818359375, -0.01629638671875, 0.004802703857421875, 0.006694793701171875, -0.0706787109375, -0.03265380859375, 0.040863037109375, -0.04534912109375, -0.019134521484375, -0.00385284423828125, -0.03973388671875, 0.0012235641479492188, -0.0460205078125, 0.028656005859375, -0.04290771484375, -0.0247650146484375, 0.0038166046142578125, -0.0039520263671875, 0.004230499267578125, 0.03662109375, -0.06329345703125, 0.0012750625610351562, 0.035400390625, 0.05560302734375, -0.007358551025390625, -0.01338958740234375, -0.04522705078125, -0.015899658203125, -0.004215240478515625, 0.0215301513671875, -0.01357269287109375, 0.0078582763671875, -0.013702392578125, 0.01221466064453125, -0.025238037109375, -0.026611328125, 0.01457977294921875, -0.0391845703125, -0.0025539398193359375, -0.029388427734375, -0.029083251953125, -0.0206756591796875, 0.0294952392578125, -0.039886474609375, 0.08331298828125, 0.03179931640625, -0.06695556640625, 0.01396942138671875, -0.0447998046875, -0.00608062744140625, -0.016815185546875, -0.0007328987121582031, -0.039947509765625, -0.006679534912109375, 0.0184478759765625, 0.044281005859375, -0.004573822021484375, -0.0037174224853515625, -0.017059326171875, -0.03289794921875, 0.005462646484375, -0.0066070556640625, 0.0955810546875, 0.0271453857421875, -0.0684814453125, 0.0035343170166015625, -0.06866455078125, 0.0184478759765625, 0.0167694091796875, -0.02093505859375, -0.01276397705078125, -0.01459503173828125, 0.004314422607421875, 0.041778564453125, 0.044342041015625, -0.044281005859375, 0.0219573974609375, -0.03094482421875, 0.04351806640625, 0.042938232421875, -0.02392578125, 0.016693115234375, -0.0183563232421875, 0.031707763671875, 0.031219482421875, 0.0301361083984375, 0.020111083984375, -0.046417236328125, -0.0567626953125, -0.044403076171875, 0.0272216796875, 0.0546875, -0.04638671875, 0.05511474609375, -0.00750732421875, -0.047332763671875, -0.041107177734375, 0.01922607421875, 0.044647216796875, 0.0230865478515625, 0.037750244140625, -0.003627777099609375, -0.04266357421875, -0.08404541015625, 0.0045623779296875, 0.0006833076477050781, 0.00010836124420166016, 0.0140228271484375, 0.052093505859375, -0.02410888671875, 0.0653076171875, -0.0355224609375, -0.03509521484375, -0.0137481689453125, 0.0044097900390625, 0.033782958984375, 0.056365966796875, 0.0791015625, -0.05609130859375, -0.049560546875, -0.0157928466796875, -0.0499267578125, 0.01549530029296875, 0.0005116462707519531, -0.020294189453125, 0.045074462890625, 0.0189208984375, -0.045257568359375, 0.037353515625, 0.05462646484375, -0.03680419921875, 0.0182647705078125, -0.0246734619140625, 0.01531219482421875, -0.076904296875, 0.017791748046875, -0.0023365020751953125, -0.0219268798828125, -0.041778564453125, -0.006168365478515625, 0.0115966796875, 0.0235595703125, -0.0257110595703125, 0.0254974365234375, -0.048248291015625, 0.007640838623046875, -0.016998291015625, 0.0193023681640625, 0.00925445556640625, 0.0626220703125, -0.00476837158203125, 0.05023193359375, 0.042388916015625, -0.034881591796875, 0.0190277099609375, 0.041717529296875, -0.0234222412109375, 0.032440185546875, -0.06536865234375, 0.01425933837890625, -0.01477813720703125, 0.012939453125, -0.08160400390625, 0.01116180419921875, 0.02630615234375, -0.038848876953125, 0.044097900390625, -0.0101165771484375, -0.03924560546875, -0.039825439453125, -0.0277862548828125, -0.00092315673828125, 0.05706787109375, -0.051605224609375, 0.018585205078125, 0.02862548828125, 0.00893402099609375, -0.05401611328125, -0.060211181640625, -0.009246826171875, -0.0153045654296875, -0.0643310546875, 0.054534912109375, -0.0152130126953125, -0.008636474609375, -0.005214691162109375, -0.00569915771484375, -0.0149383544921875, 0.0233612060546875, 0.0269622802734375, 0.0350341796875, -0.00513458251953125, -0.004589080810546875, 0.00647735595703125, 0.0024890899658203125, -0.01092529296875, 0.001560211181640625, 0.038848876953125, -0.02398681640625, -0.003215789794921875, -0.0283355712890625, 0.019775390625, 0.045623779296875, -0.02813720703125, 0.060516357421875, 0.06976318359375, -0.0206756591796875, 0.0029468536376953125, -0.038787841796875, -0.01486968994140625, -0.034637451171875, 0.01898193359375, -0.033050537109375, -0.05963134765625, 0.05035400390625, 0.016448974609375, 0.0219268798828125, 0.0484619140625, 0.0438232421875, -0.00911712646484375, 0.091064453125, 0.04986572265625, -0.02630615234375, 0.042999267578125, -0.056549072265625, -0.0025482177734375, -0.07525634765625, -0.0171356201171875, -0.03369140625, -0.050689697265625, -0.039031982421875, -0.017181396484375, 0.015655517578125, 0.034332275390625, -0.023101806640625, 0.060089111328125, -0.0809326171875, 0.0010824203491210938, 0.055084228515625, 0.037933349609375, -0.0017423629760742188, 0.00559234619140625, 0.01076507568359375, -0.00852203369140625, -0.0574951171875, -0.031402587890625, 0.058013916015625, 0.028656005859375, 0.0382080078125, 0.01407623291015625, 0.06439208984375, 0.01172637939453125, -0.006015777587890625, -0.02349853515625, 0.0316162109375, -0.01152801513671875, -0.041748046875, -0.01351165771484375, -0.026611328125, -0.08563232421875, 0.0162811279296875, -0.01049041748046875, -0.0869140625, 0.030670166015625, 0.031494140625, -0.035980224609375, 0.0131988525390625, -0.039642333984375, 0.06903076171875, -0.010833740234375, -0.0281524658203125, -0.0212554931640625, -0.051239013671875, 0.0173187255859375, 0.017730712890625, -0.0124359130859375, -0.0221710205078125, 0.00501251220703125, 0.061370849609375, -0.022796630859375, 0.058502197265625, -0.0283660888671875, -0.0222320556640625, 0.028472900390625, -0.003009796142578125, 0.053741455078125, -0.003643035888671875, -0.00244140625, 0.0183563232421875, 0.0235137939453125, -0.03289794921875, -0.035064697265625, 0.06072998046875, -0.0640869140625, -0.0256195068359375, -0.033416748046875, -0.043731689453125, -0.0192108154296875, -0.0017490386962890625, 0.0254974365234375, 0.033203125, 0.0031261444091796875, 0.0152130126953125, 0.0628662109375, -0.009521484375, 0.037200927734375, 0.03887939453125, 0.0123138427734375, -0.01155853271484375, 0.061248779296875, 0.00904083251953125, 0.005611419677734375, 0.036285400390625, -0.023345947265625, -0.02374267578125, -0.038543701171875, -0.03875732421875, 0.006214141845703125, -0.039794921875, -0.0297393798828125, -0.05316162109375, -0.00690460205078125, -0.02667236328125, 0.005474090576171875, -0.0292510986328125, -0.043182373046875, -0.05462646484375, 0.02105712890625, 0.05029296875, 0.040924072265625, -0.003448486328125, 0.00992584228515625, -0.06829833984375, 0.011962890625, 0.00677490234375, 0.0174102783203125, -0.00011307001113891602, -0.042022705078125, -0.0189208984375, 0.02557373046875, -0.044342041015625, -0.0611572265625, 0.033721923828125, 0.0020313262939453125, 0.046234130859375, 0.0007476806640625, 0.00662994384765625, 0.048858642578125, -0.03082275390625, 0.05877685546875, 0.0249481201171875, -0.06170654296875, 0.052581787109375, -0.0180816650390625, 0.0210723876953125, 0.044921875, 0.034027099609375, -0.0017786026000976562, -0.02301025390625, -0.060089111328125, -0.0562744140625, 0.07513427734375, 0.03887939453125, -0.010528564453125, 0.008453369140625, 0.01318359375, -0.0146942138671875, 0.016326904296875, -0.0282135009765625, -0.03533935546875, -0.013153076171875, -0.02166748046875, -0.00299072265625, -0.023162841796875, -0.0065460205078125, -0.035888671875, 0.0689697265625, -0.0017633438110351562, 0.041351318359375, 0.03692626953125, -0.019866943359375, -0.0010385513305664062, -0.0010986328125, 0.0635986328125, 0.063232421875, -0.0316162109375, -0.017791748046875, 0.0161285400390625, -0.030914306640625, -0.0014276504516601562, 0.0168304443359375, 0.0016937255859375, 0.01554107666015625, 0.0185089111328125, 0.07196044921875, 0.003582000732421875, -0.03863525390625, 0.0271759033203125, 0.00464630126953125, -0.0323486328125, -0.01666259765625, -0.0003151893615722656, -0.001880645751953125, 0.0433349609375, 0.0206298828125, 0.012542724609375, 0.01255035400390625, -0.0273895263671875, 0.0143890380859375, 0.048553466796875, -0.04266357421875, -0.022674560546875, 0.051422119140625, 0.007747650146484375, 0.0015001296997070312, 0.038909912109375, -0.018218994140625, -0.05072021484375, 0.06292724609375, 0.0255279541015625, 0.06005859375, -0.01178741455078125, 0.004425048828125, 0.0517578125, 0.0250091552734375, 0.00798797607421875, 0.0443115234375, 0.005096435546875, -0.02496337890625, -0.021148681640625, -0.04974365234375, -0.0009212493896484375, 0.0213623046875, -0.05084228515625, 0.00335693359375, -0.0094451904296875, -0.02642822265625, 0.013824462890625, 0.028472900390625, -0.06500244140625, 0.01409149169921875, 0.0092315673828125, 0.07342529296875, -0.039794921875, 0.06304931640625, 0.05419921875, -0.0322265625, -0.048980712890625, -0.0215301513671875, -0.009429931640625, -0.06231689453125, 0.07818603515625, 0.01385498046875, 0.006122589111328125, 0.000644683837890625, -0.028350830078125, -0.07244873046875, 0.09381103515625, 0.0252685546875, -0.07012939453125, -0.003902435302734375, 0.00037288665771484375, 0.03369140625, -0.0177154541015625, 0.01922607421875, 0.0423583984375, 0.035736083984375, -0.00527191162109375, -0.08306884765625, 0.02716064453125, -0.0252685546875, 0.0074462890625, 0.0162811279296875, -0.069091796875, 0.0789794921875, -0.01006317138671875, 0.0145263671875, 0.01263427734375, 0.044921875, 0.0204925537109375, 0.0060577392578125, 0.044647216796875, 0.05010986328125, 0.043731689453125, -0.015045166015625, 0.06707763671875, -0.038238525390625, 0.047637939453125, 0.06744384765625, 0.01175689697265625, 0.05047607421875, 0.03472900390625, -0.03411865234375, 0.032562255859375, 0.0499267578125, -0.01385498046875, 0.03338623046875, 0.0117645263671875, 0.00617218017578125, -0.017669677734375, 0.025177001953125, -0.034942626953125, 0.032501220703125, 0.0084228515625, -0.037567138671875, -0.01450347900390625, 0.006488800048828125, 0.005908966064453125, -0.01071929931640625, -0.0197296142578125, 0.04827880859375, -0.00231170654296875, -0.054229736328125, 0.08343505859375, -0.0164794921875, 0.06268310546875, -0.0379638671875, -0.0099029541015625, -0.0040130615234375, 0.038970947265625, -0.0252685546875, -0.05487060546875, 0.01873779296875, -0.0065765380859375, -0.0246734619140625, -0.0077362060546875, 0.0496826171875, -0.02874755859375, -0.031402587890625, 0.0285797119140625, 0.0277099609375, 0.01131439208984375, -0.0235748291015625, -0.0906982421875, 0.0279998779296875, 0.0192413330078125, -0.039093017578125, 0.03668212890625, 0.01029205322265625, 0.01319122314453125, 0.03558349609375, 0.0149078369140625, -0.0305023193359375, 0.0009007453918457031, -0.0179290771484375, 0.0733642578125, -0.022491455078125, -0.01983642578125, -0.064208984375, 0.0477294921875, -0.0165863037109375, -0.029052734375, 0.0697021484375, 0.0352783203125, 0.03790283203125, -0.0197296142578125, 0.039459228515625, -0.031890869140625, 0.0258941650390625, -0.03466796875, 0.058013916015625, -0.0694580078125, -0.0096282958984375, -0.035400390625, -0.0684814453125, 0.0037174224853515625, 0.0531005859375, -0.0010166168212890625, 0.0096282958984375, 0.01580810546875, 0.0499267578125, -0.008514404296875, -0.0180206298828125, 0.0123138427734375, 0.01218414306640625, 0.01751708984375, 0.07513427734375, 0.036529541015625, -0.06097412109375, 0.036895751953125, -0.0384521484375, -0.032379150390625, -0.0293426513671875, -0.0577392578125, -0.08282470703125, -0.055145263671875, -0.05316162109375, -0.03851318359375, -0.0034580230712890625, 0.06768798828125, 0.0709228515625, -0.06390380859375, 0.0149993896484375, -0.01409149169921875, -0.00905609130859375, -0.039306640625, -0.0169525146484375, 0.039703369140625, -0.032958984375, -0.07806396484375, 0.0232696533203125, -0.008148193359375, 0.0239715576171875, -0.00968170166015625, -0.0180511474609375, -0.0237579345703125, -0.003055572509765625, 0.057830810546875, 0.0179290771484375, -0.049072265625, -0.01404571533203125, 0.00829315185546875, -0.0112457275390625, 0.007625579833984375, 0.0087738037109375, -0.05487060546875, 0.00428009033203125, 0.042816162109375, 0.015655517578125, 0.042694091796875, -0.0164031982421875, 0.0126800537109375, -0.057647705078125, 0.03118896484375, 0.0159759521484375, 0.031707763671875, 0.004093170166015625, -0.035614013671875, 0.0455322265625, -0.01058197021484375, -0.0455322265625, -0.0662841796875, 0.00681304931640625, -0.10699462890625, -0.023651123046875, 0.07196044921875, -0.0245819091796875, -0.0207366943359375, 0.00811767578125, -0.0280303955078125, 0.01160430908203125, -0.02923583984375, 0.054107666015625, 0.0377197265625, -0.019256591796875, 0.0020809173583984375, -0.037139892578125, 0.05560302734375, 0.04144287109375, -0.043060302734375, 0.0009703636169433594, 0.0232696533203125, 0.019378662109375, 0.04095458984375, 0.042388916015625, -0.0028076171875, 0.0282745361328125, -0.0101165771484375, 0.00004839897155761719, -0.026153564453125, -0.016387939453125, -0.011962890625, -0.015106201171875, -0.00917816162109375, -0.043914794921875 ] ]
sentence-transformers/clip-ViT-B-32-multilingual-v1
2022-06-15T20:17:26.000Z
[ "sentence-transformers", "pytorch", "tf", "distilbert", "feature-extraction", "sentence-similarity", "transformers", "multilingual", "arxiv:2004.09813", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/clip-ViT-B-32-multilingual-v1
56
173,720
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity language: multilingual tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers license: apache-2.0 --- # sentence-transformers/clip-ViT-B-32-multilingual-v1 This is a multi-lingual version of the OpenAI CLIP-ViT-B32 model. You can map text (in 50+ languages) and images to a common dense vector space such that images and the matching texts are close. This model can be used for **image search** (users search through a large collection of images) and for **multi-lingual zero-shot image classification** (image labels are defined as text). ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer, util from PIL import Image, ImageFile import requests import torch # We use the original clip-ViT-B-32 for encoding images img_model = SentenceTransformer('clip-ViT-B-32') # Our text embedding model is aligned to the img_model and maps 50+ # languages to the same vector space text_model = SentenceTransformer('sentence-transformers/clip-ViT-B-32-multilingual-v1') # Now we load and encode the images def load_image(url_or_path): if url_or_path.startswith("http://") or url_or_path.startswith("https://"): return Image.open(requests.get(url_or_path, stream=True).raw) else: return Image.open(url_or_path) # We load 3 images. You can either pass URLs or # a path on your disc img_paths = [ # Dog image "https://unsplash.com/photos/QtxgNsmJQSs/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjM1ODQ0MjY3&w=640", # Cat image "https://unsplash.com/photos/9UUoGaaHtNE/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8Mnx8Y2F0fHwwfHx8fDE2MzU4NDI1ODQ&w=640", # Beach image "https://unsplash.com/photos/Siuwr3uCir0/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8NHx8YmVhY2h8fDB8fHx8MTYzNTg0MjYzMg&w=640" ] images = [load_image(img) for img in img_paths] # Map images to the vector space img_embeddings = img_model.encode(images) # Now we encode our text: texts = [ "A dog in the snow", "Eine Katze", # German: A cat "Una playa con palmeras." # Spanish: a beach with palm trees ] text_embeddings = text_model.encode(texts) # Compute cosine similarities: cos_sim = util.cos_sim(text_embeddings, img_embeddings) for text, scores in zip(texts, cos_sim): max_img_idx = torch.argmax(scores) print("Text:", text) print("Score:", scores[max_img_idx] ) print("Path:", img_paths[max_img_idx], "\n") ``` ## Multilingual Image Search - Demo For a demo of multilingual image search, have a look at: [Image_Search-multilingual.ipynb](https://github.com/UKPLab/sentence-transformers/tree/master/examples/applications/image-search/Image_Search-multilingual.ipynb) ( [Colab version](https://colab.research.google.com/drive/1N6woBKL4dzYsHboDNqtv-8gjZglKOZcn?usp=sharing) ) For more details on image search and zero-shot image classification, have a look at the documentation on [SBERT.net](https://www.sbert.net/examples/applications/image-search/README.html). ## Training This model has been created using [Multilingual Knowledge Distillation](https://arxiv.org/abs/2004.09813). As teacher model, we used the original `clip-ViT-B-32` and then trained a [multilingual DistilBERT](https://huggingface.co/distilbert-base-multilingual-cased) model as student model. Using parallel data, the multilingual student model learns to align the teachers vector space across many languages. As a result, you get an text embedding model that works for 50+ languages. The image encoder from CLIP is unchanged, i.e. you can use the original CLIP image encoder to encode images. Have a look at the [SBERT.net - Multilingual-Models documentation](https://www.sbert.net/examples/training/multilingual/README.html) on more details and for **training code**. We used the following 50+ languages to align the vector spaces: ar, bg, ca, cs, da, de, el, es, et, fa, fi, fr, fr-ca, gl, gu, he, hi, hr, hu, hy, id, it, ja, ka, ko, ku, lt, lv, mk, mn, mr, ms, my, nb, nl, pl, pt, pt, pt-br, ro, ru, sk, sl, sq, sr, sv, th, tr, uk, ur, vi, zh-cn, zh-tw. The original multilingual DistilBERT supports 100+ lanugages. The model also work for these languages, but might not yield the best results. ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) (2): Dense({'in_features': 768, 'out_features': 512, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
5,626
[ [ -0.027984619140625, -0.052001953125, 0.0202789306640625, 0.035919189453125, -0.0245513916015625, -0.012939453125, -0.0244140625, -0.02685546875, 0.00968170166015625, 0.0159759521484375, -0.033447265625, -0.042266845703125, -0.039215087890625, 0.0273895263671875, -0.01403045654296875, 0.058380126953125, -0.018218994140625, 0.0187225341796875, -0.00714111328125, -0.020904541015625, -0.032257080078125, -0.034423828125, -0.0284881591796875, -0.0204620361328125, 0.0213623046875, 0.01433563232421875, 0.0474853515625, 0.03363037109375, 0.037017822265625, 0.0303497314453125, -0.00420379638671875, 0.00220489501953125, -0.0244598388671875, -0.0025539398193359375, -0.00396728515625, -0.03460693359375, 0.001209259033203125, -0.00579071044921875, 0.034698486328125, 0.01399993896484375, -0.005649566650390625, 0.0028076171875, -0.00778961181640625, 0.0318603515625, -0.031494140625, 0.0193939208984375, -0.041900634765625, 0.010589599609375, -0.007610321044921875, -0.007610321044921875, -0.0394287109375, -0.0281982421875, 0.0062103271484375, -0.049774169921875, 0.01416778564453125, -0.0031337738037109375, 0.1044921875, 0.004901885986328125, -0.034912109375, -0.034820556640625, -0.02734375, 0.0638427734375, -0.04931640625, 0.0230255126953125, 0.01849365234375, 0.0094146728515625, -0.00965118408203125, -0.051513671875, -0.056976318359375, 0.00439453125, -0.01345062255859375, 0.0185546875, -0.0178375244140625, -0.01163482666015625, 0.01171875, 0.029876708984375, -0.04541015625, -0.0017375946044921875, -0.038726806640625, -0.01023101806640625, 0.04534912109375, -0.006320953369140625, 0.04974365234375, -0.0283355712890625, -0.043243408203125, -0.038909912109375, -0.0303802490234375, 0.0162200927734375, 0.0255584716796875, 0.01073455810546875, -0.0246124267578125, 0.04833984375, 0.00978851318359375, 0.048248291015625, -0.00711822509765625, -0.014373779296875, 0.044464111328125, -0.0230255126953125, -0.0068206787109375, -0.00023353099822998047, 0.09539794921875, 0.038848876953125, 0.024749755859375, 0.00490570068359375, -0.00443267822265625, 0.0090179443359375, -0.00901031494140625, -0.07489013671875, -0.0244140625, 0.004436492919921875, -0.0292510986328125, -0.01305389404296875, 0.005035400390625, -0.057342529296875, -0.0003294944763183594, 0.006847381591796875, 0.04632568359375, -0.0654296875, -0.005779266357421875, 0.0308837890625, -0.0224609375, 0.0133819580078125, -0.00138092041015625, -0.053131103515625, 0.0012388229370117188, 0.0097198486328125, 0.07354736328125, 0.01404571533203125, -0.0499267578125, -0.00797271728515625, 0.0007224082946777344, -0.0050506591796875, 0.05474853515625, -0.0254058837890625, -0.0137939453125, 0.01336669921875, 0.0296630859375, -0.01556396484375, -0.020263671875, 0.0447998046875, -0.0264434814453125, 0.039947509765625, -0.00963592529296875, -0.031280517578125, -0.040740966796875, 0.023773193359375, -0.0556640625, 0.0914306640625, -0.006816864013671875, -0.073486328125, 0.006572723388671875, -0.04931640625, -0.0341796875, -0.01436614990234375, -0.004207611083984375, -0.04193115234375, -0.00806427001953125, 0.050048828125, 0.051300048828125, -0.012725830078125, 0.018890380859375, -0.0192108154296875, -0.0289306640625, 0.0310821533203125, -0.033843994140625, 0.09063720703125, 0.006195068359375, -0.018890380859375, -0.01284027099609375, -0.038604736328125, -0.0011301040649414062, 0.03680419921875, -0.03173828125, -0.029052734375, -0.01345062255859375, 0.0246124267578125, 0.03192138671875, 0.025482177734375, -0.06298828125, 0.01824951171875, -0.0189361572265625, 0.0482177734375, 0.04193115234375, -0.0029468536376953125, 0.02880859375, -0.0230865478515625, 0.032012939453125, 0.018035888671875, 0.006671905517578125, -0.040863037109375, -0.03466796875, -0.049957275390625, -0.0328369140625, 0.0184478759765625, 0.052947998046875, -0.07073974609375, 0.036529541015625, -0.0302734375, -0.044830322265625, -0.060333251953125, -0.0027828216552734375, 0.0256500244140625, 0.0254058837890625, 0.041229248046875, -0.016815185546875, -0.03338623046875, -0.06640625, 0.00511932373046875, -0.0010442733764648438, 0.0166778564453125, 0.0184326171875, 0.054534912109375, -0.01800537109375, 0.057281494140625, -0.050262451171875, -0.035736083984375, -0.0183258056640625, 0.0078277587890625, 0.02783203125, 0.031890869140625, 0.065185546875, -0.073974609375, -0.06109619140625, -0.0029811859130859375, -0.0657958984375, 0.01099395751953125, -0.010528564453125, -0.0034999847412109375, 0.0232391357421875, 0.0242462158203125, -0.054534912109375, 0.024017333984375, 0.046417236328125, -0.0194244384765625, 0.041107177734375, -0.034393310546875, 0.0206451416015625, -0.09844970703125, -0.004535675048828125, 0.01279449462890625, -0.00888824462890625, -0.0302886962890625, 0.012054443359375, 0.0191802978515625, -0.00933074951171875, -0.049041748046875, 0.0321044921875, -0.044677734375, 0.01216888427734375, 0.0032863616943359375, 0.01220703125, 0.0105438232421875, 0.054534912109375, 0.01568603515625, 0.0640869140625, 0.06488037109375, -0.033477783203125, 0.03399658203125, 0.0440673828125, -0.0406494140625, 0.04034423828125, -0.053466796875, -0.0028743743896484375, -0.0215301513671875, 0.01690673828125, -0.074462890625, -0.0034084320068359375, 0.01352691650390625, -0.039947509765625, 0.01538848876953125, 0.0014524459838867188, -0.053863525390625, -0.03912353515625, -0.036895751953125, 0.0249481201171875, 0.029876708984375, -0.052947998046875, 0.0281829833984375, 0.0168609619140625, 0.00009608268737792969, -0.057952880859375, -0.079345703125, 0.00670623779296875, -0.0161895751953125, -0.0513916015625, 0.037200927734375, -0.001407623291015625, 0.019378662109375, 0.0194549560546875, 0.0184326171875, -0.01947021484375, -0.01116943359375, 0.005939483642578125, 0.0223388671875, -0.0145263671875, 0.0123291015625, -0.000835418701171875, 0.00908660888671875, -0.018707275390625, -0.01212310791015625, 0.05902099609375, -0.0260009765625, -0.0198974609375, -0.039215087890625, 0.026153564453125, 0.038360595703125, -0.029449462890625, 0.061920166015625, 0.0721435546875, -0.025360107421875, 0.010955810546875, -0.025787353515625, -0.0018224716186523438, -0.03704833984375, 0.04180908203125, -0.036041259765625, -0.059661865234375, 0.038787841796875, 0.0113525390625, 0.0005125999450683594, 0.035491943359375, 0.041473388671875, -0.01230621337890625, 0.0631103515625, 0.051361083984375, -0.00939178466796875, 0.03802490234375, -0.047943115234375, 0.0178070068359375, -0.053741455078125, -0.021881103515625, -0.0292510986328125, -0.01715087890625, -0.052734375, -0.03668212890625, 0.022247314453125, 0.00861358642578125, -0.01561737060546875, 0.052093505859375, -0.052520751953125, 0.03802490234375, 0.0357666015625, 0.0197296142578125, 0.0032444000244140625, 0.0194854736328125, -0.0252838134765625, -0.02276611328125, -0.048980712890625, -0.035430908203125, 0.06475830078125, 0.0264739990234375, 0.052825927734375, 0.0086517333984375, 0.051727294921875, -0.0041656494140625, -0.0077056884765625, -0.061279296875, 0.04052734375, -0.035797119140625, -0.044097900390625, -0.017913818359375, -0.01544189453125, -0.0811767578125, 0.037384033203125, -0.0207977294921875, -0.057861328125, 0.005565643310546875, -0.01568603515625, -0.002193450927734375, 0.034454345703125, -0.0543212890625, 0.07586669921875, -0.030914306640625, -0.018280029296875, -0.0008864402770996094, -0.03887939453125, 0.0169525146484375, 0.00919342041015625, 0.0203399658203125, -0.002925872802734375, 0.0041656494140625, 0.049163818359375, -0.025848388671875, 0.061370849609375, -0.00485992431640625, 0.0033893585205078125, 0.0180206298828125, -0.0085601806640625, 0.0190887451171875, -0.0044097900390625, -0.00250244140625, 0.021636962890625, 0.01235198974609375, -0.0234375, -0.029510498046875, 0.04443359375, -0.07049560546875, -0.0203857421875, -0.044464111328125, -0.044677734375, 0.016998291015625, 0.0277099609375, 0.041534423828125, 0.02532958984375, -0.008544921875, 0.0264129638671875, 0.038238525390625, -0.050628662109375, 0.048614501953125, 0.0255889892578125, -0.0174102783203125, -0.043365478515625, 0.06591796875, 0.004207611083984375, 0.0008325576782226562, 0.041290283203125, 0.01690673828125, -0.032989501953125, -0.0095062255859375, -0.03759765625, 0.031829833984375, -0.059814453125, -0.0248260498046875, -0.051849365234375, -0.0144195556640625, -0.03680419921875, -0.013641357421875, -0.023406982421875, -0.0218658447265625, -0.038726806640625, -0.0013742446899414062, 0.04388427734375, 0.038909912109375, 0.0009684562683105469, 0.03375244140625, -0.053131103515625, 0.0210418701171875, 0.0033779144287109375, 0.0142669677734375, -0.0036163330078125, -0.036163330078125, -0.01287841796875, 0.00934600830078125, -0.04364013671875, -0.08392333984375, 0.049591064453125, 0.031463623046875, 0.03546142578125, 0.021087646484375, -0.006961822509765625, 0.06512451171875, -0.046875, 0.06549072265625, 0.04095458984375, -0.07061767578125, 0.034881591796875, -0.01261138916015625, 0.025909423828125, 0.0290679931640625, 0.050628662109375, -0.053253173828125, -0.020263671875, -0.0299835205078125, -0.0687255859375, 0.05902099609375, 0.0138092041015625, 0.032135009765625, -0.007434844970703125, 0.00977325439453125, 0.000621795654296875, 0.0014276504516601562, -0.09588623046875, -0.0280303955078125, -0.031768798828125, -0.0301666259765625, -0.01412200927734375, -0.01038360595703125, 0.0125579833984375, -0.04022216796875, 0.05914306640625, -0.003070831298828125, 0.033843994140625, 0.029205322265625, -0.032745361328125, 0.019775390625, 0.0172271728515625, 0.037994384765625, 0.0095367431640625, -0.00977325439453125, 0.00830078125, 0.0087890625, -0.042510986328125, 0.0245513916015625, 0.01253509521484375, -0.0219573974609375, 0.027435302734375, 0.022125244140625, 0.07818603515625, 0.015625, -0.0428466796875, 0.057708740234375, -0.01317596435546875, -0.0203399658203125, -0.0284576416015625, -0.0258331298828125, 0.0101776123046875, 0.022613525390625, 0.02215576171875, -0.004924774169921875, -0.005035400390625, -0.040008544921875, 0.012054443359375, 0.0193939208984375, -0.023773193359375, -0.0189361572265625, 0.04534912109375, -0.011016845703125, -0.01446533203125, 0.05706787109375, -0.029876708984375, -0.06182861328125, 0.050994873046875, 0.0633544921875, 0.05145263671875, -0.0055694580078125, 0.041900634765625, 0.05242919921875, 0.035186767578125, -0.0160980224609375, 0.02587890625, 0.013641357421875, -0.061492919921875, -0.019927978515625, -0.043060302734375, -0.0027446746826171875, 0.0091705322265625, -0.040557861328125, 0.03228759765625, -0.0083770751953125, -0.01245880126953125, -0.0005421638488769531, -0.009674072265625, -0.056427001953125, -0.0017070770263671875, 0.005859375, 0.0701904296875, -0.07366943359375, 0.07470703125, 0.07562255859375, -0.06292724609375, -0.058746337890625, -0.00803375244140625, -0.02362060546875, -0.05535888671875, 0.05859375, 0.038055419921875, 0.0148162841796875, -0.00839996337890625, -0.03533935546875, -0.060638427734375, 0.09197998046875, 0.03948974609375, -0.027679443359375, 0.0028705596923828125, 0.01837158203125, 0.04541015625, -0.0242462158203125, 0.03314208984375, 0.00839996337890625, 0.025360107421875, 0.004657745361328125, -0.069580078125, 0.0082550048828125, -0.0274810791015625, 0.020111083984375, 0.00502777099609375, -0.048431396484375, 0.08514404296875, -0.01763916015625, -0.0232086181640625, 0.0103912353515625, 0.0411376953125, 0.0146942138671875, 0.0036773681640625, 0.020660400390625, 0.0579833984375, 0.043548583984375, -0.01328277587890625, 0.063232421875, -0.0204620361328125, 0.052642822265625, 0.06365966796875, 0.0013217926025390625, 0.07012939453125, 0.03851318359375, -0.0147705078125, 0.05078125, 0.048797607421875, -0.030181884765625, 0.05364990234375, -0.0005965232849121094, -0.0097198486328125, -0.00250244140625, -0.001800537109375, -0.0169677734375, 0.03948974609375, 0.01517486572265625, -0.0343017578125, -0.0225372314453125, 0.0286712646484375, 0.00450897216796875, 0.01279449462890625, 0.0014190673828125, 0.035675048828125, 0.0015687942504882812, -0.03497314453125, 0.05419921875, 0.0168304443359375, 0.07843017578125, -0.0299072265625, 0.00720977783203125, 0.0011644363403320312, 0.0255584716796875, -0.00846099853515625, -0.0819091796875, 0.010009765625, -0.002758026123046875, -0.0018625259399414062, -0.0192108154296875, 0.044281005859375, -0.053253173828125, -0.03924560546875, 0.03228759765625, 0.0157623291015625, 0.0228118896484375, 0.004611968994140625, -0.0809326171875, 0.006572723388671875, 0.01145172119140625, -0.0193023681640625, 0.006633758544921875, 0.034515380859375, 0.0034236907958984375, 0.04876708984375, 0.03271484375, 0.0018739700317382812, 0.0159454345703125, -0.008087158203125, 0.049774169921875, -0.0445556640625, -0.0316162109375, -0.06658935546875, 0.0364990234375, -0.00653839111328125, -0.023773193359375, 0.057830810546875, 0.054168701171875, 0.0858154296875, -0.037139892578125, 0.057281494140625, -0.0204315185546875, 0.00040721893310546875, -0.040740966796875, 0.056854248046875, -0.060302734375, -0.02020263671875, -0.022216796875, -0.053131103515625, -0.023040771484375, 0.06732177734375, -0.0171661376953125, -0.01549530029296875, 0.056671142578125, 0.0672607421875, -0.0172576904296875, -0.0191650390625, 0.0100250244140625, 0.0076141357421875, 0.0188446044921875, 0.0555419921875, 0.04461669921875, -0.06982421875, 0.055419921875, -0.049102783203125, 0.005466461181640625, -0.0016088485717773438, -0.05975341796875, -0.07220458984375, -0.07366943359375, -0.028289794921875, -0.009765625, -0.0108795166015625, 0.05560302734375, 0.0413818359375, -0.053466796875, -0.0243377685546875, 0.0038394927978515625, 0.00302886962890625, -0.01076507568359375, -0.0196075439453125, 0.039642333984375, -0.0156097412109375, -0.07037353515625, -0.0023517608642578125, 0.01097869873046875, 0.0160064697265625, -0.0086517333984375, -0.007480621337890625, -0.05364990234375, -0.005153656005859375, 0.04931640625, 0.00872039794921875, -0.05157470703125, -0.007568359375, 0.017730712890625, -0.021728515625, 0.01824951171875, 0.026123046875, -0.033111572265625, 0.034698486328125, 0.0367431640625, 0.035430908203125, 0.05218505859375, -0.021331787109375, 0.01202392578125, -0.0592041015625, 0.04046630859375, -0.0062103271484375, 0.05072021484375, 0.039093017578125, -0.0175018310546875, 0.035186767578125, 0.017669677734375, -0.0313720703125, -0.058685302734375, -0.0019626617431640625, -0.08477783203125, -0.0345458984375, 0.08203125, -0.036163330078125, -0.0246429443359375, 0.0199127197265625, -0.0360107421875, 0.031402587890625, -0.0249786376953125, 0.04833984375, 0.06304931640625, 0.0121917724609375, -0.031524658203125, -0.03668212890625, 0.005390167236328125, 0.0189971923828125, -0.04107666015625, -0.0270233154296875, 0.0280303955078125, 0.02740478515625, 0.0307464599609375, 0.0379638671875, -0.01496124267578125, 0.01513671875, -0.005382537841796875, 0.037506103515625, -0.0022335052490234375, -0.00164794921875, -0.0298004150390625, -0.0008444786071777344, -0.022064208984375, -0.04156494140625 ] ]
obi/deid_roberta_i2b2
2022-08-22T13:28:26.000Z
[ "transformers", "pytorch", "roberta", "token-classification", "deidentification", "medical notes", "ehr", "phi", "en", "dataset:I2B2", "arxiv:1907.11692", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
obi
null
null
obi/deid_roberta_i2b2
7
172,479
transformers
2022-03-02T23:29:05
--- language: - en thumbnail: "https://www.onebraveidea.org/wp-content/uploads/2019/07/OBI-Logo-Website.png" tags: - deidentification - medical notes - ehr - phi datasets: - I2B2 metrics: - F1 - Recall - Precision widget: - text: "Physician Discharge Summary Admit date: 10/12/1982 Discharge date: 10/22/1982 Patient Information Jack Reacher, 54 y.o. male (DOB = 1/21/1928)." - text: "Home Address: 123 Park Drive, San Diego, CA, 03245. Home Phone: 202-555-0199 (home)." - text: "Hospital Care Team Service: Orthopedics Inpatient Attending: Roger C Kelly, MD Attending phys phone: (634)743-5135 Discharge Unit: HCS843 Primary Care Physician: Hassan V Kim, MD 512-832-5025." license: mit --- # Model Description * A RoBERTa [[Liu et al., 2019]](https://arxiv.org/pdf/1907.11692.pdf) model fine-tuned for de-identification of medical notes. * Sequence Labeling (token classification): The model was trained to predict protected health information (PHI/PII) entities (spans). A list of protected health information categories is given by [HIPAA](https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html). * A token can either be classified as non-PHI or as one of the 11 PHI types. Token predictions are aggregated to spans by making use of BILOU tagging. * The PHI labels that were used for training and other details can be found here: [Annotation Guidelines](https://github.com/obi-ml-public/ehr_deidentification/blob/master/AnnotationGuidelines.md) * More details on how to use this model, the format of data and other useful information is present in the GitHub repo: [Robust DeID](https://github.com/obi-ml-public/ehr_deidentification). # How to use * A demo on how the model works (using model predictions to de-identify a medical note) is on this space: [Medical-Note-Deidentification](https://huggingface.co/spaces/obi/Medical-Note-Deidentification). * Steps on how this model can be used to run a forward pass can be found here: [Forward Pass](https://github.com/obi-ml-public/ehr_deidentification/tree/master/steps/forward_pass) * In brief, the steps are: * Sentencize (the model aggregates the sentences back to the note level) and tokenize the dataset. * Use the predict function of this model to gather the predictions (i.e., predictions for each token). * Additionally, the model predictions can be used to remove PHI from the original note/text. # Dataset * The I2B2 2014 [[Stubbs and Uzuner, 2015]](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4978170/) dataset was used to train this model. | | I2B2 | | I2B2 | | | --------- | --------------------- | ---------- | -------------------- | ---------- | | | TRAIN SET - 790 NOTES | | TEST SET - 514 NOTES | | | PHI LABEL | COUNT | PERCENTAGE | COUNT | PERCENTAGE | | DATE | 7502 | 43.69 | 4980 | 44.14 | | STAFF | 3149 | 18.34 | 2004 | 17.76 | | HOSP | 1437 | 8.37 | 875 | 7.76 | | AGE | 1233 | 7.18 | 764 | 6.77 | | LOC | 1206 | 7.02 | 856 | 7.59 | | PATIENT | 1316 | 7.66 | 879 | 7.79 | | PHONE | 317 | 1.85 | 217 | 1.92 | | ID | 881 | 5.13 | 625 | 5.54 | | PATORG | 124 | 0.72 | 82 | 0.73 | | EMAIL | 4 | 0.02 | 1 | 0.01 | | OTHERPHI | 2 | 0.01 | 0 | 0 | | TOTAL | 17171 | 100 | 11283 | 100 | # Training procedure * Steps on how this model was trained can be found here: [Training](https://github.com/obi-ml-public/ehr_deidentification/tree/master/steps/train). The "model_name_or_path" was set to: "roberta-large". * The dataset was sentencized with the en_core_sci_sm sentencizer from spacy. * The dataset was then tokenized with a custom tokenizer built on top of the en_core_sci_sm tokenizer from spacy. * For each sentence we added 32 tokens on the left (from previous sentences) and 32 tokens on the right (from the next sentences). * The added tokens are not used for learning - i.e, the loss is not computed on these tokens - they are used as additional context. * Each sequence contained a maximum of 128 tokens (including the 32 tokens added on). Longer sequences were split. * The sentencized and tokenized dataset with the token level labels based on the BILOU notation was used to train the model. * The model is fine-tuned from a pre-trained RoBERTa model. * Training details: * Input sequence length: 128 * Batch size: 32 (16 with 2 gradient accumulation steps) * Optimizer: AdamW * Learning rate: 5e-5 * Dropout: 0.1 ## Results # Questions? Post a Github issue on the repo: [Robust DeID](https://github.com/obi-ml-public/ehr_deidentification).
5,280
[ [ -0.002288818359375, -0.062103271484375, 0.034698486328125, -0.01434326171875, -0.0265045166015625, -0.017730712890625, 0.0021610260009765625, -0.032958984375, 0.0179443359375, 0.042816162109375, -0.04046630859375, -0.0625, -0.051605224609375, -0.0009398460388183594, -0.0158538818359375, 0.0943603515625, 0.007781982421875, 0.0214385986328125, 0.004558563232421875, -0.007080078125, -0.033294677734375, -0.044403076171875, -0.03985595703125, -0.024139404296875, 0.025421142578125, 0.031280517578125, 0.006122589111328125, 0.06964111328125, 0.06414794921875, 0.0197296142578125, -0.0306549072265625, -0.005474090576171875, -0.0208282470703125, -0.013275146484375, 0.01194000244140625, -0.0237274169921875, -0.06536865234375, -0.0020351409912109375, 0.0391845703125, 0.0482177734375, -0.006488800048828125, 0.02667236328125, 0.006229400634765625, 0.042938232421875, -0.025238037109375, 0.01264190673828125, -0.041778564453125, 0.0016832351684570312, -0.025634765625, -0.00736236572265625, -0.032989501953125, -0.0110626220703125, 0.0213165283203125, -0.0301513671875, 0.00798797607421875, -0.0038051605224609375, 0.0916748046875, 0.0165252685546875, -0.036376953125, -0.0138702392578125, -0.046234130859375, 0.039794921875, -0.06890869140625, 0.014892578125, 0.04083251953125, -0.004932403564453125, 0.00659942626953125, -0.054901123046875, -0.02471923828125, -0.009185791015625, -0.01419830322265625, 0.01421356201171875, 0.0085906982421875, 0.0208740234375, 0.04425048828125, 0.039215087890625, -0.043182373046875, 0.0236968994140625, -0.0535888671875, -0.01251220703125, 0.0518798828125, 0.0171661376953125, 0.01261138916015625, -0.02911376953125, -0.045562744140625, 0.00018894672393798828, -0.02203369140625, 0.0093231201171875, 0.0098876953125, 0.0236358642578125, -0.0277099609375, 0.047027587890625, 0.004093170166015625, 0.04443359375, 0.0247039794921875, -0.0253143310546875, 0.059326171875, -0.0136871337890625, -0.035247802734375, 0.022430419921875, 0.0732421875, 0.0168304443359375, -0.000025331974029541016, 0.014739990234375, -0.025177001953125, -0.007793426513671875, 0.0063629150390625, -0.05987548828125, -0.031646728515625, 0.025665283203125, -0.05230712890625, -0.0269317626953125, 0.01502227783203125, -0.054840087890625, -0.00899505615234375, -0.040802001953125, 0.0357666015625, -0.055694580078125, 0.004184722900390625, 0.01210784912109375, -0.0031337738037109375, -0.003246307373046875, 0.005382537841796875, -0.0738525390625, 0.0218048095703125, 0.04217529296875, 0.059478759765625, 0.0016956329345703125, -0.0157623291015625, -0.03131103515625, 0.00921630859375, -0.00788116455078125, 0.040679931640625, -0.00031280517578125, -0.028961181640625, -0.0118255615234375, 0.01552581787109375, -0.042510986328125, -0.03961181640625, 0.05419921875, -0.019805908203125, 0.031585693359375, -0.00302886962890625, -0.056549072265625, -0.0209503173828125, 0.0212860107421875, -0.06072998046875, 0.07232666015625, 0.0144500732421875, -0.090576171875, 0.03546142578125, -0.056610107421875, -0.033905029296875, 0.002948760986328125, 0.0014009475708007812, -0.06927490234375, -0.00847625732421875, 0.00724029541015625, 0.037109375, -0.006015777587890625, 0.03472900390625, -0.023712158203125, -0.0310516357421875, 0.004222869873046875, -0.030364990234375, 0.08294677734375, 0.013580322265625, -0.0396728515625, 0.0035076141357421875, -0.08599853515625, -0.00888824462890625, 0.01000213623046875, -0.059173583984375, -0.028106689453125, -0.0306549072265625, 0.018310546875, 0.01372528076171875, 0.01947021484375, -0.03302001953125, 0.00943756103515625, -0.050140380859375, 0.032562255859375, 0.0526123046875, 0.0224609375, 0.017730712890625, -0.038421630859375, 0.02301025390625, 0.0301666259765625, 0.0180816650390625, -0.0192108154296875, -0.039031982421875, -0.04864501953125, -0.04638671875, 0.04388427734375, 0.06451416015625, 0.00180816650390625, 0.07049560546875, -0.0302276611328125, -0.030303955078125, -0.047454833984375, -0.003429412841796875, 0.0390625, 0.06512451171875, 0.041107177734375, -0.02001953125, -0.04254150390625, -0.06597900390625, 0.004241943359375, -0.0242462158203125, 0.0047760009765625, 0.018951416015625, 0.05169677734375, -0.035614013671875, 0.047332763671875, -0.026763916015625, -0.040740966796875, -0.0142669677734375, 0.01349639892578125, 0.021514892578125, 0.055999755859375, 0.0270538330078125, -0.051483154296875, -0.0254364013671875, -0.013458251953125, -0.052276611328125, 0.00896453857421875, -0.026824951171875, -0.005779266357421875, 0.011444091796875, 0.0340576171875, -0.045501708984375, 0.048797607421875, 0.011444091796875, -0.01654052734375, 0.0217132568359375, -0.037109375, 0.003665924072265625, -0.10321044921875, 0.04144287109375, 0.00746917724609375, -0.0007505416870117188, -0.049285888671875, -0.0291290283203125, 0.0102081298828125, 0.0030460357666015625, -0.015167236328125, 0.031402587890625, -0.04547119140625, 0.0109710693359375, 0.00920867919921875, -0.01207733154296875, 0.007293701171875, 0.048095703125, 0.0005116462707519531, 0.0249176025390625, 0.036163330078125, -0.05029296875, -0.01482391357421875, 0.038482666015625, -0.027130126953125, 0.02947998046875, -0.05364990234375, 0.0010662078857421875, -0.01454925537109375, 0.006313323974609375, -0.064453125, -0.01554107666015625, 0.0323486328125, -0.03216552734375, 0.044647216796875, -0.01000213623046875, -0.020538330078125, -0.04449462890625, -0.0130157470703125, 0.0169219970703125, 0.0318603515625, -0.0225372314453125, 0.046051025390625, 0.0282440185546875, -0.004390716552734375, -0.044647216796875, -0.06689453125, -0.00530242919921875, -0.038330078125, -0.022369384765625, 0.058441162109375, -0.0033779144287109375, -0.0211029052734375, -0.005466461181640625, 0.006336212158203125, -0.0228118896484375, 0.0255279541015625, 0.03924560546875, 0.047515869140625, -0.009735107421875, 0.0310516357421875, 0.018402099609375, -0.00643157958984375, 0.01059722900390625, -0.01357269287109375, 0.02703857421875, 0.005580902099609375, -0.0269775390625, -0.07147216796875, 0.0207977294921875, 0.054290771484375, -0.03350830078125, 0.0596923828125, 0.042816162109375, -0.050140380859375, 0.0225830078125, -0.032562255859375, -0.0239105224609375, -0.0277252197265625, 0.03546142578125, -0.02984619140625, -0.033782958984375, 0.046173095703125, -0.007038116455078125, 0.0196075439453125, 0.05767822265625, 0.03173828125, -0.006153106689453125, 0.06634521484375, 0.037109375, -0.02880859375, 0.0295257568359375, -0.05218505859375, 0.0086669921875, -0.069580078125, -0.038818359375, -0.037994384765625, -0.0172882080078125, -0.03900146484375, -0.0074462890625, 0.017425537109375, 0.01201629638671875, -0.035888671875, 0.0189971923828125, -0.04718017578125, -0.002613067626953125, 0.050689697265625, 0.041046142578125, 0.005008697509765625, -0.008544921875, -0.0247955322265625, 0.00600433349609375, -0.047332763671875, -0.04473876953125, 0.091552734375, 0.049652099609375, 0.03851318359375, -0.0138092041015625, 0.0755615234375, 0.0168914794921875, 0.016357421875, -0.049957275390625, 0.0287322998046875, -0.01331329345703125, -0.043792724609375, -0.0189361572265625, -0.029937744140625, -0.078125, -0.00736236572265625, -0.02301025390625, -0.06683349609375, 0.0144195556640625, 0.0128326416015625, -0.04168701171875, 0.027435302734375, -0.0362548828125, 0.080810546875, -0.0283203125, -0.0025539398193359375, -0.00299072265625, -0.07293701171875, 0.022125244140625, -0.006954193115234375, 0.0095672607421875, 0.004016876220703125, 0.0169677734375, 0.06817626953125, -0.04510498046875, 0.062255859375, -0.011962890625, -0.0006189346313476562, 0.031951904296875, -0.0238189697265625, 0.036590576171875, -0.00878143310546875, -0.006305694580078125, 0.0138702392578125, 0.00022113323211669922, -0.0134429931640625, -0.00785064697265625, 0.04107666015625, -0.049224853515625, -0.0297698974609375, -0.030792236328125, -0.0293426513671875, 0.01187896728515625, 0.030059814453125, 0.0458984375, 0.05267333984375, -0.0054931640625, -0.0016574859619140625, 0.06951904296875, -0.01396942138671875, 0.02191162109375, 0.019989013671875, 0.019256591796875, -0.047698974609375, 0.038848876953125, 0.007396697998046875, 0.00665283203125, 0.04058837890625, 0.014739990234375, -0.03546142578125, -0.03924560546875, -0.037017822265625, 0.035400390625, -0.05352783203125, -0.004055023193359375, -0.07373046875, -0.02362060546875, -0.04656982421875, 0.01971435546875, -0.0019683837890625, -0.0230865478515625, -0.06536865234375, -0.03729248046875, 0.035552978515625, 0.04559326171875, 0.004791259765625, 0.017730712890625, -0.065185546875, 0.00560760498046875, -0.0033283233642578125, 0.00867462158203125, -0.0123291015625, -0.07421875, -0.0227508544921875, 0.005107879638671875, -0.0133514404296875, -0.0738525390625, 0.0282440185546875, 0.01110076904296875, 0.04486083984375, 0.04681396484375, 0.004848480224609375, 0.06201171875, -0.038604736328125, 0.049652099609375, 0.0102081298828125, -0.05535888671875, 0.059661865234375, -0.02362060546875, 0.009124755859375, 0.05908203125, 0.0277557373046875, -0.0186767578125, -0.0138092041015625, -0.071533203125, -0.07330322265625, 0.052703857421875, -0.01194000244140625, -0.0031757354736328125, -0.002925872802734375, 0.047821044921875, -0.01123046875, 0.005504608154296875, -0.05609130859375, -0.030303955078125, 0.0104522705078125, -0.048675537109375, -0.0025691986083984375, -0.02880859375, -0.0090789794921875, -0.022430419921875, 0.06964111328125, 0.00909423828125, 0.0276336669921875, 0.028350830078125, -0.0006165504455566406, -0.004032135009765625, 0.01421356201171875, 0.04351806640625, 0.0247955322265625, -0.0288848876953125, -0.00878143310546875, 0.003757476806640625, -0.049652099609375, -0.00021982192993164062, 0.036468505859375, -0.032257080078125, 0.0162200927734375, 0.0249786376953125, 0.07373046875, 0.0035400390625, -0.04443359375, 0.03704833984375, 0.00228118896484375, -0.035430908203125, -0.035003662109375, -0.005214691162109375, -0.006603240966796875, -0.0020885467529296875, -0.0038280487060546875, 0.00881195068359375, 0.0250244140625, -0.0246734619140625, 0.027923583984375, 0.0161285400390625, -0.0478515625, -0.0167236328125, 0.07720947265625, 9.5367431640625e-7, -0.0272369384765625, 0.05938720703125, 0.00183868408203125, -0.036163330078125, 0.0572509765625, 0.0236663818359375, 0.06646728515625, -0.01837158203125, 0.00452423095703125, 0.060882568359375, 0.00970458984375, -0.0107421875, 0.027130126953125, 0.00612640380859375, -0.0158233642578125, -0.004924774169921875, -0.0276641845703125, 0.00811004638671875, 0.031402587890625, -0.07293701171875, 0.0264434814453125, -0.0367431640625, -0.03607177734375, 0.0187225341796875, 0.007091522216796875, -0.0484619140625, 0.03460693359375, 0.004344940185546875, 0.06732177734375, -0.07293701171875, 0.061920166015625, 0.057708740234375, -0.047882080078125, -0.07977294921875, 0.006275177001953125, 0.0088043212890625, -0.033172607421875, 0.05487060546875, 0.0382080078125, 0.03436279296875, -0.00803375244140625, -0.00775146484375, -0.0509033203125, 0.095703125, 0.0066375732421875, -0.059814453125, -0.0108489990234375, 0.005523681640625, 0.04144287109375, -0.00873565673828125, 0.017974853515625, 0.0286712646484375, 0.017333984375, -0.0111083984375, -0.078369140625, 0.00555419921875, -0.02630615234375, -0.0131683349609375, 0.012359619140625, -0.041595458984375, 0.07891845703125, -0.03228759765625, -0.00318145751953125, 0.0249786376953125, 0.0281524658203125, 0.027191162109375, 0.04583740234375, 0.03765869140625, 0.0712890625, 0.0853271484375, 0.003253936767578125, 0.055328369140625, -0.0367431640625, 0.0263671875, 0.09136962890625, -0.01346588134765625, 0.0498046875, 0.036712646484375, 0.0021266937255859375, 0.05303955078125, 0.06915283203125, -0.0066375732421875, 0.037750244140625, 0.0207672119140625, -0.017242431640625, 0.0020999908447265625, 0.00704193115234375, -0.0199432373046875, 0.027099609375, 0.0205230712890625, -0.0758056640625, 0.0237579345703125, 0.00795745849609375, 0.01117706298828125, -0.00336456298828125, -0.01377105712890625, 0.07220458984375, -0.0050506591796875, -0.053955078125, 0.0380859375, 0.005588531494140625, 0.02850341796875, -0.027923583984375, -0.00551605224609375, -0.0010089874267578125, 0.03729248046875, -0.0103607177734375, -0.021209716796875, 0.005161285400390625, 0.000012040138244628906, -0.026092529296875, -0.00827789306640625, 0.034454345703125, -0.03778076171875, -0.041778564453125, 0.00904083251953125, 0.0264892578125, 0.0204620361328125, 0.0165863037109375, -0.08221435546875, -0.0254974365234375, 0.00724029541015625, -0.0196380615234375, 0.01334381103515625, 0.0484619140625, 0.00005239248275756836, 0.04071044921875, 0.0289459228515625, 0.006622314453125, -0.01456451416015625, 0.01451873779296875, 0.06585693359375, -0.037017822265625, -0.044647216796875, -0.06671142578125, 0.0533447265625, -0.009979248046875, -0.044097900390625, 0.038299560546875, 0.06512451171875, 0.05316162109375, -0.0019893646240234375, 0.04632568359375, -0.0005617141723632812, 0.058624267578125, -0.048675537109375, 0.05126953125, -0.03875732421875, 0.029388427734375, -0.0303802490234375, -0.0229034423828125, -0.0170440673828125, 0.052520751953125, -0.0159759521484375, 0.0078582763671875, 0.031951904296875, 0.052276611328125, -0.002445220947265625, -0.00864410400390625, -0.00518035888671875, 0.00991058349609375, 0.01390838623046875, 0.04302978515625, 0.025421142578125, -0.06622314453125, 0.0241241455078125, -0.034515380859375, -0.0164642333984375, -0.016326904296875, -0.04168701171875, -0.0792236328125, -0.0460205078125, -0.050048828125, -0.03509521484375, -0.00124359130859375, 0.06634521484375, 0.0538330078125, -0.050994873046875, 0.004123687744140625, -0.017913818359375, -0.0203399658203125, -0.0223236083984375, -0.01352691650390625, 0.0572509765625, 0.0020542144775390625, -0.047210693359375, 0.0020084381103515625, 0.00664520263671875, 0.041351318359375, 0.00069427490234375, -0.0022411346435546875, -0.032135009765625, -0.0153045654296875, 0.00914764404296875, 0.0074615478515625, -0.04071044921875, 0.0052642822265625, 0.005458831787109375, -0.033538818359375, 0.01824951171875, 0.04388427734375, -0.03924560546875, 0.02178955078125, 0.0234375, 0.0323486328125, 0.0310516357421875, 0.0218048095703125, 0.006656646728515625, -0.0282745361328125, 0.00902557373046875, 0.02740478515625, 0.03875732421875, 0.0167236328125, -0.0517578125, 0.03472900390625, 0.0252227783203125, -0.049224853515625, -0.060302734375, -0.0226593017578125, -0.07830810546875, -0.0125885009765625, 0.07373046875, -0.0305328369140625, -0.0294647216796875, -0.0015668869018554688, -0.026580810546875, 0.03216552734375, -0.019195556640625, 0.062744140625, 0.0206146240234375, 0.007411956787109375, -0.0023193359375, -0.0288848876953125, 0.0350341796875, 0.02734375, -0.064453125, -0.0022735595703125, 0.042327880859375, 0.0279998779296875, 0.005779266357421875, 0.06689453125, -0.0182342529296875, 0.0125732421875, -0.01026153564453125, 0.005542755126953125, 0.0006451606750488281, -0.0190277099609375, -0.045501708984375, -0.011260986328125, 0.0006432533264160156, -0.046875 ] ]
cross-encoder/ms-marco-TinyBERT-L-2-v2
2021-08-05T08:39:45.000Z
[ "transformers", "pytorch", "jax", "bert", "text-classification", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
cross-encoder
null
null
cross-encoder/ms-marco-TinyBERT-L-2-v2
9
171,523
transformers
2022-03-02T23:29:05
--- license: apache-2.0 --- # Cross-Encoder for MS Marco This model was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task. The model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order. See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html) for more details. The training code is available here: [SBERT.net Training MS Marco](https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/ms_marco) ## Usage with Transformers ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch model = AutoModelForSequenceClassification.from_pretrained('model_name') tokenizer = AutoTokenizer.from_pretrained('model_name') features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt") model.eval() with torch.no_grad(): scores = model(**features).logits print(scores) ``` ## Usage with SentenceTransformers The usage becomes easier when you have [SentenceTransformers](https://www.sbert.net/) installed. Then, you can use the pre-trained models like this: ```python from sentence_transformers import CrossEncoder model = CrossEncoder('model_name', max_length=512) scores = model.predict([('Query', 'Paragraph1'), ('Query', 'Paragraph2') , ('Query', 'Paragraph3')]) ``` ## Performance In the following table, we provide various pre-trained Cross-Encoders together with their performance on the [TREC Deep Learning 2019](https://microsoft.github.io/TREC-2019-Deep-Learning/) and the [MS Marco Passage Reranking](https://github.com/microsoft/MSMARCO-Passage-Ranking/) dataset. | Model-Name | NDCG@10 (TREC DL 19) | MRR@10 (MS Marco Dev) | Docs / Sec | | ------------- |:-------------| -----| --- | | **Version 2 models** | | | | cross-encoder/ms-marco-TinyBERT-L-2-v2 | 69.84 | 32.56 | 9000 | cross-encoder/ms-marco-MiniLM-L-2-v2 | 71.01 | 34.85 | 4100 | cross-encoder/ms-marco-MiniLM-L-4-v2 | 73.04 | 37.70 | 2500 | cross-encoder/ms-marco-MiniLM-L-6-v2 | 74.30 | 39.01 | 1800 | cross-encoder/ms-marco-MiniLM-L-12-v2 | 74.31 | 39.02 | 960 | **Version 1 models** | | | | cross-encoder/ms-marco-TinyBERT-L-2 | 67.43 | 30.15 | 9000 | cross-encoder/ms-marco-TinyBERT-L-4 | 68.09 | 34.50 | 2900 | cross-encoder/ms-marco-TinyBERT-L-6 | 69.57 | 36.13 | 680 | cross-encoder/ms-marco-electra-base | 71.99 | 36.41 | 340 | **Other models** | | | | nboost/pt-tinybert-msmarco | 63.63 | 28.80 | 2900 | nboost/pt-bert-base-uncased-msmarco | 70.94 | 34.75 | 340 | nboost/pt-bert-large-msmarco | 73.36 | 36.48 | 100 | Capreolus/electra-base-msmarco | 71.23 | 36.89 | 340 | amberoad/bert-multilingual-passage-reranking-msmarco | 68.40 | 35.54 | 330 | sebastian-hofstaetter/distilbert-cat-margin_mse-T2-msmarco | 72.82 | 37.88 | 720 Note: Runtime was computed on a V100 GPU.
3,233
[ [ -0.03228759765625, -0.043670654296875, 0.0250396728515625, 0.01168060302734375, -0.0127105712890625, 0.01073455810546875, -0.01338958740234375, -0.038543701171875, 0.025146484375, 0.0255889892578125, -0.041229248046875, -0.051055908203125, -0.058013916015625, 0.003063201904296875, -0.0333251953125, 0.059326171875, -0.0015039443969726562, 0.0123291015625, -0.01372528076171875, -0.00827789306640625, -0.0193939208984375, -0.0308074951171875, -0.04132080078125, -0.0218048095703125, 0.035980224609375, 0.01593017578125, 0.05816650390625, 0.02978515625, 0.0419921875, 0.032958984375, -0.0084075927734375, 0.007022857666015625, -0.01415252685546875, 0.00011080503463745117, 0.005214691162109375, -0.0286865234375, -0.041717529296875, -0.0091400146484375, 0.033416748046875, 0.0266265869140625, 0.0005674362182617188, 0.019989013671875, -0.00020313262939453125, 0.04327392578125, -0.0293731689453125, -0.00376129150390625, -0.02587890625, 0.0184478759765625, -0.0149383544921875, -0.018768310546875, -0.035186767578125, -0.0162811279296875, 0.01331329345703125, -0.043914794921875, 0.0299072265625, 0.01174163818359375, 0.09521484375, 0.026275634765625, -0.01641845703125, -0.0196533203125, -0.035400390625, 0.05419921875, -0.051483154296875, 0.053314208984375, 0.01374053955078125, 0.01324462890625, 0.0088958740234375, -0.07318115234375, -0.033660888671875, -0.0162506103515625, -0.01439666748046875, 0.01922607421875, -0.031829833984375, -0.006427764892578125, 0.0313720703125, 0.0309600830078125, -0.07501220703125, -0.00608062744140625, -0.053802490234375, -0.0092010498046875, 0.04913330078125, 0.0202484130859375, 0.020172119140625, -0.0188446044921875, -0.02423095703125, -0.01071929931640625, -0.037811279296875, 0.016326904296875, 0.0207366943359375, 0.0008802413940429688, -0.015472412109375, 0.03082275390625, -0.01776123046875, 0.059600830078125, 0.0084075927734375, 0.007080078125, 0.058074951171875, -0.019500732421875, -0.0178985595703125, 0.00186920166015625, 0.07366943359375, 0.021514892578125, 0.00785064697265625, -0.0095367431640625, -0.0169830322265625, -0.01285552978515625, 0.0303497314453125, -0.066162109375, -0.020111083984375, 0.0219879150390625, -0.040313720703125, -0.0100860595703125, 0.01227569580078125, -0.06396484375, 0.01207733154296875, -0.0098876953125, 0.045501708984375, -0.030059814453125, 0.0024127960205078125, 0.01776123046875, -0.0106964111328125, 0.02166748046875, 0.01345062255859375, -0.0557861328125, 0.0010099411010742188, 0.0260009765625, 0.070556640625, -0.00864410400390625, -0.028472900390625, -0.01212310791015625, -0.002803802490234375, -0.01251220703125, 0.042694091796875, -0.03546142578125, -0.0235137939453125, -0.005451202392578125, 0.021514892578125, -0.01129150390625, -0.0229034423828125, 0.0535888671875, -0.034820556640625, 0.038299560546875, -0.00946807861328125, -0.0261993408203125, -0.01183319091796875, 0.017730712890625, -0.059112548828125, 0.091064453125, 0.002994537353515625, -0.06390380859375, 0.01236724853515625, -0.052947998046875, -0.0256805419921875, -0.01207733154296875, 0.0030956268310546875, -0.05743408203125, 0.0034542083740234375, 0.030609130859375, 0.019378662109375, -0.0241241455078125, 0.007495880126953125, -0.0130767822265625, -0.0340576171875, 0.012298583984375, -0.031219482421875, 0.08184814453125, 0.0298309326171875, -0.0369873046875, 0.0040435791015625, -0.0506591796875, 0.00890350341796875, 0.0214385986328125, -0.031890869140625, -0.0004322528839111328, -0.021514892578125, 0.01038360595703125, 0.0301055908203125, 0.032958984375, -0.037689208984375, 0.007785797119140625, -0.02099609375, 0.036376953125, 0.034698486328125, -0.008209228515625, 0.0258941650390625, -0.0226593017578125, 0.05029296875, 0.0095062255859375, 0.03265380859375, 0.00067901611328125, -0.047607421875, -0.06622314453125, -0.01019287109375, 0.038299560546875, 0.044158935546875, -0.055450439453125, 0.04083251953125, -0.038970947265625, -0.0531005859375, -0.062042236328125, -0.007534027099609375, 0.031524658203125, 0.0255126953125, 0.049713134765625, -0.006671905517578125, -0.055084228515625, -0.07501220703125, -0.025146484375, 0.0017757415771484375, 0.002986907958984375, 0.0179595947265625, 0.048370361328125, -0.019775390625, 0.055328369140625, -0.040008544921875, -0.0163116455078125, -0.034423828125, 0.0002512931823730469, 0.019012451171875, 0.050018310546875, 0.04730224609375, -0.052642822265625, -0.0408935546875, -0.0142669677734375, -0.052215576171875, 0.005352020263671875, 0.0027637481689453125, -0.0104522705078125, 0.02032470703125, 0.04638671875, -0.051971435546875, 0.051177978515625, 0.037109375, -0.034393310546875, 0.0279998779296875, -0.03302001953125, 0.0218658447265625, -0.09063720703125, 0.007625579833984375, -0.0025157928466796875, -0.01189422607421875, -0.038787841796875, -0.0119476318359375, 0.006992340087890625, -0.0018863677978515625, -0.0261077880859375, 0.0248565673828125, -0.04547119140625, -0.0025920867919921875, 0.009185791015625, 0.00579833984375, 0.01277923583984375, 0.047637939453125, 0.024566650390625, 0.058349609375, 0.039031982421875, -0.0266571044921875, 0.0180816650390625, 0.0276031494140625, -0.04620361328125, 0.0284881591796875, -0.0693359375, -0.0006427764892578125, -0.00978851318359375, 0.00811767578125, -0.07489013671875, 0.01265716552734375, 0.0178985595703125, -0.0655517578125, 0.0234832763671875, -0.0102691650390625, -0.0296478271484375, -0.049591064453125, -0.013458251953125, 0.0246734619140625, 0.037750244140625, -0.03564453125, 0.043487548828125, 0.0255279541015625, 0.0005202293395996094, -0.052947998046875, -0.091552734375, 0.01369476318359375, -0.00421905517578125, -0.055267333984375, 0.04754638671875, -0.0153961181640625, 0.01108551025390625, 0.002925872802734375, -0.00334930419921875, -0.00313568115234375, -0.00838470458984375, 0.01454925537109375, 0.0247650146484375, -0.0140838623046875, 0.0011796951293945312, 0.0009307861328125, -0.01641845703125, 0.00524139404296875, -0.01568603515625, 0.04803466796875, -0.01324462890625, -0.0095062255859375, -0.018951416015625, 0.01490020751953125, 0.037322998046875, -0.042388916015625, 0.054046630859375, 0.061004638671875, -0.024322509765625, -0.00823974609375, -0.031402587890625, -0.007595062255859375, -0.037811279296875, 0.03387451171875, -0.043548583984375, -0.057952880859375, 0.039886474609375, 0.0227813720703125, 0.0020294189453125, 0.038360595703125, 0.03662109375, -0.0015134811401367188, 0.07733154296875, 0.036346435546875, -0.003635406494140625, 0.04937744140625, -0.053619384765625, 0.022308349609375, -0.058074951171875, -0.044281005859375, -0.049591064453125, -0.033447265625, -0.051300048828125, -0.0264129638671875, 0.0228729248046875, -0.0100860595703125, -0.01702880859375, 0.05230712890625, -0.05633544921875, 0.024322509765625, 0.0546875, 0.0208892822265625, 0.00777435302734375, 0.01097869873046875, -0.0191802978515625, -0.00926971435546875, -0.06256103515625, -0.024505615234375, 0.097900390625, 0.012603759765625, 0.0526123046875, 0.0012807846069335938, 0.057952880859375, 0.023101806640625, -0.00286102294921875, -0.03240966796875, 0.032989501953125, -0.01132965087890625, -0.05841064453125, -0.017303466796875, -0.0316162109375, -0.08074951171875, 0.0255584716796875, -0.0159912109375, -0.043701171875, 0.038543701171875, -0.00670623779296875, -0.029266357421875, 0.02374267578125, -0.0419921875, 0.09814453125, -0.031280517578125, -0.0268707275390625, -0.007358551025390625, -0.0555419921875, 0.0127410888671875, 0.01555633544921875, 0.0025653839111328125, 0.006885528564453125, -0.0124664306640625, 0.05682373046875, -0.02764892578125, 0.0261077880859375, -0.010955810546875, 0.01145172119140625, 0.014068603515625, -0.007411956787109375, 0.02874755859375, -0.0006070137023925781, -0.00780487060546875, 0.0250091552734375, -0.0032405853271484375, -0.029876708984375, -0.03192138671875, 0.060943603515625, -0.06878662109375, -0.0313720703125, -0.040771484375, -0.0272674560546875, -0.002227783203125, 0.01551055908203125, 0.057769775390625, 0.031707763671875, 0.00033092498779296875, 0.0323486328125, 0.05584716796875, -0.0235748291015625, 0.04302978515625, 0.0284271240234375, -0.003986358642578125, -0.055389404296875, 0.058074951171875, 0.02294921875, 0.012481689453125, 0.0430908203125, -0.0136566162109375, -0.0357666015625, -0.040557861328125, -0.0266571044921875, 0.01265716552734375, -0.039947509765625, -0.01678466796875, -0.05487060546875, -0.0307769775390625, -0.037567138671875, -0.005542755126953125, -0.03125, -0.032012939453125, -0.0180816650390625, -0.01311492919921875, 0.0164794921875, 0.045745849609375, 0.00982666015625, 0.01531219482421875, -0.046051025390625, 0.01605224609375, 0.00049591064453125, 0.011810302734375, -0.0079498291015625, -0.0657958984375, -0.03411865234375, -0.005157470703125, -0.0308380126953125, -0.06207275390625, 0.051055908203125, -0.006534576416015625, 0.054901123046875, 0.0113983154296875, 0.004344940185546875, 0.056427001953125, -0.029205322265625, 0.06744384765625, 0.0123291015625, -0.064697265625, 0.050018310546875, 0.0019779205322265625, 0.02935791015625, 0.04730224609375, 0.04193115234375, -0.040008544921875, -0.0193939208984375, -0.057891845703125, -0.071044921875, 0.0675048828125, 0.0224609375, -0.0082244873046875, 0.005519866943359375, 0.00151824951171875, -0.00902557373046875, 0.021209716796875, -0.07275390625, -0.036865234375, -0.034027099609375, -0.028533935546875, -0.023681640625, -0.01242828369140625, 0.01543426513671875, -0.047027587890625, 0.0582275390625, 0.01316070556640625, 0.042999267578125, 0.0455322265625, -0.031036376953125, 0.006595611572265625, 0.00827789306640625, 0.05181884765625, 0.048370361328125, -0.0203857421875, -0.0017566680908203125, 0.01593017578125, -0.038360595703125, -0.01056671142578125, 0.01776123046875, -0.034820556640625, 0.02899169921875, 0.025360107421875, 0.07562255859375, 0.0168609619140625, -0.0290374755859375, 0.04888916015625, 0.0038509368896484375, -0.020904541015625, -0.0377197265625, -0.0149383544921875, 0.0015134811401367188, 0.028717041015625, 0.0185089111328125, 0.00479888916015625, 0.0191497802734375, -0.031036376953125, 0.01174163818359375, 0.0269012451171875, -0.044158935546875, -0.015411376953125, 0.0682373046875, 0.01306915283203125, -0.032073974609375, 0.0516357421875, 0.0017795562744140625, -0.061553955078125, 0.03900146484375, 0.0273590087890625, 0.078857421875, -0.0213623046875, 0.01332855224609375, 0.052215576171875, 0.051361083984375, 0.005550384521484375, 0.0262603759765625, -0.011871337890625, -0.039886474609375, -0.0010747909545898438, -0.0413818359375, -0.00887298583984375, -0.005031585693359375, -0.050811767578125, 0.022308349609375, -0.01331329345703125, -0.0245361328125, -0.01439666748046875, 0.02044677734375, -0.0631103515625, 0.012481689453125, 0.004119873046875, 0.083740234375, -0.04107666015625, 0.0804443359375, 0.043487548828125, -0.066162109375, -0.043731689453125, -0.00977325439453125, -0.0299835205078125, -0.052215576171875, 0.04278564453125, 0.009490966796875, 0.00878143310546875, -0.000026702880859375, -0.0267486572265625, -0.061767578125, 0.111083984375, 0.0152740478515625, -0.05120849609375, -0.01378631591796875, 0.033111572265625, 0.03857421875, -0.02593994140625, 0.050933837890625, 0.032623291015625, 0.037200927734375, -0.0145721435546875, -0.0706787109375, 0.01125335693359375, -0.03704833984375, -0.0032634735107421875, 0.006015777587890625, -0.062286376953125, 0.0794677734375, -0.0166473388671875, 0.01275634765625, 0.0125732421875, 0.04534912109375, 0.01544952392578125, 0.025848388671875, 0.0264434814453125, 0.0634765625, 0.05157470703125, -0.0298309326171875, 0.0665283203125, -0.042327880859375, 0.042877197265625, 0.06787109375, 0.015289306640625, 0.06707763671875, 0.032379150390625, -0.024627685546875, 0.05596923828125, 0.054779052734375, -0.016387939453125, 0.038787841796875, 0.0029277801513671875, 0.0012254714965820312, -0.0310211181640625, 0.0289459228515625, -0.051025390625, 0.01800537109375, 0.011810302734375, -0.060302734375, -0.005710601806640625, -0.00397491455078125, -0.008331298828125, -0.0123291015625, -0.0185546875, 0.03387451171875, -0.0058135986328125, -0.0430908203125, 0.051513671875, 0.0022754669189453125, 0.056427001953125, -0.04998779296875, 0.0139312744140625, -0.0194549560546875, 0.020294189453125, -0.017059326171875, -0.06591796875, 0.00714111328125, -0.0037097930908203125, -0.011322021484375, -0.020843505859375, 0.036865234375, -0.043975830078125, -0.043060302734375, 0.0309600830078125, 0.0240631103515625, 0.0159912109375, -0.007198333740234375, -0.07806396484375, 0.0166778564453125, 0.0159759521484375, -0.037750244140625, 0.0084381103515625, 0.031829833984375, 0.0098114013671875, 0.0504150390625, 0.03662109375, -0.0091400146484375, 0.031524658203125, 0.0025482177734375, 0.053253173828125, -0.06591796875, -0.039642333984375, -0.04345703125, 0.045806884765625, -0.0219879150390625, -0.040008544921875, 0.0679931640625, 0.07818603515625, 0.0748291015625, -0.0242156982421875, 0.0504150390625, -0.01110076904296875, 0.0186767578125, -0.029449462890625, 0.05877685546875, -0.064208984375, 0.0188446044921875, -0.0164794921875, -0.0625, -0.013214111328125, 0.04803466796875, -0.033111572265625, 0.0194549560546875, 0.050384521484375, 0.07061767578125, 0.0005860328674316406, -0.001922607421875, 0.018524169921875, 0.011932373046875, 0.0135498046875, 0.06591796875, 0.048919677734375, -0.06964111328125, 0.075927734375, -0.03289794921875, 0.01218414306640625, -0.0166473388671875, -0.031829833984375, -0.06414794921875, -0.043914794921875, -0.024871826171875, -0.03173828125, 0.0123291015625, 0.0626220703125, 0.054840087890625, -0.0562744140625, -0.01555633544921875, -0.0017461776733398438, 0.00749969482421875, -0.01042938232421875, -0.0171966552734375, 0.0325927734375, -0.020721435546875, -0.0718994140625, 0.02520751953125, 0.0010089874267578125, 0.000698089599609375, -0.0185089111328125, -0.03289794921875, -0.0222015380859375, 0.003265380859375, 0.034393310546875, 0.00815582275390625, -0.05499267578125, -0.00936126708984375, 0.01422119140625, -0.022430419921875, 0.0221710205078125, 0.045806884765625, -0.058868408203125, 0.0172576904296875, 0.062286376953125, 0.031524658203125, 0.068359375, -0.01568603515625, 0.0208282470703125, -0.0311279296875, -0.0030803680419921875, 0.01171112060546875, 0.04345703125, 0.0107879638671875, -0.01445770263671875, 0.04534912109375, 0.0296478271484375, -0.04547119140625, -0.061798095703125, -0.0135955810546875, -0.086669921875, -0.026580810546875, 0.06787109375, -0.01015472412109375, -0.033416748046875, 0.01340484619140625, -0.0111083984375, 0.017822265625, -0.0283355712890625, 0.035552978515625, 0.04962158203125, 0.00440216064453125, -0.0204925537109375, -0.043365478515625, 0.0309906005859375, 0.0176544189453125, -0.052276611328125, -0.01372528076171875, 0.01335906982421875, 0.03564453125, 0.01515960693359375, 0.03326416015625, -0.0309600830078125, 0.0238189697265625, 0.0120086669921875, 0.031036376953125, -0.02178955078125, -0.031280517578125, -0.0252685546875, 0.01305389404296875, -0.031219482421875, -0.038604736328125 ] ]
BAAI/bge-small-en-v1.5
2023-11-02T10:47:51.000Z
[ "sentence-transformers", "pytorch", "bert", "feature-extraction", "sentence-similarity", "transformers", "mteb", "en", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
BAAI
null
null
BAAI/bge-small-en-v1.5
37
170,376
sentence-transformers
2023-09-12T05:20:55
--- tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers - mteb model-index: - name: bge-small-en-v1.5 results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.79104477611939 - type: ap value: 37.21923821573361 - type: f1 value: 68.0914945617093 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 92.75377499999999 - type: ap value: 89.46766124546022 - type: f1 value: 92.73884001331487 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 46.986 - type: f1 value: 46.55936786727896 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 35.846000000000004 - type: map_at_10 value: 51.388 - type: map_at_100 value: 52.132999999999996 - type: map_at_1000 value: 52.141000000000005 - type: map_at_3 value: 47.037 - type: map_at_5 value: 49.579 - type: mrr_at_1 value: 36.558 - type: mrr_at_10 value: 51.658 - type: mrr_at_100 value: 52.402 - type: mrr_at_1000 value: 52.410000000000004 - type: mrr_at_3 value: 47.345 - type: mrr_at_5 value: 49.797999999999995 - type: ndcg_at_1 value: 35.846000000000004 - type: ndcg_at_10 value: 59.550000000000004 - type: ndcg_at_100 value: 62.596 - type: ndcg_at_1000 value: 62.759 - type: ndcg_at_3 value: 50.666999999999994 - type: ndcg_at_5 value: 55.228 - type: precision_at_1 value: 35.846000000000004 - type: precision_at_10 value: 8.542 - type: precision_at_100 value: 0.984 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 20.389 - type: precision_at_5 value: 14.438 - type: recall_at_1 value: 35.846000000000004 - type: recall_at_10 value: 85.42 - type: recall_at_100 value: 98.43499999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 61.166 - type: recall_at_5 value: 72.191 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 47.402770198163594 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 40.01545436974177 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.586465273207196 - type: mrr value: 74.42169019038825 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 85.1891186537969 - type: cos_sim_spearman value: 83.75492046087288 - type: euclidean_pearson value: 84.11766204805357 - type: euclidean_spearman value: 84.01456493126516 - type: manhattan_pearson value: 84.2132950502772 - type: manhattan_spearman value: 83.89227298813377 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 85.74025974025975 - type: f1 value: 85.71493566466381 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.467181385006434 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 34.719496037339056 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.587000000000003 - type: map_at_10 value: 41.114 - type: map_at_100 value: 42.532 - type: map_at_1000 value: 42.661 - type: map_at_3 value: 37.483 - type: map_at_5 value: 39.652 - type: mrr_at_1 value: 36.338 - type: mrr_at_10 value: 46.763 - type: mrr_at_100 value: 47.393 - type: mrr_at_1000 value: 47.445 - type: mrr_at_3 value: 43.538 - type: mrr_at_5 value: 45.556000000000004 - type: ndcg_at_1 value: 36.338 - type: ndcg_at_10 value: 47.658 - type: ndcg_at_100 value: 52.824000000000005 - type: ndcg_at_1000 value: 54.913999999999994 - type: ndcg_at_3 value: 41.989 - type: ndcg_at_5 value: 44.944 - type: precision_at_1 value: 36.338 - type: precision_at_10 value: 9.156 - type: precision_at_100 value: 1.4789999999999999 - type: precision_at_1000 value: 0.196 - type: precision_at_3 value: 20.076 - type: precision_at_5 value: 14.85 - type: recall_at_1 value: 29.587000000000003 - type: recall_at_10 value: 60.746 - type: recall_at_100 value: 82.157 - type: recall_at_1000 value: 95.645 - type: recall_at_3 value: 44.821 - type: recall_at_5 value: 52.819 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.239 - type: map_at_10 value: 39.989000000000004 - type: map_at_100 value: 41.196 - type: map_at_1000 value: 41.325 - type: map_at_3 value: 37.261 - type: map_at_5 value: 38.833 - type: mrr_at_1 value: 37.516 - type: mrr_at_10 value: 46.177 - type: mrr_at_100 value: 46.806 - type: mrr_at_1000 value: 46.849000000000004 - type: mrr_at_3 value: 44.002 - type: mrr_at_5 value: 45.34 - type: ndcg_at_1 value: 37.516 - type: ndcg_at_10 value: 45.586 - type: ndcg_at_100 value: 49.897000000000006 - type: ndcg_at_1000 value: 51.955 - type: ndcg_at_3 value: 41.684 - type: ndcg_at_5 value: 43.617 - type: precision_at_1 value: 37.516 - type: precision_at_10 value: 8.522 - type: precision_at_100 value: 1.374 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 20.105999999999998 - type: precision_at_5 value: 14.152999999999999 - type: recall_at_1 value: 30.239 - type: recall_at_10 value: 55.03 - type: recall_at_100 value: 73.375 - type: recall_at_1000 value: 86.29599999999999 - type: recall_at_3 value: 43.269000000000005 - type: recall_at_5 value: 48.878 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 38.338 - type: map_at_10 value: 50.468999999999994 - type: map_at_100 value: 51.553000000000004 - type: map_at_1000 value: 51.608 - type: map_at_3 value: 47.107 - type: map_at_5 value: 49.101 - type: mrr_at_1 value: 44.201 - type: mrr_at_10 value: 54.057 - type: mrr_at_100 value: 54.764 - type: mrr_at_1000 value: 54.791000000000004 - type: mrr_at_3 value: 51.56699999999999 - type: mrr_at_5 value: 53.05 - type: ndcg_at_1 value: 44.201 - type: ndcg_at_10 value: 56.379000000000005 - type: ndcg_at_100 value: 60.645 - type: ndcg_at_1000 value: 61.73499999999999 - type: ndcg_at_3 value: 50.726000000000006 - type: ndcg_at_5 value: 53.58500000000001 - type: precision_at_1 value: 44.201 - type: precision_at_10 value: 9.141 - type: precision_at_100 value: 1.216 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 22.654 - type: precision_at_5 value: 15.723999999999998 - type: recall_at_1 value: 38.338 - type: recall_at_10 value: 70.30499999999999 - type: recall_at_100 value: 88.77199999999999 - type: recall_at_1000 value: 96.49799999999999 - type: recall_at_3 value: 55.218 - type: recall_at_5 value: 62.104000000000006 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.682 - type: map_at_10 value: 33.498 - type: map_at_100 value: 34.461000000000006 - type: map_at_1000 value: 34.544000000000004 - type: map_at_3 value: 30.503999999999998 - type: map_at_5 value: 32.216 - type: mrr_at_1 value: 27.683999999999997 - type: mrr_at_10 value: 35.467999999999996 - type: mrr_at_100 value: 36.32 - type: mrr_at_1000 value: 36.386 - type: mrr_at_3 value: 32.618 - type: mrr_at_5 value: 34.262 - type: ndcg_at_1 value: 27.683999999999997 - type: ndcg_at_10 value: 38.378 - type: ndcg_at_100 value: 43.288 - type: ndcg_at_1000 value: 45.413 - type: ndcg_at_3 value: 32.586 - type: ndcg_at_5 value: 35.499 - type: precision_at_1 value: 27.683999999999997 - type: precision_at_10 value: 5.864 - type: precision_at_100 value: 0.882 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 13.446 - type: precision_at_5 value: 9.718 - type: recall_at_1 value: 25.682 - type: recall_at_10 value: 51.712 - type: recall_at_100 value: 74.446 - type: recall_at_1000 value: 90.472 - type: recall_at_3 value: 36.236000000000004 - type: recall_at_5 value: 43.234 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.073999999999998 - type: map_at_10 value: 24.352999999999998 - type: map_at_100 value: 25.438 - type: map_at_1000 value: 25.545 - type: map_at_3 value: 21.614 - type: map_at_5 value: 23.104 - type: mrr_at_1 value: 19.776 - type: mrr_at_10 value: 28.837000000000003 - type: mrr_at_100 value: 29.755 - type: mrr_at_1000 value: 29.817 - type: mrr_at_3 value: 26.201999999999998 - type: mrr_at_5 value: 27.714 - type: ndcg_at_1 value: 19.776 - type: ndcg_at_10 value: 29.701 - type: ndcg_at_100 value: 35.307 - type: ndcg_at_1000 value: 37.942 - type: ndcg_at_3 value: 24.764 - type: ndcg_at_5 value: 27.025 - type: precision_at_1 value: 19.776 - type: precision_at_10 value: 5.659 - type: precision_at_100 value: 0.971 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 12.065 - type: precision_at_5 value: 8.905000000000001 - type: recall_at_1 value: 16.073999999999998 - type: recall_at_10 value: 41.647 - type: recall_at_100 value: 66.884 - type: recall_at_1000 value: 85.91499999999999 - type: recall_at_3 value: 27.916 - type: recall_at_5 value: 33.729 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 28.444999999999997 - type: map_at_10 value: 38.218999999999994 - type: map_at_100 value: 39.595 - type: map_at_1000 value: 39.709 - type: map_at_3 value: 35.586 - type: map_at_5 value: 36.895 - type: mrr_at_1 value: 34.841 - type: mrr_at_10 value: 44.106 - type: mrr_at_100 value: 44.98 - type: mrr_at_1000 value: 45.03 - type: mrr_at_3 value: 41.979 - type: mrr_at_5 value: 43.047999999999995 - type: ndcg_at_1 value: 34.841 - type: ndcg_at_10 value: 43.922 - type: ndcg_at_100 value: 49.504999999999995 - type: ndcg_at_1000 value: 51.675000000000004 - type: ndcg_at_3 value: 39.858 - type: ndcg_at_5 value: 41.408 - type: precision_at_1 value: 34.841 - type: precision_at_10 value: 7.872999999999999 - type: precision_at_100 value: 1.2449999999999999 - type: precision_at_1000 value: 0.161 - type: precision_at_3 value: 18.993 - type: precision_at_5 value: 13.032 - type: recall_at_1 value: 28.444999999999997 - type: recall_at_10 value: 54.984 - type: recall_at_100 value: 78.342 - type: recall_at_1000 value: 92.77 - type: recall_at_3 value: 42.842999999999996 - type: recall_at_5 value: 47.247 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.072 - type: map_at_10 value: 32.354 - type: map_at_100 value: 33.800000000000004 - type: map_at_1000 value: 33.908 - type: map_at_3 value: 29.232000000000003 - type: map_at_5 value: 31.049 - type: mrr_at_1 value: 29.110000000000003 - type: mrr_at_10 value: 38.03 - type: mrr_at_100 value: 39.032 - type: mrr_at_1000 value: 39.086999999999996 - type: mrr_at_3 value: 35.407 - type: mrr_at_5 value: 36.76 - type: ndcg_at_1 value: 29.110000000000003 - type: ndcg_at_10 value: 38.231 - type: ndcg_at_100 value: 44.425 - type: ndcg_at_1000 value: 46.771 - type: ndcg_at_3 value: 33.095 - type: ndcg_at_5 value: 35.459 - type: precision_at_1 value: 29.110000000000003 - type: precision_at_10 value: 7.215000000000001 - type: precision_at_100 value: 1.2109999999999999 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 16.058 - type: precision_at_5 value: 11.644 - type: recall_at_1 value: 23.072 - type: recall_at_10 value: 50.285999999999994 - type: recall_at_100 value: 76.596 - type: recall_at_1000 value: 92.861 - type: recall_at_3 value: 35.702 - type: recall_at_5 value: 42.152 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 24.937916666666666 - type: map_at_10 value: 33.755250000000004 - type: map_at_100 value: 34.955999999999996 - type: map_at_1000 value: 35.070499999999996 - type: map_at_3 value: 30.98708333333333 - type: map_at_5 value: 32.51491666666666 - type: mrr_at_1 value: 29.48708333333333 - type: mrr_at_10 value: 37.92183333333334 - type: mrr_at_100 value: 38.76583333333333 - type: mrr_at_1000 value: 38.82466666666667 - type: mrr_at_3 value: 35.45125 - type: mrr_at_5 value: 36.827000000000005 - type: ndcg_at_1 value: 29.48708333333333 - type: ndcg_at_10 value: 39.05225 - type: ndcg_at_100 value: 44.25983333333334 - type: ndcg_at_1000 value: 46.568333333333335 - type: ndcg_at_3 value: 34.271583333333325 - type: ndcg_at_5 value: 36.483916666666666 - type: precision_at_1 value: 29.48708333333333 - type: precision_at_10 value: 6.865749999999999 - type: precision_at_100 value: 1.1195833333333332 - type: precision_at_1000 value: 0.15058333333333335 - type: precision_at_3 value: 15.742083333333333 - type: precision_at_5 value: 11.221916666666667 - type: recall_at_1 value: 24.937916666666666 - type: recall_at_10 value: 50.650416666666665 - type: recall_at_100 value: 73.55383333333334 - type: recall_at_1000 value: 89.61691666666667 - type: recall_at_3 value: 37.27808333333334 - type: recall_at_5 value: 42.99475 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.947 - type: map_at_10 value: 30.575000000000003 - type: map_at_100 value: 31.465 - type: map_at_1000 value: 31.558000000000003 - type: map_at_3 value: 28.814 - type: map_at_5 value: 29.738999999999997 - type: mrr_at_1 value: 26.994 - type: mrr_at_10 value: 33.415 - type: mrr_at_100 value: 34.18 - type: mrr_at_1000 value: 34.245 - type: mrr_at_3 value: 31.621 - type: mrr_at_5 value: 32.549 - type: ndcg_at_1 value: 26.994 - type: ndcg_at_10 value: 34.482 - type: ndcg_at_100 value: 38.915 - type: ndcg_at_1000 value: 41.355 - type: ndcg_at_3 value: 31.139 - type: ndcg_at_5 value: 32.589 - type: precision_at_1 value: 26.994 - type: precision_at_10 value: 5.322 - type: precision_at_100 value: 0.8160000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 13.344000000000001 - type: precision_at_5 value: 8.988 - type: recall_at_1 value: 23.947 - type: recall_at_10 value: 43.647999999999996 - type: recall_at_100 value: 63.851 - type: recall_at_1000 value: 82.0 - type: recall_at_3 value: 34.288000000000004 - type: recall_at_5 value: 38.117000000000004 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.197 - type: map_at_10 value: 22.968 - type: map_at_100 value: 24.095 - type: map_at_1000 value: 24.217 - type: map_at_3 value: 20.771 - type: map_at_5 value: 21.995 - type: mrr_at_1 value: 19.511 - type: mrr_at_10 value: 26.55 - type: mrr_at_100 value: 27.500999999999998 - type: mrr_at_1000 value: 27.578999999999997 - type: mrr_at_3 value: 24.421 - type: mrr_at_5 value: 25.604 - type: ndcg_at_1 value: 19.511 - type: ndcg_at_10 value: 27.386 - type: ndcg_at_100 value: 32.828 - type: ndcg_at_1000 value: 35.739 - type: ndcg_at_3 value: 23.405 - type: ndcg_at_5 value: 25.255 - type: precision_at_1 value: 19.511 - type: precision_at_10 value: 5.017 - type: precision_at_100 value: 0.91 - type: precision_at_1000 value: 0.133 - type: precision_at_3 value: 11.023 - type: precision_at_5 value: 8.025 - type: recall_at_1 value: 16.197 - type: recall_at_10 value: 37.09 - type: recall_at_100 value: 61.778 - type: recall_at_1000 value: 82.56599999999999 - type: recall_at_3 value: 26.034000000000002 - type: recall_at_5 value: 30.762 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.41 - type: map_at_10 value: 33.655 - type: map_at_100 value: 34.892 - type: map_at_1000 value: 34.995 - type: map_at_3 value: 30.94 - type: map_at_5 value: 32.303 - type: mrr_at_1 value: 29.477999999999998 - type: mrr_at_10 value: 37.443 - type: mrr_at_100 value: 38.383 - type: mrr_at_1000 value: 38.440000000000005 - type: mrr_at_3 value: 34.949999999999996 - type: mrr_at_5 value: 36.228 - type: ndcg_at_1 value: 29.477999999999998 - type: ndcg_at_10 value: 38.769 - type: ndcg_at_100 value: 44.245000000000005 - type: ndcg_at_1000 value: 46.593 - type: ndcg_at_3 value: 33.623 - type: ndcg_at_5 value: 35.766 - type: precision_at_1 value: 29.477999999999998 - type: precision_at_10 value: 6.455 - type: precision_at_100 value: 1.032 - type: precision_at_1000 value: 0.135 - type: precision_at_3 value: 14.893999999999998 - type: precision_at_5 value: 10.485 - type: recall_at_1 value: 25.41 - type: recall_at_10 value: 50.669 - type: recall_at_100 value: 74.084 - type: recall_at_1000 value: 90.435 - type: recall_at_3 value: 36.679 - type: recall_at_5 value: 41.94 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.339 - type: map_at_10 value: 31.852000000000004 - type: map_at_100 value: 33.411 - type: map_at_1000 value: 33.62 - type: map_at_3 value: 28.929 - type: map_at_5 value: 30.542 - type: mrr_at_1 value: 28.063 - type: mrr_at_10 value: 36.301 - type: mrr_at_100 value: 37.288 - type: mrr_at_1000 value: 37.349 - type: mrr_at_3 value: 33.663 - type: mrr_at_5 value: 35.165 - type: ndcg_at_1 value: 28.063 - type: ndcg_at_10 value: 37.462 - type: ndcg_at_100 value: 43.620999999999995 - type: ndcg_at_1000 value: 46.211 - type: ndcg_at_3 value: 32.68 - type: ndcg_at_5 value: 34.981 - type: precision_at_1 value: 28.063 - type: precision_at_10 value: 7.1739999999999995 - type: precision_at_100 value: 1.486 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 15.217 - type: precision_at_5 value: 11.265 - type: recall_at_1 value: 23.339 - type: recall_at_10 value: 48.376999999999995 - type: recall_at_100 value: 76.053 - type: recall_at_1000 value: 92.455 - type: recall_at_3 value: 34.735 - type: recall_at_5 value: 40.71 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 18.925 - type: map_at_10 value: 26.017000000000003 - type: map_at_100 value: 27.034000000000002 - type: map_at_1000 value: 27.156000000000002 - type: map_at_3 value: 23.604 - type: map_at_5 value: 24.75 - type: mrr_at_1 value: 20.333000000000002 - type: mrr_at_10 value: 27.915 - type: mrr_at_100 value: 28.788000000000004 - type: mrr_at_1000 value: 28.877999999999997 - type: mrr_at_3 value: 25.446999999999996 - type: mrr_at_5 value: 26.648 - type: ndcg_at_1 value: 20.333000000000002 - type: ndcg_at_10 value: 30.673000000000002 - type: ndcg_at_100 value: 35.618 - type: ndcg_at_1000 value: 38.517 - type: ndcg_at_3 value: 25.71 - type: ndcg_at_5 value: 27.679 - type: precision_at_1 value: 20.333000000000002 - type: precision_at_10 value: 4.9910000000000005 - type: precision_at_100 value: 0.8130000000000001 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 11.029 - type: precision_at_5 value: 7.8740000000000006 - type: recall_at_1 value: 18.925 - type: recall_at_10 value: 43.311 - type: recall_at_100 value: 66.308 - type: recall_at_1000 value: 87.49 - type: recall_at_3 value: 29.596 - type: recall_at_5 value: 34.245 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 13.714 - type: map_at_10 value: 23.194 - type: map_at_100 value: 24.976000000000003 - type: map_at_1000 value: 25.166 - type: map_at_3 value: 19.709 - type: map_at_5 value: 21.523999999999997 - type: mrr_at_1 value: 30.619000000000003 - type: mrr_at_10 value: 42.563 - type: mrr_at_100 value: 43.386 - type: mrr_at_1000 value: 43.423 - type: mrr_at_3 value: 39.555 - type: mrr_at_5 value: 41.268 - type: ndcg_at_1 value: 30.619000000000003 - type: ndcg_at_10 value: 31.836 - type: ndcg_at_100 value: 38.652 - type: ndcg_at_1000 value: 42.088 - type: ndcg_at_3 value: 26.733 - type: ndcg_at_5 value: 28.435 - type: precision_at_1 value: 30.619000000000003 - type: precision_at_10 value: 9.751999999999999 - type: precision_at_100 value: 1.71 - type: precision_at_1000 value: 0.23500000000000001 - type: precision_at_3 value: 19.935 - type: precision_at_5 value: 14.984 - type: recall_at_1 value: 13.714 - type: recall_at_10 value: 37.26 - type: recall_at_100 value: 60.546 - type: recall_at_1000 value: 79.899 - type: recall_at_3 value: 24.325 - type: recall_at_5 value: 29.725 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 8.462 - type: map_at_10 value: 18.637 - type: map_at_100 value: 26.131999999999998 - type: map_at_1000 value: 27.607 - type: map_at_3 value: 13.333 - type: map_at_5 value: 15.654000000000002 - type: mrr_at_1 value: 66.25 - type: mrr_at_10 value: 74.32600000000001 - type: mrr_at_100 value: 74.60900000000001 - type: mrr_at_1000 value: 74.62 - type: mrr_at_3 value: 72.667 - type: mrr_at_5 value: 73.817 - type: ndcg_at_1 value: 53.87499999999999 - type: ndcg_at_10 value: 40.028999999999996 - type: ndcg_at_100 value: 44.199 - type: ndcg_at_1000 value: 51.629999999999995 - type: ndcg_at_3 value: 44.113 - type: ndcg_at_5 value: 41.731 - type: precision_at_1 value: 66.25 - type: precision_at_10 value: 31.900000000000002 - type: precision_at_100 value: 10.043000000000001 - type: precision_at_1000 value: 1.926 - type: precision_at_3 value: 47.417 - type: precision_at_5 value: 40.65 - type: recall_at_1 value: 8.462 - type: recall_at_10 value: 24.293 - type: recall_at_100 value: 50.146 - type: recall_at_1000 value: 74.034 - type: recall_at_3 value: 14.967 - type: recall_at_5 value: 18.682000000000002 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 47.84499999999999 - type: f1 value: 42.48106691979349 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 74.034 - type: map_at_10 value: 82.76 - type: map_at_100 value: 82.968 - type: map_at_1000 value: 82.98299999999999 - type: map_at_3 value: 81.768 - type: map_at_5 value: 82.418 - type: mrr_at_1 value: 80.048 - type: mrr_at_10 value: 87.64999999999999 - type: mrr_at_100 value: 87.712 - type: mrr_at_1000 value: 87.713 - type: mrr_at_3 value: 87.01100000000001 - type: mrr_at_5 value: 87.466 - type: ndcg_at_1 value: 80.048 - type: ndcg_at_10 value: 86.643 - type: ndcg_at_100 value: 87.361 - type: ndcg_at_1000 value: 87.606 - type: ndcg_at_3 value: 85.137 - type: ndcg_at_5 value: 86.016 - type: precision_at_1 value: 80.048 - type: precision_at_10 value: 10.372 - type: precision_at_100 value: 1.093 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 32.638 - type: precision_at_5 value: 20.177 - type: recall_at_1 value: 74.034 - type: recall_at_10 value: 93.769 - type: recall_at_100 value: 96.569 - type: recall_at_1000 value: 98.039 - type: recall_at_3 value: 89.581 - type: recall_at_5 value: 91.906 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 20.5 - type: map_at_10 value: 32.857 - type: map_at_100 value: 34.589 - type: map_at_1000 value: 34.778 - type: map_at_3 value: 29.160999999999998 - type: map_at_5 value: 31.033 - type: mrr_at_1 value: 40.123 - type: mrr_at_10 value: 48.776 - type: mrr_at_100 value: 49.495 - type: mrr_at_1000 value: 49.539 - type: mrr_at_3 value: 46.605000000000004 - type: mrr_at_5 value: 47.654 - type: ndcg_at_1 value: 40.123 - type: ndcg_at_10 value: 40.343 - type: ndcg_at_100 value: 46.56 - type: ndcg_at_1000 value: 49.777 - type: ndcg_at_3 value: 37.322 - type: ndcg_at_5 value: 37.791000000000004 - type: precision_at_1 value: 40.123 - type: precision_at_10 value: 11.08 - type: precision_at_100 value: 1.752 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 24.897 - type: precision_at_5 value: 17.809 - type: recall_at_1 value: 20.5 - type: recall_at_10 value: 46.388 - type: recall_at_100 value: 69.552 - type: recall_at_1000 value: 89.011 - type: recall_at_3 value: 33.617999999999995 - type: recall_at_5 value: 38.211 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 39.135999999999996 - type: map_at_10 value: 61.673 - type: map_at_100 value: 62.562 - type: map_at_1000 value: 62.62 - type: map_at_3 value: 58.467999999999996 - type: map_at_5 value: 60.463 - type: mrr_at_1 value: 78.271 - type: mrr_at_10 value: 84.119 - type: mrr_at_100 value: 84.29299999999999 - type: mrr_at_1000 value: 84.299 - type: mrr_at_3 value: 83.18900000000001 - type: mrr_at_5 value: 83.786 - type: ndcg_at_1 value: 78.271 - type: ndcg_at_10 value: 69.935 - type: ndcg_at_100 value: 73.01299999999999 - type: ndcg_at_1000 value: 74.126 - type: ndcg_at_3 value: 65.388 - type: ndcg_at_5 value: 67.906 - type: precision_at_1 value: 78.271 - type: precision_at_10 value: 14.562 - type: precision_at_100 value: 1.6969999999999998 - type: precision_at_1000 value: 0.184 - type: precision_at_3 value: 41.841 - type: precision_at_5 value: 27.087 - type: recall_at_1 value: 39.135999999999996 - type: recall_at_10 value: 72.809 - type: recall_at_100 value: 84.86200000000001 - type: recall_at_1000 value: 92.208 - type: recall_at_3 value: 62.76199999999999 - type: recall_at_5 value: 67.718 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.60600000000001 - type: ap value: 86.6579587804335 - type: f1 value: 90.5938853929307 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 21.852 - type: map_at_10 value: 33.982 - type: map_at_100 value: 35.116 - type: map_at_1000 value: 35.167 - type: map_at_3 value: 30.134 - type: map_at_5 value: 32.340999999999994 - type: mrr_at_1 value: 22.479 - type: mrr_at_10 value: 34.594 - type: mrr_at_100 value: 35.672 - type: mrr_at_1000 value: 35.716 - type: mrr_at_3 value: 30.84 - type: mrr_at_5 value: 32.998 - type: ndcg_at_1 value: 22.493 - type: ndcg_at_10 value: 40.833000000000006 - type: ndcg_at_100 value: 46.357 - type: ndcg_at_1000 value: 47.637 - type: ndcg_at_3 value: 32.995999999999995 - type: ndcg_at_5 value: 36.919000000000004 - type: precision_at_1 value: 22.493 - type: precision_at_10 value: 6.465999999999999 - type: precision_at_100 value: 0.9249999999999999 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.030999999999999 - type: precision_at_5 value: 10.413 - type: recall_at_1 value: 21.852 - type: recall_at_10 value: 61.934999999999995 - type: recall_at_100 value: 87.611 - type: recall_at_1000 value: 97.441 - type: recall_at_3 value: 40.583999999999996 - type: recall_at_5 value: 49.992999999999995 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.36069311445507 - type: f1 value: 93.16456330371453 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.74692202462381 - type: f1 value: 58.17903579421599 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 74.80833893745796 - type: f1 value: 72.70786592684664 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 78.69872225958305 - type: f1 value: 78.61626934504731 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.058658628717694 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 30.85561739360599 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.290259910144385 - type: mrr value: 32.44223046102856 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.288 - type: map_at_10 value: 12.267999999999999 - type: map_at_100 value: 15.557000000000002 - type: map_at_1000 value: 16.98 - type: map_at_3 value: 8.866 - type: map_at_5 value: 10.418 - type: mrr_at_1 value: 43.653 - type: mrr_at_10 value: 52.681 - type: mrr_at_100 value: 53.315999999999995 - type: mrr_at_1000 value: 53.357 - type: mrr_at_3 value: 51.393 - type: mrr_at_5 value: 51.903999999999996 - type: ndcg_at_1 value: 42.415000000000006 - type: ndcg_at_10 value: 34.305 - type: ndcg_at_100 value: 30.825999999999997 - type: ndcg_at_1000 value: 39.393 - type: ndcg_at_3 value: 39.931 - type: ndcg_at_5 value: 37.519999999999996 - type: precision_at_1 value: 43.653 - type: precision_at_10 value: 25.728 - type: precision_at_100 value: 7.932 - type: precision_at_1000 value: 2.07 - type: precision_at_3 value: 38.184000000000005 - type: precision_at_5 value: 32.879000000000005 - type: recall_at_1 value: 5.288 - type: recall_at_10 value: 16.195 - type: recall_at_100 value: 31.135 - type: recall_at_1000 value: 61.531000000000006 - type: recall_at_3 value: 10.313 - type: recall_at_5 value: 12.754999999999999 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 28.216 - type: map_at_10 value: 42.588 - type: map_at_100 value: 43.702999999999996 - type: map_at_1000 value: 43.739 - type: map_at_3 value: 38.177 - type: map_at_5 value: 40.754000000000005 - type: mrr_at_1 value: 31.866 - type: mrr_at_10 value: 45.189 - type: mrr_at_100 value: 46.056000000000004 - type: mrr_at_1000 value: 46.081 - type: mrr_at_3 value: 41.526999999999994 - type: mrr_at_5 value: 43.704 - type: ndcg_at_1 value: 31.837 - type: ndcg_at_10 value: 50.178 - type: ndcg_at_100 value: 54.98800000000001 - type: ndcg_at_1000 value: 55.812 - type: ndcg_at_3 value: 41.853 - type: ndcg_at_5 value: 46.153 - type: precision_at_1 value: 31.837 - type: precision_at_10 value: 8.43 - type: precision_at_100 value: 1.1119999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 19.023 - type: precision_at_5 value: 13.911000000000001 - type: recall_at_1 value: 28.216 - type: recall_at_10 value: 70.8 - type: recall_at_100 value: 91.857 - type: recall_at_1000 value: 97.941 - type: recall_at_3 value: 49.196 - type: recall_at_5 value: 59.072 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 71.22800000000001 - type: map_at_10 value: 85.115 - type: map_at_100 value: 85.72 - type: map_at_1000 value: 85.737 - type: map_at_3 value: 82.149 - type: map_at_5 value: 84.029 - type: mrr_at_1 value: 81.96 - type: mrr_at_10 value: 88.00200000000001 - type: mrr_at_100 value: 88.088 - type: mrr_at_1000 value: 88.089 - type: mrr_at_3 value: 87.055 - type: mrr_at_5 value: 87.715 - type: ndcg_at_1 value: 82.01 - type: ndcg_at_10 value: 88.78 - type: ndcg_at_100 value: 89.91 - type: ndcg_at_1000 value: 90.013 - type: ndcg_at_3 value: 85.957 - type: ndcg_at_5 value: 87.56 - type: precision_at_1 value: 82.01 - type: precision_at_10 value: 13.462 - type: precision_at_100 value: 1.528 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.553 - type: precision_at_5 value: 24.732000000000003 - type: recall_at_1 value: 71.22800000000001 - type: recall_at_10 value: 95.69 - type: recall_at_100 value: 99.531 - type: recall_at_1000 value: 99.98 - type: recall_at_3 value: 87.632 - type: recall_at_5 value: 92.117 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 52.31768034366916 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 60.640266772723606 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.7780000000000005 - type: map_at_10 value: 12.299 - type: map_at_100 value: 14.363000000000001 - type: map_at_1000 value: 14.71 - type: map_at_3 value: 8.738999999999999 - type: map_at_5 value: 10.397 - type: mrr_at_1 value: 23.599999999999998 - type: mrr_at_10 value: 34.845 - type: mrr_at_100 value: 35.916 - type: mrr_at_1000 value: 35.973 - type: mrr_at_3 value: 31.7 - type: mrr_at_5 value: 33.535 - type: ndcg_at_1 value: 23.599999999999998 - type: ndcg_at_10 value: 20.522000000000002 - type: ndcg_at_100 value: 28.737000000000002 - type: ndcg_at_1000 value: 34.596 - type: ndcg_at_3 value: 19.542 - type: ndcg_at_5 value: 16.958000000000002 - type: precision_at_1 value: 23.599999999999998 - type: precision_at_10 value: 10.67 - type: precision_at_100 value: 2.259 - type: precision_at_1000 value: 0.367 - type: precision_at_3 value: 18.333 - type: precision_at_5 value: 14.879999999999999 - type: recall_at_1 value: 4.7780000000000005 - type: recall_at_10 value: 21.617 - type: recall_at_100 value: 45.905 - type: recall_at_1000 value: 74.42 - type: recall_at_3 value: 11.148 - type: recall_at_5 value: 15.082999999999998 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.22372750297885 - type: cos_sim_spearman value: 79.40972617119405 - type: euclidean_pearson value: 80.6101072020434 - type: euclidean_spearman value: 79.53844217225202 - type: manhattan_pearson value: 80.57265975286111 - type: manhattan_spearman value: 79.46335611792958 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 85.43713315520749 - type: cos_sim_spearman value: 77.44128693329532 - type: euclidean_pearson value: 81.63869928101123 - type: euclidean_spearman value: 77.29512977961515 - type: manhattan_pearson value: 81.63704185566183 - type: manhattan_spearman value: 77.29909412738657 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 81.59451537860527 - type: cos_sim_spearman value: 82.97994638856723 - type: euclidean_pearson value: 82.89478688288412 - type: euclidean_spearman value: 83.58740751053104 - type: manhattan_pearson value: 82.69140840941608 - type: manhattan_spearman value: 83.33665956040555 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.00756527711764 - type: cos_sim_spearman value: 81.83560996841379 - type: euclidean_pearson value: 82.07684151976518 - type: euclidean_spearman value: 82.00913052060511 - type: manhattan_pearson value: 82.05690778488794 - type: manhattan_spearman value: 82.02260252019525 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.13710262895447 - type: cos_sim_spearman value: 87.26412811156248 - type: euclidean_pearson value: 86.94151453230228 - type: euclidean_spearman value: 87.5363796699571 - type: manhattan_pearson value: 86.86989424083748 - type: manhattan_spearman value: 87.47315940781353 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.0230597603627 - type: cos_sim_spearman value: 84.93344499318864 - type: euclidean_pearson value: 84.23754743431141 - type: euclidean_spearman value: 85.09707376597099 - type: manhattan_pearson value: 84.04325160987763 - type: manhattan_spearman value: 84.89353071339909 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 86.75620824563921 - type: cos_sim_spearman value: 87.15065513706398 - type: euclidean_pearson value: 88.26281533633521 - type: euclidean_spearman value: 87.51963738643983 - type: manhattan_pearson value: 88.25599267618065 - type: manhattan_spearman value: 87.58048736047483 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 64.74645319195137 - type: cos_sim_spearman value: 65.29996325037214 - type: euclidean_pearson value: 67.04297794086443 - type: euclidean_spearman value: 65.43841726694343 - type: manhattan_pearson value: 67.39459955690904 - type: manhattan_spearman value: 65.92864704413651 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.31291020270801 - type: cos_sim_spearman value: 85.86473738688068 - type: euclidean_pearson value: 85.65537275064152 - type: euclidean_spearman value: 86.13087454209642 - type: manhattan_pearson value: 85.43946955047609 - type: manhattan_spearman value: 85.91568175344916 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 85.93798118350695 - type: mrr value: 95.93536274908824 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 57.594 - type: map_at_10 value: 66.81899999999999 - type: map_at_100 value: 67.368 - type: map_at_1000 value: 67.4 - type: map_at_3 value: 64.061 - type: map_at_5 value: 65.47 - type: mrr_at_1 value: 60.667 - type: mrr_at_10 value: 68.219 - type: mrr_at_100 value: 68.655 - type: mrr_at_1000 value: 68.684 - type: mrr_at_3 value: 66.22200000000001 - type: mrr_at_5 value: 67.289 - type: ndcg_at_1 value: 60.667 - type: ndcg_at_10 value: 71.275 - type: ndcg_at_100 value: 73.642 - type: ndcg_at_1000 value: 74.373 - type: ndcg_at_3 value: 66.521 - type: ndcg_at_5 value: 68.581 - type: precision_at_1 value: 60.667 - type: precision_at_10 value: 9.433 - type: precision_at_100 value: 1.0699999999999998 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.556 - type: precision_at_5 value: 16.8 - type: recall_at_1 value: 57.594 - type: recall_at_10 value: 83.622 - type: recall_at_100 value: 94.167 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 70.64399999999999 - type: recall_at_5 value: 75.983 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.85841584158416 - type: cos_sim_ap value: 96.66996142314342 - type: cos_sim_f1 value: 92.83208020050125 - type: cos_sim_precision value: 93.06532663316584 - type: cos_sim_recall value: 92.60000000000001 - type: dot_accuracy value: 99.85841584158416 - type: dot_ap value: 96.6775307676576 - type: dot_f1 value: 92.69289729177312 - type: dot_precision value: 94.77533960292581 - type: dot_recall value: 90.7 - type: euclidean_accuracy value: 99.86138613861387 - type: euclidean_ap value: 96.6338454403108 - type: euclidean_f1 value: 92.92214357937311 - type: euclidean_precision value: 93.96728016359918 - type: euclidean_recall value: 91.9 - type: manhattan_accuracy value: 99.86237623762376 - type: manhattan_ap value: 96.60370449645053 - type: manhattan_f1 value: 92.91177970423253 - type: manhattan_precision value: 94.7970863683663 - type: manhattan_recall value: 91.10000000000001 - type: max_accuracy value: 99.86237623762376 - type: max_ap value: 96.6775307676576 - type: max_f1 value: 92.92214357937311 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 60.77977058695198 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.2725272535638 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 53.64052466362125 - type: mrr value: 54.533067014684654 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.677624219206578 - type: cos_sim_spearman value: 30.121368518123447 - type: dot_pearson value: 30.69870088041608 - type: dot_spearman value: 29.61284927093751 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.22 - type: map_at_10 value: 1.855 - type: map_at_100 value: 9.885 - type: map_at_1000 value: 23.416999999999998 - type: map_at_3 value: 0.637 - type: map_at_5 value: 1.024 - type: mrr_at_1 value: 88.0 - type: mrr_at_10 value: 93.067 - type: mrr_at_100 value: 93.067 - type: mrr_at_1000 value: 93.067 - type: mrr_at_3 value: 92.667 - type: mrr_at_5 value: 93.067 - type: ndcg_at_1 value: 82.0 - type: ndcg_at_10 value: 75.899 - type: ndcg_at_100 value: 55.115 - type: ndcg_at_1000 value: 48.368 - type: ndcg_at_3 value: 79.704 - type: ndcg_at_5 value: 78.39699999999999 - type: precision_at_1 value: 88.0 - type: precision_at_10 value: 79.60000000000001 - type: precision_at_100 value: 56.06 - type: precision_at_1000 value: 21.206 - type: precision_at_3 value: 84.667 - type: precision_at_5 value: 83.2 - type: recall_at_1 value: 0.22 - type: recall_at_10 value: 2.078 - type: recall_at_100 value: 13.297 - type: recall_at_1000 value: 44.979 - type: recall_at_3 value: 0.6689999999999999 - type: recall_at_5 value: 1.106 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.258 - type: map_at_10 value: 10.439 - type: map_at_100 value: 16.89 - type: map_at_1000 value: 18.407999999999998 - type: map_at_3 value: 5.668 - type: map_at_5 value: 7.718 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 51.159 - type: mrr_at_100 value: 51.714000000000006 - type: mrr_at_1000 value: 51.714000000000006 - type: mrr_at_3 value: 47.959 - type: mrr_at_5 value: 50.407999999999994 - type: ndcg_at_1 value: 29.592000000000002 - type: ndcg_at_10 value: 26.037 - type: ndcg_at_100 value: 37.924 - type: ndcg_at_1000 value: 49.126999999999995 - type: ndcg_at_3 value: 30.631999999999998 - type: ndcg_at_5 value: 28.571 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 22.857 - type: precision_at_100 value: 7.754999999999999 - type: precision_at_1000 value: 1.529 - type: precision_at_3 value: 34.014 - type: precision_at_5 value: 29.796 - type: recall_at_1 value: 2.258 - type: recall_at_10 value: 16.554 - type: recall_at_100 value: 48.439 - type: recall_at_1000 value: 82.80499999999999 - type: recall_at_3 value: 7.283 - type: recall_at_5 value: 10.732 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 69.8858 - type: ap value: 13.835684144362109 - type: f1 value: 53.803351693244586 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 60.50650820599886 - type: f1 value: 60.84357825979259 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 48.52131044852134 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 85.59337187816654 - type: cos_sim_ap value: 73.23925826533437 - type: cos_sim_f1 value: 67.34693877551021 - type: cos_sim_precision value: 62.40432237730752 - type: cos_sim_recall value: 73.13984168865434 - type: dot_accuracy value: 85.31322644096085 - type: dot_ap value: 72.30723963807422 - type: dot_f1 value: 66.47051612112296 - type: dot_precision value: 62.0792305930845 - type: dot_recall value: 71.53034300791556 - type: euclidean_accuracy value: 85.61125350181797 - type: euclidean_ap value: 73.32843720487845 - type: euclidean_f1 value: 67.36549633745895 - type: euclidean_precision value: 64.60755813953489 - type: euclidean_recall value: 70.36939313984169 - type: manhattan_accuracy value: 85.63509566668654 - type: manhattan_ap value: 73.16658488311325 - type: manhattan_f1 value: 67.20597386434349 - type: manhattan_precision value: 63.60424028268551 - type: manhattan_recall value: 71.2401055408971 - type: max_accuracy value: 85.63509566668654 - type: max_ap value: 73.32843720487845 - type: max_f1 value: 67.36549633745895 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.33779640625606 - type: cos_sim_ap value: 84.83868375898157 - type: cos_sim_f1 value: 77.16506154017773 - type: cos_sim_precision value: 74.62064005753327 - type: cos_sim_recall value: 79.88912842623961 - type: dot_accuracy value: 88.02732176815307 - type: dot_ap value: 83.95089283763002 - type: dot_f1 value: 76.29635101196631 - type: dot_precision value: 73.31771720613288 - type: dot_recall value: 79.52725592854944 - type: euclidean_accuracy value: 88.44452206310397 - type: euclidean_ap value: 84.98384576824827 - type: euclidean_f1 value: 77.29311047696697 - type: euclidean_precision value: 74.51232583065381 - type: euclidean_recall value: 80.28949799815214 - type: manhattan_accuracy value: 88.47362906042613 - type: manhattan_ap value: 84.91421462218432 - type: manhattan_f1 value: 77.05107637204792 - type: manhattan_precision value: 74.74484256243214 - type: manhattan_recall value: 79.50415768401602 - type: max_accuracy value: 88.47362906042613 - type: max_ap value: 84.98384576824827 - type: max_f1 value: 77.29311047696697 license: mit language: - en --- <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
89,841
[ [ -0.036651611328125, -0.06793212890625, 0.0292816162109375, 0.0118560791015625, -0.02716064453125, -0.02056884765625, -0.0237579345703125, -0.0189666748046875, 0.0300445556640625, 0.028350830078125, -0.0260467529296875, -0.0654296875, -0.036102294921875, -0.004608154296875, -0.006397247314453125, 0.04150390625, -0.0037384033203125, 0.010711669921875, 0.003932952880859375, -0.0186920166015625, -0.0283966064453125, -0.0190277099609375, -0.049224853515625, -0.01922607421875, 0.0257720947265625, 0.017059326171875, 0.042205810546875, 0.056365966796875, 0.0222930908203125, 0.02020263671875, -0.0175933837890625, 0.01165771484375, -0.0357666015625, -0.0051422119140625, -0.01552581787109375, -0.024871826171875, -0.03143310546875, 0.01319122314453125, 0.050994873046875, 0.0335693359375, -0.00774383544921875, 0.00778961181640625, 0.0005016326904296875, 0.053192138671875, -0.034515380859375, 0.0207366943359375, -0.042724609375, 0.0027256011962890625, -0.01806640625, 0.0112762451171875, -0.03863525390625, -0.0287017822265625, 0.011962890625, -0.0458984375, 0.00617218017578125, 0.0216064453125, 0.097412109375, 0.01541900634765625, -0.033935546875, -0.01239013671875, -0.0090789794921875, 0.0740966796875, -0.07525634765625, 0.051177978515625, 0.03790283203125, 0.0190277099609375, -0.00582122802734375, -0.0609130859375, -0.0269622802734375, -0.01239776611328125, -0.0151519775390625, 0.031524658203125, 0.001903533935546875, 0.0014963150024414062, 0.0236358642578125, 0.044586181640625, -0.041412353515625, 0.007038116455078125, -0.005123138427734375, -0.01184844970703125, 0.056976318359375, -0.01230621337890625, 0.0341796875, -0.0416259765625, -0.0223236083984375, -0.0274505615234375, -0.0596923828125, 0.003406524658203125, 0.0273284912109375, 0.0102386474609375, -0.029296875, 0.0423583984375, -0.017120361328125, 0.045501708984375, 0.003787994384765625, 0.0038471221923828125, 0.03924560546875, -0.0278472900390625, -0.01548004150390625, -0.0109405517578125, 0.069580078125, 0.029449462890625, -0.004322052001953125, 0.003849029541015625, -0.0241241455078125, -0.00708770751953125, -0.00695037841796875, -0.0670166015625, -0.018157958984375, 0.0148162841796875, -0.05712890625, -0.0135498046875, 0.0177459716796875, -0.05810546875, 0.007747650146484375, 0.0000985264778137207, 0.043670654296875, -0.055694580078125, -0.005519866943359375, 0.0233612060546875, -0.015777587890625, 0.0299835205078125, -0.00023949146270751953, -0.046844482421875, -0.018524169921875, 0.03973388671875, 0.0640869140625, 0.01248931884765625, -0.005710601806640625, -0.0279541015625, 0.0028285980224609375, -0.01065826416015625, 0.024444580078125, -0.03887939453125, -0.01331329345703125, 0.0157470703125, 0.028961181640625, -0.00771331787109375, -0.0216827392578125, 0.06597900390625, -0.0401611328125, 0.02691650390625, -0.0283050537109375, -0.0615234375, -0.037506103515625, 0.006908416748046875, -0.060089111328125, 0.08282470703125, -0.00732421875, -0.06353759765625, 0.006221771240234375, -0.048248291015625, -0.01617431640625, -0.0191192626953125, -0.0024852752685546875, -0.044769287109375, -0.00879669189453125, 0.0284423828125, 0.043701171875, -0.017120361328125, 0.0026149749755859375, -0.025970458984375, -0.042755126953125, -0.0005435943603515625, -0.017303466796875, 0.0819091796875, 0.0191650390625, -0.025115966796875, -0.0164794921875, -0.032806396484375, 0.00897979736328125, 0.0227203369140625, -0.0233612060546875, -0.025787353515625, 0.0165863037109375, 0.0176849365234375, 0.0038814544677734375, 0.039642333984375, -0.052764892578125, 0.0136871337890625, -0.043792724609375, 0.044464111328125, 0.04180908203125, 0.0129241943359375, 0.0179595947265625, -0.035491943359375, 0.0215301513671875, -0.00174713134765625, -0.0028514862060546875, -0.0166168212890625, -0.039703369140625, -0.046966552734375, -0.0226287841796875, 0.055511474609375, 0.04937744140625, -0.0653076171875, 0.049774169921875, -0.034210205078125, -0.04632568359375, -0.07049560546875, 0.0100555419921875, 0.0399169921875, 0.0001685619354248047, 0.05377197265625, -0.01033782958984375, -0.035858154296875, -0.0699462890625, -0.00461578369140625, 0.005863189697265625, -0.00679779052734375, 0.040191650390625, 0.0460205078125, -0.023895263671875, 0.0305023193359375, -0.054779052734375, -0.026153564453125, -0.0172119140625, -0.0054779052734375, 0.025390625, 0.036712646484375, 0.04779052734375, -0.07537841796875, -0.043701171875, -0.0006070137023925781, -0.058319091796875, 0.0057220458984375, 0.0027179718017578125, -0.022430419921875, 0.01305389404296875, 0.045684814453125, -0.03076171875, 0.017791748046875, 0.035675048828125, -0.019317626953125, 0.0210418701171875, -0.0015649795532226562, 0.01097869873046875, -0.09912109375, 0.0016641616821289062, 0.022613525390625, -0.008575439453125, -0.020477294921875, 0.0389404296875, 0.0126953125, 0.0154266357421875, -0.025909423828125, 0.0439453125, -0.0394287109375, 0.0187530517578125, 0.0096588134765625, 0.0460205078125, -0.00670623779296875, 0.038330078125, -0.0035152435302734375, 0.0537109375, 0.027801513671875, -0.0299072265625, 0.00928497314453125, 0.03955078125, -0.0333251953125, 0.0060882568359375, -0.04937744140625, -0.005680084228515625, -0.005542755126953125, 0.0125579833984375, -0.06207275390625, -0.005451202392578125, 0.0198516845703125, -0.042999267578125, 0.03955078125, -0.0224456787109375, -0.0372314453125, -0.0276336669921875, -0.06829833984375, 0.0110015869140625, 0.04376220703125, -0.048553466796875, 0.0164947509765625, 0.0220947265625, 0.00695037841796875, -0.0579833984375, -0.061309814453125, -0.01165771484375, -0.00016796588897705078, -0.039520263671875, 0.0408935546875, -0.0021572113037109375, 0.0191650390625, 0.01419830322265625, -0.00537109375, 0.01128387451171875, 0.00867462158203125, -0.0001977682113647461, 0.0184326171875, -0.035797119140625, 0.003543853759765625, 0.0205535888671875, 0.0098114013671875, -0.0148773193359375, -0.01212310791015625, 0.033111572265625, -0.01291656494140625, -0.0267791748046875, -0.017791748046875, 0.0255584716796875, 0.01922607421875, -0.0303955078125, 0.0445556640625, 0.07452392578125, -0.0281829833984375, -0.006275177001953125, -0.0496826171875, -0.00923919677734375, -0.036224365234375, 0.034149169921875, -0.0243377685546875, -0.07379150390625, 0.02972412109375, -0.0015096664428710938, 0.0162506103515625, 0.050872802734375, 0.0252532958984375, -0.0106201171875, 0.08099365234375, 0.0281524658203125, -0.020294189453125, 0.049835205078125, -0.049774169921875, 0.0133209228515625, -0.0882568359375, -0.00335693359375, -0.0297088623046875, -0.0297088623046875, -0.099853515625, -0.03802490234375, 0.004619598388671875, 0.0210113525390625, -0.028594970703125, 0.0323486328125, -0.04302978515625, 0.01145172119140625, 0.03643798828125, 0.0222930908203125, -0.00138092041015625, 0.00933837890625, -0.032623291015625, -0.020355224609375, -0.04583740234375, -0.038238525390625, 0.07525634765625, 0.036407470703125, 0.046051025390625, 0.0273590087890625, 0.061981201171875, 0.01422119140625, 0.0073089599609375, -0.058197021484375, 0.042999267578125, -0.03936767578125, -0.04296875, -0.0269927978515625, -0.036712646484375, -0.0838623046875, 0.02984619140625, -0.0205841064453125, -0.058319091796875, 0.00806427001953125, -0.01485443115234375, -0.0022907257080078125, 0.03515625, -0.050872802734375, 0.07720947265625, -0.00811004638671875, -0.023193359375, -0.005870819091796875, -0.03155517578125, 0.0245208740234375, 0.01496124267578125, 0.006206512451171875, 0.005580902099609375, -0.0195159912109375, 0.0572509765625, -0.01415252685546875, 0.04803466796875, -0.01220703125, 0.01123809814453125, 0.03240966796875, -0.01385498046875, 0.04168701171875, 0.006053924560546875, -0.0135498046875, 0.0226898193359375, 0.006748199462890625, -0.036407470703125, -0.037384033203125, 0.06634521484375, -0.050750732421875, -0.05328369140625, -0.028289794921875, -0.0188751220703125, 0.01348876953125, 0.032989501953125, 0.0265960693359375, 0.0165252685546875, -0.007762908935546875, 0.0487060546875, 0.06982421875, -0.041107177734375, 0.0289154052734375, 0.0261077880859375, -0.0206146240234375, -0.044647216796875, 0.0845947265625, 0.0197906494140625, -0.003971099853515625, 0.050750732421875, 0.00099945068359375, -0.0210418701171875, -0.0400390625, -0.03436279296875, 0.047943115234375, -0.04473876953125, -0.01264190673828125, -0.048309326171875, -0.0322265625, -0.0325927734375, 0.001651763916015625, -0.02044677734375, -0.021331787109375, -0.013427734375, -0.021209716796875, 0.0177459716796875, 0.0357666015625, 0.0091705322265625, 0.00669097900390625, -0.0535888671875, 0.015899658203125, -0.00734710693359375, 0.033172607421875, 0.005420684814453125, -0.04071044921875, -0.046844482421875, 0.01316070556640625, -0.036956787109375, -0.081787109375, 0.0262908935546875, 0.005680084228515625, 0.06317138671875, 0.02484130859375, -0.0008444786071777344, 0.0309600830078125, -0.03955078125, 0.08062744140625, -0.008148193359375, -0.05926513671875, 0.0384521484375, -0.02117919921875, 0.01241302490234375, 0.042205810546875, 0.049285888671875, -0.03497314453125, -0.0206451416015625, -0.03704833984375, -0.07275390625, 0.036712646484375, 0.01374053955078125, 0.0032062530517578125, -0.022369384765625, 0.024810791015625, -0.01372528076171875, -0.00017380714416503906, -0.060302734375, -0.05621337890625, -0.0251922607421875, -0.02655029296875, -0.0073089599609375, -0.0208892822265625, 0.01558685302734375, -0.021881103515625, 0.07550048828125, 0.0003418922424316406, 0.041412353515625, 0.0270233154296875, -0.024658203125, 0.0180511474609375, 0.019073486328125, 0.0224456787109375, 0.01412200927734375, -0.0291595458984375, -0.01092529296875, 0.0237579345703125, -0.0416259765625, -0.004810333251953125, 0.0233612060546875, -0.035369873046875, 0.01459503173828125, 0.023040771484375, 0.05328369140625, 0.033843994140625, -0.03338623046875, 0.042633056640625, 0.00861358642578125, -0.01419830322265625, -0.02252197265625, -0.00539398193359375, 0.0230255126953125, 0.0189666748046875, 0.00879669189453125, -0.034393310546875, 0.0199737548828125, -0.039947509765625, 0.0255584716796875, 0.033843994140625, -0.028656005859375, -0.005062103271484375, 0.05279541015625, 0.002620697021484375, -0.0015993118286132812, 0.036102294921875, -0.037841796875, -0.055450439453125, 0.031982421875, 0.0282440185546875, 0.06329345703125, -0.0110015869140625, 0.016876220703125, 0.06500244140625, 0.04010009765625, -0.024078369140625, 0.026885986328125, 0.00585174560546875, -0.04400634765625, -0.033447265625, -0.0408935546875, -0.00438690185546875, 0.0200958251953125, -0.043548583984375, 0.026397705078125, -0.031402587890625, -0.01113128662109375, 0.0023632049560546875, 0.033111572265625, -0.05609130859375, 0.00955963134765625, 0.003429412841796875, 0.0848388671875, -0.04400634765625, 0.06292724609375, 0.07476806640625, -0.07220458984375, -0.058197021484375, 0.00597381591796875, -0.00984954833984375, -0.045867919921875, 0.0283050537109375, 0.0198822021484375, 0.0133209228515625, 0.004688262939453125, -0.03594970703125, -0.06884765625, 0.1182861328125, 0.00287628173828125, -0.0399169921875, -0.004680633544921875, -0.021240234375, 0.034454345703125, -0.02850341796875, 0.033599853515625, 0.0310516357421875, 0.0460205078125, -0.0139617919921875, -0.04852294921875, 0.040863037109375, -0.023834228515625, 0.0176849365234375, 0.00370025634765625, -0.07330322265625, 0.0623779296875, 0.003448486328125, -0.0249786376953125, 0.014678955078125, 0.05450439453125, 0.0173187255859375, 0.031585693359375, 0.0181884765625, 0.07000732421875, 0.0496826171875, -0.01678466796875, 0.08709716796875, -0.019439697265625, 0.047332763671875, 0.06549072265625, 0.01274871826171875, 0.0843505859375, 0.00669097900390625, -0.017730712890625, 0.05059814453125, 0.0596923828125, -0.0240478515625, 0.035064697265625, 0.0014276504516601562, 0.004589080810546875, -0.0240631103515625, 0.004024505615234375, -0.04034423828125, 0.021697998046875, 0.0245513916015625, -0.0389404296875, 0.0033702850341796875, -0.0220489501953125, 0.00897979736328125, 0.00798797607421875, -0.0017099380493164062, 0.04339599609375, 0.0235443115234375, -0.035064697265625, 0.050079345703125, 0.0177459716796875, 0.07611083984375, -0.03045654296875, -0.011505126953125, -0.02130126953125, -0.00859832763671875, -0.01708984375, -0.05889892578125, -0.006107330322265625, -0.019378662109375, -0.0155487060546875, 0.00634765625, 0.0406494140625, -0.046905517578125, -0.0306549072265625, 0.0426025390625, 0.038818359375, 0.0182647705078125, 0.01349639892578125, -0.0826416015625, 0.0023479461669921875, 0.0288238525390625, -0.0401611328125, 0.02313232421875, 0.035552978515625, -0.00466156005859375, 0.04443359375, 0.04400634765625, 0.00484466552734375, -0.0014696121215820312, 0.00302886962890625, 0.03875732421875, -0.07037353515625, -0.022979736328125, -0.047607421875, 0.0271148681640625, -0.0246124267578125, 0.0016565322875976562, 0.060791015625, 0.053009033203125, 0.08074951171875, -0.003948211669921875, 0.061309814453125, -0.008636474609375, 0.03070068359375, -0.045379638671875, 0.067138671875, -0.0772705078125, 0.019439697265625, -0.0266265869140625, -0.070556640625, -0.0118408203125, 0.052703857421875, -0.0252838134765625, 0.0174102783203125, 0.051177978515625, 0.0736083984375, -0.019287109375, -0.01438140869140625, 0.0231781005859375, 0.0328369140625, 0.01187896728515625, 0.0596923828125, 0.0259857177734375, -0.0736083984375, 0.048248291015625, -0.0178375244140625, 0.00962066650390625, -0.039154052734375, -0.04791259765625, -0.06939697265625, -0.05511474609375, -0.031707763671875, -0.022979736328125, -0.0034122467041015625, 0.06927490234375, 0.0257568359375, -0.056304931640625, -0.005207061767578125, 0.0207672119140625, 0.03643798828125, -0.02008056640625, -0.0207061767578125, 0.0496826171875, -0.0057220458984375, -0.07037353515625, 0.02471923828125, -0.006290435791015625, -0.0051422119140625, -0.003986358642578125, -0.0184326171875, -0.06640625, 0.0089569091796875, 0.044952392578125, 0.0191650390625, -0.068603515625, -0.031585693359375, 0.006366729736328125, -0.0196075439453125, -0.01158905029296875, 0.01270294189453125, -0.0309906005859375, 0.0270233154296875, 0.046600341796875, 0.05792236328125, 0.0496826171875, -0.0035190582275390625, 0.01529693603515625, -0.04608154296875, -0.00640869140625, -0.0031604766845703125, 0.053375244140625, 0.0273895263671875, -0.02264404296875, 0.06781005859375, 0.016082763671875, -0.03045654296875, -0.056640625, 0.0024356842041015625, -0.07928466796875, -0.0247344970703125, 0.08404541015625, -0.031982421875, -0.018829345703125, 0.0205535888671875, -0.01439666748046875, 0.041259765625, -0.036956787109375, 0.035369873046875, 0.05999755859375, 0.03271484375, -0.01171112060546875, -0.0684814453125, 0.0235137939453125, 0.046722412109375, -0.0206756591796875, -0.0255126953125, 0.025390625, 0.036346435546875, 0.0179595947265625, 0.00876617431640625, -0.0185699462890625, 0.024017333984375, -0.005657196044921875, -0.0006608963012695312, -0.01041412353515625, 0.01849365234375, -0.01372528076171875, 0.001953125, -0.01255035400390625, -0.0232086181640625 ] ]
timm/convnext_small.fb_in22k
2023-03-31T22:34:14.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-22k", "arxiv:2201.03545", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/convnext_small.fb_in22k
0
169,191
timm
2022-12-13T07:13:23
--- tags: - image-classification - timm library_tag: timm license: apache-2.0 datasets: - imagenet-22k --- # Model card for convnext_small.fb_in22k A ConvNeXt image classification model. Pretrained on ImageNet-22k by paper authors. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 66.3 - GMACs: 8.7 - Activations (M): 21.6 - Image size: 224 x 224 - **Papers:** - A ConvNet for the 2020s: https://arxiv.org/abs/2201.03545 - **Original:** https://github.com/facebookresearch/ConvNeXt - **Dataset:** ImageNet-22k ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('convnext_small.fb_in22k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'convnext_small.fb_in22k', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 96, 56, 56]) # torch.Size([1, 192, 28, 28]) # torch.Size([1, 384, 14, 14]) # torch.Size([1, 768, 7, 7]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'convnext_small.fb_in22k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 768, 7, 7) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). All timing numbers from eager model PyTorch 1.13 on RTX 3090 w/ AMP. | model |top1 |top5 |img_size|param_count|gmacs |macts |samples_per_sec|batch_size| |------------------------------------------------------------------------------------------------------------------------------|------|------|--------|-----------|------|------|---------------|----------| | [convnextv2_huge.fcmae_ft_in22k_in1k_512](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_512) |88.848|98.742|512 |660.29 |600.81|413.07|28.58 |48 | | [convnextv2_huge.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_384) |88.668|98.738|384 |660.29 |337.96|232.35|50.56 |64 | | [convnext_xxlarge.clip_laion2b_soup_ft_in1k](https://huggingface.co/timm/convnext_xxlarge.clip_laion2b_soup_ft_in1k) |88.612|98.704|256 |846.47 |198.09|124.45|122.45 |256 | | [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384) |88.312|98.578|384 |200.13 |101.11|126.74|196.84 |256 | | [convnextv2_large.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k_384) |88.196|98.532|384 |197.96 |101.1 |126.74|128.94 |128 | | [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320) |87.968|98.47 |320 |200.13 |70.21 |88.02 |283.42 |256 | | [convnext_xlarge.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k_384) |87.75 |98.556|384 |350.2 |179.2 |168.99|124.85 |192 | | [convnextv2_base.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k_384) |87.646|98.422|384 |88.72 |45.21 |84.49 |209.51 |256 | | [convnext_large.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k_384) |87.476|98.382|384 |197.77 |101.1 |126.74|194.66 |256 | | [convnext_large_mlp.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_augreg_ft_in1k) |87.344|98.218|256 |200.13 |44.94 |56.33 |438.08 |256 | | [convnextv2_large.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k) |87.26 |98.248|224 |197.96 |34.4 |43.13 |376.84 |256 | | [convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384) |87.138|98.212|384 |88.59 |45.21 |84.49 |365.47 |256 | | [convnext_xlarge.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k) |87.002|98.208|224 |350.2 |60.98 |57.5 |368.01 |256 | | [convnext_base.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k_384) |86.796|98.264|384 |88.59 |45.21 |84.49 |366.54 |256 | | [convnextv2_base.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k) |86.74 |98.022|224 |88.72 |15.38 |28.75 |624.23 |256 | | [convnext_large.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k) |86.636|98.028|224 |197.77 |34.4 |43.13 |581.43 |256 | | [convnext_base.clip_laiona_augreg_ft_in1k_384](https://huggingface.co/timm/convnext_base.clip_laiona_augreg_ft_in1k_384) |86.504|97.97 |384 |88.59 |45.21 |84.49 |368.14 |256 | | [convnext_base.clip_laion2b_augreg_ft_in12k_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k) |86.344|97.97 |256 |88.59 |20.09 |37.55 |816.14 |256 | | [convnextv2_huge.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in1k) |86.256|97.75 |224 |660.29 |115.0 |79.07 |154.72 |256 | | [convnext_small.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_small.in12k_ft_in1k_384) |86.182|97.92 |384 |50.22 |25.58 |63.37 |516.19 |256 | | [convnext_base.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in1k) |86.154|97.68 |256 |88.59 |20.09 |37.55 |819.86 |256 | | [convnext_base.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k) |85.822|97.866|224 |88.59 |15.38 |28.75 |1037.66 |256 | | [convnext_small.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k_384) |85.778|97.886|384 |50.22 |25.58 |63.37 |518.95 |256 | | [convnextv2_large.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in1k) |85.742|97.584|224 |197.96 |34.4 |43.13 |375.23 |256 | | [convnext_small.in12k_ft_in1k](https://huggingface.co/timm/convnext_small.in12k_ft_in1k) |85.174|97.506|224 |50.22 |8.71 |21.56 |1474.31 |256 | | [convnext_tiny.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k_384) |85.118|97.608|384 |28.59 |13.14 |39.48 |856.76 |256 | | [convnextv2_tiny.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k_384) |85.112|97.63 |384 |28.64 |13.14 |39.48 |491.32 |256 | | [convnextv2_base.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in1k) |84.874|97.09 |224 |88.72 |15.38 |28.75 |625.33 |256 | | [convnext_small.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k) |84.562|97.394|224 |50.22 |8.71 |21.56 |1478.29 |256 | | [convnext_large.fb_in1k](https://huggingface.co/timm/convnext_large.fb_in1k) |84.282|96.892|224 |197.77 |34.4 |43.13 |584.28 |256 | | [convnext_tiny.in12k_ft_in1k](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k) |84.186|97.124|224 |28.59 |4.47 |13.44 |2433.7 |256 | | [convnext_tiny.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k_384) |84.084|97.14 |384 |28.59 |13.14 |39.48 |862.95 |256 | | [convnextv2_tiny.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k) |83.894|96.964|224 |28.64 |4.47 |13.44 |1452.72 |256 | | [convnext_base.fb_in1k](https://huggingface.co/timm/convnext_base.fb_in1k) |83.82 |96.746|224 |88.59 |15.38 |28.75 |1054.0 |256 | | [convnextv2_nano.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k_384) |83.37 |96.742|384 |15.62 |7.22 |24.61 |801.72 |256 | | [convnext_small.fb_in1k](https://huggingface.co/timm/convnext_small.fb_in1k) |83.142|96.434|224 |50.22 |8.71 |21.56 |1464.0 |256 | | [convnextv2_tiny.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in1k) |82.92 |96.284|224 |28.64 |4.47 |13.44 |1425.62 |256 | | [convnext_tiny.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k) |82.898|96.616|224 |28.59 |4.47 |13.44 |2480.88 |256 | | [convnext_nano.in12k_ft_in1k](https://huggingface.co/timm/convnext_nano.in12k_ft_in1k) |82.282|96.344|224 |15.59 |2.46 |8.37 |3926.52 |256 | | [convnext_tiny_hnf.a2h_in1k](https://huggingface.co/timm/convnext_tiny_hnf.a2h_in1k) |82.216|95.852|224 |28.59 |4.47 |13.44 |2529.75 |256 | | [convnext_tiny.fb_in1k](https://huggingface.co/timm/convnext_tiny.fb_in1k) |82.066|95.854|224 |28.59 |4.47 |13.44 |2346.26 |256 | | [convnextv2_nano.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k) |82.03 |96.166|224 |15.62 |2.46 |8.37 |2300.18 |256 | | [convnextv2_nano.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in1k) |81.83 |95.738|224 |15.62 |2.46 |8.37 |2321.48 |256 | | [convnext_nano_ols.d1h_in1k](https://huggingface.co/timm/convnext_nano_ols.d1h_in1k) |80.866|95.246|224 |15.65 |2.65 |9.38 |3523.85 |256 | | [convnext_nano.d1h_in1k](https://huggingface.co/timm/convnext_nano.d1h_in1k) |80.768|95.334|224 |15.59 |2.46 |8.37 |3915.58 |256 | | [convnextv2_pico.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_pico.fcmae_ft_in1k) |80.304|95.072|224 |9.07 |1.37 |6.1 |3274.57 |256 | | [convnext_pico.d1_in1k](https://huggingface.co/timm/convnext_pico.d1_in1k) |79.526|94.558|224 |9.05 |1.37 |6.1 |5686.88 |256 | | [convnext_pico_ols.d1_in1k](https://huggingface.co/timm/convnext_pico_ols.d1_in1k) |79.522|94.692|224 |9.06 |1.43 |6.5 |5422.46 |256 | | [convnextv2_femto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_femto.fcmae_ft_in1k) |78.488|93.98 |224 |5.23 |0.79 |4.57 |4264.2 |256 | | [convnext_femto_ols.d1_in1k](https://huggingface.co/timm/convnext_femto_ols.d1_in1k) |77.86 |93.83 |224 |5.23 |0.82 |4.87 |6910.6 |256 | | [convnext_femto.d1_in1k](https://huggingface.co/timm/convnext_femto.d1_in1k) |77.454|93.68 |224 |5.22 |0.79 |4.57 |7189.92 |256 | | [convnextv2_atto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_atto.fcmae_ft_in1k) |76.664|93.044|224 |3.71 |0.55 |3.81 |4728.91 |256 | | [convnext_atto_ols.a2_in1k](https://huggingface.co/timm/convnext_atto_ols.a2_in1k) |75.88 |92.846|224 |3.7 |0.58 |4.11 |7963.16 |256 | | [convnext_atto.d2_in1k](https://huggingface.co/timm/convnext_atto.d2_in1k) |75.664|92.9 |224 |3.7 |0.55 |3.81 |8439.22 |256 | ## Citation ```bibtex @article{liu2022convnet, author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie}, title = {A ConvNet for the 2020s}, journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2022}, } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
15,600
[ [ -0.06671142578125, -0.03302001953125, -0.0028743743896484375, 0.037750244140625, -0.03143310546875, -0.014373779296875, -0.01308441162109375, -0.03533935546875, 0.065673828125, 0.0171966552734375, -0.04345703125, -0.04144287109375, -0.050506591796875, -0.0026683807373046875, 0.00743865966796875, 0.06817626953125, -0.003414154052734375, -0.01053619384765625, 0.0194854736328125, -0.0272674560546875, -0.01617431640625, -0.0281219482421875, -0.06298828125, -0.01512908935546875, 0.019287109375, 0.0225372314453125, 0.058380126953125, 0.045989990234375, 0.0293426513671875, 0.04107666015625, -0.018463134765625, 0.01337432861328125, -0.0141754150390625, -0.0262603759765625, 0.04058837890625, -0.032196044921875, -0.06805419921875, 0.016937255859375, 0.062347412109375, 0.04034423828125, 0.004627227783203125, 0.01544189453125, 0.0258331298828125, 0.035186767578125, 0.002777099609375, -0.004848480224609375, -0.006988525390625, 0.012786865234375, -0.0206451416015625, 0.0015497207641601562, 0.004146575927734375, -0.0538330078125, 0.0240478515625, -0.04205322265625, 0.0017347335815429688, -0.0006103515625, 0.10260009765625, -0.00921630859375, -0.016845703125, 0.0011167526245117188, 0.0106201171875, 0.053924560546875, -0.059783935546875, 0.023223876953125, 0.03216552734375, -0.011383056640625, -0.015289306640625, -0.04730224609375, -0.045989990234375, -0.002979278564453125, -0.0278778076171875, 0.017333984375, -0.027679443359375, -0.00630950927734375, 0.0413818359375, 0.034271240234375, -0.03668212890625, -0.00659942626953125, -0.025421142578125, -0.008270263671875, 0.060394287109375, -0.006195068359375, 0.04754638671875, -0.026519775390625, -0.046783447265625, -0.020294189453125, -0.01560211181640625, 0.03533935546875, 0.01678466796875, -0.0021266937255859375, -0.0740966796875, 0.03973388671875, 0.0074005126953125, 0.0204620361328125, 0.0279083251953125, -0.0157470703125, 0.060028076171875, -0.018280029296875, -0.0404052734375, -0.02593994140625, 0.0899658203125, 0.05340576171875, 0.0301055908203125, 0.0103607177734375, 0.004756927490234375, -0.006805419921875, -0.036041259765625, -0.07476806640625, -0.0117950439453125, 0.0279998779296875, -0.042388916015625, -0.00909423828125, 0.026580810546875, -0.061614990234375, 0.010284423828125, -0.00878143310546875, 0.014251708984375, -0.0621337890625, -0.02752685546875, -0.0086822509765625, -0.026519775390625, 0.0299835205078125, 0.0218505859375, -0.02630615234375, 0.0223388671875, 0.0209808349609375, 0.07403564453125, 0.02203369140625, -0.0116119384765625, -0.032989501953125, -0.009857177734375, -0.0272064208984375, 0.0263519287109375, 0.00997161865234375, -0.01299285888671875, -0.0194854736328125, 0.034088134765625, -0.01305389404296875, -0.03045654296875, 0.031890869140625, 0.022857666015625, 0.0069580078125, -0.0290374755859375, -0.0259552001953125, -0.0206451416015625, 0.027069091796875, -0.037994384765625, 0.0791015625, 0.035186767578125, -0.07806396484375, 0.023101806640625, -0.033935546875, -0.0042877197265625, -0.0215606689453125, 0.00499725341796875, -0.05682373046875, -0.008026123046875, 0.01953125, 0.054595947265625, -0.009002685546875, -0.01174163818359375, -0.02935791015625, -0.0047149658203125, 0.0270538330078125, 0.0090789794921875, 0.0723876953125, 0.01329803466796875, -0.0341796875, 0.0005979537963867188, -0.047576904296875, 0.0238494873046875, 0.0291748046875, -0.003894805908203125, -0.00562286376953125, -0.06048583984375, 0.002582550048828125, 0.0389404296875, 0.01485443115234375, -0.03912353515625, 0.020843505859375, -0.01763916015625, 0.028900146484375, 0.048431396484375, -0.004241943359375, 0.02313232421875, -0.045166015625, 0.041656494140625, 0.005496978759765625, 0.0196685791015625, -0.0028228759765625, -0.0305023193359375, -0.05499267578125, -0.053070068359375, 0.01558685302734375, 0.035858154296875, -0.0335693359375, 0.055755615234375, 0.01309967041015625, -0.0460205078125, -0.056365966796875, 0.0167083740234375, 0.037841796875, 0.018096923828125, 0.01555633544921875, -0.0268402099609375, -0.04901123046875, -0.069580078125, -0.00919342041015625, 0.004589080810546875, -0.00269317626953125, 0.048065185546875, 0.0283203125, -0.00856781005859375, 0.04132080078125, -0.033599853515625, -0.021087646484375, -0.0093994140625, -0.0062408447265625, 0.03302001953125, 0.05755615234375, 0.0865478515625, -0.0645751953125, -0.06976318359375, 0.002223968505859375, -0.0836181640625, 0.00029277801513671875, -0.0032634735107421875, -0.032470703125, 0.0211181640625, 0.0198211669921875, -0.07489013671875, 0.0526123046875, 0.0270843505859375, -0.045623779296875, 0.034027099609375, -0.0209503173828125, 0.024688720703125, -0.07305908203125, 0.0171356201171875, 0.0200958251953125, -0.0233306884765625, -0.039276123046875, 0.0054779052734375, -0.006622314453125, 0.0123443603515625, -0.047882080078125, 0.06878662109375, -0.050689697265625, 0.007434844970703125, 0.0026798248291015625, 0.0099945068359375, 0.0012912750244140625, 0.036529541015625, -0.0028629302978515625, 0.033905029296875, 0.0572509765625, -0.02069091796875, 0.03466796875, 0.03985595703125, -0.00070953369140625, 0.059173583984375, -0.046234130859375, 0.008941650390625, 0.00839996337890625, 0.036865234375, -0.0677490234375, -0.03277587890625, 0.04443359375, -0.05682373046875, 0.035247802734375, -0.0186309814453125, -0.0259246826171875, -0.0615234375, -0.066162109375, 0.019073486328125, 0.04339599609375, -0.046905517578125, 0.01316070556640625, 0.0214385986328125, 0.00321197509765625, -0.045989990234375, -0.049560546875, -0.00508880615234375, -0.03216552734375, -0.06549072265625, 0.03173828125, 0.00586700439453125, -0.0088348388671875, 0.0016384124755859375, -0.001171112060546875, -0.002956390380859375, -0.01389312744140625, 0.039581298828125, 0.031463623046875, -0.017578125, -0.02752685546875, -0.023651123046875, -0.00811004638671875, 0.0007524490356445312, -0.00780487060546875, 0.041778564453125, -0.026519775390625, 0.0106964111328125, -0.079345703125, 0.0162506103515625, 0.04736328125, -0.002590179443359375, 0.06781005859375, 0.07720947265625, -0.03387451171875, 0.01076507568359375, -0.0286102294921875, -0.0100555419921875, -0.037933349609375, -0.009857177734375, -0.040130615234375, -0.048583984375, 0.06268310546875, 0.0145416259765625, -0.007221221923828125, 0.053436279296875, 0.0257110595703125, -0.018402099609375, 0.0645751953125, 0.040069580078125, -0.006801605224609375, 0.044769287109375, -0.06671142578125, 0.0025634765625, -0.06292724609375, -0.046844482421875, -0.009368896484375, -0.04144287109375, -0.055206298828125, -0.03045654296875, 0.0197296142578125, 0.03680419921875, -0.00856781005859375, 0.049591064453125, -0.04193115234375, -0.007167816162109375, 0.03631591796875, 0.025299072265625, -0.019256591796875, -0.0182342529296875, -0.0114593505859375, -0.0160064697265625, -0.040252685546875, -0.01100921630859375, 0.0518798828125, 0.04901123046875, 0.0298004150390625, -0.0007486343383789062, 0.04034423828125, -0.0048675537109375, 0.021728515625, -0.037750244140625, 0.05499267578125, -0.004032135009765625, -0.038299560546875, -0.01507568359375, -0.034393310546875, -0.0732421875, 0.010467529296875, -0.0270843505859375, -0.06494140625, -0.010162353515625, 0.01482391357421875, -0.023284912109375, 0.041107177734375, -0.050018310546875, 0.0548095703125, -0.00550079345703125, -0.0372314453125, 0.00792694091796875, -0.064697265625, 0.0183563232421875, 0.0300445556640625, -0.003368377685546875, -0.0128021240234375, 0.01110076904296875, 0.061004638671875, -0.0634765625, 0.036865234375, -0.028839111328125, 0.00258636474609375, 0.041046142578125, -0.00504302978515625, 0.032470703125, 0.01168060302734375, 0.000484466552734375, 0.0008630752563476562, 0.01096343994140625, -0.048583984375, -0.02783203125, 0.049163818359375, -0.049835205078125, -0.029144287109375, -0.0406494140625, -0.02349853515625, 0.01314544677734375, 0.0017108917236328125, 0.04754638671875, 0.04156494140625, -0.010101318359375, 0.01540374755859375, 0.038665771484375, -0.028228759765625, 0.038848876953125, -0.01459503173828125, -0.0027065277099609375, -0.040008544921875, 0.058502197265625, 0.0035552978515625, 0.007297515869140625, 0.0024509429931640625, 0.005496978759765625, -0.0308990478515625, -0.0115966796875, -0.0108489990234375, 0.052337646484375, -0.0172882080078125, -0.02716064453125, -0.0472412109375, -0.032958984375, -0.04461669921875, -0.02642822265625, -0.029144287109375, -0.0198516845703125, -0.02593994140625, 0.00518798828125, 0.0548095703125, 0.041534423828125, -0.03045654296875, 0.033233642578125, -0.04901123046875, 0.0253448486328125, 0.00543212890625, 0.030731201171875, -0.0215911865234375, -0.043914794921875, 0.00200653076171875, 0.0027103424072265625, -0.0175933837890625, -0.05865478515625, 0.048065185546875, 0.011016845703125, 0.0290679931640625, 0.039886474609375, -0.02337646484375, 0.05975341796875, -0.0068817138671875, 0.037841796875, 0.04217529296875, -0.06536865234375, 0.0321044921875, -0.0316162109375, 0.006267547607421875, 0.01214599609375, 0.0272674560546875, -0.037017822265625, -0.025390625, -0.07293701171875, -0.044921875, 0.052154541015625, 0.0110931396484375, -0.0004360675811767578, 0.00555419921875, 0.04620361328125, -0.006900787353515625, 0.01055908203125, -0.04168701171875, -0.053497314453125, -0.0152130126953125, -0.0111083984375, -0.006397247314453125, -0.00360107421875, -0.0033969879150390625, -0.050567626953125, 0.03656005859375, -0.0106201171875, 0.043701171875, 0.0182647705078125, 0.0003514289855957031, -0.003452301025390625, -0.0237884521484375, 0.041046142578125, 0.026702880859375, -0.023162841796875, -0.009857177734375, 0.028289794921875, -0.03753662109375, 0.0011968612670898438, 0.0213165283203125, 0.00476837158203125, 0.0144195556640625, 0.0240325927734375, 0.044464111328125, 0.01910400390625, -0.01287841796875, 0.04400634765625, -0.015625, -0.0301513671875, -0.0229949951171875, -0.0031299591064453125, 0.01496124267578125, 0.035797119140625, 0.016326904296875, 0.00403594970703125, -0.0230255126953125, -0.044708251953125, 0.04132080078125, 0.058746337890625, -0.033477783203125, -0.041534423828125, 0.04803466796875, -0.00820159912109375, -0.00809478759765625, 0.039093017578125, -0.005939483642578125, -0.053680419921875, 0.07366943359375, 0.0205535888671875, 0.04248046875, -0.042724609375, 0.019561767578125, 0.06549072265625, -0.0014600753784179688, 0.00928497314453125, 0.0270843505859375, 0.0289154052734375, -0.031951904296875, 0.00466156005859375, -0.0482177734375, 0.01342010498046875, 0.042388916015625, -0.03533935546875, 0.02642822265625, -0.057342529296875, -0.02813720703125, 0.0146331787109375, 0.0318603515625, -0.062042236328125, 0.0238189697265625, 0.004413604736328125, 0.081787109375, -0.0601806640625, 0.06597900390625, 0.055511474609375, -0.02862548828125, -0.0714111328125, -0.01019287109375, 0.0167388916015625, -0.058197021484375, 0.029327392578125, 0.017333984375, 0.0171356201171875, -0.017791748046875, -0.0452880859375, -0.03759765625, 0.08984375, 0.035919189453125, -0.009490966796875, 0.007415771484375, -0.0263519287109375, 0.029449462890625, -0.020599365234375, 0.035308837890625, 0.03985595703125, 0.03997802734375, 0.0160369873046875, -0.06915283203125, 0.027496337890625, -0.0308074951171875, -0.0137176513671875, 0.0216522216796875, -0.10394287109375, 0.07977294921875, -0.026214599609375, -0.0016794204711914062, 0.01433563232421875, 0.061920166015625, 0.030364990234375, 0.0040740966796875, 0.0288848876953125, 0.053924560546875, 0.036102294921875, -0.0147705078125, 0.077880859375, 0.0010633468627929688, 0.031036376953125, 0.0189056396484375, 0.039031982421875, 0.03143310546875, 0.0280914306640625, -0.032958984375, 0.01000213623046875, 0.06512451171875, -0.01494598388671875, 0.0083770751953125, 0.01541900634765625, -0.0124664306640625, -0.0096893310546875, -0.017425537109375, -0.046173095703125, 0.0325927734375, 0.01192474365234375, -0.020233154296875, 0.0014638900756835938, -0.007595062255859375, 0.0389404296875, 0.0006499290466308594, -0.01186370849609375, 0.03289794921875, 0.02020263671875, -0.042205810546875, 0.039215087890625, -0.005359649658203125, 0.0738525390625, -0.0256500244140625, 0.0017004013061523438, -0.0232696533203125, 0.02459716796875, -0.020263671875, -0.0875244140625, 0.0243377685546875, -0.0105133056640625, 0.01450347900390625, -0.004932403564453125, 0.047088623046875, -0.03387451171875, -0.0190277099609375, 0.04052734375, 0.0260772705078125, 0.02923583984375, 0.006206512451171875, -0.0870361328125, 0.0178070068359375, 0.01141357421875, -0.041351318359375, 0.031768798828125, 0.037017822265625, 0.0184173583984375, 0.050994873046875, 0.031890869140625, 0.015045166015625, 0.00690460205078125, -0.02783203125, 0.058929443359375, -0.04937744140625, -0.035797119140625, -0.06524658203125, 0.03289794921875, -0.0247955322265625, -0.0462646484375, 0.060302734375, 0.033416748046875, 0.03857421875, 0.0078125, 0.0394287109375, -0.036468505859375, 0.0292510986328125, -0.0325927734375, 0.053924560546875, -0.060546875, -0.024139404296875, -0.034027099609375, -0.061431884765625, -0.02178955078125, 0.055816650390625, 0.0030364990234375, 0.0179443359375, 0.0265045166015625, 0.0439453125, -0.0027980804443359375, -0.018768310546875, -0.005809783935546875, 0.0191497802734375, 0.0041961669921875, 0.061920166015625, 0.04046630859375, -0.055511474609375, 0.015716552734375, -0.04754638671875, -0.024383544921875, -0.02734375, -0.05523681640625, -0.08251953125, -0.05963134765625, -0.036346435546875, -0.04937744140625, -0.024078369140625, 0.08404541015625, 0.07049560546875, -0.041107177734375, -0.01273345947265625, 0.0242156982421875, 0.00809478759765625, -0.0158233642578125, -0.019775390625, 0.0406494140625, 0.025482177734375, -0.07708740234375, -0.020050048828125, 0.006908416748046875, 0.042694091796875, 0.0260772705078125, -0.030120849609375, -0.01898193359375, -0.00377655029296875, 0.0301666259765625, 0.0628662109375, -0.052978515625, -0.035675048828125, 0.003086090087890625, -0.020233154296875, 0.018951416015625, 0.021026611328125, -0.0287933349609375, -0.0079345703125, 0.04107666015625, 0.0099639892578125, 0.056640625, 0.0115203857421875, 0.0184783935546875, -0.04571533203125, 0.04986572265625, -0.00405120849609375, 0.0277099609375, 0.0276641845703125, -0.0298919677734375, 0.055755615234375, 0.037933349609375, -0.03375244140625, -0.0736083984375, -0.0221405029296875, -0.10711669921875, 0.0006251335144042969, 0.05859375, -0.01535797119140625, -0.04052734375, 0.040679931640625, -0.026580810546875, 0.039947509765625, -0.0204010009765625, 0.019744873046875, 0.0272216796875, -0.02606201171875, -0.034393310546875, -0.04156494140625, 0.054351806640625, 0.0249176025390625, -0.050506591796875, -0.0264892578125, -0.001758575439453125, 0.035919189453125, 0.01837158203125, 0.05975341796875, -0.0151214599609375, 0.01160430908203125, 0.001239776611328125, 0.01084136962890625, 0.003875732421875, 0.0015430450439453125, -0.0119781494140625, -0.0162200927734375, -0.02392578125, -0.04400634765625 ] ]
cross-encoder/ms-marco-MiniLM-L-4-v2
2021-08-05T08:39:32.000Z
[ "transformers", "pytorch", "jax", "bert", "text-classification", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
cross-encoder
null
null
cross-encoder/ms-marco-MiniLM-L-4-v2
1
164,634
transformers
2022-03-02T23:29:05
--- license: apache-2.0 --- # Cross-Encoder for MS Marco This model was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task. The model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order. See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html) for more details. The training code is available here: [SBERT.net Training MS Marco](https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/ms_marco) ## Usage with Transformers ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch model = AutoModelForSequenceClassification.from_pretrained('model_name') tokenizer = AutoTokenizer.from_pretrained('model_name') features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt") model.eval() with torch.no_grad(): scores = model(**features).logits print(scores) ``` ## Usage with SentenceTransformers The usage becomes easier when you have [SentenceTransformers](https://www.sbert.net/) installed. Then, you can use the pre-trained models like this: ```python from sentence_transformers import CrossEncoder model = CrossEncoder('model_name', max_length=512) scores = model.predict([('Query', 'Paragraph1'), ('Query', 'Paragraph2') , ('Query', 'Paragraph3')]) ``` ## Performance In the following table, we provide various pre-trained Cross-Encoders together with their performance on the [TREC Deep Learning 2019](https://microsoft.github.io/TREC-2019-Deep-Learning/) and the [MS Marco Passage Reranking](https://github.com/microsoft/MSMARCO-Passage-Ranking/) dataset. | Model-Name | NDCG@10 (TREC DL 19) | MRR@10 (MS Marco Dev) | Docs / Sec | | ------------- |:-------------| -----| --- | | **Version 2 models** | | | | cross-encoder/ms-marco-TinyBERT-L-2-v2 | 69.84 | 32.56 | 9000 | cross-encoder/ms-marco-MiniLM-L-2-v2 | 71.01 | 34.85 | 4100 | cross-encoder/ms-marco-MiniLM-L-4-v2 | 73.04 | 37.70 | 2500 | cross-encoder/ms-marco-MiniLM-L-6-v2 | 74.30 | 39.01 | 1800 | cross-encoder/ms-marco-MiniLM-L-12-v2 | 74.31 | 39.02 | 960 | **Version 1 models** | | | | cross-encoder/ms-marco-TinyBERT-L-2 | 67.43 | 30.15 | 9000 | cross-encoder/ms-marco-TinyBERT-L-4 | 68.09 | 34.50 | 2900 | cross-encoder/ms-marco-TinyBERT-L-6 | 69.57 | 36.13 | 680 | cross-encoder/ms-marco-electra-base | 71.99 | 36.41 | 340 | **Other models** | | | | nboost/pt-tinybert-msmarco | 63.63 | 28.80 | 2900 | nboost/pt-bert-base-uncased-msmarco | 70.94 | 34.75 | 340 | nboost/pt-bert-large-msmarco | 73.36 | 36.48 | 100 | Capreolus/electra-base-msmarco | 71.23 | 36.89 | 340 | amberoad/bert-multilingual-passage-reranking-msmarco | 68.40 | 35.54 | 330 | sebastian-hofstaetter/distilbert-cat-margin_mse-T2-msmarco | 72.82 | 37.88 | 720 Note: Runtime was computed on a V100 GPU.
3,233
[ [ -0.03228759765625, -0.043670654296875, 0.0250396728515625, 0.01168060302734375, -0.0127105712890625, 0.01073455810546875, -0.01338958740234375, -0.038543701171875, 0.025146484375, 0.0255889892578125, -0.041229248046875, -0.051055908203125, -0.058013916015625, 0.003063201904296875, -0.0333251953125, 0.059326171875, -0.0015039443969726562, 0.0123291015625, -0.01372528076171875, -0.00827789306640625, -0.0193939208984375, -0.0308074951171875, -0.04132080078125, -0.0218048095703125, 0.035980224609375, 0.01593017578125, 0.05816650390625, 0.02978515625, 0.0419921875, 0.032958984375, -0.0084075927734375, 0.007022857666015625, -0.01415252685546875, 0.00011080503463745117, 0.005214691162109375, -0.0286865234375, -0.041717529296875, -0.0091400146484375, 0.033416748046875, 0.0266265869140625, 0.0005674362182617188, 0.019989013671875, -0.00020313262939453125, 0.04327392578125, -0.0293731689453125, -0.00376129150390625, -0.02587890625, 0.0184478759765625, -0.0149383544921875, -0.018768310546875, -0.035186767578125, -0.0162811279296875, 0.01331329345703125, -0.043914794921875, 0.0299072265625, 0.01174163818359375, 0.09521484375, 0.026275634765625, -0.01641845703125, -0.0196533203125, -0.035400390625, 0.05419921875, -0.051483154296875, 0.053314208984375, 0.01374053955078125, 0.01324462890625, 0.0088958740234375, -0.07318115234375, -0.033660888671875, -0.0162506103515625, -0.01439666748046875, 0.01922607421875, -0.031829833984375, -0.006427764892578125, 0.0313720703125, 0.0309600830078125, -0.07501220703125, -0.00608062744140625, -0.053802490234375, -0.0092010498046875, 0.04913330078125, 0.0202484130859375, 0.020172119140625, -0.0188446044921875, -0.02423095703125, -0.01071929931640625, -0.037811279296875, 0.016326904296875, 0.0207366943359375, 0.0008802413940429688, -0.015472412109375, 0.03082275390625, -0.01776123046875, 0.059600830078125, 0.0084075927734375, 0.007080078125, 0.058074951171875, -0.019500732421875, -0.0178985595703125, 0.00186920166015625, 0.07366943359375, 0.021514892578125, 0.00785064697265625, -0.0095367431640625, -0.0169830322265625, -0.01285552978515625, 0.0303497314453125, -0.066162109375, -0.020111083984375, 0.0219879150390625, -0.040313720703125, -0.0100860595703125, 0.01227569580078125, -0.06396484375, 0.01207733154296875, -0.0098876953125, 0.045501708984375, -0.030059814453125, 0.0024127960205078125, 0.01776123046875, -0.0106964111328125, 0.02166748046875, 0.01345062255859375, -0.0557861328125, 0.0010099411010742188, 0.0260009765625, 0.070556640625, -0.00864410400390625, -0.028472900390625, -0.01212310791015625, -0.002803802490234375, -0.01251220703125, 0.042694091796875, -0.03546142578125, -0.0235137939453125, -0.005451202392578125, 0.021514892578125, -0.01129150390625, -0.0229034423828125, 0.0535888671875, -0.034820556640625, 0.038299560546875, -0.00946807861328125, -0.0261993408203125, -0.01183319091796875, 0.017730712890625, -0.059112548828125, 0.091064453125, 0.002994537353515625, -0.06390380859375, 0.01236724853515625, -0.052947998046875, -0.0256805419921875, -0.01207733154296875, 0.0030956268310546875, -0.05743408203125, 0.0034542083740234375, 0.030609130859375, 0.019378662109375, -0.0241241455078125, 0.007495880126953125, -0.0130767822265625, -0.0340576171875, 0.012298583984375, -0.031219482421875, 0.08184814453125, 0.0298309326171875, -0.0369873046875, 0.0040435791015625, -0.0506591796875, 0.00890350341796875, 0.0214385986328125, -0.031890869140625, -0.0004322528839111328, -0.021514892578125, 0.01038360595703125, 0.0301055908203125, 0.032958984375, -0.037689208984375, 0.007785797119140625, -0.02099609375, 0.036376953125, 0.034698486328125, -0.008209228515625, 0.0258941650390625, -0.0226593017578125, 0.05029296875, 0.0095062255859375, 0.03265380859375, 0.00067901611328125, -0.047607421875, -0.06622314453125, -0.01019287109375, 0.038299560546875, 0.044158935546875, -0.055450439453125, 0.04083251953125, -0.038970947265625, -0.0531005859375, -0.062042236328125, -0.007534027099609375, 0.031524658203125, 0.0255126953125, 0.049713134765625, -0.006671905517578125, -0.055084228515625, -0.07501220703125, -0.025146484375, 0.0017757415771484375, 0.002986907958984375, 0.0179595947265625, 0.048370361328125, -0.019775390625, 0.055328369140625, -0.040008544921875, -0.0163116455078125, -0.034423828125, 0.0002512931823730469, 0.019012451171875, 0.050018310546875, 0.04730224609375, -0.052642822265625, -0.0408935546875, -0.0142669677734375, -0.052215576171875, 0.005352020263671875, 0.0027637481689453125, -0.0104522705078125, 0.02032470703125, 0.04638671875, -0.051971435546875, 0.051177978515625, 0.037109375, -0.034393310546875, 0.0279998779296875, -0.03302001953125, 0.0218658447265625, -0.09063720703125, 0.007625579833984375, -0.0025157928466796875, -0.01189422607421875, -0.038787841796875, -0.0119476318359375, 0.006992340087890625, -0.0018863677978515625, -0.0261077880859375, 0.0248565673828125, -0.04547119140625, -0.0025920867919921875, 0.009185791015625, 0.00579833984375, 0.01277923583984375, 0.047637939453125, 0.024566650390625, 0.058349609375, 0.039031982421875, -0.0266571044921875, 0.0180816650390625, 0.0276031494140625, -0.04620361328125, 0.0284881591796875, -0.0693359375, -0.0006427764892578125, -0.00978851318359375, 0.00811767578125, -0.07489013671875, 0.01265716552734375, 0.0178985595703125, -0.0655517578125, 0.0234832763671875, -0.0102691650390625, -0.0296478271484375, -0.049591064453125, -0.013458251953125, 0.0246734619140625, 0.037750244140625, -0.03564453125, 0.043487548828125, 0.0255279541015625, 0.0005202293395996094, -0.052947998046875, -0.091552734375, 0.01369476318359375, -0.00421905517578125, -0.055267333984375, 0.04754638671875, -0.0153961181640625, 0.01108551025390625, 0.002925872802734375, -0.00334930419921875, -0.00313568115234375, -0.00838470458984375, 0.01454925537109375, 0.0247650146484375, -0.0140838623046875, 0.0011796951293945312, 0.0009307861328125, -0.01641845703125, 0.00524139404296875, -0.01568603515625, 0.04803466796875, -0.01324462890625, -0.0095062255859375, -0.018951416015625, 0.01490020751953125, 0.037322998046875, -0.042388916015625, 0.054046630859375, 0.061004638671875, -0.024322509765625, -0.00823974609375, -0.031402587890625, -0.007595062255859375, -0.037811279296875, 0.03387451171875, -0.043548583984375, -0.057952880859375, 0.039886474609375, 0.0227813720703125, 0.0020294189453125, 0.038360595703125, 0.03662109375, -0.0015134811401367188, 0.07733154296875, 0.036346435546875, -0.003635406494140625, 0.04937744140625, -0.053619384765625, 0.022308349609375, -0.058074951171875, -0.044281005859375, -0.049591064453125, -0.033447265625, -0.051300048828125, -0.0264129638671875, 0.0228729248046875, -0.0100860595703125, -0.01702880859375, 0.05230712890625, -0.05633544921875, 0.024322509765625, 0.0546875, 0.0208892822265625, 0.00777435302734375, 0.01097869873046875, -0.0191802978515625, -0.00926971435546875, -0.06256103515625, -0.024505615234375, 0.097900390625, 0.012603759765625, 0.0526123046875, 0.0012807846069335938, 0.057952880859375, 0.023101806640625, -0.00286102294921875, -0.03240966796875, 0.032989501953125, -0.01132965087890625, -0.05841064453125, -0.017303466796875, -0.0316162109375, -0.08074951171875, 0.0255584716796875, -0.0159912109375, -0.043701171875, 0.038543701171875, -0.00670623779296875, -0.029266357421875, 0.02374267578125, -0.0419921875, 0.09814453125, -0.031280517578125, -0.0268707275390625, -0.007358551025390625, -0.0555419921875, 0.0127410888671875, 0.01555633544921875, 0.0025653839111328125, 0.006885528564453125, -0.0124664306640625, 0.05682373046875, -0.02764892578125, 0.0261077880859375, -0.010955810546875, 0.01145172119140625, 0.014068603515625, -0.007411956787109375, 0.02874755859375, -0.0006070137023925781, -0.00780487060546875, 0.0250091552734375, -0.0032405853271484375, -0.029876708984375, -0.03192138671875, 0.060943603515625, -0.06878662109375, -0.0313720703125, -0.040771484375, -0.0272674560546875, -0.002227783203125, 0.01551055908203125, 0.057769775390625, 0.031707763671875, 0.00033092498779296875, 0.0323486328125, 0.05584716796875, -0.0235748291015625, 0.04302978515625, 0.0284271240234375, -0.003986358642578125, -0.055389404296875, 0.058074951171875, 0.02294921875, 0.012481689453125, 0.0430908203125, -0.0136566162109375, -0.0357666015625, -0.040557861328125, -0.0266571044921875, 0.01265716552734375, -0.039947509765625, -0.01678466796875, -0.05487060546875, -0.0307769775390625, -0.037567138671875, -0.005542755126953125, -0.03125, -0.032012939453125, -0.0180816650390625, -0.01311492919921875, 0.0164794921875, 0.045745849609375, 0.00982666015625, 0.01531219482421875, -0.046051025390625, 0.01605224609375, 0.00049591064453125, 0.011810302734375, -0.0079498291015625, -0.0657958984375, -0.03411865234375, -0.005157470703125, -0.0308380126953125, -0.06207275390625, 0.051055908203125, -0.006534576416015625, 0.054901123046875, 0.0113983154296875, 0.004344940185546875, 0.056427001953125, -0.029205322265625, 0.06744384765625, 0.0123291015625, -0.064697265625, 0.050018310546875, 0.0019779205322265625, 0.02935791015625, 0.04730224609375, 0.04193115234375, -0.040008544921875, -0.0193939208984375, -0.057891845703125, -0.071044921875, 0.0675048828125, 0.0224609375, -0.0082244873046875, 0.005519866943359375, 0.00151824951171875, -0.00902557373046875, 0.021209716796875, -0.07275390625, -0.036865234375, -0.034027099609375, -0.028533935546875, -0.023681640625, -0.01242828369140625, 0.01543426513671875, -0.047027587890625, 0.0582275390625, 0.01316070556640625, 0.042999267578125, 0.0455322265625, -0.031036376953125, 0.006595611572265625, 0.00827789306640625, 0.05181884765625, 0.048370361328125, -0.0203857421875, -0.0017566680908203125, 0.01593017578125, -0.038360595703125, -0.01056671142578125, 0.01776123046875, -0.034820556640625, 0.02899169921875, 0.025360107421875, 0.07562255859375, 0.0168609619140625, -0.0290374755859375, 0.04888916015625, 0.0038509368896484375, -0.020904541015625, -0.0377197265625, -0.0149383544921875, 0.0015134811401367188, 0.028717041015625, 0.0185089111328125, 0.00479888916015625, 0.0191497802734375, -0.031036376953125, 0.01174163818359375, 0.0269012451171875, -0.044158935546875, -0.015411376953125, 0.0682373046875, 0.01306915283203125, -0.032073974609375, 0.0516357421875, 0.0017795562744140625, -0.061553955078125, 0.03900146484375, 0.0273590087890625, 0.078857421875, -0.0213623046875, 0.01332855224609375, 0.052215576171875, 0.051361083984375, 0.005550384521484375, 0.0262603759765625, -0.011871337890625, -0.039886474609375, -0.0010747909545898438, -0.0413818359375, -0.00887298583984375, -0.005031585693359375, -0.050811767578125, 0.022308349609375, -0.01331329345703125, -0.0245361328125, -0.01439666748046875, 0.02044677734375, -0.0631103515625, 0.012481689453125, 0.004119873046875, 0.083740234375, -0.04107666015625, 0.0804443359375, 0.043487548828125, -0.066162109375, -0.043731689453125, -0.00977325439453125, -0.0299835205078125, -0.052215576171875, 0.04278564453125, 0.009490966796875, 0.00878143310546875, -0.000026702880859375, -0.0267486572265625, -0.061767578125, 0.111083984375, 0.0152740478515625, -0.05120849609375, -0.01378631591796875, 0.033111572265625, 0.03857421875, -0.02593994140625, 0.050933837890625, 0.032623291015625, 0.037200927734375, -0.0145721435546875, -0.0706787109375, 0.01125335693359375, -0.03704833984375, -0.0032634735107421875, 0.006015777587890625, -0.062286376953125, 0.0794677734375, -0.0166473388671875, 0.01275634765625, 0.0125732421875, 0.04534912109375, 0.01544952392578125, 0.025848388671875, 0.0264434814453125, 0.0634765625, 0.05157470703125, -0.0298309326171875, 0.0665283203125, -0.042327880859375, 0.042877197265625, 0.06787109375, 0.015289306640625, 0.06707763671875, 0.032379150390625, -0.024627685546875, 0.05596923828125, 0.054779052734375, -0.016387939453125, 0.038787841796875, 0.0029277801513671875, 0.0012254714965820312, -0.0310211181640625, 0.0289459228515625, -0.051025390625, 0.01800537109375, 0.011810302734375, -0.060302734375, -0.005710601806640625, -0.00397491455078125, -0.008331298828125, -0.0123291015625, -0.0185546875, 0.03387451171875, -0.0058135986328125, -0.0430908203125, 0.051513671875, 0.0022754669189453125, 0.056427001953125, -0.04998779296875, 0.0139312744140625, -0.0194549560546875, 0.020294189453125, -0.017059326171875, -0.06591796875, 0.00714111328125, -0.0037097930908203125, -0.011322021484375, -0.020843505859375, 0.036865234375, -0.043975830078125, -0.043060302734375, 0.0309600830078125, 0.0240631103515625, 0.0159912109375, -0.007198333740234375, -0.07806396484375, 0.0166778564453125, 0.0159759521484375, -0.037750244140625, 0.0084381103515625, 0.031829833984375, 0.0098114013671875, 0.0504150390625, 0.03662109375, -0.0091400146484375, 0.031524658203125, 0.0025482177734375, 0.053253173828125, -0.06591796875, -0.039642333984375, -0.04345703125, 0.045806884765625, -0.0219879150390625, -0.040008544921875, 0.0679931640625, 0.07818603515625, 0.0748291015625, -0.0242156982421875, 0.0504150390625, -0.01110076904296875, 0.0186767578125, -0.029449462890625, 0.05877685546875, -0.064208984375, 0.0188446044921875, -0.0164794921875, -0.0625, -0.013214111328125, 0.04803466796875, -0.033111572265625, 0.0194549560546875, 0.050384521484375, 0.07061767578125, 0.0005860328674316406, -0.001922607421875, 0.018524169921875, 0.011932373046875, 0.0135498046875, 0.06591796875, 0.048919677734375, -0.06964111328125, 0.075927734375, -0.03289794921875, 0.01218414306640625, -0.0166473388671875, -0.031829833984375, -0.06414794921875, -0.043914794921875, -0.024871826171875, -0.03173828125, 0.0123291015625, 0.0626220703125, 0.054840087890625, -0.0562744140625, -0.01555633544921875, -0.0017461776733398438, 0.00749969482421875, -0.01042938232421875, -0.0171966552734375, 0.0325927734375, -0.020721435546875, -0.0718994140625, 0.02520751953125, 0.0010089874267578125, 0.000698089599609375, -0.0185089111328125, -0.03289794921875, -0.0222015380859375, 0.003265380859375, 0.034393310546875, 0.00815582275390625, -0.05499267578125, -0.00936126708984375, 0.01422119140625, -0.022430419921875, 0.0221710205078125, 0.045806884765625, -0.058868408203125, 0.0172576904296875, 0.062286376953125, 0.031524658203125, 0.068359375, -0.01568603515625, 0.0208282470703125, -0.0311279296875, -0.0030803680419921875, 0.01171112060546875, 0.04345703125, 0.0107879638671875, -0.01445770263671875, 0.04534912109375, 0.0296478271484375, -0.04547119140625, -0.061798095703125, -0.0135955810546875, -0.086669921875, -0.026580810546875, 0.06787109375, -0.01015472412109375, -0.033416748046875, 0.01340484619140625, -0.0111083984375, 0.017822265625, -0.0283355712890625, 0.035552978515625, 0.04962158203125, 0.00440216064453125, -0.0204925537109375, -0.043365478515625, 0.0309906005859375, 0.0176544189453125, -0.052276611328125, -0.01372528076171875, 0.01335906982421875, 0.03564453125, 0.01515960693359375, 0.03326416015625, -0.0309600830078125, 0.0238189697265625, 0.0120086669921875, 0.031036376953125, -0.02178955078125, -0.031280517578125, -0.0252685546875, 0.01305389404296875, -0.031219482421875, -0.038604736328125 ] ]
google/pix2struct-textcaps-base
2023-09-07T18:57:01.000Z
[ "transformers", "pytorch", "safetensors", "pix2struct", "text2text-generation", "image-to-text", "en", "fr", "ro", "de", "multilingual", "arxiv:2210.03347", "license:apache-2.0", "autotrain_compatible", "has_space", "region:us" ]
image-to-text
google
null
null
google/pix2struct-textcaps-base
23
163,798
transformers
2023-03-01T09:07:41
--- language: - en - fr - ro - de - multilingual pipeline_tag: image-to-text inference: false license: apache-2.0 --- # Model card for Pix2Struct - Finetuned on TextCaps ![model_image](https://s3.amazonaws.com/moonup/production/uploads/1678713353867-62441d1d9fdefb55a0b7d12c.png) # Table of Contents 0. [TL;DR](#TL;DR) 1. [Using the model](#using-the-model) 2. [Contribution](#contribution) 3. [Citation](#citation) # TL;DR Pix2Struct is an image encoder - text decoder model that is trained on image-text pairs for various tasks, including image captionning and visual question answering. The full list of available models can be found on the Table 1 of the paper: ![Table 1 - paper](https://s3.amazonaws.com/moonup/production/uploads/1678712985040-62441d1d9fdefb55a0b7d12c.png) The abstract of the model states that: > Visually-situated language is ubiquitous—sources range from textbooks with diagrams to web pages with images and tables, to mobile apps with buttons and forms. Perhaps due to this diversity, previous work has typically relied on domainspecific recipes with limited sharing of the underlying data, model architectures, and objectives. We present Pix2Struct, a pretrained image-to-text model for purely visual language understanding, which can be finetuned on tasks containing visually-situated language. Pix2Struct is pretrained by learning to parse masked screenshots of web pages into simplified HTML. The web, with its richness of visual elements cleanly reflected in the HTML structure, provides a large source of pretraining data well suited to the diversity of downstream tasks. Intuitively, this objective subsumes common pretraining signals such as OCR, language modeling, image captioning. In addition to the novel pretraining strategy, we introduce a variable-resolution input representation and a more flexible integration of language and vision inputs, where language prompts such as questions are rendered directly on top of the input image. For the first time, we show that a single pretrained model can achieve state-of-the-art results in six out of nine tasks across four domains: documents, illustrations, user interfaces, and natural images. # Using the model ## Converting from T5x to huggingface You can use the [`convert_pix2struct_checkpoint_to_pytorch.py`](https://github.com/huggingface/transformers/blob/main/src/transformers/models/pix2struct/convert_pix2struct_checkpoint_to_pytorch.py) script as follows: ```bash python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE ``` if you are converting a large model, run: ```bash python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE --use-large ``` Once saved, you can push your converted model with the following snippet: ```python from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor model = Pix2StructForConditionalGeneration.from_pretrained(PATH_TO_SAVE) processor = Pix2StructProcessor.from_pretrained(PATH_TO_SAVE) model.push_to_hub("USERNAME/MODEL_NAME") processor.push_to_hub("USERNAME/MODEL_NAME") ``` ## Running the model ### In full precision, on CPU: You can run the model in full precision on CPU: ```python import requests from PIL import Image from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor url = "https://www.ilankelman.org/stopsigns/australia.jpg" image = Image.open(requests.get(url, stream=True).raw) model = Pix2StructForConditionalGeneration.from_pretrained("google/pix2struct-textcaps-base") processor = Pix2StructProcessor.from_pretrained("google/pix2struct-textcaps-base") # image only inputs = processor(images=image, return_tensors="pt") predictions = model.generate(**inputs) print(processor.decode(predictions[0], skip_special_tokens=True)) >>> A stop sign is on a street corner. ``` ### In full precision, on GPU: You can run the model in full precision on CPU: ```python import requests from PIL import Image from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor url = "https://www.ilankelman.org/stopsigns/australia.jpg" image = Image.open(requests.get(url, stream=True).raw) model = Pix2StructForConditionalGeneration.from_pretrained("google/pix2struct-textcaps-base").to("cuda") processor = Pix2StructProcessor.from_pretrained("google/pix2struct-textcaps-base") # image only inputs = processor(images=image, return_tensors="pt").to("cuda") predictions = model.generate(**inputs) print(processor.decode(predictions[0], skip_special_tokens=True)) >>> A stop sign is on a street corner. ``` ### In half precision, on GPU: You can run the model in full precision on CPU: ```python import requests import torch from PIL import Image from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor url = "https://www.ilankelman.org/stopsigns/australia.jpg" image = Image.open(requests.get(url, stream=True).raw) model = Pix2StructForConditionalGeneration.from_pretrained("google/pix2struct-textcaps-base", torch_dtype=torch.bfloat16).to("cuda") processor = Pix2StructProcessor.from_pretrained("google/pix2struct-textcaps-base") # image only inputs = processor(images=image, return_tensors="pt").to("cuda", torch.bfloat16) predictions = model.generate(**inputs) print(processor.decode(predictions[0], skip_special_tokens=True)) >>> A stop sign is on a street corner. ``` ### Use different sequence length This model has been trained on a sequence length of `2048`. You can try to reduce the sequence length for a more memory efficient inference but you may observe some performance degradation for small sequence length (<512). Just pass `max_patches` when calling the processor: ```python inputs = processor(images=image, return_tensors="pt", max_patches=512) ``` ### Conditional generation You can also pre-pend some input text to perform conditional generation: ```python import requests from PIL import Image from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor url = "https://www.ilankelman.org/stopsigns/australia.jpg" image = Image.open(requests.get(url, stream=True).raw) text = "A picture of" model = Pix2StructForConditionalGeneration.from_pretrained("google/pix2struct-textcaps-base") processor = Pix2StructProcessor.from_pretrained("google/pix2struct-textcaps-base") # image only inputs = processor(images=image, text=text, return_tensors="pt") predictions = model.generate(**inputs) print(processor.decode(predictions[0], skip_special_tokens=True)) >>> A picture of a stop sign that says yes. ``` # Contribution This model was originally contributed by Kenton Lee, Mandar Joshi et al. and added to the Hugging Face ecosystem by [Younes Belkada](https://huggingface.co/ybelkada). # Citation If you want to cite this work, please consider citing the original paper: ``` @misc{https://doi.org/10.48550/arxiv.2210.03347, doi = {10.48550/ARXIV.2210.03347}, url = {https://arxiv.org/abs/2210.03347}, author = {Lee, Kenton and Joshi, Mandar and Turc, Iulia and Hu, Hexiang and Liu, Fangyu and Eisenschlos, Julian and Khandelwal, Urvashi and Shaw, Peter and Chang, Ming-Wei and Toutanova, Kristina}, keywords = {Computation and Language (cs.CL), Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
7,674
[ [ -0.02825927734375, -0.05853271484375, 0.035797119140625, 0.0053253173828125, -0.0182647705078125, -0.03350830078125, -0.02325439453125, -0.04150390625, -0.01030731201171875, 0.0195465087890625, -0.04132080078125, -0.0189971923828125, -0.06329345703125, -0.0099029541015625, -0.003894805908203125, 0.06005859375, -0.004581451416015625, 0.0024547576904296875, -0.0232086181640625, 0.014312744140625, -0.016082763671875, -0.0190582275390625, -0.05975341796875, -0.026885986328125, 0.019287109375, 0.0103912353515625, 0.05548095703125, 0.044708251953125, 0.0394287109375, 0.0235443115234375, -0.00951385498046875, 0.0004887580871582031, -0.027496337890625, -0.0210113525390625, -0.00766754150390625, -0.04718017578125, -0.025238037109375, 0.01239776611328125, 0.042724609375, 0.0294036865234375, 0.0153045654296875, 0.006999969482421875, 0.00728607177734375, 0.035675048828125, -0.0347900390625, 0.02947998046875, -0.0323486328125, 0.0007252693176269531, 0.0006847381591796875, 0.0213165283203125, -0.0260162353515625, -0.0095672607421875, -0.0014095306396484375, -0.047454833984375, 0.0228118896484375, -0.0066986083984375, 0.0926513671875, 0.02362060546875, -0.00392913818359375, -0.006923675537109375, -0.02197265625, 0.04852294921875, -0.051177978515625, 0.018310546875, 0.03564453125, 0.007843017578125, -0.00537109375, -0.08038330078125, -0.050567626953125, -0.00019621849060058594, -0.01410675048828125, 0.0172882080078125, -0.0311737060546875, -0.01551055908203125, 0.029754638671875, 0.01134490966796875, -0.04644775390625, -0.00487518310546875, -0.047454833984375, -0.01517486572265625, 0.053558349609375, -0.0106658935546875, 0.03759765625, -0.022674560546875, -0.032440185546875, -0.042083740234375, -0.0323486328125, 0.033599853515625, 0.01035308837890625, 0.00244140625, -0.037353515625, 0.043365478515625, 0.004261016845703125, 0.042724609375, 0.0201416015625, -0.0006413459777832031, 0.0257568359375, -0.01340484619140625, -0.014739990234375, -0.0182342529296875, 0.0921630859375, 0.04107666015625, 0.0236663818359375, 0.0050048828125, 0.0005402565002441406, 0.002948760986328125, -0.002956390380859375, -0.0859375, -0.045196533203125, 0.024627685546875, -0.0255889892578125, -0.01358795166015625, 0.0204925537109375, -0.04730224609375, -0.006072998046875, -0.01552581787109375, 0.037811279296875, -0.04852294921875, -0.040496826171875, -0.00336456298828125, -0.01617431640625, 0.01806640625, 0.0218353271484375, -0.05706787109375, 0.002838134765625, 0.037567138671875, 0.08074951171875, 0.006374359130859375, -0.02679443359375, -0.024505615234375, -0.02191162109375, -0.02947998046875, 0.0576171875, -0.0174407958984375, -0.01303863525390625, -0.0300445556640625, 0.0256500244140625, -0.016021728515625, -0.0301361083984375, 0.006412506103515625, -0.0227508544921875, 0.0257720947265625, 0.0013895034790039062, -0.023162841796875, -0.0323486328125, 0.009002685546875, -0.036590576171875, 0.089599609375, 0.024444580078125, -0.05609130859375, 0.002410888671875, -0.039276123046875, -0.0200042724609375, -0.005435943603515625, -0.0096588134765625, -0.07257080078125, -0.0031681060791015625, 0.0193634033203125, 0.037750244140625, -0.01242828369140625, 0.022003173828125, -0.028289794921875, -0.015380859375, 0.021270751953125, -0.00556182861328125, 0.072509765625, 0.01800537109375, -0.053009033203125, 0.005954742431640625, -0.0302581787109375, 0.017822265625, 0.026763916015625, -0.00577545166015625, 0.0094451904296875, -0.0124969482421875, 0.0211029052734375, 0.029327392578125, 0.01849365234375, -0.031829833984375, 0.009063720703125, -0.0189971923828125, 0.05023193359375, 0.036651611328125, -0.0228118896484375, 0.0279541015625, -0.00505828857421875, 0.01216888427734375, 0.0075225830078125, 0.00726318359375, -0.02777099609375, -0.0653076171875, -0.05084228515625, -0.0144805908203125, 0.005580902099609375, 0.0330810546875, -0.05767822265625, 0.03289794921875, -0.017486572265625, -0.0447998046875, -0.006267547607421875, -0.00661468505859375, 0.046295166015625, 0.040557861328125, 0.0290679931640625, -0.031707763671875, -0.0318603515625, -0.04876708984375, 0.0087738037109375, -0.01200103759765625, -0.01155853271484375, 0.0225830078125, 0.061767578125, -0.027496337890625, 0.06768798828125, -0.035919189453125, -0.0085906982421875, -0.01366424560546875, 0.01284027099609375, 0.01800537109375, 0.0584716796875, 0.041259765625, -0.0662841796875, -0.044647216796875, -0.0024585723876953125, -0.06707763671875, 0.01328277587890625, -0.004581451416015625, -0.019378662109375, 0.025299072265625, 0.0302581787109375, -0.044464111328125, 0.03662109375, 0.036865234375, -0.04962158203125, 0.050506591796875, -0.0159149169921875, 0.0006842613220214844, -0.08380126953125, 0.027374267578125, 0.011993408203125, -0.029876708984375, -0.046112060546875, 0.008941650390625, 0.03424072265625, -0.0250701904296875, -0.047698974609375, 0.05377197265625, -0.038116455078125, -0.009002685546875, -0.0195770263671875, -0.01666259765625, 0.00530242919921875, 0.052093505859375, 0.038238525390625, 0.055511474609375, 0.06292724609375, -0.04840087890625, 0.0255126953125, 0.03704833984375, -0.02642822265625, 0.03131103515625, -0.05975341796875, 0.0271759033203125, -0.00435638427734375, 0.007480621337890625, -0.06524658203125, -0.0321044921875, 0.038116455078125, -0.059234619140625, 0.03082275390625, -0.0201568603515625, -0.0271453857421875, -0.04156494140625, -0.01117706298828125, 0.0406494140625, 0.050506591796875, -0.0421142578125, 0.0287017822265625, 0.01540374755859375, -0.00749969482421875, -0.0216827392578125, -0.065185546875, -0.0002238750457763672, 0.005504608154296875, -0.06640625, 0.032073974609375, -0.001407623291015625, -0.00473785400390625, 0.0113525390625, 0.0209197998046875, 0.00286865234375, -0.012115478515625, 0.04156494140625, 0.033721923828125, -0.0125274658203125, -0.01508331298828125, -0.0134429931640625, -0.0241851806640625, 0.00220489501953125, -0.027435302734375, 0.058624267578125, -0.02734375, -0.004184722900390625, -0.07379150390625, 0.007381439208984375, 0.039276123046875, -0.0177459716796875, 0.04290771484375, 0.0584716796875, -0.039215087890625, 0.0176849365234375, -0.037994384765625, -0.0190887451171875, -0.033660888671875, 0.048309326171875, -0.0325927734375, -0.034759521484375, 0.03155517578125, 0.00907135009765625, -0.0102691650390625, 0.050994873046875, 0.049560546875, -0.0166168212890625, 0.07110595703125, 0.048675537109375, 0.02154541015625, 0.042449951171875, -0.05712890625, -0.0001322031021118164, -0.06524658203125, -0.040679931640625, -0.01085662841796875, -0.018585205078125, -0.024810791015625, -0.0482177734375, 0.031494140625, 0.0272979736328125, -0.04083251953125, 0.040191650390625, -0.047454833984375, 0.0191192626953125, 0.057586669921875, 0.04083251953125, -0.021484375, 0.03253173828125, 0.005382537841796875, -0.007534027099609375, -0.044219970703125, -0.0242767333984375, 0.07598876953125, 0.0322265625, 0.0369873046875, -0.0204010009765625, 0.0189361572265625, -0.00705718994140625, 0.00341796875, -0.04736328125, 0.026641845703125, -0.01038360595703125, -0.037750244140625, -0.0112762451171875, -0.0262908935546875, -0.057037353515625, 0.00843048095703125, -0.02313232421875, -0.0604248046875, 0.0177459716796875, 0.01175689697265625, -0.03607177734375, 0.044647216796875, -0.060028076171875, 0.07958984375, -0.0291748046875, -0.0592041015625, 0.007598876953125, -0.045166015625, 0.0200042724609375, 0.015380859375, -0.00234222412109375, -0.0005993843078613281, 0.017425537109375, 0.06903076171875, -0.0400390625, 0.059844970703125, -0.0161590576171875, 0.0165252685546875, 0.041748046875, -0.006114959716796875, 0.033538818359375, -0.0032596588134765625, -0.00921630859375, 0.04083251953125, 0.01102447509765625, -0.0318603515625, -0.0294036865234375, 0.030487060546875, -0.08489990234375, -0.01849365234375, -0.03643798828125, -0.036834716796875, 0.02117919921875, 0.02630615234375, 0.05621337890625, 0.0267181396484375, 0.0157012939453125, 0.00569915771484375, 0.03729248046875, -0.0211029052734375, 0.048248291015625, -0.0081634521484375, -0.0120391845703125, -0.04925537109375, 0.0537109375, -0.0070648193359375, 0.0215606689453125, 0.005237579345703125, 0.016143798828125, -0.0450439453125, -0.01355743408203125, -0.04205322265625, 0.03790283203125, -0.037322998046875, -0.01971435546875, -0.0438232421875, -0.020843505859375, -0.05621337890625, -0.01751708984375, -0.048736572265625, -0.0201873779296875, -0.04290771484375, 0.025909423828125, 0.029754638671875, 0.04083251953125, -0.0212860107421875, 0.0400390625, -0.028717041015625, 0.0266876220703125, 0.025238037109375, 0.0341796875, -0.01422882080078125, -0.049713134765625, -0.0008788108825683594, 0.0169830322265625, -0.035247802734375, -0.045928955078125, 0.044647216796875, 0.01200103759765625, 0.01483154296875, 0.040374755859375, -0.01178741455078125, 0.059661865234375, -0.02117919921875, 0.052490234375, 0.05938720703125, -0.0577392578125, 0.05706787109375, -0.007076263427734375, 0.01641845703125, 0.0198974609375, 0.0255126953125, -0.037567138671875, -0.00026345252990722656, -0.05950927734375, -0.05230712890625, 0.0714111328125, 0.0281829833984375, 0.0031280517578125, 0.019927978515625, 0.044708251953125, -0.007640838623046875, -0.0010213851928710938, -0.0709228515625, -0.0102081298828125, -0.0426025390625, -0.01143646240234375, -0.009033203125, -0.02496337890625, 0.006031036376953125, -0.034881591796875, 0.057647705078125, -0.01142120361328125, 0.057647705078125, 0.030364990234375, -0.0227813720703125, -0.005382537841796875, -0.026031494140625, 0.038421630859375, 0.04156494140625, -0.01068115234375, 0.0161285400390625, -0.012054443359375, -0.047637939453125, -0.004932403564453125, 0.006061553955078125, -0.0266265869140625, -0.00400543212890625, 0.0244598388671875, 0.0850830078125, 0.0027866363525390625, -0.0227203369140625, 0.041229248046875, -0.004604339599609375, -0.0289154052734375, -0.03607177734375, 0.00728607177734375, 0.00772857666015625, 0.02490234375, 0.021484375, 0.01727294921875, -0.0259246826171875, -0.03839111328125, 0.0216827392578125, 0.03472900390625, -0.028533935546875, -0.033538818359375, 0.08056640625, -0.0050201416015625, -0.0200653076171875, 0.06744384765625, -0.0160369873046875, -0.045745849609375, 0.07379150390625, 0.04034423828125, 0.051513671875, 0.0024662017822265625, 0.00885009765625, 0.07830810546875, 0.01082611083984375, -0.020416259765625, 0.006824493408203125, 0.0010776519775390625, -0.04779052734375, -0.013214111328125, -0.0462646484375, -0.01384735107421875, 0.01415252685546875, -0.037872314453125, 0.03485107421875, -0.04925537109375, -0.0218353271484375, -0.0036525726318359375, 0.01410675048828125, -0.05181884765625, 0.0265350341796875, 0.035736083984375, 0.054779052734375, -0.054290771484375, 0.056793212890625, 0.051361083984375, -0.04571533203125, -0.05938720703125, -0.0205230712890625, -0.005634307861328125, -0.0699462890625, 0.045623779296875, 0.05010986328125, 0.0092926025390625, 0.0177001953125, -0.0631103515625, -0.055908203125, 0.094482421875, 0.0213165283203125, -0.042633056640625, 0.0008230209350585938, 0.0205841064453125, 0.0154571533203125, -0.01238250732421875, 0.052581787109375, 0.02490234375, 0.035491943359375, 0.0235137939453125, -0.0587158203125, 0.007633209228515625, -0.04254150390625, 0.01079559326171875, -0.015899658203125, -0.048675537109375, 0.08441162109375, -0.0350341796875, -0.020751953125, 0.0158538818359375, 0.054107666015625, 0.0248870849609375, 0.02435302734375, 0.026824951171875, 0.05657958984375, 0.0452880859375, -0.0297393798828125, 0.0794677734375, -0.0237274169921875, 0.05584716796875, 0.05548095703125, 0.0228424072265625, 0.06146240234375, 0.0255126953125, -0.02496337890625, 0.03082275390625, 0.070068359375, -0.032318115234375, 0.030120849609375, -0.012603759765625, -0.0012350082397460938, -0.025299072265625, 0.0149383544921875, -0.0372314453125, 0.0239715576171875, 0.019622802734375, -0.0260772705078125, -0.00963592529296875, 0.0088043212890625, 0.0215301513671875, -0.00601959228515625, -0.028350830078125, 0.032928466796875, 0.005954742431640625, -0.058319091796875, 0.062744140625, 0.01361846923828125, 0.07037353515625, -0.033111572265625, 0.01064300537109375, -0.0122528076171875, 0.030120849609375, -0.0237274169921875, -0.07586669921875, 0.0237274169921875, -0.0032711029052734375, -0.0177764892578125, -0.0181121826171875, 0.0421142578125, -0.0286865234375, -0.07012939453125, 0.012847900390625, -0.0033130645751953125, 0.0303497314453125, -0.0396728515625, -0.06640625, 0.0125274658203125, 0.006195068359375, -0.02777099609375, 0.01305389404296875, 0.0036296844482421875, -0.0014858245849609375, 0.040557861328125, 0.05914306640625, -0.006946563720703125, 0.0166473388671875, -0.02899169921875, 0.055816650390625, -0.04229736328125, -0.038604736328125, -0.059722900390625, 0.057403564453125, -0.003582000732421875, -0.025726318359375, 0.0284271240234375, 0.056182861328125, 0.070068359375, -0.0092620849609375, 0.054962158203125, -0.03564453125, -0.0027637481689453125, -0.04254150390625, 0.0738525390625, -0.0474853515625, -0.0228424072265625, -0.00891876220703125, -0.0587158203125, -0.0301361083984375, 0.06781005859375, -0.035247802734375, 0.0013895034790039062, 0.044464111328125, 0.0826416015625, -0.021728515625, -0.0247650146484375, 0.014617919921875, 0.007785797119140625, 0.022705078125, 0.062469482421875, 0.0294952392578125, -0.06231689453125, 0.03912353515625, -0.0562744140625, -0.028961181640625, -0.00505828857421875, -0.048828125, -0.06988525390625, -0.056732177734375, -0.04925537109375, -0.03875732421875, -0.00719451904296875, 0.0426025390625, 0.07305908203125, -0.055755615234375, -0.0106048583984375, -0.025238037109375, -0.006664276123046875, -0.01168060302734375, -0.01629638671875, 0.050750732421875, -0.016876220703125, -0.081787109375, -0.0136566162109375, 0.00099945068359375, 0.0218353271484375, 0.00911712646484375, -0.0230712890625, -0.01432037353515625, -0.03497314453125, 0.040374755859375, 0.0408935546875, -0.048065185546875, -0.0014505386352539062, 0.0012159347534179688, -0.0089569091796875, 0.0225067138671875, 0.03179931640625, -0.0477294921875, 0.04150390625, 0.031829833984375, 0.026092529296875, 0.06610107421875, -0.00024509429931640625, 0.00948333740234375, -0.0205230712890625, 0.04150390625, -0.00809478759765625, 0.0287933349609375, 0.031463623046875, -0.0340576171875, 0.034515380859375, 0.04388427734375, -0.0247344970703125, -0.04107666015625, 0.00402069091796875, -0.07891845703125, -0.031646728515625, 0.1029052734375, -0.026763916015625, -0.049591064453125, 0.025909423828125, -0.017913818359375, 0.0384521484375, -0.00868988037109375, 0.04071044921875, 0.0188446044921875, -0.017578125, -0.050750732421875, -0.0275115966796875, 0.0232086181640625, 0.024261474609375, -0.045806884765625, -0.0206146240234375, 0.0305328369140625, 0.044647216796875, 0.0289459228515625, 0.034515380859375, -0.02166748046875, 0.0382080078125, 0.01131439208984375, 0.04052734375, -0.035247802734375, -0.011322021484375, -0.015777587890625, -0.00047969818115234375, -0.025238037109375, -0.0258026123046875 ] ]
microsoft/trocr-large-stage1
2023-03-31T18:38:51.000Z
[ "transformers", "pytorch", "vision-encoder-decoder", "trocr", "image-to-text", "arxiv:2109.10282", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
microsoft
null
null
microsoft/trocr-large-stage1
10
163,433
transformers
2022-03-02T23:29:05
--- tags: - trocr - image-to-text --- # TrOCR (large-sized model, pre-trained only) TrOCR pre-trained only model. It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr). Disclaimer: The team releasing TrOCR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of BEiT, while the text decoder was initialized from the weights of RoBERTa. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens. ## Intended uses & limitations You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model in PyTorch: ```python from transformers import TrOCRProcessor, VisionEncoderDecoderModel from PIL import Image import requests # load image from the IAM database url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg' image = Image.open(requests.get(url, stream=True).raw).convert("RGB") processor = TrOCRProcessor.from_pretrained('microsoft/trocr-large-stage1') model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-large-stage1') # training pixel_values = processor(image, return_tensors="pt").pixel_values # Batch size 1 decoder_input_ids = torch.tensor([[model.config.decoder.decoder_start_token_id]]) outputs = model(pixel_values=pixel_values, decoder_input_ids=decoder_input_ids) ``` ### BibTeX entry and citation info ```bibtex @misc{li2021trocr, title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models}, author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei}, year={2021}, eprint={2109.10282}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
2,520
[ [ -0.0203094482421875, -0.02545166015625, 0.006443023681640625, -0.0208892822265625, -0.0284271240234375, -0.00409698486328125, -0.0016317367553710938, -0.0537109375, 0.01171112060546875, 0.042388916015625, -0.0306549072265625, -0.0237884521484375, -0.049774169921875, 0.0118560791015625, -0.0268402099609375, 0.082763671875, -0.01354217529296875, 0.0003936290740966797, 0.01262664794921875, -0.0226593017578125, -0.01102447509765625, -0.044525146484375, -0.046600341796875, -0.0218353271484375, 0.031494140625, 0.015838623046875, 0.046356201171875, 0.03997802734375, 0.06536865234375, 0.0302581787109375, -0.019012451171875, 0.0015172958374023438, -0.02886962890625, -0.0299224853515625, 0.021087646484375, -0.032958984375, -0.041229248046875, 0.007244110107421875, 0.047454833984375, 0.01020050048828125, -0.00441741943359375, 0.009765625, 0.004364013671875, 0.0277557373046875, -0.0248260498046875, -0.00669097900390625, -0.031890869140625, 0.03387451171875, -0.00015211105346679688, -0.00036787986755371094, -0.036651611328125, -0.0208587646484375, 0.018951416015625, -0.05072021484375, 0.044189453125, 0.001384735107421875, 0.09130859375, -0.0067596435546875, -0.0280609130859375, -0.0265655517578125, -0.058258056640625, 0.049072265625, -0.0418701171875, 0.034881591796875, -0.007762908935546875, 0.0184478759765625, 0.010101318359375, -0.081787109375, -0.06439208984375, -0.033477783203125, -0.0194549560546875, 0.001071929931640625, -0.0241546630859375, 0.02294921875, 0.039093017578125, 0.0298004150390625, -0.051513671875, -0.01001739501953125, -0.059722900390625, -0.0294647216796875, 0.02227783203125, -0.01119232177734375, 0.01873779296875, 0.007076263427734375, -0.043701171875, -0.035797119140625, -0.01314544677734375, -0.006591796875, -0.00311279296875, -0.003047943115234375, -0.0270538330078125, 0.04998779296875, 0.0200347900390625, 0.057830810546875, 0.0235595703125, -0.0192718505859375, 0.046661376953125, -0.0015935897827148438, -0.01337432861328125, 0.01004791259765625, 0.08203125, 0.0156707763671875, 0.0276336669921875, -0.01131439208984375, -0.0251617431640625, 0.0222930908203125, 0.0098724365234375, -0.0693359375, -0.004360198974609375, -0.00003439188003540039, -0.036346435546875, -0.02105712890625, 0.01401519775390625, -0.062042236328125, -0.014862060546875, -0.003078460693359375, 0.044769287109375, -0.027252197265625, -0.00823974609375, -0.00841522216796875, -0.0088043212890625, 0.02203369140625, 0.01751708984375, -0.03387451171875, 0.0052337646484375, 0.01129913330078125, 0.081787109375, -0.00958251953125, -0.032135009765625, -0.032867431640625, -0.005741119384765625, -0.022003173828125, 0.035400390625, -0.0192413330078125, -0.017364501953125, -0.0006670951843261719, 0.0159759521484375, -0.0026721954345703125, -0.039520263671875, 0.036102294921875, -0.0484619140625, 0.016143798828125, 0.00850677490234375, -0.00437164306640625, -0.0037403106689453125, 0.023895263671875, -0.064453125, 0.0828857421875, 0.0176849365234375, -0.0631103515625, 0.019439697265625, -0.054473876953125, -0.01611328125, 0.004344940185546875, 0.00818634033203125, -0.051116943359375, 0.01160430908203125, 0.00205230712890625, 0.009063720703125, -0.01258087158203125, 0.0040435791015625, -0.0072479248046875, -0.029388427734375, 0.01258087158203125, -0.0150909423828125, 0.054168701171875, 0.0235443115234375, -0.031494140625, 0.005718231201171875, -0.067138671875, 0.0112457275390625, 0.01824951171875, -0.017242431640625, 0.007259368896484375, -0.0262908935546875, 0.0260467529296875, 0.0408935546875, 0.0299224853515625, -0.049468994140625, 0.0188140869140625, -0.0180816650390625, 0.0609130859375, 0.035552978515625, -0.0276336669921875, 0.0311737060546875, -0.0088958740234375, 0.0350341796875, 0.024505615234375, 0.01312255859375, -0.00685882568359375, -0.0184478759765625, -0.0701904296875, -0.0108795166015625, 0.01406097412109375, 0.0433349609375, -0.07855224609375, 0.0279693603515625, -0.02545166015625, -0.044342041015625, -0.0264892578125, -0.00543975830078125, 0.041839599609375, 0.05169677734375, 0.0262451171875, -0.038482666015625, -0.041168212890625, -0.043701171875, 0.00238037109375, -0.018341064453125, 0.0022106170654296875, 0.01244354248046875, 0.058868408203125, -0.019073486328125, 0.06170654296875, -0.0276947021484375, -0.045501708984375, -0.0296630859375, 0.01105499267578125, 0.0170135498046875, 0.052886962890625, 0.02972412109375, -0.053955078125, -0.0295867919921875, -0.0009369850158691406, -0.057037353515625, 0.014129638671875, -0.0001042485237121582, -0.01107025146484375, 0.03070068359375, 0.0269622802734375, -0.062042236328125, 0.056365966796875, 0.041961669921875, -0.028228759765625, 0.044219970703125, -0.04144287109375, 0.007251739501953125, -0.0762939453125, 0.0204315185546875, 0.0024852752685546875, -0.0197906494140625, -0.062255859375, 0.015655517578125, 0.0110015869140625, -0.028228759765625, -0.038604736328125, 0.048614501953125, -0.051971435546875, -0.00875091552734375, -0.00783538818359375, 0.01200103759765625, 0.009521484375, 0.0511474609375, 0.037139892578125, 0.055145263671875, 0.0219573974609375, -0.038238525390625, 0.0302581787109375, 0.0310211181640625, -0.02886962890625, 0.03289794921875, -0.0841064453125, 0.03594970703125, 0.0006642341613769531, 0.007648468017578125, -0.05560302734375, 0.006320953369140625, 0.0256195068359375, -0.0345458984375, 0.0304107666015625, -0.005077362060546875, -0.037567138671875, -0.0701904296875, 0.0007648468017578125, 0.043487548828125, 0.039703369140625, -0.04840087890625, 0.0673828125, 0.0071563720703125, 0.03314208984375, -0.03863525390625, -0.083740234375, -0.007564544677734375, -0.0087738037109375, -0.052978515625, 0.034423828125, -0.01384735107421875, 0.022918701171875, -0.01093292236328125, -0.000018775463104248047, -0.0217132568359375, -0.03253173828125, 0.004413604736328125, 0.037628173828125, -0.0219573974609375, -0.01151275634765625, -0.043182373046875, -0.01328277587890625, -0.0208587646484375, -0.0224609375, 0.053375244140625, -0.0205078125, -0.0016222000122070312, -0.048126220703125, 0.0133209228515625, 0.053497314453125, -0.04400634765625, 0.055999755859375, 0.057464599609375, -0.0262908935546875, 0.006526947021484375, -0.04473876953125, -0.01319122314453125, -0.038055419921875, 0.039764404296875, -0.0239105224609375, -0.049102783203125, 0.055908203125, 0.0290679931640625, -0.011077880859375, 0.0294647216796875, 0.0280609130859375, 0.0089111328125, 0.07257080078125, 0.058135986328125, 0.007427215576171875, 0.062164306640625, -0.045867919921875, 0.0189208984375, -0.06829833984375, -0.028717041015625, -0.0273284912109375, -0.02880859375, -0.041015625, -0.0205841064453125, 0.0286712646484375, 0.0025482177734375, -0.031951904296875, 0.03985595703125, -0.07586669921875, 0.016571044921875, 0.051116943359375, 0.037139892578125, 0.00554656982421875, 0.0137176513671875, -0.01384735107421875, 0.00848388671875, -0.03765869140625, -0.0283203125, 0.0550537109375, 0.0187225341796875, 0.060516357421875, -0.0207366943359375, 0.0399169921875, 0.0099029541015625, -0.001018524169921875, -0.060516357421875, 0.046875, -0.0079193115234375, -0.049224853515625, -0.00017392635345458984, -0.035400390625, -0.07086181640625, 0.0025005340576171875, -0.030792236328125, -0.059722900390625, 0.045440673828125, 0.031341552734375, -0.00812530517578125, 0.045867919921875, -0.05035400390625, 0.076171875, -0.031982421875, -0.0186920166015625, 0.01953125, -0.060546875, 0.00811767578125, 0.00901031494140625, -0.0123138427734375, 0.03564453125, 0.01557159423828125, 0.07916259765625, -0.052215576171875, 0.04827880859375, -0.0166168212890625, 0.0055084228515625, 0.0360107421875, -0.004364013671875, 0.049591064453125, -0.037933349609375, -0.0034656524658203125, 0.0404052734375, 0.00859832763671875, -0.02001953125, -0.0250244140625, 0.0188446044921875, -0.07684326171875, -0.0225982666015625, -0.062042236328125, -0.044403076171875, 0.0168609619140625, 0.03662109375, 0.061004638671875, 0.044525146484375, -0.002597808837890625, 0.0018529891967773438, 0.04742431640625, -0.01270294189453125, 0.039459228515625, 0.013275146484375, -0.01187896728515625, -0.04693603515625, 0.0572509765625, 0.01036834716796875, 0.02911376953125, 0.03125, 0.01097869873046875, -0.01221466064453125, -0.0276947021484375, -0.019134521484375, 0.038604736328125, -0.04437255859375, -0.01947021484375, -0.033355712890625, -0.031890869140625, -0.026092529296875, -0.0136566162109375, -0.0220489501953125, -0.01015472412109375, -0.04315185546875, 0.013916015625, 0.02508544921875, 0.04571533203125, 0.00585174560546875, 0.0662841796875, -0.06475830078125, 0.0280303955078125, 0.01314544677734375, 0.0201263427734375, 0.0025806427001953125, -0.04901123046875, -0.017059326171875, 0.00890350341796875, -0.030242919921875, -0.059539794921875, 0.056121826171875, 0.0273284912109375, 0.02227783203125, 0.04693603515625, -0.0025043487548828125, 0.05755615234375, -0.043548583984375, 0.0501708984375, 0.046844482421875, -0.072998046875, 0.0303955078125, 0.009979248046875, 0.0179443359375, 0.0225372314453125, 0.0159759521484375, -0.037353515625, 0.00226593017578125, -0.0430908203125, -0.0350341796875, 0.0859375, 0.00321197509765625, 0.0021839141845703125, 0.0212860107421875, 0.03173828125, -0.0218048095703125, 0.0207977294921875, -0.08074951171875, -0.0164031982421875, -0.0316162109375, -0.031707763671875, -0.0160980224609375, -0.0254974365234375, 0.004802703857421875, -0.0216217041015625, 0.031402587890625, 0.0010061264038085938, 0.0638427734375, 0.038299560546875, -0.04022216796875, -0.006412506103515625, -0.00894927978515625, 0.043975830078125, 0.0379638671875, -0.01505279541015625, 0.010955810546875, -0.0019159317016601562, -0.0804443359375, 0.0016946792602539062, 0.00664520263671875, -0.0164337158203125, 0.0072174072265625, 0.037078857421875, 0.08697509765625, -0.0102691650390625, -0.032012939453125, 0.048675537109375, -0.0006537437438964844, -0.0192718505859375, -0.02239990234375, -0.009124755859375, -0.0246734619140625, 0.0203094482421875, 0.03900146484375, 0.00995635986328125, -0.01140594482421875, -0.04107666015625, 0.003391265869140625, 0.03961181640625, -0.0462646484375, -0.02764892578125, 0.051116943359375, -0.00848388671875, -0.033416748046875, 0.0662841796875, -0.0009407997131347656, -0.06915283203125, 0.05755615234375, 0.05035400390625, 0.051971435546875, -0.0266265869140625, 0.01439666748046875, 0.040069580078125, 0.03302001953125, -0.006046295166015625, 0.01166534423828125, -0.0035533905029296875, -0.059906005859375, 0.01264190673828125, -0.04376220703125, -0.00782012939453125, 0.002994537353515625, -0.050628662109375, 0.0396728515625, -0.041015625, -0.0279693603515625, -0.01080322265625, 0.01861572265625, -0.0576171875, 0.035247802734375, 0.00799560546875, 0.07275390625, -0.04156494140625, 0.061004638671875, 0.043365478515625, -0.035308837890625, -0.0596923828125, -0.016998291015625, -0.02105712890625, -0.08538818359375, 0.053131103515625, 0.02215576171875, 0.0029296875, 0.0129241943359375, -0.0523681640625, -0.05487060546875, 0.088134765625, 0.02313232421875, -0.045654296875, -0.040069580078125, 0.02203369140625, 0.0430908203125, -0.031585693359375, 0.05316162109375, 0.0157623291015625, 0.0184478759765625, 0.0513916015625, -0.0618896484375, 0.0042572021484375, -0.019927978515625, 0.0189666748046875, 0.01079559326171875, -0.048187255859375, 0.0804443359375, -0.039031982421875, -0.01557159423828125, 0.0273284912109375, 0.051513671875, 0.0192108154296875, 0.01549530029296875, 0.035186767578125, 0.04669189453125, 0.0462646484375, -0.01385498046875, 0.065673828125, -0.0276947021484375, 0.04022216796875, 0.0556640625, -0.0016393661499023438, 0.054107666015625, 0.0305938720703125, -0.00247955322265625, 0.040435791015625, 0.0396728515625, -0.031341552734375, 0.039703369140625, -0.007228851318359375, 0.00597381591796875, 0.00004887580871582031, 0.00733184814453125, -0.0369873046875, 0.0247039794921875, 0.00649261474609375, -0.05059814453125, -0.0009918212890625, 0.0238494873046875, -0.005031585693359375, -0.039093017578125, -0.03717041015625, 0.046295166015625, 0.000629425048828125, -0.03619384765625, 0.0582275390625, -0.01342010498046875, 0.0665283203125, -0.0634765625, -0.0013189315795898438, -0.004505157470703125, 0.037078857421875, -0.019744873046875, -0.047088623046875, 0.0018529891967773438, -0.00185394287109375, -0.01209259033203125, 0.01485443115234375, 0.04486083984375, -0.039581298828125, -0.0665283203125, 0.0290069580078125, -0.0002536773681640625, 0.0185394287109375, 0.00873565673828125, -0.052886962890625, 0.019622802734375, -0.00403594970703125, -0.0179290771484375, 0.005855560302734375, 0.033721923828125, 0.007495880126953125, 0.040252685546875, 0.037689208984375, 0.010345458984375, 0.02484130859375, -0.021209716796875, 0.060699462890625, -0.0443115234375, -0.035858154296875, -0.044342041015625, 0.04107666015625, 0.007793426513671875, -0.0369873046875, 0.037841796875, 0.036773681640625, 0.0550537109375, -0.020599365234375, 0.025421142578125, -0.0157318115234375, 0.006107330322265625, -0.0234375, 0.066162109375, -0.05511474609375, -0.0158843994140625, -0.03826904296875, -0.06597900390625, -0.0293731689453125, 0.07354736328125, -0.0157623291015625, 0.03155517578125, 0.051422119140625, 0.07354736328125, -0.01416778564453125, -0.02325439453125, 0.0216217041015625, 0.031768798828125, 0.010894775390625, 0.06060791015625, 0.03887939453125, -0.06414794921875, 0.060821533203125, -0.0197906494140625, -0.0171966552734375, -0.01617431640625, -0.055206298828125, -0.0765380859375, -0.045257568359375, -0.029296875, -0.05352783203125, -0.007476806640625, 0.051971435546875, 0.0665283203125, -0.0660400390625, -0.01126861572265625, -0.01593017578125, 0.00028395652770996094, -0.009063720703125, -0.016510009765625, 0.03466796875, 0.00920867919921875, -0.05291748046875, -0.03375244140625, -0.0207366943359375, 0.02447509765625, -0.0009617805480957031, -0.0174102783203125, -0.01064300537109375, -0.01032257080078125, 0.036102294921875, 0.039459228515625, -0.0408935546875, -0.0121917724609375, 0.00890350341796875, -0.010894775390625, 0.042022705078125, 0.0443115234375, -0.05718994140625, 0.03118896484375, 0.0272216796875, 0.00789642333984375, 0.0623779296875, -0.019134521484375, 0.00994873046875, -0.0309906005859375, 0.032440185546875, 0.015228271484375, 0.037811279296875, 0.0245208740234375, -0.02880859375, 0.0231475830078125, 0.0280303955078125, -0.04290771484375, -0.0711669921875, -0.005428314208984375, -0.09906005859375, 0.0103912353515625, 0.0650634765625, -0.0160064697265625, -0.038543701171875, 0.0287322998046875, -0.0203399658203125, 0.036895751953125, -0.023529052734375, 0.02978515625, 0.0225830078125, 0.00868988037109375, -0.04901123046875, -0.00698089599609375, 0.017181396484375, -0.018096923828125, -0.042694091796875, -0.01526641845703125, 0.02410888671875, 0.0255584716796875, 0.0509033203125, 0.03131103515625, -0.0213623046875, 0.0149383544921875, 0.005680084228515625, 0.0462646484375, -0.01399993896484375, -0.025787353515625, -0.033294677734375, 0.00818634033203125, -0.014129638671875, -0.0228729248046875 ] ]
liuhaotian/llava-v1.5-13b
2023-10-16T21:53:56.000Z
[ "transformers", "pytorch", "llava", "text-generation", "has_space", "region:us" ]
text-generation
liuhaotian
null
null
liuhaotian/llava-v1.5-13b
257
163,328
transformers
2023-10-05T18:27:40
--- inference: false --- <br> <br> # LLaVA Model Card ## Model details **Model type:** LLaVA is an open-source chatbot trained by fine-tuning LLaMA/Vicuna on GPT-generated multimodal instruction-following data. It is an auto-regressive language model, based on the transformer architecture. **Model date:** LLaVA-v1.5-13B was trained in September 2023. **Paper or resources for more information:** https://llava-vl.github.io/ ## License Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved. **Where to send questions or comments about the model:** https://github.com/haotian-liu/LLaVA/issues ## Intended use **Primary intended uses:** The primary use of LLaVA is research on large multimodal models and chatbots. **Primary intended users:** The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. ## Training dataset - 558K filtered image-text pairs from LAION/CC/SBU, captioned by BLIP. - 158K GPT-generated multimodal instruction-following data. - 450K academic-task-oriented VQA data mixture. - 40K ShareGPT data. ## Evaluation dataset A collection of 12 benchmarks, including 5 academic VQA benchmarks and 7 recent benchmarks specifically proposed for instruction-following LMMs.
1,365
[ [ -0.00019431114196777344, -0.07061767578125, 0.023284912109375, 0.0189666748046875, -0.031005859375, 0.0160675048828125, -0.001163482666015625, -0.0345458984375, 0.0186767578125, 0.041534423828125, -0.04302978515625, -0.04156494140625, -0.041961669921875, -0.0103759765625, -0.0313720703125, 0.0711669921875, 0.004058837890625, -0.007648468017578125, -0.0223541259765625, 0.01113128662109375, -0.059234619140625, -0.031219482421875, -0.042816162109375, -0.0240478515625, 0.04302978515625, 0.042327880859375, 0.043487548828125, 0.037994384765625, 0.03143310546875, 0.02581787109375, -0.00186920166015625, 0.018585205078125, -0.04638671875, 0.0018453598022460938, 0.02154541015625, -0.05426025390625, -0.055145263671875, -0.016143798828125, 0.040435791015625, -0.007167816162109375, -0.0259857177734375, 0.024627685546875, 0.000408172607421875, 0.0252532958984375, -0.0172576904296875, 0.04742431640625, -0.06243896484375, -0.0179901123046875, -0.0298614501953125, -0.00720977783203125, -0.03460693359375, -0.018707275390625, -0.0228271484375, -0.041229248046875, -0.0200653076171875, 0.00812530517578125, 0.07708740234375, 0.03985595703125, -0.0265960693359375, -0.015869140625, -0.04998779296875, 0.053558349609375, -0.049346923828125, 0.0209808349609375, 0.03515625, 0.055572509765625, -0.0114898681640625, -0.04937744140625, -0.04803466796875, -0.0189666748046875, 0.002750396728515625, 0.01186370849609375, -0.035400390625, 0.0005588531494140625, 0.005115509033203125, 0.02490234375, -0.0323486328125, 0.0196990966796875, -0.04443359375, -0.00168609619140625, 0.0430908203125, 0.0251312255859375, 0.010711669921875, -0.01396942138671875, -0.03436279296875, -0.00954437255859375, -0.037322998046875, -0.0013742446899414062, 0.0433349609375, 0.01666259765625, -0.031402587890625, 0.057098388671875, -0.01531982421875, 0.0298614501953125, -0.0017490386962890625, -0.031158447265625, 0.035552978515625, -0.009185791015625, -0.037567138671875, -0.0221099853515625, 0.07073974609375, 0.0188751220703125, 0.01568603515625, 0.015655517578125, -0.0185089111328125, 0.00742340087890625, 0.02032470703125, -0.03826904296875, -0.0062103271484375, 0.01461029052734375, -0.03070068359375, -0.03802490234375, -0.049957275390625, -0.05303955078125, -0.02508544921875, -0.0157470703125, 0.01654052734375, -0.0270233154296875, -0.0205535888671875, -0.01331329345703125, 0.031768798828125, 0.042816162109375, 0.035125732421875, -0.065185546875, 0.01010894775390625, 0.039276123046875, 0.050445556640625, -0.0082855224609375, -0.0155792236328125, 0.00624847412109375, -0.0073394775390625, -0.007282257080078125, 0.08203125, -0.049163818359375, -0.03204345703125, 0.0016307830810546875, 0.007289886474609375, -0.001781463623046875, -0.019317626953125, 0.05450439453125, -0.04290771484375, 0.0175933837890625, 0.01213836669921875, -0.0369873046875, -0.0136260986328125, 0.024993896484375, -0.048797607421875, 0.08453369140625, -0.0047149658203125, -0.04571533203125, 0.0012607574462890625, -0.046295166015625, -0.0001398324966430664, 0.01432037353515625, -0.012664794921875, -0.0292816162109375, -0.011962890625, 0.01396942138671875, 0.020355224609375, -0.0435791015625, 0.039306640625, -0.00653839111328125, -0.0225830078125, 0.0113983154296875, -0.06060791015625, 0.0577392578125, 0.02130126953125, 0.002105712890625, 0.0193023681640625, -0.0699462890625, -0.018157958984375, 0.0272979736328125, -0.0286102294921875, -0.0047454833984375, 0.0015172958374023438, 0.0035247802734375, -0.003124237060546875, 0.050048828125, -0.03582763671875, 0.0377197265625, -0.00879669189453125, 0.00862884521484375, 0.06671142578125, -0.0178985595703125, 0.0164642333984375, -0.0227203369140625, 0.05047607421875, -0.004016876220703125, 0.04046630859375, -0.0187530517578125, -0.068359375, -0.06744384765625, -0.025543212890625, 0.005161285400390625, 0.07940673828125, -0.052886962890625, 0.0217742919921875, -0.020843505859375, -0.055084228515625, -0.059051513671875, 0.02130126953125, 0.030059814453125, 0.036956787109375, 0.0200653076171875, -0.01320648193359375, -0.047119140625, -0.081787109375, 0.00559234619140625, -0.036529541015625, 0.0032596588134765625, 0.03179931640625, 0.032073974609375, -0.03839111328125, 0.0537109375, -0.025634765625, -0.0303802490234375, -0.02001953125, -0.006755828857421875, 0.0202178955078125, 0.0163116455078125, 0.0257110595703125, -0.037322998046875, -0.03759765625, 0.0019550323486328125, -0.06390380859375, -0.00788116455078125, -0.0056610107421875, -0.0209503173828125, 0.02410888671875, 0.0222625732421875, -0.044769287109375, 0.0482177734375, 0.0662841796875, -0.01007080078125, 0.032501220703125, 0.0002732276916503906, -0.005344390869140625, -0.0882568359375, -0.0101776123046875, -0.01364898681640625, -0.0131683349609375, -0.04107666015625, -0.0089263916015625, -0.00872802734375, 0.00795745849609375, -0.04400634765625, 0.047454833984375, -0.0171966552734375, 0.00681304931640625, -0.029296875, 0.00251007080078125, -0.01141357421875, 0.055572509765625, -0.00809478759765625, 0.0706787109375, 0.035369873046875, -0.0291290283203125, 0.04058837890625, 0.038604736328125, -0.0189666748046875, 0.03656005859375, -0.06781005859375, 0.0178985595703125, -0.001544952392578125, 0.0118560791015625, -0.08905029296875, -0.0200042724609375, 0.041656494140625, -0.0408935546875, 0.0234832763671875, -0.00647735595703125, -0.050537109375, -0.0179290771484375, -0.006984710693359375, 0.0203857421875, 0.06201171875, -0.033416748046875, 0.058135986328125, 0.03472900390625, 0.0146026611328125, -0.05535888671875, -0.05572509765625, 0.0019435882568359375, -0.0177459716796875, -0.039764404296875, 0.005130767822265625, -0.0164031982421875, -0.010528564453125, -0.00988006591796875, 0.02227783203125, -0.01387786865234375, -0.00571441650390625, 0.033416748046875, 0.036468505859375, -0.00311279296875, 0.018341064453125, 0.00036907196044921875, -0.00662994384765625, -0.00814056396484375, 0.0249176025390625, 0.0516357421875, -0.0194091796875, -0.0245819091796875, -0.0594482421875, 0.0005140304565429688, 0.0225830078125, 0.005466461181640625, 0.05291748046875, 0.043701171875, -0.006191253662109375, 0.0211181640625, -0.05157470703125, 0.002666473388671875, -0.039947509765625, 0.031829833984375, -0.0240020751953125, -0.05047607421875, 0.041900634765625, 0.00374603271484375, 0.03369140625, 0.0294036865234375, 0.0587158203125, -0.0159912109375, 0.048858642578125, 0.05718994140625, -0.014923095703125, 0.058990478515625, -0.00594329833984375, -0.0035457611083984375, -0.06201171875, -0.0294036865234375, -0.017822265625, -0.01473236083984375, -0.05706787109375, -0.045806884765625, -0.0012369155883789062, -0.0163116455078125, -0.024139404296875, 0.03167724609375, -0.03131103515625, 0.0316162109375, 0.047821044921875, 0.00628662109375, 0.02276611328125, 0.01148223876953125, 0.01146697998046875, 0.005931854248046875, -0.03729248046875, -0.058258056640625, 0.10235595703125, 0.050048828125, 0.07550048828125, 0.0041351318359375, 0.0533447265625, 0.0247802734375, 0.03192138671875, -0.05377197265625, 0.048980712890625, 0.002307891845703125, -0.05084228515625, -0.0210418701171875, -0.00864410400390625, -0.0753173828125, 0.005527496337890625, -0.00954437255859375, -0.05126953125, -0.004543304443359375, 0.02215576171875, 0.0143585205078125, 0.028533935546875, -0.056793212890625, 0.04541015625, -0.0294036865234375, -0.01116180419921875, -0.01036834716796875, -0.033538818359375, 0.04693603515625, -0.00327301025390625, 0.017730712890625, -0.019989013671875, -0.0024852752685546875, 0.038604736328125, -0.00989532470703125, 0.10919189453125, -0.0039215087890625, -0.023712158203125, 0.03936767578125, 0.0014438629150390625, 0.038330078125, -0.00295257568359375, 0.0123291015625, 0.03759765625, -0.0145111083984375, -0.0311126708984375, -0.027191162109375, 0.047821044921875, -0.0966796875, -0.052734375, -0.0208740234375, -0.039154052734375, 0.0031833648681640625, 0.0009274482727050781, 0.01019287109375, 0.007160186767578125, -0.011688232421875, 0.0017595291137695312, 0.040283203125, -0.0296173095703125, 0.0163726806640625, 0.0267791748046875, -0.0255889892578125, -0.033966064453125, 0.0640869140625, -0.01419830322265625, 0.01531982421875, 0.03558349609375, -0.002285003662109375, -0.0120391845703125, -0.0179595947265625, -0.032928466796875, 0.0308837890625, -0.05914306640625, -0.02850341796875, -0.033172607421875, -0.033599853515625, -0.0208282470703125, 0.018280029296875, -0.0433349609375, -0.0201263427734375, -0.045318603515625, -0.001216888427734375, 0.04937744140625, 0.055877685546875, 0.015869140625, 0.05438232421875, -0.0360107421875, 0.0171356201171875, 0.035125732421875, 0.027313232421875, -0.0078887939453125, -0.068115234375, 0.00030875205993652344, -0.00010722875595092773, -0.039398193359375, -0.057891845703125, 0.039337158203125, 0.0156097412109375, 0.053192138671875, 0.0103912353515625, -0.0198516845703125, 0.053131103515625, -0.04339599609375, 0.063232421875, 0.0110015869140625, -0.046844482421875, 0.048248291015625, -0.01351165771484375, 0.0264434814453125, 0.0258331298828125, 0.016571044921875, -0.0294036865234375, -0.0294342041015625, -0.041015625, -0.053466796875, 0.040863037109375, 0.017242431640625, 0.03692626953125, -0.0026378631591796875, 0.02008056640625, 0.012451171875, 0.0125732421875, -0.08233642578125, -0.039703369140625, -0.0249786376953125, -0.0284881591796875, 0.0113525390625, -0.0447998046875, -0.00853729248046875, -0.0074615478515625, 0.04791259765625, -0.00859832763671875, 0.05242919921875, -0.0189666748046875, -0.006557464599609375, 0.001567840576171875, 0.0172271728515625, 0.062103271484375, 0.0234527587890625, -0.016204833984375, -0.0250396728515625, 0.02178955078125, -0.05572509765625, 0.001079559326171875, -0.0032367706298828125, -0.01096343994140625, -0.006145477294921875, 0.03070068359375, 0.08380126953125, 0.01055145263671875, -0.052093505859375, 0.028594970703125, -0.0257110595703125, -0.019989013671875, -0.055206298828125, 0.00376129150390625, 0.0084228515625, 0.041748046875, 0.015716552734375, -0.01313018798828125, -0.00024044513702392578, -0.0218505859375, -0.01136016845703125, 0.023712158203125, -0.016510009765625, -0.0270843505859375, 0.05377197265625, 0.0147552490234375, -0.039703369140625, 0.045684814453125, 0.00446319580078125, -0.0083160400390625, 0.03460693359375, 0.039306640625, 0.059051513671875, -0.021728515625, 0.0142669677734375, 0.034912109375, 0.0263671875, 0.0095977783203125, 0.0302581787109375, -0.005237579345703125, -0.045867919921875, -0.0285491943359375, -0.04400634765625, -0.0450439453125, 0.01097869873046875, -0.031829833984375, 0.03839111328125, -0.0264434814453125, -0.0169677734375, -0.02105712890625, 0.0012073516845703125, -0.06097412109375, -0.0034503936767578125, 0.0249481201171875, 0.0653076171875, -0.056304931640625, 0.0924072265625, 0.0295257568359375, -0.0408935546875, -0.048065185546875, -0.0272369384765625, 0.003662109375, -0.1075439453125, 0.062744140625, -0.00560760498046875, 0.00038051605224609375, -0.020843505859375, -0.0648193359375, -0.08221435546875, 0.1060791015625, 0.019927978515625, -0.075439453125, -0.01212310791015625, 0.00922393798828125, 0.04876708984375, -0.0284271240234375, 0.040191650390625, 0.03436279296875, 0.022613525390625, 0.0283050537109375, -0.08251953125, -0.01922607421875, -0.0279083251953125, 0.002811431884765625, -0.01479339599609375, -0.07708740234375, 0.0616455078125, 0.004367828369140625, -0.01015472412109375, 0.01126861572265625, 0.060760498046875, 0.0291595458984375, 0.01447296142578125, 0.031707763671875, 0.0252227783203125, 0.053558349609375, 0.007167816162109375, 0.0726318359375, -0.0291595458984375, 0.007904052734375, 0.0869140625, -0.0085296630859375, 0.07080078125, 0.0307769775390625, -0.0146026611328125, 0.057891845703125, 0.038177490234375, -0.004756927490234375, 0.043914794921875, -0.00543212890625, 0.00823974609375, -0.00939178466796875, 0.00732421875, -0.024810791015625, 0.050079345703125, 0.0426025390625, -0.03179931640625, -0.0010042190551757812, -0.011016845703125, 0.00431060791015625, -0.0078582763671875, -0.004199981689453125, 0.056427001953125, -0.0024566650390625, -0.022308349609375, 0.060302734375, -0.0147552490234375, 0.065185546875, -0.0440673828125, -0.0164642333984375, -0.040435791015625, 0.0034198760986328125, 0.00254058837890625, -0.05303955078125, 0.017578125, 0.01461029052734375, 0.01058197021484375, -0.0038242340087890625, 0.055755615234375, -0.0190582275390625, -0.043243408203125, 0.0171051025390625, 0.03326416015625, 0.03472900390625, 0.03558349609375, -0.0677490234375, 0.03497314453125, 0.00901031494140625, -0.0296783447265625, 0.0196380615234375, 0.029693603515625, -0.0162506103515625, 0.07720947265625, 0.03411865234375, -0.0093994140625, 0.006572723388671875, 0.00955963134765625, 0.08404541015625, -0.034698486328125, -0.012542724609375, -0.056427001953125, 0.04302978515625, -0.0013675689697265625, -0.037506103515625, 0.0498046875, 0.026123046875, 0.040283203125, -0.00001811981201171875, 0.049224853515625, 0.0079803466796875, 0.03460693359375, -0.028076171875, 0.0276336669921875, -0.0460205078125, 0.035308837890625, -0.005268096923828125, -0.05389404296875, -0.01517486572265625, 0.0439453125, -0.012420654296875, -0.00846099853515625, 0.0208740234375, 0.059722900390625, 0.010284423828125, -0.01436614990234375, 0.04339599609375, 0.0197601318359375, 0.042816162109375, 0.043548583984375, 0.06414794921875, -0.04736328125, 0.06402587890625, -0.00801849365234375, -0.0193328857421875, -0.0372314453125, -0.05377197265625, -0.09259033203125, -0.043548583984375, -0.0160369873046875, -0.00811767578125, 0.0100860595703125, 0.05810546875, 0.034210205078125, -0.03472900390625, -0.031494140625, 0.01027679443359375, 0.007274627685546875, 0.00787353515625, -0.011810302734375, 0.01329803466796875, -0.0199127197265625, -0.05926513671875, 0.02178955078125, -0.0057373046875, 0.0146636962890625, -0.03997802734375, -0.00432586669921875, -0.016693115234375, 0.016815185546875, 0.04351806640625, 0.02996826171875, -0.07293701171875, -0.017822265625, 0.014434814453125, -0.01395416259765625, 0.0230865478515625, 0.00992584228515625, -0.054473876953125, 0.0263214111328125, 0.017669677734375, 0.0214385986328125, 0.037445068359375, -0.01213836669921875, 0.0201416015625, -0.054901123046875, 0.0300750732421875, 0.004093170166015625, 0.0174407958984375, 0.02362060546875, -0.0309295654296875, 0.035400390625, -0.0047149658203125, -0.0599365234375, -0.06298828125, 0.01355743408203125, -0.0804443359375, 0.007904052734375, 0.0994873046875, 0.006397247314453125, -0.049652099609375, 0.00620269775390625, -0.037384033203125, 0.0147552490234375, -0.049346923828125, 0.0556640625, 0.03057861328125, -0.0032501220703125, -0.03656005859375, -0.0633544921875, 0.0145263671875, -0.00946807861328125, -0.069580078125, -0.00392913818359375, 0.03143310546875, 0.0185699462890625, 0.002223968505859375, 0.0657958984375, 0.0004711151123046875, 0.002788543701171875, 0.01493072509765625, 0.03692626953125, -0.00780487060546875, -0.0223541259765625, -0.014892578125, -0.0185394287109375, 0.0099639892578125, -0.0362548828125 ] ]
facebook/maskformer-swin-large-ade
2023-02-27T15:08:57.000Z
[ "transformers", "pytorch", "maskformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2107.06278", "license:other", "endpoints_compatible", "has_space", "region:us" ]
image-segmentation
facebook
null
null
facebook/maskformer-swin-large-ade
38
163,031
transformers
2022-03-02T23:29:05
--- license: other tags: - vision - image-segmentation datasets: - scene_parse_150 widget: - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg example_title: House - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg example_title: Castle --- # MaskFormer MaskFormer model trained on ADE20k semantic segmentation (large-sized version, Swin backbone). It was introduced in the paper [Per-Pixel Classification is Not All You Need for Semantic Segmentation](https://arxiv.org/abs/2107.06278) and first released in [this repository](https://github.com/facebookresearch/MaskFormer/blob/da3e60d85fdeedcb31476b5edd7d328826ce56cc/mask_former/modeling/criterion.py#L169). Disclaimer: The team releasing MaskFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description MaskFormer addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/maskformer_architecture.png) ## Intended uses & limitations You can use this particular checkpoint for semantic segmentation. See the [model hub](https://huggingface.co/models?search=maskformer) to look for other fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import MaskFormerImageProcessor, MaskFormerForInstanceSegmentation from PIL import Image import requests url = "https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg" image = Image.open(requests.get(url, stream=True).raw) processor = MaskFormerImageProcessor.from_pretrained("facebook/maskformer-swin-large-ade") inputs = processor(images=image, return_tensors="pt") model = MaskFormerForInstanceSegmentation.from_pretrained("facebook/maskformer-swin-large-ade") outputs = model(**inputs) # model predicts class_queries_logits of shape `(batch_size, num_queries)` # and masks_queries_logits of shape `(batch_size, num_queries, height, width)` class_queries_logits = outputs.class_queries_logits masks_queries_logits = outputs.masks_queries_logits # you can pass them to processor for postprocessing # we refer to the demo notebooks for visualization (see "Resources" section in the MaskFormer docs) predicted_semantic_map = processor.post_process_semantic_segmentation(outputs, target_sizes=[image.size[::-1]])[0] ``` For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/maskformer).
2,804
[ [ -0.045806884765625, -0.059967041015625, 0.0263671875, 0.0138397216796875, -0.01690673828125, -0.0272369384765625, 0.005405426025390625, -0.049041748046875, 0.0279693603515625, 0.0462646484375, -0.06842041015625, -0.048004150390625, -0.057403564453125, -0.0182037353515625, -0.010223388671875, 0.06787109375, -0.0115966796875, 0.00878143310546875, -0.0259246826171875, -0.0052642822265625, -0.00920867919921875, -0.0294189453125, -0.053192138671875, -0.0182342529296875, 0.0191802978515625, 0.00673675537109375, 0.02349853515625, 0.039825439453125, 0.03106689453125, 0.0304412841796875, -0.0199737548828125, -0.005596160888671875, -0.022918701171875, -0.0054779052734375, 0.009735107421875, -0.0244903564453125, -0.036163330078125, 0.01480865478515625, 0.045745849609375, 0.050201416015625, 0.006198883056640625, 0.030181884765625, -0.0234222412109375, 0.0310516357421875, -0.0498046875, 0.016693115234375, -0.031951904296875, 0.0068206787109375, -0.00982666015625, 0.035003662109375, -0.00989532470703125, -0.006763458251953125, 0.01317596435546875, -0.046875, 0.0266265869140625, -0.012725830078125, 0.09326171875, 0.0177764892578125, -0.01708984375, -0.0031070709228515625, -0.02508544921875, 0.052825927734375, -0.04193115234375, 0.01849365234375, 0.025787353515625, 0.04248046875, 0.0008487701416015625, -0.07818603515625, -0.043487548828125, 0.01189422607421875, -0.0014247894287109375, -0.001857757568359375, -0.01235198974609375, 0.01526641845703125, 0.037994384765625, 0.03131103515625, -0.036468505859375, 0.0016908645629882812, -0.056671142578125, -0.043182373046875, 0.053253173828125, -0.00971221923828125, 0.0295867919921875, -0.0222015380859375, -0.0506591796875, -0.02471923828125, -0.00537872314453125, 0.0284423828125, 0.005573272705078125, -0.00870513916015625, -0.01393890380859375, 0.037811279296875, -0.025634765625, 0.06890869140625, 0.0211944580078125, -0.0209197998046875, 0.0258026123046875, 0.0107269287109375, -0.0278167724609375, -0.0016241073608398438, 0.049163818359375, 0.0287933349609375, -0.0022335052490234375, 0.01000213623046875, -0.025543212890625, 0.0288238525390625, 0.015716552734375, -0.08984375, -0.03277587890625, 0.01543426513671875, -0.0298309326171875, -0.031005859375, 0.0203857421875, -0.043060302734375, -0.00687408447265625, -0.022491455078125, 0.049285888671875, -0.0262298583984375, -0.004978179931640625, 0.01454925537109375, -0.0238189697265625, 0.03546142578125, 0.0408935546875, -0.04595947265625, 0.037109375, 0.041015625, 0.057952880859375, -0.008575439453125, -0.01004791259765625, -0.0220947265625, -0.008056640625, -0.010162353515625, 0.0726318359375, -0.03228759765625, 0.003692626953125, -0.0223846435546875, 0.036590576171875, -0.035797119140625, -0.043731689453125, 0.033721923828125, -0.040496826171875, 0.0305328369140625, 0.0051727294921875, -0.033843994140625, -0.054107666015625, 0.009002685546875, -0.0478515625, 0.06182861328125, 0.02642822265625, -0.04034423828125, 0.03155517578125, -0.052398681640625, -0.0093231201171875, 0.0011091232299804688, -0.0010509490966796875, -0.058319091796875, -0.00681304931640625, 0.025421142578125, 0.045989990234375, 0.0128631591796875, -0.00015819072723388672, -0.0379638671875, -0.01983642578125, 0.002323150634765625, 0.0166015625, 0.0745849609375, 0.0088653564453125, -0.035552978515625, 0.04058837890625, -0.056182861328125, 0.006656646728515625, 0.0219879150390625, 0.01120758056640625, -0.001117706298828125, -0.040863037109375, 0.0235443115234375, 0.049835205078125, 0.0143585205078125, -0.0615234375, 0.0016260147094726562, -0.01120758056640625, 0.043182373046875, 0.053466796875, 0.00408935546875, 0.03912353515625, -0.024505615234375, 0.03302001953125, 0.01161956787109375, 0.0288543701171875, -0.01108551025390625, -0.041168212890625, -0.054718017578125, -0.042694091796875, 0.007534027099609375, 0.02459716796875, -0.0265045166015625, 0.0325927734375, 0.0106353759765625, -0.061065673828125, -0.0283966064453125, -0.002017974853515625, 0.0236358642578125, 0.0445556640625, 0.018646240234375, -0.03985595703125, -0.05535888671875, -0.07220458984375, 0.016357421875, 0.01113128662109375, -0.00722503662109375, 0.022674560546875, 0.048095703125, -0.04022216796875, 0.056671142578125, -0.051849365234375, -0.02252197265625, -0.0058746337890625, -0.0114288330078125, 0.0037250518798828125, 0.05029296875, 0.04962158203125, -0.0657958984375, -0.011688232421875, -0.03546142578125, -0.0435791015625, 0.0011234283447265625, 0.014862060546875, -0.03521728515625, 0.0274505615234375, 0.0281524658203125, -0.05511474609375, 0.045440673828125, 0.042510986328125, -0.037933349609375, 0.053466796875, 0.01556396484375, -0.0029888153076171875, -0.06768798828125, 0.00919342041015625, 0.0175933837890625, -0.025787353515625, -0.034423828125, 0.00672149658203125, 0.004062652587890625, -0.0328369140625, -0.043121337890625, 0.04547119140625, -0.0247650146484375, -0.00875091552734375, -0.01473236083984375, -0.01390838623046875, 0.02178955078125, 0.04412841796875, 0.0257110595703125, 0.0145721435546875, 0.05682373046875, -0.047210693359375, 0.033905029296875, 0.035247802734375, -0.025909423828125, 0.040191650390625, -0.0670166015625, 0.006763458251953125, -0.01161956787109375, 0.043243408203125, -0.07666015625, -0.048004150390625, 0.0311279296875, -0.0301361083984375, 0.0248870849609375, -0.01690673828125, -0.01113128662109375, -0.047088623046875, -0.032135009765625, 0.03759765625, 0.0382080078125, -0.043975830078125, 0.0298004150390625, 0.0435791015625, 0.0171051025390625, -0.0223541259765625, -0.049896240234375, -0.021881103515625, -0.024200439453125, -0.08935546875, 0.0472412109375, -0.0021762847900390625, -0.002613067626953125, -0.0013751983642578125, -0.0279693603515625, -0.016448974609375, -0.0132293701171875, 0.03289794921875, 0.038299560546875, -0.0181884765625, -0.0237884521484375, -0.01311492919921875, -0.0173187255859375, 0.00933074951171875, -0.029876708984375, 0.03900146484375, -0.020416259765625, -0.0197906494140625, -0.04736328125, 0.0032520294189453125, 0.03173828125, -0.0191802978515625, 0.0479736328125, 0.083984375, -0.056182861328125, -0.0140380859375, -0.06671142578125, -0.026275634765625, -0.034454345703125, 0.0213623046875, -0.0305328369140625, -0.064208984375, 0.057586669921875, 0.00458526611328125, -0.009521484375, 0.05865478515625, 0.024810791015625, 0.00958251953125, 0.0758056640625, 0.052581787109375, 0.03271484375, 0.05108642578125, -0.04541015625, -0.0063629150390625, -0.07244873046875, -0.031768798828125, -0.0037059783935546875, -0.033477783203125, -0.03070068359375, -0.04144287109375, 0.031524658203125, 0.02935791015625, -0.02197265625, 0.034820556640625, -0.06494140625, 0.0164947509765625, 0.043975830078125, 0.0174102783203125, -0.0160675048828125, 0.00905609130859375, 0.01375579833984375, 0.00948333740234375, -0.044921875, -0.0292205810546875, 0.04498291015625, 0.034332275390625, 0.04864501953125, -0.0305023193359375, 0.051025390625, -0.0236053466796875, -0.0012054443359375, -0.06976318359375, 0.04840087890625, -0.00931549072265625, -0.036102294921875, -0.01898193359375, -0.0146636962890625, -0.0626220703125, 0.0287017822265625, -0.01241302490234375, -0.074462890625, 0.02850341796875, -0.002864837646484375, -0.025238037109375, 0.031005859375, -0.06256103515625, 0.10205078125, 0.00682830810546875, -0.01290130615234375, 0.01250457763671875, -0.057525634765625, 0.032958984375, 0.00513458251953125, -0.005748748779296875, -0.020599365234375, 0.00856781005859375, 0.08953857421875, -0.049835205078125, 0.07513427734375, -0.036376953125, 0.0166168212890625, 0.0313720703125, -0.00948333740234375, 0.01076507568359375, -0.0089263916015625, 0.00760650634765625, 0.02801513671875, 0.0271759033203125, -0.0406494140625, -0.036773681640625, 0.044189453125, -0.06134033203125, -0.032958984375, -0.0194091796875, -0.024200439453125, 0.00933074951171875, 0.0222015380859375, 0.034881591796875, 0.03399658203125, -0.0047760009765625, 0.0008988380432128906, 0.043487548828125, -0.0130767822265625, 0.02642822265625, 0.007099151611328125, -0.01548004150390625, -0.03228759765625, 0.051727294921875, 0.005931854248046875, 0.01885986328125, 0.01148223876953125, 0.0235595703125, -0.022491455078125, -0.0150909423828125, -0.041351318359375, 0.035247802734375, -0.049041748046875, -0.0248260498046875, -0.0570068359375, -0.038299560546875, -0.056488037109375, -0.0236053466796875, -0.032470703125, -0.05084228515625, -0.0088958740234375, 0.008941650390625, 0.041107177734375, 0.04730224609375, -0.00920867919921875, 0.029541015625, -0.035980224609375, 0.0191802978515625, 0.0299835205078125, 0.0299224853515625, -0.0197296142578125, -0.031829833984375, 0.0027904510498046875, -0.0017147064208984375, -0.0299072265625, -0.04876708984375, 0.023040771484375, 0.01070404052734375, 0.0394287109375, 0.032928466796875, -0.006290435791015625, 0.0621337890625, -0.01120758056640625, 0.053070068359375, 0.0159149169921875, -0.060272216796875, 0.0601806640625, -0.01097869873046875, 0.0196075439453125, 0.0182647705078125, 0.02325439453125, -0.0445556640625, -0.0016317367553710938, -0.0615234375, -0.0748291015625, 0.08099365234375, -0.00463104248046875, -0.00798797607421875, 0.01419830322265625, 0.0266265869140625, 0.005489349365234375, 0.006221771240234375, -0.0509033203125, -0.0182647705078125, -0.032989501953125, 0.002166748046875, 0.0011701583862304688, -0.0292205810546875, -0.007434844970703125, -0.0352783203125, 0.0516357421875, 0.0042724609375, 0.021881103515625, 0.03082275390625, -0.023345947265625, -0.01239013671875, -0.0265960693359375, 0.0379638671875, 0.047210693359375, -0.0205841064453125, -0.000843048095703125, 0.0011749267578125, -0.044281005859375, -0.014801025390625, 0.0301361083984375, -0.016845703125, -0.0030078887939453125, 0.0261688232421875, 0.083984375, 0.00726318359375, -0.0273284912109375, 0.05523681640625, 0.006420135498046875, -0.033782958984375, -0.038299560546875, 0.0006389617919921875, -0.005931854248046875, 0.01506805419921875, 0.0019092559814453125, 0.0192718505859375, 0.014617919921875, -0.01108551025390625, 0.023162841796875, 0.0300445556640625, -0.04071044921875, -0.024993896484375, 0.056793212890625, -0.00821685791015625, -0.0022678375244140625, 0.040283203125, -0.018646240234375, -0.0594482421875, 0.06512451171875, 0.045654296875, 0.065673828125, -0.0238189697265625, 0.02716064453125, 0.050506591796875, -0.005237579345703125, 0.0058746337890625, -0.006107330322265625, -0.01276397705078125, -0.04620361328125, -0.03240966796875, -0.07696533203125, -0.0022125244140625, 0.015869140625, -0.0423583984375, 0.0287017822265625, -0.0421142578125, -0.0081787109375, 0.0206146240234375, 0.01111602783203125, -0.058349609375, 0.03277587890625, 0.0273284912109375, 0.06298828125, -0.061004638671875, 0.03363037109375, 0.09405517578125, -0.0299224853515625, -0.059844970703125, -0.02777099609375, 0.015655517578125, -0.0799560546875, 0.0396728515625, 0.046722412109375, 0.004558563232421875, -0.0199737548828125, -0.043548583984375, -0.058013916015625, 0.08953857421875, 0.00836181640625, -0.007457733154296875, -0.004390716552734375, -0.007717132568359375, 0.0189666748046875, -0.040679931640625, 0.0212860107421875, 0.0191497802734375, 0.0296783447265625, 0.050384521484375, -0.05780029296875, 0.024688720703125, -0.030548095703125, 0.0160064697265625, 0.0005021095275878906, -0.045745849609375, 0.0716552734375, -0.01971435546875, -0.0119171142578125, -0.002826690673828125, 0.03717041015625, 0.0185546875, 0.031005859375, 0.0611572265625, 0.060272216796875, 0.04302978515625, -0.005268096923828125, 0.076171875, -0.00330352783203125, 0.0252685546875, 0.050384521484375, 0.01898193359375, 0.040191650390625, 0.0191497802734375, -0.0037937164306640625, 0.0472412109375, 0.06988525390625, -0.0215301513671875, 0.0171356201171875, 0.0193634033203125, -0.0142059326171875, -0.02789306640625, -0.005035400390625, -0.021148681640625, 0.0545654296875, 0.0213165283203125, -0.024261474609375, -0.019378662109375, 0.01111602783203125, 0.004608154296875, -0.018096923828125, -0.03204345703125, 0.0650634765625, 0.0005521774291992188, -0.033355712890625, 0.053070068359375, 0.0070343017578125, 0.06353759765625, -0.04693603515625, -0.00536346435546875, 0.006549835205078125, 0.0194091796875, -0.036376953125, -0.0462646484375, 0.04290771484375, -0.03204345703125, -0.007068634033203125, -0.00685882568359375, 0.08367919921875, -0.0287628173828125, -0.051361083984375, 0.01284027099609375, 0.01470947265625, 0.0266876220703125, -0.032012939453125, -0.06304931640625, 0.0283355712890625, -0.00838470458984375, -0.03399658203125, 0.018524169921875, 0.0221405029296875, -0.0172576904296875, 0.0360107421875, 0.048309326171875, -0.00833892822265625, 0.002933502197265625, 0.00005310773849487305, 0.076416015625, -0.038909912109375, -0.033660888671875, -0.043304443359375, 0.040863037109375, -0.0182037353515625, -0.015838623046875, 0.0482177734375, 0.044525146484375, 0.08331298828125, -0.0173187255859375, 0.027740478515625, -0.01177215576171875, 0.0164947509765625, -0.022705078125, 0.047271728515625, -0.043121337890625, -0.0200958251953125, -0.032135009765625, -0.0780029296875, -0.0279693603515625, 0.0740966796875, -0.04071044921875, 0.01085662841796875, 0.03369140625, 0.07159423828125, -0.0207366943359375, -0.005512237548828125, 0.00495147705078125, 0.0011930465698242188, 0.01250457763671875, 0.0321044921875, 0.0379638671875, -0.041168212890625, 0.0282745361328125, -0.0657958984375, -0.044403076171875, -0.00885009765625, -0.034759521484375, -0.064208984375, -0.059234619140625, -0.04345703125, -0.03985595703125, -0.01398468017578125, 0.044830322265625, 0.09735107421875, -0.06158447265625, -0.00665283203125, 0.00205230712890625, 0.007781982421875, -0.0084686279296875, -0.02362060546875, 0.0396728515625, -0.019317626953125, -0.063232421875, -0.0117950439453125, 0.004840850830078125, 0.01270294189453125, -0.0047149658203125, 0.0085906982421875, 0.0012950897216796875, -0.0193634033203125, 0.058197021484375, 0.0210723876953125, -0.05267333984375, -0.01800537109375, -0.007770538330078125, 0.004512786865234375, 0.00572967529296875, 0.0390625, -0.05450439453125, 0.031005859375, 0.0273590087890625, 0.0282440185546875, 0.0733642578125, 0.0010528564453125, 0.0193634033203125, -0.035186767578125, 0.0251312255859375, 0.0118865966796875, 0.03387451171875, 0.0283355712890625, -0.0299072265625, 0.0299835205078125, 0.03619384765625, -0.04022216796875, -0.04547119140625, 0.0208740234375, -0.09906005859375, -0.019683837890625, 0.07659912109375, -0.011993408203125, -0.06146240234375, 0.01971435546875, -0.03094482421875, 0.022979736328125, -0.01629638671875, 0.041473388671875, 0.0323486328125, -0.0246429443359375, -0.014678955078125, -0.0213775634765625, 0.045562744140625, 0.005558013916015625, -0.038726806640625, -0.031463623046875, 0.0435791015625, 0.04498291015625, 0.0216522216796875, 0.048187255859375, -0.034637451171875, 0.018798828125, 0.018157958984375, 0.026763916015625, -0.01751708984375, 0.0026950836181640625, -0.0217437744140625, 0.004207611083984375, -0.0261383056640625, -0.041168212890625 ] ]