modelId
stringlengths
4
111
lastModified
stringlengths
24
24
tags
list
pipeline_tag
stringlengths
5
30
author
stringlengths
2
34
config
null
securityStatus
null
id
stringlengths
4
111
likes
int64
0
9.53k
downloads
int64
2
73.6M
library_name
stringlengths
2
84
created
timestamp[us]
card
stringlengths
101
901k
card_len
int64
101
901k
embeddings
list
Salesforce/blip2-opt-2.7b
2023-09-11T13:01:16.000Z
[ "transformers", "pytorch", "blip-2", "visual-question-answering", "vision", "image-to-text", "image-captioning", "en", "arxiv:2301.12597", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
Salesforce
null
null
Salesforce/blip2-opt-2.7b
162
162,652
transformers
2023-02-06T16:21:49
--- language: en license: mit tags: - vision - image-to-text - image-captioning - visual-question-answering pipeline_tag: image-to-text --- # BLIP-2, OPT-2.7b, pre-trained only BLIP-2 model, leveraging [OPT-2.7b](https://huggingface.co/facebook/opt-2.7b) (a large language model with 2.7 billion parameters). It was introduced in the paper [BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models](https://arxiv.org/abs/2301.12597) by Li et al. and first released in [this repository](https://github.com/salesforce/LAVIS/tree/main/projects/blip2). Disclaimer: The team releasing BLIP-2 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BLIP-2 consists of 3 models: a CLIP-like image encoder, a Querying Transformer (Q-Former) and a large language model. The authors initialize the weights of the image encoder and large language model from pre-trained checkpoints and keep them frozen while training the Querying Transformer, which is a BERT-like Transformer encoder that maps a set of "query tokens" to query embeddings, which bridge the gap between the embedding space of the image encoder and the large language model. The goal for the model is simply to predict the next text token, giving the query embeddings and the previous text. <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/blip2_architecture.jpg" alt="drawing" width="600"/> This allows the model to be used for tasks like: - image captioning - visual question answering (VQA) - chat-like conversations by feeding the image and the previous conversation as prompt to the model ## Direct Use and Downstream Use You can use the raw model for conditional text generation given an image and optional text. See the [model hub](https://huggingface.co/models?search=Salesforce/blip) to look for fine-tuned versions on a task that interests you. ## Bias, Risks, Limitations, and Ethical Considerations BLIP2-OPT uses off-the-shelf OPT as the language model. It inherits the same risks and limitations as mentioned in Meta's model card. > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. > BLIP2 is fine-tuned on image-text datasets (e.g. [LAION](https://laion.ai/blog/laion-400-open-dataset/) ) collected from the internet. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. BLIP2 has not been tested in real world applications. It should not be directly deployed in any applications. Researchers should first carefully assess the safety and fairness of the model in relation to the specific context they’re being deployed within. ### How to use For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2ForConditionalGeneration.forward.example). #### Running the model on CPU <details> <summary> Click to expand </summary> ```python import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True).strip()) ``` </details> #### Running the model on GPU ##### In full precision <details> <summary> Click to expand </summary> ```python # pip install accelerate import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda") out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True).strip()) ``` </details> ##### In half precision (`float16`) <details> <summary> Click to expand </summary> ```python # pip install accelerate import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", torch_dtype=torch.float16, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True).strip()) ``` </details> ##### In 8-bit precision (`int8`) <details> <summary> Click to expand </summary> ```python # pip install accelerate bitsandbytes import torch import requests from PIL import Image from transformers import Blip2Processor, Blip2ForConditionalGeneration processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b") model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-2.7b", load_in_8bit=True, device_map="auto") img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg' raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB') question = "how many dogs are in the picture?" inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16) out = model.generate(**inputs) print(processor.decode(out[0], skip_special_tokens=True).strip()) ``` </details>
6,637
[ [ -0.03179931640625, -0.056396484375, -0.003635406494140625, 0.030181884765625, -0.021026611328125, -0.01078033447265625, -0.024200439453125, -0.0595703125, -0.009979248046875, 0.0263214111328125, -0.03607177734375, -0.006702423095703125, -0.043304443359375, -0.004337310791015625, -0.004505157470703125, 0.051849365234375, 0.0088958740234375, 0.0014314651489257812, -0.006885528564453125, 0.003658294677734375, -0.020843505859375, -0.0066070556640625, -0.0513916015625, -0.013519287109375, -0.0009670257568359375, 0.0250701904296875, 0.059478759765625, 0.0308837890625, 0.056304931640625, 0.0270233154296875, -0.016143798828125, 0.0089569091796875, -0.038665771484375, -0.0210723876953125, -0.007305145263671875, -0.054840087890625, -0.01351165771484375, -0.0003714561462402344, 0.042694091796875, 0.036651611328125, 0.01129913330078125, 0.0260467529296875, -0.003711700439453125, 0.040374755859375, -0.035919189453125, 0.0193023681640625, -0.05523681640625, 0.0012531280517578125, -0.0029315948486328125, 0.001979827880859375, -0.033233642578125, -0.00988006591796875, 0.004009246826171875, -0.052703857421875, 0.03363037109375, 0.006587982177734375, 0.115966796875, 0.026824951171875, -0.00098419189453125, -0.020751953125, -0.03271484375, 0.0677490234375, -0.047882080078125, 0.036407470703125, 0.0176849365234375, 0.0195159912109375, -0.002803802490234375, -0.06365966796875, -0.0513916015625, -0.00887298583984375, -0.01007080078125, 0.02484130859375, -0.0211334228515625, -0.00632476806640625, 0.036407470703125, 0.019561767578125, -0.047943115234375, 0.004169464111328125, -0.064697265625, -0.0174102783203125, 0.052703857421875, -0.005096435546875, 0.0205841064453125, -0.026580810546875, -0.039764404296875, -0.028472900390625, -0.040985107421875, 0.024688720703125, 0.0120391845703125, 0.005954742431640625, -0.0306549072265625, 0.06256103515625, -0.0031795501708984375, 0.048553466796875, 0.026611328125, -0.0177764892578125, 0.043304443359375, -0.0236663818359375, -0.0198211669921875, -0.01462554931640625, 0.072998046875, 0.04583740234375, 0.01482391357421875, 0.0006546974182128906, -0.009979248046875, 0.002262115478515625, 0.0037441253662109375, -0.0814208984375, -0.0163421630859375, 0.0289306640625, -0.038909912109375, -0.0108642578125, -0.00737762451171875, -0.06854248046875, -0.00489044189453125, 0.006694793701171875, 0.035491943359375, -0.04498291015625, -0.0233154296875, 0.007965087890625, -0.032958984375, 0.03411865234375, 0.015838623046875, -0.0716552734375, -0.0023632049560546875, 0.0372314453125, 0.06549072265625, 0.011260986328125, -0.034759521484375, -0.01082611083984375, 0.00649261474609375, -0.024200439453125, 0.041534423828125, -0.01145172119140625, -0.016387939453125, -0.0030765533447265625, 0.01611328125, 0.0004818439483642578, -0.04541015625, 0.0100250244140625, -0.031951904296875, 0.018646240234375, -0.0112762451171875, -0.033416748046875, -0.0240936279296875, 0.0116119384765625, -0.032470703125, 0.08697509765625, 0.021514892578125, -0.061248779296875, 0.034454345703125, -0.035858154296875, -0.0230865478515625, 0.0185089111328125, -0.00940704345703125, -0.05230712890625, -0.002838134765625, 0.0155181884765625, 0.0245513916015625, -0.0241851806640625, 0.003948211669921875, -0.025848388671875, -0.027557373046875, 0.006130218505859375, -0.01253509521484375, 0.083740234375, 0.002307891845703125, -0.047576904296875, -0.005519866943359375, -0.037200927734375, -0.0059356689453125, 0.02960205078125, -0.0190887451171875, 0.006267547607421875, -0.01439666748046875, 0.01348114013671875, 0.02252197265625, 0.04388427734375, -0.051300048828125, -0.0007991790771484375, -0.04205322265625, 0.03607177734375, 0.038330078125, -0.01751708984375, 0.0281982421875, -0.010284423828125, 0.0235748291015625, 0.0145111083984375, 0.026824951171875, -0.0196533203125, -0.06146240234375, -0.06890869140625, -0.018280029296875, -0.0004730224609375, 0.05224609375, -0.061676025390625, 0.035797119140625, -0.014312744140625, -0.052825927734375, -0.0443115234375, 0.00949859619140625, 0.0411376953125, 0.048614501953125, 0.03802490234375, -0.017822265625, -0.037353515625, -0.0733642578125, 0.0192108154296875, -0.0207672119140625, 0.0026836395263671875, 0.035858154296875, 0.053924560546875, -0.024169921875, 0.060882568359375, -0.040740966796875, -0.0157623291015625, -0.022125244140625, 0.0015106201171875, 0.0240631103515625, 0.05194091796875, 0.05938720703125, -0.060089111328125, -0.02587890625, 0.00018548965454101562, -0.065185546875, 0.0140533447265625, -0.0132293701171875, -0.0182952880859375, 0.038909912109375, 0.03399658203125, -0.06671142578125, 0.039642333984375, 0.038238525390625, -0.020965576171875, 0.044219970703125, -0.00936126708984375, -0.0095062255859375, -0.07415771484375, 0.0269012451171875, 0.0108642578125, -0.007381439208984375, -0.0277862548828125, 0.006259918212890625, 0.01947021484375, -0.0163421630859375, -0.05108642578125, 0.0604248046875, -0.03240966796875, -0.019317626953125, 0.006587982177734375, -0.01422119140625, 0.0081634521484375, 0.04443359375, 0.020294189453125, 0.0595703125, 0.07000732421875, -0.042938232421875, 0.033477783203125, 0.04327392578125, -0.024688720703125, 0.0228424072265625, -0.0645751953125, 0.004974365234375, -0.00951385498046875, 0.00083160400390625, -0.08282470703125, -0.0111846923828125, 0.0202178955078125, -0.054656982421875, 0.0302886962890625, -0.016937255859375, -0.034393310546875, -0.057037353515625, -0.02197265625, 0.0256500244140625, 0.051727294921875, -0.048004150390625, 0.0310516357421875, 0.025177001953125, 0.00460052490234375, -0.053741455078125, -0.0826416015625, 0.0007762908935546875, 0.007289886474609375, -0.0662841796875, 0.03814697265625, -0.0008115768432617188, 0.01032257080078125, 0.01296234130859375, 0.0215301513671875, 0.00006949901580810547, -0.011627197265625, 0.0204620361328125, 0.030792236328125, -0.021270751953125, -0.0139007568359375, -0.01031494140625, -0.0008139610290527344, -0.00013375282287597656, -0.00946807861328125, 0.061248779296875, -0.0194549560546875, -0.0019273757934570312, -0.05010986328125, 0.00894927978515625, 0.035491943359375, -0.023193359375, 0.04638671875, 0.057525634765625, -0.0316162109375, -0.0068511962890625, -0.03973388671875, -0.0131378173828125, -0.043365478515625, 0.048797607421875, -0.0197296142578125, -0.032928466796875, 0.04638671875, 0.0179443359375, 0.009307861328125, 0.030303955078125, 0.06097412109375, -0.00629425048828125, 0.0692138671875, 0.0430908203125, 0.0161285400390625, 0.048187255859375, -0.066162109375, 0.003570556640625, -0.060760498046875, -0.03839111328125, -0.00789642333984375, -0.016387939453125, -0.034027099609375, -0.03607177734375, 0.0175933837890625, 0.0233001708984375, -0.028717041015625, 0.024993896484375, -0.04351806640625, 0.01629638671875, 0.055419921875, 0.021881103515625, -0.0258941650390625, 0.00994873046875, -0.006633758544921875, 0.0027408599853515625, -0.047210693359375, -0.018890380859375, 0.07440185546875, 0.03717041015625, 0.051605224609375, -0.01274871826171875, 0.0310516357421875, -0.0241241455078125, 0.01568603515625, -0.059234619140625, 0.047210693359375, -0.0240631103515625, -0.062164306640625, -0.0161285400390625, -0.02313232421875, -0.0672607421875, 0.00936126708984375, -0.0167694091796875, -0.05230712890625, 0.01274871826171875, 0.024688720703125, -0.01372528076171875, 0.02398681640625, -0.068115234375, 0.0733642578125, -0.035064697265625, -0.046417236328125, 0.00426483154296875, -0.047882080078125, 0.0272369384765625, 0.0121307373046875, -0.01377105712890625, 0.0086669921875, 0.004810333251953125, 0.057952880859375, -0.035186767578125, 0.05950927734375, -0.02587890625, 0.025299072265625, 0.036468505859375, -0.01324462890625, 0.01490020751953125, -0.0020904541015625, -0.004962921142578125, 0.0277252197265625, -0.00833892822265625, -0.04156494140625, -0.032073974609375, 0.0161895751953125, -0.06719970703125, -0.0328369140625, -0.0201873779296875, -0.034332275390625, 0.0008263587951660156, 0.032623291015625, 0.05328369140625, 0.023681640625, 0.017669677734375, 0.01229095458984375, 0.0243072509765625, -0.043914794921875, 0.050567626953125, 0.0059661865234375, -0.0205841064453125, -0.0367431640625, 0.06341552734375, -0.0007562637329101562, 0.0220489501953125, 0.012847900390625, 0.01120758056640625, -0.04498291015625, -0.029754638671875, -0.056854248046875, 0.04345703125, -0.0467529296875, -0.03582763671875, -0.0307769775390625, -0.0216217041015625, -0.045501708984375, -0.016845703125, -0.043365478515625, -0.01401519775390625, -0.03955078125, 0.01552581787109375, 0.03717041015625, 0.02947998046875, -0.00702667236328125, 0.031219482421875, -0.040374755859375, 0.037017822265625, 0.0260467529296875, 0.03131103515625, -0.004199981689453125, -0.04547119140625, -0.005489349365234375, 0.0218353271484375, -0.029296875, -0.053070068359375, 0.036468505859375, 0.01605224609375, 0.03192138671875, 0.03033447265625, -0.0295867919921875, 0.067138671875, -0.0293121337890625, 0.0633544921875, 0.0394287109375, -0.0701904296875, 0.0531005859375, -0.00562286376953125, 0.0074005126953125, 0.0279083251953125, 0.02606201171875, -0.0229339599609375, -0.0229339599609375, -0.04974365234375, -0.0643310546875, 0.06671142578125, 0.01294708251953125, 0.01255035400390625, 0.0098419189453125, 0.029388427734375, -0.013519287109375, 0.0072479248046875, -0.05230712890625, -0.0225677490234375, -0.04052734375, -0.010528564453125, -0.006313323974609375, -0.01230621337890625, 0.013824462890625, -0.03662109375, 0.040679931640625, -0.002471923828125, 0.04473876953125, 0.024658203125, -0.0236053466796875, -0.0151214599609375, -0.03582763671875, 0.038360595703125, 0.038970947265625, -0.0242919921875, -0.00411224365234375, 0.0005717277526855469, -0.053314208984375, -0.0178680419921875, 0.00910186767578125, -0.02996826171875, -0.0006213188171386719, 0.025848388671875, 0.08233642578125, -0.006580352783203125, -0.04473876953125, 0.049774169921875, 0.0041046142578125, -0.0179595947265625, -0.0245513916015625, 0.0007395744323730469, 0.007625579833984375, 0.019989013671875, 0.033111572265625, 0.01071929931640625, -0.012359619140625, -0.037261962890625, 0.0256805419921875, 0.042144775390625, -0.01096343994140625, -0.0245513916015625, 0.0633544921875, 0.004669189453125, -0.018035888671875, 0.059356689453125, -0.035614013671875, -0.05230712890625, 0.07305908203125, 0.059173583984375, 0.034637451171875, -0.00274658203125, 0.01837158203125, 0.053924560546875, 0.034027099609375, -0.00669097900390625, 0.034149169921875, 0.022064208984375, -0.06597900390625, -0.021881103515625, -0.04974365234375, -0.0259246826171875, 0.0172119140625, -0.03924560546875, 0.043243408203125, -0.04669189453125, -0.01702880859375, 0.01690673828125, 0.0146484375, -0.0634765625, 0.026123046875, 0.026031494140625, 0.06597900390625, -0.062469482421875, 0.034515380859375, 0.061737060546875, -0.05548095703125, -0.06756591796875, -0.0225372314453125, -0.02587890625, -0.0860595703125, 0.06048583984375, 0.033905029296875, 0.00984954833984375, 0.0039215087890625, -0.05902099609375, -0.061248779296875, 0.07672119140625, 0.034454345703125, -0.0271453857421875, 0.003849029541015625, 0.01485443115234375, 0.04058837890625, -0.007022857666015625, 0.034912109375, 0.0165252685546875, 0.035980224609375, 0.027587890625, -0.0675048828125, 0.0100555419921875, -0.02587890625, -0.0004718303680419922, -0.0022373199462890625, -0.071533203125, 0.0760498046875, -0.036102294921875, -0.01218414306640625, 0.0016574859619140625, 0.052520751953125, 0.031707763671875, 0.00955963134765625, 0.03167724609375, 0.053863525390625, 0.046173095703125, -0.0036487579345703125, 0.07550048828125, -0.026153564453125, 0.051605224609375, 0.04248046875, 0.0041046142578125, 0.059112548828125, 0.029296875, -0.01227569580078125, 0.0149078369140625, 0.052093505859375, -0.048248291015625, 0.0306243896484375, -0.0012331008911132812, 0.020538330078125, -0.0011157989501953125, 0.007045745849609375, -0.0306854248046875, 0.050201416015625, 0.035186767578125, -0.02606201171875, -0.002223968505859375, -0.0010805130004882812, 0.00287628173828125, -0.033447265625, -0.0221405029296875, 0.02740478515625, -0.0118408203125, -0.059539794921875, 0.07183837890625, -0.0024662017822265625, 0.0765380859375, -0.01873779296875, 0.003662109375, -0.017303466796875, 0.01751708984375, -0.024871826171875, -0.08184814453125, 0.0241241455078125, -0.006984710693359375, -0.0003006458282470703, 0.0015048980712890625, 0.03826904296875, -0.032470703125, -0.07147216796875, 0.02337646484375, 0.00890350341796875, 0.02398681640625, 0.010009765625, -0.0811767578125, 0.01058197021484375, 0.006191253662109375, -0.027496337890625, -0.00762176513671875, 0.0213623046875, 0.0093536376953125, 0.0521240234375, 0.05078125, 0.0161285400390625, 0.036376953125, -0.011383056640625, 0.062744140625, -0.04364013671875, -0.0267181396484375, -0.048065185546875, 0.045074462890625, -0.01189422607421875, -0.047149658203125, 0.039337158203125, 0.0687255859375, 0.071044921875, -0.011871337890625, 0.045684814453125, -0.0265350341796875, 0.01068115234375, -0.035369873046875, 0.053924560546875, -0.060089111328125, -0.0140533447265625, -0.0163421630859375, -0.05230712890625, -0.028594970703125, 0.06854248046875, -0.0177459716796875, 0.00972747802734375, 0.040008544921875, 0.09039306640625, -0.0175323486328125, -0.024688720703125, 0.005336761474609375, 0.021514892578125, 0.0257720947265625, 0.053131103515625, 0.049163818359375, -0.051544189453125, 0.046417236328125, -0.052581787109375, -0.0167694091796875, -0.0163421630859375, -0.042205810546875, -0.06866455078125, -0.04449462890625, -0.03240966796875, -0.038543701171875, -0.0115966796875, 0.0350341796875, 0.07025146484375, -0.048614501953125, -0.01318359375, -0.01229095458984375, -0.00029397010803222656, -0.0016908645629882812, -0.0169525146484375, 0.03570556640625, -0.027099609375, -0.06744384765625, -0.01035308837890625, 0.0149078369140625, 0.0178375244140625, -0.0116119384765625, -0.0015716552734375, -0.013427734375, -0.018829345703125, 0.034454345703125, 0.0311737060546875, -0.044677734375, -0.014556884765625, 0.0078887939453125, -0.0119476318359375, 0.019775390625, 0.0188751220703125, -0.04071044921875, 0.0282440185546875, 0.0391845703125, 0.028778076171875, 0.0618896484375, -0.003482818603515625, 0.004352569580078125, -0.045074462890625, 0.0570068359375, 0.01534271240234375, 0.035308837890625, 0.0369873046875, -0.0295867919921875, 0.028594970703125, 0.02227783203125, -0.02508544921875, -0.06866455078125, 0.00782012939453125, -0.0948486328125, -0.021087646484375, 0.10614013671875, -0.0124053955078125, -0.04388427734375, 0.01242828369140625, -0.0171661376953125, 0.0278167724609375, -0.017669677734375, 0.04400634765625, 0.0136566162109375, -0.0074920654296875, -0.0306854248046875, -0.0246124267578125, 0.03167724609375, 0.0279541015625, -0.0499267578125, -0.0174407958984375, 0.021942138671875, 0.03369140625, 0.03472900390625, 0.033935546875, -0.01065826416015625, 0.030364990234375, 0.0179443359375, 0.0279083251953125, -0.01116943359375, -0.005489349365234375, -0.012420654296875, -0.00531768798828125, -0.01284027099609375, -0.032440185546875 ] ]
asahi417/tner-xlm-roberta-base-all-english
2021-02-12T23:31:37.000Z
[ "transformers", "pytorch", "xlm-roberta", "token-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
asahi417
null
null
asahi417/tner-xlm-roberta-base-all-english
0
162,337
transformers
2022-03-02T23:29:05
# XLM-RoBERTa for NER XLM-RoBERTa finetuned on NER. Check more detail at [TNER repository](https://github.com/asahi417/tner). ## Usage ``` from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("asahi417/tner-xlm-roberta-base-all-english") model = AutoModelForTokenClassification.from_pretrained("asahi417/tner-xlm-roberta-base-all-english") ```
419
[ [ -0.0138702392578125, -0.0254669189453125, 0.0160064697265625, 0.01435089111328125, -0.032073974609375, -0.0021381378173828125, -0.012908935546875, 0.006801605224609375, 0.01367950439453125, 0.042022705078125, -0.037322998046875, -0.0298309326171875, -0.068359375, 0.031341552734375, -0.0223846435546875, 0.07080078125, -0.01541900634765625, 0.040283203125, 0.0157470703125, -0.022064208984375, -0.02667236328125, -0.049835205078125, -0.08258056640625, -0.030029296875, 0.024810791015625, 0.0340576171875, 0.016510009765625, 0.022491455078125, 0.0259246826171875, 0.033905029296875, -0.0075225830078125, 0.0072174072265625, -0.01346588134765625, 0.0018186569213867188, 0.00601959228515625, -0.04840087890625, -0.058807373046875, -0.0018968582153320312, 0.05108642578125, 0.0177001953125, 0.004619598388671875, 0.0225372314453125, -0.0157470703125, 0.0279998779296875, -0.025299072265625, 0.0179595947265625, -0.033111572265625, 0.01067352294921875, -0.004985809326171875, -0.0020427703857421875, -0.04095458984375, -0.0171051025390625, -0.01800537109375, -0.057373046875, -0.0027713775634765625, 0.0104522705078125, 0.09918212890625, 0.027740478515625, -0.048553466796875, 0.002532958984375, -0.070556640625, 0.07696533203125, -0.052703857421875, 0.0276336669921875, 0.0240020751953125, 0.04071044921875, 0.00788116455078125, -0.0772705078125, -0.042694091796875, -0.0163421630859375, -0.01242828369140625, 0.0262908935546875, -0.028717041015625, -0.01375579833984375, 0.0286712646484375, 0.0196685791015625, -0.040740966796875, 0.000050127506256103516, -0.0458984375, -0.0177764892578125, 0.024078369140625, 0.0207061767578125, 0.023956298828125, -0.022216796875, -0.01050567626953125, -0.021392822265625, -0.0232086181640625, -0.03045654296875, 0.01451873779296875, 0.0218658447265625, -0.034515380859375, 0.03973388671875, -0.00005269050598144531, 0.0499267578125, 0.035400390625, 0.01467132568359375, 0.04461669921875, -0.00928497314453125, -0.01580810546875, -0.01544189453125, 0.06805419921875, 0.00791168212890625, 0.005062103271484375, -0.0035877227783203125, -0.025421142578125, -0.0150146484375, 0.002140045166015625, -0.05523681640625, -0.0160980224609375, -0.0029430389404296875, -0.0306854248046875, -0.0014486312866210938, 0.030914306640625, -0.038970947265625, 0.03546142578125, -0.049591064453125, 0.0604248046875, -0.035797119140625, -0.0171051025390625, -0.017791748046875, -0.01342010498046875, 0.02447509765625, 0.004085540771484375, -0.059417724609375, 0.03009033203125, 0.039947509765625, 0.046173095703125, 0.01438140869140625, -0.023345947265625, -0.04339599609375, -0.00670623779296875, -0.0245208740234375, 0.047943115234375, -0.0019893646240234375, -0.0196685791015625, -0.0200958251953125, 0.0300445556640625, -0.0270538330078125, -0.033905029296875, 0.045013427734375, -0.0303497314453125, 0.051666259765625, 0.0232696533203125, -0.048919677734375, -0.027557373046875, 0.01296234130859375, -0.0389404296875, 0.06988525390625, 0.0377197265625, -0.0294342041015625, 0.02978515625, -0.06610107421875, -0.028106689453125, 0.0026493072509765625, 0.007602691650390625, -0.054168701171875, 0.00588226318359375, -0.00579071044921875, 0.006137847900390625, 0.003589630126953125, 0.0243988037109375, 0.007965087890625, -0.01529693603515625, 0.01031494140625, -0.0247650146484375, 0.06256103515625, 0.0362548828125, -0.03692626953125, 0.0304107666015625, -0.069580078125, 0.00667572021484375, -0.0021533966064453125, -0.00830078125, -0.01898193359375, -0.0280609130859375, 0.04351806640625, 0.01517486572265625, 0.0225982666015625, -0.031341552734375, 0.002193450927734375, -0.04241943359375, 0.03326416015625, 0.031768798828125, -0.007659912109375, 0.046661376953125, -0.0249786376953125, 0.045135498046875, 0.01076507568359375, 0.030059814453125, 0.004352569580078125, -0.01727294921875, -0.08819580078125, -0.01329803466796875, 0.04510498046875, 0.044677734375, -0.03265380859375, 0.054229736328125, -0.00856781005859375, -0.05450439453125, -0.023468017578125, 0.0010023117065429688, 0.03045654296875, 0.021453857421875, 0.04986572265625, -0.01453399658203125, -0.06890869140625, -0.05322265625, -0.003185272216796875, 0.013519287109375, 0.00968170166015625, 0.01409912109375, 0.049102783203125, -0.039276123046875, 0.044403076171875, -0.0281524658203125, -0.034210205078125, -0.032501220703125, 0.02593994140625, 0.039215087890625, 0.06585693359375, 0.05047607421875, -0.048370361328125, -0.07373046875, -0.0149688720703125, -0.0209503173828125, 0.004314422607421875, -0.00428009033203125, -0.0282135009765625, 0.0433349609375, 0.037322998046875, -0.0372314453125, 0.033843994140625, 0.039398193359375, -0.026123046875, 0.0297393798828125, -0.029052734375, 0.0007586479187011719, -0.125244140625, 0.0144195556640625, -0.0231475830078125, -0.030029296875, -0.04327392578125, 0.0007538795471191406, 0.01509857177734375, 0.00696563720703125, -0.04119873046875, 0.05767822265625, -0.034210205078125, 0.0027713775634765625, -0.027008056640625, -0.0121612548828125, 0.00688934326171875, 0.0015783309936523438, 0.0014867782592773438, 0.041259765625, 0.05810546875, -0.049774169921875, 0.03631591796875, 0.0552978515625, -0.003955841064453125, 0.019989013671875, -0.051239013671875, 0.0163421630859375, 0.0055389404296875, 0.0220947265625, -0.0335693359375, -0.00679779052734375, 0.03753662109375, -0.04290771484375, 0.045562744140625, -0.03033447265625, -0.03033447265625, -0.0169525146484375, 0.023101806640625, 0.0408935546875, 0.050628662109375, -0.04083251953125, 0.0849609375, 0.01250457763671875, 0.0004642009735107422, -0.032135009765625, -0.054168701171875, 0.020355224609375, -0.02630615234375, -0.01416778564453125, 0.023406982421875, -0.0213775634765625, -0.0093536376953125, -0.0038166046142578125, 0.00041675567626953125, -0.0299072265625, -0.00804901123046875, 0.0192718505859375, 0.042724609375, -0.0265655517578125, -0.005126953125, -0.0110321044921875, -0.06475830078125, -0.00818634033203125, -0.0302276611328125, 0.057403564453125, -0.01036834716796875, -0.0062713623046875, -0.0312347412109375, 0.01007080078125, 0.02703857421875, -0.0192413330078125, 0.05389404296875, 0.0704345703125, -0.017364501953125, -0.00037026405334472656, -0.04376220703125, -0.025665283203125, -0.033233642578125, 0.0287628173828125, -0.010711669921875, -0.05133056640625, 0.040740966796875, 0.0101776123046875, -0.0099334716796875, 0.052459716796875, 0.023040771484375, 0.02020263671875, 0.0718994140625, 0.046661376953125, -0.0025959014892578125, 0.027862548828125, -0.04248046875, 0.0135498046875, -0.07293701171875, -0.009033203125, -0.057037353515625, 0.00879669189453125, -0.042022705078125, -0.00795745849609375, 0.0066070556640625, -0.012542724609375, -0.041595458984375, 0.05438232421875, -0.04095458984375, 0.031707763671875, 0.046142578125, -0.007778167724609375, 0.001087188720703125, -0.018890380859375, -0.006191253662109375, -0.01418304443359375, -0.04254150390625, -0.035491943359375, 0.0771484375, -0.006633758544921875, 0.040283203125, 0.006011962890625, 0.0550537109375, -0.032684326171875, 0.038482666015625, -0.05572509765625, 0.0227813720703125, -0.04998779296875, -0.07147216796875, -0.00201416015625, -0.034454345703125, -0.07647705078125, -0.00951385498046875, -0.0228118896484375, -0.0278167724609375, 0.005863189697265625, 0.0091552734375, -0.0117950439453125, 0.013214111328125, -0.01751708984375, 0.07586669921875, -0.030181884765625, 0.0166168212890625, -0.011322021484375, -0.03948974609375, 0.009002685546875, -0.0223236083984375, 0.017547607421875, 0.00024127960205078125, 0.006683349609375, 0.0447998046875, -0.0350341796875, 0.04193115234375, -0.00920867919921875, 0.032012939453125, -0.003520965576171875, -0.006900787353515625, 0.01263427734375, 0.004520416259765625, -0.005542755126953125, 0.01477813720703125, -0.017547607421875, -0.0156707763671875, -0.0237579345703125, 0.043731689453125, -0.07745361328125, -0.0264434814453125, -0.04449462890625, -0.019195556640625, 0.0158538818359375, 0.046478271484375, 0.05859375, 0.0604248046875, -0.00102996826171875, 0.0089569091796875, 0.0171966552734375, -0.00797271728515625, 0.031005859375, 0.05194091796875, -0.0244903564453125, -0.055084228515625, 0.049957275390625, 0.00940704345703125, 0.01824951171875, 0.03802490234375, 0.017822265625, -0.0333251953125, -0.0372314453125, -0.0198211669921875, 0.0177764892578125, -0.0577392578125, -0.0175933837890625, -0.038360595703125, -0.04034423828125, -0.0292510986328125, 0.006160736083984375, -0.0235443115234375, -0.04632568359375, -0.041473388671875, -0.00991058349609375, 0.039703369140625, 0.041961669921875, -0.0290985107421875, 0.036163330078125, -0.07745361328125, 0.0270843505859375, 0.01151275634765625, 0.0149993896484375, -0.0226593017578125, -0.0784912109375, -0.027252197265625, -0.006885528564453125, -0.0247650146484375, -0.039154052734375, 0.05877685546875, 0.027130126953125, 0.054595947265625, 0.01580810546875, 0.0012073516845703125, 0.050445556640625, -0.038116455078125, 0.050872802734375, 0.0072174072265625, -0.0810546875, 0.03936767578125, -0.006744384765625, 0.0248870849609375, 0.044189453125, 0.0276336669921875, -0.03338623046875, -0.035888671875, -0.058746337890625, -0.09210205078125, 0.062286376953125, 0.01273345947265625, 0.0455322265625, -0.004451751708984375, 0.02288818359375, 0.011322021484375, -0.009124755859375, -0.08538818359375, -0.05426025390625, -0.05230712890625, -0.0247039794921875, -0.03778076171875, -0.006134033203125, -0.0010471343994140625, -0.03131103515625, 0.0875244140625, 0.0013523101806640625, 0.00861358642578125, 0.023101806640625, -0.021514892578125, -0.025665283203125, 0.0168914794921875, 0.04962158203125, 0.0391845703125, -0.01441192626953125, -0.01213836669921875, 0.026275634765625, -0.028564453125, 0.0134124755859375, 0.0190887451171875, -0.03045654296875, 0.028289794921875, 0.00925445556640625, 0.07928466796875, 0.0303497314453125, -0.0087127685546875, 0.0229034423828125, -0.0305023193359375, -0.0253448486328125, -0.08154296875, 0.00904083251953125, 0.00485992431640625, -0.006130218505859375, 0.0238494873046875, 0.0245208740234375, -0.005413055419921875, -0.0156402587890625, 0.01194000244140625, 0.0166015625, -0.05078125, -0.0168914794921875, 0.0654296875, -0.002147674560546875, -0.039642333984375, 0.06390380859375, -0.01220703125, -0.04248046875, 0.058258056640625, 0.05255126953125, 0.0875244140625, -0.012908935546875, -0.005565643310546875, 0.06414794921875, 0.0110321044921875, 0.00579833984375, 0.0278167724609375, 0.0217132568359375, -0.0615234375, -0.0391845703125, -0.045806884765625, -0.033935546875, 0.0123291015625, -0.06719970703125, 0.03839111328125, -0.02545166015625, -0.034942626953125, 0.00946807861328125, -0.01561737060546875, -0.041351318359375, 0.01432037353515625, -0.00555419921875, 0.06719970703125, -0.038970947265625, 0.061920166015625, 0.07501220703125, -0.04296875, -0.09576416015625, -0.0126953125, -0.0081329345703125, -0.040374755859375, 0.07916259765625, 0.01337432861328125, 0.022369384765625, 0.024261474609375, -0.02001953125, -0.069091796875, 0.07867431640625, -0.005649566650390625, -0.03460693359375, 0.03045654296875, 0.006862640380859375, 0.0236968994140625, -0.028564453125, 0.046051025390625, 0.01111602783203125, 0.0667724609375, 0.000492095947265625, -0.06768798828125, -0.0153961181640625, -0.030609130859375, -0.00954437255859375, 0.019805908203125, -0.057769775390625, 0.07501220703125, 0.004512786865234375, 0.003841400146484375, 0.033966064453125, 0.040069580078125, 0.0111846923828125, 0.016082763671875, 0.03857421875, 0.04302978515625, 0.010223388671875, -0.0236358642578125, 0.042144775390625, -0.047210693359375, 0.0438232421875, 0.0814208984375, -0.0049591064453125, 0.05194091796875, 0.0213623046875, -0.007457733154296875, 0.0572509765625, 0.033416748046875, -0.042266845703125, 0.0256805419921875, -0.00641632080078125, -0.0028057098388671875, -0.02044677734375, 0.03460693359375, -0.0231170654296875, 0.037811279296875, 0.00859832763671875, -0.0210113525390625, -0.003047943115234375, -0.0006246566772460938, 0.0198822021484375, 0.0006899833679199219, -0.02642822265625, 0.06396484375, 0.0252227783203125, -0.057037353515625, 0.03594970703125, 0.0085296630859375, 0.0499267578125, -0.057525634765625, 0.01145172119140625, -0.012664794921875, 0.033111572265625, -0.00594329833984375, -0.0156097412109375, 0.043121337890625, -0.003513336181640625, -0.00921630859375, -0.0030059814453125, 0.038299560546875, -0.0311126708984375, -0.0533447265625, 0.0234375, 0.0255126953125, 0.010101318359375, 0.020538330078125, -0.05859375, -0.0234527587890625, -0.0033054351806640625, -0.0157470703125, 0.0157012939453125, 0.0215911865234375, 0.0191802978515625, 0.039581298828125, 0.044281005859375, -0.00096893310546875, 0.00286865234375, 0.028900146484375, 0.035614013671875, -0.04644775390625, -0.037689208984375, -0.0562744140625, 0.040618896484375, 0.002513885498046875, -0.03692626953125, 0.07763671875, 0.0621337890625, 0.04791259765625, -0.035919189453125, 0.040252685546875, -0.005168914794921875, 0.035858154296875, -0.037628173828125, 0.060455322265625, -0.036834716796875, -0.01184844970703125, -0.006923675537109375, -0.0799560546875, -0.0312347412109375, 0.0650634765625, -0.00812530517578125, 0.01461029052734375, 0.056671142578125, 0.057098388671875, -0.0175933837890625, -0.01690673828125, 0.0201416015625, 0.0416259765625, 0.00733184814453125, 0.013641357421875, 0.05816650390625, -0.053863525390625, 0.04132080078125, -0.03240966796875, -0.0034942626953125, -0.0102996826171875, -0.07525634765625, -0.06561279296875, -0.0377197265625, -0.033599853515625, -0.0288543701171875, -0.0179443359375, 0.08746337890625, 0.07110595703125, -0.0819091796875, -0.023406982421875, -0.0176239013671875, -0.0050506591796875, -0.004535675048828125, -0.0182952880859375, 0.0294952392578125, -0.03045654296875, -0.07452392578125, 0.0215911865234375, -0.004405975341796875, 0.036407470703125, -0.002613067626953125, -0.0316162109375, 0.0111541748046875, 0.0015869140625, 0.003589630126953125, 0.015472412109375, -0.038726806640625, -0.00506591796875, 0.0033893585205078125, -0.01137542724609375, 0.019439697265625, 0.01861572265625, -0.030029296875, 0.005031585693359375, 0.02587890625, -0.00974273681640625, 0.05255126953125, -0.013763427734375, 0.0494384765625, -0.035919189453125, 0.0276641845703125, 0.014862060546875, 0.05743408203125, 0.0307159423828125, -0.0118865966796875, 0.0224761962890625, 0.0014438629150390625, -0.053131103515625, -0.043243408203125, -0.003131866455078125, -0.07501220703125, -0.00018489360809326172, 0.0692138671875, -0.0125274658203125, -0.023101806640625, -0.0177154541015625, -0.0275421142578125, 0.043304443359375, -0.0168914794921875, 0.051239013671875, 0.031341552734375, 0.0184783935546875, -0.0186309814453125, -0.027984619140625, 0.025604248046875, 0.03302001953125, -0.048553466796875, -0.02777099609375, 0.00914764404296875, 0.04736328125, 0.0247955322265625, 0.0300445556640625, -0.0266265869140625, 0.025390625, -0.01177215576171875, 0.0252685546875, -0.0238037109375, -0.005710601806640625, -0.02685546875, -0.037322998046875, 0.00614166259765625, -0.000640869140625 ] ]
darkstorm2150/Protogen_x5.8_Official_Release
2023-03-21T18:20:14.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "art", "artistic", "en", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
darkstorm2150
null
null
darkstorm2150/Protogen_x5.8_Official_Release
193
161,893
diffusers
2023-01-06T01:18:34
--- language: - en tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - art - artistic - diffusers inference: true license: creativeml-openrail-m --- <center><img src="https://huggingface.co/darkstorm2150/Protogen_x5.8_Official_Release/resolve/main/Protogen%20x5.8-512.png" style="height:690px; border-radius: 8%; border: 10px solid #663380; padding-top:0px;" span title="Protogen x5.8 Raw Output"></center> <center><h1>Protogen x5.8 (Scifi-Anime) Official Release</h1></center> <center><p><em>Research Model by <a href="https://instagram.com/officialvictorespinoza">darkstorm2150</a></em></p></center> </div> ## Table of contents * [General info](#general-info) * [Granular Adaptive Learning](#granular-adaptive-learning) * [Trigger Words](#trigger-words) * [Setup](#setup) * [Space](#space) * [CompVis](#compvis) * [Diffusers](#🧨-diffusers) * [Checkpoint Merging Data Reference](#checkpoint-merging-data-reference) * [License](#license) ## General info Protogen x5.8 Protogen was warm-started with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) and is rebuilt using dreamlikePhotoRealV2.ckpt as a core, adding small amounts during merge checkpoints. ## Granular Adaptive Learning Granular adaptive learning is a machine learning technique that focuses on adjusting the learning process at a fine-grained level, rather than making global adjustments to the model. This approach allows the model to adapt to specific patterns or features in the data, rather than making assumptions based on general trends. Granular adaptive learning can be achieved through techniques such as active learning, which allows the model to select the data it wants to learn from, or through the use of reinforcement learning, where the model receives feedback on its performance and adapts based on that feedback. It can also be achieved through techniques such as online learning where the model adjust itself as it receives more data. Granular adaptive learning is often used in situations where the data is highly diverse or non-stationary and where the model needs to adapt quickly to changing patterns. This is often the case in dynamic environments such as robotics, financial markets, and natural language processing. ## Trigger Words modelshoot style, analog style, mdjrny-v4 style, nousr robot Trigger words are available for the hassan1.4 and f222, might have to google them :) ## Setup To run this model, download the model.ckpt or model.safetensor and install it in your "stable-diffusion-webui\models\Stable-diffusion" directory ## Space We support a [Gradio](https://github.com/gradio-app/gradio) Web UI: [![Open In Spaces](https://camo.githubusercontent.com/00380c35e60d6b04be65d3d94a58332be5cc93779f630bcdfc18ab9a3a7d3388/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f25463025394625413425393725323048756767696e67253230466163652d5370616365732d626c7565)](https://huggingface.co/spaces/darkstorm2150/Stable-Diffusion-Protogen-webui) ## CompVis ## CKPT [Download ProtoGen x5.8.ckpt (7.7GB)](https://huggingface.co/darkstorm2150/Protogen_x5.8_Official_Release/resolve/main/ProtoGen_X5.8.ckpt) [Download ProtoGen X5.8-pruned-fp16.ckpt (1.72 GB)](https://huggingface.co/darkstorm2150/Protogen_x5.8_Official_Release/resolve/main/ProtoGen_X5.8-pruned-fp16.ckpt) ## Safetensors [Download ProtoGen x5.8.safetensors (7.7GB)](https://huggingface.co/darkstorm2150/Protogen_x5.8_Official_Release/resolve/main/ProtoGen_X5.8.safetensors) [Download ProtoGen x5.8-pruned-fp16.safetensors (1.72GB)](https://huggingface.co/darkstorm2150/Protogen_x5.8_Official_Release/resolve/main/ProtoGen_X5.8-pruned-fp16.safetensors) ### 🧨 Diffusers This model can be used just like any other Stable Diffusion model. For more information, please have a look at the [Stable Diffusion Pipeline](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion). ```python from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler import torch prompt = ( "modelshoot style, (extremely detailed CG unity 8k wallpaper), full shot body photo of the most beautiful artwork in the world, " "english medieval witch, black silk vale, pale skin, black silk robe, black cat, necromancy magic, medieval era, " "photorealistic painting by Ed Blinkey, Atey Ghailan, Studio Ghibli, by Jeremy Mann, Greg Manchess, Antonio Moro, trending on ArtStation, " "trending on CGSociety, Intricate, High Detail, Sharp focus, dramatic, photorealistic painting art by midjourney and greg rutkowski" ) model_id = "darkstorm2150/Protogen_v5.8_Official_Release" pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16) pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config) pipe = pipe.to("cuda") image = pipe(prompt, num_inference_steps=25).images[0] image.save("./result.jpg") ``` ## - PENDING DATA FOR MERGE, RPGv2 not accounted.. ## Checkpoint Merging Data Reference <style> .myTable { border-collapse:collapse; } .myTable th { background-color:#663380; color:white; } .myTable td, .myTable th { padding:5px; border:1px solid #663380; } </style> <table class="myTable"> <tr> <th>Models</th> <th>Protogen v2.2 (Anime)</th> <th>Protogen x3.4 (Photo)</th> <th>Protogen x5.3 (Photo)</th> <th>Protogen x5.8 (Sci-fi/Anime)</th> <th>Protogen x5.9 (Dragon)</th> <th>Protogen x7.4 (Eclipse)</th> <th>Protogen x8.0 (Nova)</th> <th>Protogen x8.6 (Infinity)</th> </tr> <tr> <td>seek_art_mega v1</td> <td>52.50%</td> <td>42.76%</td> <td>42.63%</td> <td></td> <td></td> <td></td> <td>25.21%</td> <td>14.83%</td> </tr> <tr> <td>modelshoot v1</td> <td>30.00%</td> <td>24.44%</td> <td>24.37%</td> <td>2.56%</td> <td>2.05%</td> <td>3.48%</td> <td>22.91%</td> <td>13.48%</td> </tr> <tr> <td>elldreth v1</td> <td>12.64%</td> <td>10.30%</td> <td>10.23%</td> <td></td> <td></td> <td></td> <td>6.06%</td> <td>3.57%</td> </tr> <tr> <td>photoreal v2</td> <td></td> <td></td> <td>10.00%</td> <td>48.64%</td> <td>38.91%</td> <td>66.33%</td> <td>20.49%</td> <td>12.06%</td> </tr> <tr> <td>analogdiffusion v1</td> <td></td> <td>4.75%</td> <td>4.50%</td> <td></td> <td></td> <td></td> <td>1.75%</td> <td>1.03%</td> </tr> <tr> <td>openjourney v2</td> <td></td> <td>4.51%</td> <td>4.28%</td> <td></td> <td></td> <td>4.75%</td> <td>2.26%</td> <td>1.33%</td> </tr> <tr> <td>hassan1.4</td> <td>2.63%</td> <td>2.14%</td> <td>2.13%</td> <td></td> <td></td> <td></td> <td>1.26%</td> <td>0.74%</td> </tr> <tr> <td>f222</td> <td>2.23%</td> <td>1.82%</td> <td>1.81%</td> <td></td> <td></td> <td></td> <td>1.07%</td> <td>0.63%</td> </tr> <tr> <td>hasdx</td> <td></td> <td></td> <td></td> <td>20.00%</td> <td>16.00%</td> <td>4.07%</td> <td>5.01%</td> <td>2.95%</td> </tr> <tr> <td>moistmix</td> <td></td> <td></td> <td></td> <td>16.00%</td> <td>12.80%</td> <td>3.86%</td> <td>4.08%</td> <td>2.40%</td> </tr> <tr> <td>roboDiffusion v1</td> <td></td> <td>4.29%</td> <td></td> <td>12.80%</td> <td>10.24%</td> <td>3.67%</td> <td>4.41%</td> <td>2.60%</td> </tr> <tr> <td>RPG v3</td> <td></td> <td>5.00%</td> <td></td> <td></td> <td>20.00%</td> <td>4.29%</td> <td>4.29%</td> <td>2.52%</td> </tr> <tr> <td>anything&everything</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>4.51%</td> <td>0.56%</td> <td>0.33%</td> </tr> <tr> <td>dreamlikediff v1</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>5.0%</td> <td>0.63%</td> <td>0.37%</td> </tr> <tr> <td>sci-fidiff v1</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>3.10%</td> </tr> <tr> <td>synthwavepunk v2</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>3.26%</td> </tr> <tr> <td>mashupv2</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>11.51%</td> </tr> <tr> <td>dreamshaper 252</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>4.04%</td> </tr> <tr> <td>comicdiff v2</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>4.25%</td> </tr> <tr> <td>artEros</td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td></td> <td>15.00%</td> </tr> </table> ## License License This model is licesed under a modified CreativeML OpenRAIL-M license. You are not allowed to host, finetune, or do inference with the model or its derivatives on websites/apps/etc. If you want to, please email us at contact@dreamlike.art You are free to host the model card and files (Without any actual inference or finetuning) on both commercial and non-commercial websites/apps/etc. Please state the full model name (Dreamlike Photoreal 2.0) and include the license as well as a link to the model card (https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0) You are free to use the outputs (images) of the model for commercial purposes in teams of 10 or less You can't use the model to deliberately produce nor share illegal or harmful outputs or content The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license You may re-distribute the weights. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the modified CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here: https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/blob/main/LICENSE.md
9,504
[ [ -0.05096435546875, -0.047637939453125, 0.01291656494140625, 0.034759521484375, -0.01207733154296875, 0.005352020263671875, 0.010955810546875, -0.03173828125, 0.027740478515625, 0.00682830810546875, -0.048370361328125, -0.02740478515625, -0.04388427734375, 0.0008158683776855469, -0.00885009765625, 0.061798095703125, -0.0016069412231445312, -0.0162811279296875, 0.00467681884765625, 0.00818634033203125, -0.0177001953125, -0.002239227294921875, -0.028564453125, -0.026214599609375, 0.0225830078125, 0.034088134765625, 0.061309814453125, 0.041656494140625, 0.026641845703125, 0.0262451171875, -0.0264129638671875, 0.0029010772705078125, -0.0355224609375, -0.0027446746826171875, 0.0032958984375, -0.019561767578125, -0.034881591796875, -0.0023212432861328125, 0.025177001953125, 0.032684326171875, -0.0118560791015625, 0.0247039794921875, 0.0232391357421875, 0.058349609375, -0.039398193359375, 0.01226043701171875, -0.00449371337890625, 0.0220794677734375, -0.0028018951416015625, -0.003383636474609375, -0.0034427642822265625, -0.04278564453125, -0.004283905029296875, -0.0634765625, 0.01474761962890625, -0.001682281494140625, 0.0833740234375, -0.0048980712890625, -0.0184783935546875, -0.0022754669189453125, -0.053497314453125, 0.057708740234375, -0.054290771484375, 0.027679443359375, 0.0135345458984375, 0.01020050048828125, -0.015472412109375, -0.058135986328125, -0.0672607421875, 0.0175018310546875, 0.000823974609375, 0.046966552734375, -0.0300445556640625, -0.02783203125, 0.01204681396484375, 0.02191162109375, -0.0538330078125, -0.0099945068359375, -0.034576416015625, -0.0120391845703125, 0.043212890625, 0.014068603515625, 0.0268707275390625, -0.01145172119140625, -0.044403076171875, -0.0191497802734375, -0.01654052734375, 0.032012939453125, 0.0219879150390625, -0.0026416778564453125, -0.039337158203125, 0.03363037109375, -0.0076904296875, 0.048370361328125, 0.0186920166015625, -0.0297088623046875, 0.03900146484375, -0.0357666015625, -0.025177001953125, -0.0134124755859375, 0.055908203125, 0.046478271484375, -0.006237030029296875, 0.00223541259765625, 0.0091094970703125, 0.01078033447265625, 0.0017032623291015625, -0.07415771484375, -0.022735595703125, 0.03814697265625, -0.027557373046875, -0.035980224609375, 0.00034427642822265625, -0.07916259765625, -0.00478363037109375, 0.00936126708984375, 0.0235443115234375, -0.03662109375, -0.031341552734375, 0.01556396484375, -0.0248565673828125, 0.01312255859375, 0.0248565673828125, -0.050048828125, 0.02587890625, 0.025054931640625, 0.074951171875, -0.0019121170043945312, -0.00983428955078125, 0.0258026123046875, 0.0233917236328125, -0.036895751953125, 0.05718994140625, -0.01220703125, -0.04791259765625, -0.0399169921875, 0.0258331298828125, -0.014556884765625, -0.005199432373046875, 0.046234130859375, -0.0253448486328125, 0.028594970703125, -0.0217132568359375, -0.043609619140625, -0.0167388916015625, 0.01416015625, -0.0443115234375, 0.06427001953125, 0.01300811767578125, -0.0716552734375, 0.0279541015625, -0.05767822265625, -0.00140380859375, -0.0172271728515625, 0.0021076202392578125, -0.060333251953125, -0.0255126953125, 0.01275634765625, 0.020233154296875, -0.0161590576171875, -0.0230712890625, -0.0355224609375, -0.0250091552734375, 0.00681304931640625, -0.020355224609375, 0.081298828125, 0.036163330078125, -0.049163818359375, -0.0030975341796875, -0.057525634765625, 0.00351715087890625, 0.046478271484375, -0.02069091796875, 0.01050567626953125, -0.031646728515625, 0.0013456344604492188, 0.0284423828125, 0.01107025146484375, -0.04327392578125, 0.01708984375, -0.0267486572265625, 0.0305328369140625, 0.07098388671875, 0.0251617431640625, 0.03582763671875, -0.0616455078125, 0.04046630859375, 0.0251312255859375, 0.008331298828125, 0.0268096923828125, -0.0406494140625, -0.05126953125, -0.027679443359375, 0.0191650390625, 0.03741455078125, -0.032012939453125, 0.03302001953125, -0.01873779296875, -0.07012939453125, -0.0187835693359375, -0.00507354736328125, 0.0252227783203125, 0.049530029296875, 0.022064208984375, -0.00421905517578125, -0.041046142578125, -0.053497314453125, 0.0182037353515625, -0.01233673095703125, 0.0217132568359375, 0.044525146484375, 0.05438232421875, -0.0226287841796875, 0.0604248046875, -0.068115234375, -0.021636962890625, -0.0237274169921875, -0.006710052490234375, 0.035736083984375, 0.05548095703125, 0.06292724609375, -0.07196044921875, -0.044281005859375, 0.005199432373046875, -0.049713134765625, -0.0011339187622070312, 0.0079498291015625, -0.02825927734375, -0.005634307861328125, 0.01004791259765625, -0.048797607421875, 0.036163330078125, 0.04241943359375, -0.053741455078125, 0.059173583984375, -0.032684326171875, 0.0293426513671875, -0.0855712890625, 0.0236968994140625, 0.0029430389404296875, -0.0056915283203125, -0.05816650390625, 0.01096343994140625, -0.006046295166015625, -0.0008935928344726562, -0.05218505859375, 0.051727294921875, -0.04400634765625, 0.021392822265625, 0.0014657974243164062, 0.00452423095703125, 0.006313323974609375, 0.04095458984375, -0.01299285888671875, 0.072998046875, 0.044708251953125, -0.0513916015625, 0.019378662109375, 0.0221405029296875, -0.03533935546875, 0.015167236328125, -0.04669189453125, -0.005733489990234375, -0.0188140869140625, 0.01349639892578125, -0.08197021484375, -0.014251708984375, 0.02117919921875, -0.044769287109375, 0.00226593017578125, 0.0037860870361328125, -0.024871826171875, -0.0562744140625, -0.027252197265625, 0.00792694091796875, 0.06500244140625, -0.01190948486328125, 0.033233642578125, 0.014007568359375, 0.0164642333984375, -0.0364990234375, -0.04327392578125, -0.016448974609375, -0.028472900390625, -0.0751953125, 0.05413818359375, -0.0164794921875, -0.01287841796875, -0.00243377685546875, -0.0189666748046875, -0.00997161865234375, 0.00778961181640625, 0.026123046875, 0.0082855224609375, 0.0048828125, -0.0231475830078125, -0.0248870849609375, -0.008758544921875, -0.01434326171875, -0.00991058349609375, 0.050201416015625, 0.0036945343017578125, -0.017333984375, -0.0421142578125, -0.00351715087890625, 0.059478759765625, 0.002239227294921875, 0.05450439453125, 0.052337646484375, -0.01702880859375, -0.01397705078125, -0.01763916015625, -0.0134124755859375, -0.035400390625, 0.003139495849609375, -0.0250091552734375, -0.043975830078125, 0.056182861328125, 0.005527496337890625, -0.00445556640625, 0.04498291015625, 0.0360107421875, -0.0231475830078125, 0.067626953125, 0.0377197265625, 0.0004603862762451172, 0.0308837890625, -0.06915283203125, -0.0081939697265625, -0.0604248046875, -0.038330078125, -0.0281982421875, -0.047576904296875, -0.035247802734375, -0.052734375, 0.042449951171875, 0.0169219970703125, -0.046295166015625, 0.02777099609375, -0.05023193359375, 0.007762908935546875, 0.036712646484375, 0.050445556640625, 0.0004680156707763672, -0.01302337646484375, -0.036163330078125, -0.01148223876953125, -0.038787841796875, -0.0252532958984375, 0.0618896484375, 0.00738525390625, 0.040435791015625, 0.036773681640625, 0.05596923828125, 0.01207733154296875, -0.0009245872497558594, -0.01959228515625, 0.03411865234375, 0.00946044921875, -0.07427978515625, -0.0232086181640625, -0.0229949951171875, -0.08197021484375, 0.0245819091796875, -0.0352783203125, -0.064453125, 0.032989501953125, 0.01322174072265625, -0.04205322265625, 0.0350341796875, -0.04510498046875, 0.073486328125, -0.00811004638671875, -0.05792236328125, 0.019744873046875, -0.0538330078125, 0.01355743408203125, 0.007556915283203125, 0.0465087890625, -0.01556396484375, -0.0196685791015625, 0.061492919921875, -0.049041748046875, 0.053375244140625, -0.016754150390625, 0.0207672119140625, 0.04241943359375, 0.00724029541015625, 0.048370361328125, 0.0046539306640625, -0.0026187896728515625, -0.01248931884765625, -0.01241302490234375, -0.047088623046875, -0.029083251953125, 0.05908203125, -0.07965087890625, -0.041412353515625, -0.060333251953125, -0.0199127197265625, 0.01186370849609375, 0.02777099609375, 0.033172607421875, 0.01268768310546875, -0.004413604736328125, 0.005931854248046875, 0.0555419921875, -0.01287078857421875, 0.051727294921875, 0.0210723876953125, -0.0282745361328125, -0.04058837890625, 0.0748291015625, 0.0238494873046875, 0.0239410400390625, 0.0186920166015625, 0.050445556640625, -0.019500732421875, -0.052032470703125, -0.0196380615234375, 0.007175445556640625, -0.0287628173828125, -0.018096923828125, -0.06304931640625, -0.007633209228515625, -0.039825439453125, -0.034393310546875, -0.00963592529296875, -0.0367431640625, -0.030120849609375, -0.0224151611328125, 0.043914794921875, 0.02838134765625, -0.0263671875, 0.00844573974609375, -0.036285400390625, 0.028717041015625, 0.03265380859375, 0.01654052734375, 0.00493621826171875, -0.035247802734375, 0.01055145263671875, 0.0175018310546875, -0.0440673828125, -0.0828857421875, 0.049102783203125, -0.001312255859375, 0.0345458984375, 0.0198516845703125, -0.005626678466796875, 0.08416748046875, -0.020782470703125, 0.07049560546875, 0.03179931640625, -0.058929443359375, 0.042938232421875, -0.0465087890625, 0.0300445556640625, 0.032012939453125, 0.038177490234375, -0.024810791015625, -0.0248870849609375, -0.0665283203125, -0.06866455078125, 0.0426025390625, 0.041351318359375, -0.0198974609375, 0.016265869140625, -0.0018930435180664062, -0.01343536376953125, 0.007389068603515625, -0.06695556640625, -0.047882080078125, -0.019561767578125, 0.002056121826171875, -0.005985260009765625, 0.01296234130859375, -0.0001609325408935547, -0.042205810546875, 0.062255859375, 0.0160064697265625, 0.04644775390625, 0.02752685546875, 0.01163482666015625, -0.01383209228515625, 0.0131683349609375, 0.040924072265625, 0.0498046875, -0.0301055908203125, 0.0007252693176269531, 0.0101470947265625, -0.052276611328125, 0.02825927734375, -0.01184844970703125, -0.04022216796875, -0.005207061767578125, 0.0238494873046875, 0.035400390625, 0.0162200927734375, -0.00977325439453125, 0.03314208984375, 0.0029449462890625, -0.029632568359375, -0.039886474609375, 0.0265350341796875, 0.0294036865234375, 0.0236053466796875, 0.016845703125, 0.031341552734375, 0.006046295166015625, -0.0419921875, 0.019989013671875, 0.02392578125, -0.03955078125, -0.005573272705078125, 0.08160400390625, 0.004558563232421875, 0.0008001327514648438, 0.01172637939453125, -0.0096435546875, -0.041656494140625, 0.0767822265625, 0.05499267578125, 0.037322998046875, -0.01221466064453125, 0.0218048095703125, 0.05242919921875, 0.0012388229370117188, -0.0117340087890625, 0.0413818359375, 0.017547607421875, -0.0399169921875, 0.002994537353515625, -0.060791015625, -0.01467132568359375, 0.005054473876953125, -0.0247650146484375, 0.0352783203125, -0.042572021484375, -0.0232086181640625, 0.005218505859375, 0.0132598876953125, -0.038177490234375, 0.042449951171875, -0.00701904296875, 0.07232666015625, -0.0606689453125, 0.051483154296875, 0.044281005859375, -0.052581787109375, -0.0789794921875, -0.0081939697265625, 0.01297760009765625, -0.055572509765625, 0.036773681640625, -0.0091705322265625, 0.005809783935546875, 0.0150604248046875, -0.043914794921875, -0.083251953125, 0.10772705078125, 0.0072021484375, -0.02392578125, 0.004730224609375, -0.0007143020629882812, 0.049224853515625, -0.00818634033203125, 0.0379638671875, 0.028564453125, 0.043426513671875, 0.032958984375, -0.04962158203125, 0.00997161865234375, -0.0411376953125, 0.0118560791015625, 0.01003265380859375, -0.07635498046875, 0.07208251953125, -0.02532958984375, -0.0156402587890625, -0.0029754638671875, 0.053924560546875, 0.036865234375, 0.01434326171875, 0.03436279296875, 0.07232666015625, 0.027557373046875, -0.030517578125, 0.0709228515625, -0.0279541015625, 0.0477294921875, 0.0506591796875, 0.01055908203125, 0.043609619140625, 0.022430419921875, -0.0457763671875, 0.03192138671875, 0.05316162109375, -0.0017642974853515625, 0.052001953125, 0.02142333984375, -0.020751953125, 0.0010395050048828125, 0.01197052001953125, -0.048583984375, 0.01543426513671875, 0.010101318359375, -0.0210418701171875, 0.0075225830078125, 0.0018453598022460938, 0.02288818359375, 0.008270263671875, -0.021514892578125, 0.05450439453125, -0.01026153564453125, -0.034942626953125, 0.040191650390625, -0.0087890625, 0.04986572265625, -0.040283203125, 0.0070037841796875, -0.0184478759765625, 0.0302886962890625, -0.036865234375, -0.07647705078125, 0.006145477294921875, -0.009185791015625, -0.005901336669921875, -0.016387939453125, 0.0307159423828125, -0.0096435546875, -0.06024169921875, 0.0229644775390625, 0.01103973388671875, 0.0038013458251953125, 0.032257080078125, -0.0731201171875, 0.0164794921875, 0.018829345703125, -0.033538818359375, 0.016143798828125, 0.03021240234375, 0.0225067138671875, 0.0621337890625, 0.058837890625, 0.0198822021484375, 0.01959228515625, -0.035491943359375, 0.07354736328125, -0.059906005859375, -0.046173095703125, -0.0572509765625, 0.060821533203125, -0.0195159912109375, -0.02587890625, 0.0806884765625, 0.05242919921875, 0.05047607421875, -0.01114654541015625, 0.066650390625, -0.0426025390625, 0.034698486328125, -0.01004791259765625, 0.0501708984375, -0.049072265625, -0.009796142578125, -0.0501708984375, -0.0657958984375, -0.01580810546875, 0.056396484375, -0.006946563720703125, 0.0265960693359375, 0.036285400390625, 0.0582275390625, -0.0089874267578125, -0.003971099853515625, 0.0050811767578125, 0.031158447265625, 0.0264892578125, 0.04962158203125, 0.0244140625, -0.04522705078125, 0.030975341796875, -0.0570068359375, -0.02349853515625, -0.0160980224609375, -0.05242919921875, -0.032470703125, -0.03814697265625, -0.036285400390625, -0.049407958984375, 0.008514404296875, 0.06268310546875, 0.041839599609375, -0.053436279296875, -0.02923583984375, -0.0208892822265625, 0.01061248779296875, -0.0343017578125, -0.01947021484375, 0.0297393798828125, 0.01244354248046875, -0.05694580078125, -0.004634857177734375, 0.0144195556640625, 0.04205322265625, 0.004550933837890625, -0.01537322998046875, -0.03082275390625, 0.0069732666015625, 0.02069091796875, 0.0211639404296875, -0.0296173095703125, -0.01549530029296875, -0.00333404541015625, -0.00702667236328125, 0.0227508544921875, 0.00957489013671875, -0.0306243896484375, 0.03326416015625, 0.049652099609375, 0.005725860595703125, 0.0638427734375, 0.00923919677734375, 0.00756072998046875, -0.0168914794921875, 0.0138702392578125, 0.020050048828125, 0.02264404296875, -0.001461029052734375, -0.023590087890625, 0.05242919921875, 0.03411865234375, -0.05828857421875, -0.049957275390625, -0.004352569580078125, -0.09478759765625, -0.027252197265625, 0.09210205078125, -0.0016450881958007812, -0.04437255859375, 0.0094757080078125, -0.022552490234375, 0.0198211669921875, -0.046051025390625, 0.037109375, 0.03375244140625, -0.034332275390625, -0.013397216796875, -0.04400634765625, 0.03643798828125, 0.0027751922607421875, -0.058837890625, -0.0106964111328125, 0.0294189453125, 0.034637451171875, 0.029754638671875, 0.05322265625, -0.0283355712890625, 0.01103973388671875, 0.0167999267578125, -0.00146484375, 0.0031108856201171875, -0.0005459785461425781, -0.0025119781494140625, 0.030242919921875, -0.01016998291015625, -0.015350341796875 ] ]
microsoft/trocr-base-printed
2023-01-24T16:57:27.000Z
[ "transformers", "pytorch", "vision-encoder-decoder", "trocr", "image-to-text", "arxiv:2109.10282", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
microsoft
null
null
microsoft/trocr-base-printed
100
160,292
transformers
2022-03-02T23:29:05
--- tags: - trocr - image-to-text widget: - src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X00016469612_1.jpg example_title: Printed 1 - src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X51005255805_7.jpg example_title: Printed 2 - src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X51005745214_6.jpg example_title: Printed 3 --- # TrOCR (base-sized model, fine-tuned on SROIE) TrOCR model fine-tuned on the [SROIE dataset](https://rrc.cvc.uab.es/?ch=13). It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr). Disclaimer: The team releasing TrOCR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of BEiT, while the text decoder was initialized from the weights of RoBERTa. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens. ## Intended uses & limitations You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model in PyTorch: ```python from transformers import TrOCRProcessor, VisionEncoderDecoderModel from PIL import Image import requests # load image from the IAM database (actually this model is meant to be used on printed text) url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg' image = Image.open(requests.get(url, stream=True).raw).convert("RGB") processor = TrOCRProcessor.from_pretrained('microsoft/trocr-base-printed') model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-base-printed') pixel_values = processor(images=image, return_tensors="pt").pixel_values generated_ids = model.generate(pixel_values) generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0] ``` ### BibTeX entry and citation info ```bibtex @misc{li2021trocr, title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models}, author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei}, year={2021}, eprint={2109.10282}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
2,976
[ [ -0.02001953125, -0.02392578125, 0.0034999847412109375, -0.0284423828125, -0.032257080078125, -0.0055084228515625, 0.00023031234741210938, -0.06390380859375, 0.01084136962890625, 0.04510498046875, -0.031402587890625, -0.0286407470703125, -0.046051025390625, 0.0177154541015625, -0.0193328857421875, 0.07989501953125, -0.0114898681640625, -0.0032100677490234375, 0.00925445556640625, -0.034149169921875, -0.00832366943359375, -0.041290283203125, -0.03826904296875, -0.0178375244140625, 0.032135009765625, 0.021270751953125, 0.04876708984375, 0.0521240234375, 0.07080078125, 0.0282745361328125, -0.024749755859375, 0.012725830078125, -0.025238037109375, -0.02813720703125, 0.01123046875, -0.03277587890625, -0.040008544921875, 0.006526947021484375, 0.0474853515625, 0.013153076171875, -0.005672454833984375, 0.009002685546875, 0.007396697998046875, 0.038543701171875, -0.0207672119140625, -0.006900787353515625, -0.0266265869140625, 0.0268402099609375, 0.0029048919677734375, -0.004734039306640625, -0.034942626953125, -0.024810791015625, 0.0195159912109375, -0.042999267578125, 0.0452880859375, 0.002033233642578125, 0.09820556640625, -0.009368896484375, -0.0230560302734375, -0.031402587890625, -0.052032470703125, 0.048492431640625, -0.046905517578125, 0.03851318359375, -0.004955291748046875, 0.01055145263671875, 0.0075531005859375, -0.08062744140625, -0.06005859375, -0.0308074951171875, -0.024322509765625, 0.007587432861328125, -0.0220184326171875, 0.017181396484375, 0.031097412109375, 0.037353515625, -0.0494384765625, -0.020111083984375, -0.05682373046875, -0.0245208740234375, 0.0235595703125, -0.0048065185546875, 0.025909423828125, -0.002025604248046875, -0.037078857421875, -0.032684326171875, -0.006435394287109375, -0.002887725830078125, -0.00012922286987304688, 0.003940582275390625, -0.02984619140625, 0.0521240234375, 0.01904296875, 0.0611572265625, 0.0239410400390625, -0.0211944580078125, 0.041656494140625, -0.01074981689453125, -0.00579071044921875, -0.0007443428039550781, 0.0810546875, 0.022705078125, 0.0273284912109375, -0.01207733154296875, -0.0206756591796875, 0.021240234375, 0.012054443359375, -0.06280517578125, -0.003116607666015625, -0.007648468017578125, -0.042755126953125, -0.0248565673828125, 0.01361846923828125, -0.0623779296875, -0.0118560791015625, -0.0098876953125, 0.03173828125, -0.03277587890625, 0.0006575584411621094, -0.01117706298828125, -0.014129638671875, 0.01212310791015625, 0.0174713134765625, -0.038909912109375, 0.0038318634033203125, 0.00907135009765625, 0.08587646484375, -0.012481689453125, -0.0261077880859375, -0.0261688232421875, -0.0027294158935546875, -0.018646240234375, 0.0506591796875, -0.024993896484375, -0.025299072265625, -0.004604339599609375, 0.01317596435546875, -0.0009093284606933594, -0.0379638671875, 0.039031982421875, -0.037353515625, 0.02154541015625, 0.0076446533203125, -0.0052032470703125, -0.00522613525390625, 0.02972412109375, -0.06768798828125, 0.087158203125, 0.0223541259765625, -0.06610107421875, 0.0130615234375, -0.048736572265625, -0.0193634033203125, 0.005260467529296875, 0.00946807861328125, -0.0587158203125, 0.007701873779296875, 0.0017766952514648438, 0.0007085800170898438, -0.0142822265625, -0.00041675567626953125, -0.0086822509765625, -0.0268707275390625, 0.008575439453125, -0.01436614990234375, 0.05804443359375, 0.0253753662109375, -0.0266876220703125, -0.00244140625, -0.0692138671875, 0.0112762451171875, 0.01096343994140625, -0.0256195068359375, -0.002429962158203125, -0.01898193359375, 0.024993896484375, 0.029083251953125, 0.0308837890625, -0.04095458984375, 0.0207977294921875, -0.0250396728515625, 0.058868408203125, 0.029632568359375, -0.0193023681640625, 0.037109375, -0.0126495361328125, 0.03472900390625, 0.0195465087890625, 0.0115814208984375, -0.008575439453125, -0.0234222412109375, -0.06634521484375, -0.0141143798828125, 0.0172119140625, 0.054779052734375, -0.07720947265625, 0.0235443115234375, -0.0274810791015625, -0.051361083984375, -0.025177001953125, -0.0101318359375, 0.041168212890625, 0.060943603515625, 0.027191162109375, -0.04559326171875, -0.03326416015625, -0.046539306640625, -0.01056671142578125, -0.023956298828125, 0.00757598876953125, 0.0261688232421875, 0.049591064453125, -0.0211029052734375, 0.062469482421875, -0.0260162353515625, -0.044036865234375, -0.025146484375, 0.020172119140625, 0.01042938232421875, 0.05291748046875, 0.03021240234375, -0.056427001953125, -0.037078857421875, 0.00513458251953125, -0.055816650390625, 0.0032749176025390625, -0.002849578857421875, -0.007221221923828125, 0.0386962890625, 0.0215606689453125, -0.052490234375, 0.06005859375, 0.03680419921875, -0.033355712890625, 0.0460205078125, -0.0367431640625, 0.01290130615234375, -0.0836181640625, 0.0207061767578125, 0.007030487060546875, -0.0218505859375, -0.064208984375, 0.0165557861328125, 0.01561737060546875, -0.023895263671875, -0.031890869140625, 0.043121337890625, -0.051055908203125, -0.004520416259765625, -0.004230499267578125, 0.008575439453125, 0.016754150390625, 0.047332763671875, 0.03936767578125, 0.06317138671875, 0.01446533203125, -0.0330810546875, 0.01432037353515625, 0.0265655517578125, -0.0244293212890625, 0.0391845703125, -0.0791015625, 0.034637451171875, -0.0023365020751953125, -0.0013256072998046875, -0.056121826171875, 0.01593017578125, 0.03314208984375, -0.0352783203125, 0.0268402099609375, -0.004367828369140625, -0.036346435546875, -0.05535888671875, 0.001659393310546875, 0.04046630859375, 0.031036376953125, -0.041839599609375, 0.07696533203125, 0.01328277587890625, 0.024139404296875, -0.036956787109375, -0.08636474609375, -0.00010311603546142578, -0.006061553955078125, -0.056243896484375, 0.041473388671875, -0.0165863037109375, 0.0205230712890625, -0.0048980712890625, 0.0021190643310546875, -0.0159454345703125, -0.03594970703125, 0.00461578369140625, 0.03802490234375, -0.0231170654296875, -0.014739990234375, -0.0379638671875, -0.0126190185546875, -0.02825927734375, -0.0205230712890625, 0.042572021484375, -0.0273895263671875, 0.005138397216796875, -0.045623779296875, 0.018218994140625, 0.048004150390625, -0.041168212890625, 0.0469970703125, 0.04779052734375, -0.0255279541015625, 0.0081939697265625, -0.04315185546875, -0.01271820068359375, -0.0377197265625, 0.031890869140625, -0.0255889892578125, -0.053802490234375, 0.058685302734375, 0.0304412841796875, -0.010589599609375, 0.031829833984375, 0.02386474609375, 0.00302886962890625, 0.0679931640625, 0.064208984375, 0.012664794921875, 0.06658935546875, -0.038665771484375, 0.02264404296875, -0.064697265625, -0.0288848876953125, -0.031829833984375, -0.0250396728515625, -0.049041748046875, -0.02178955078125, 0.0245208740234375, 0.0015916824340820312, -0.0179443359375, 0.049041748046875, -0.08233642578125, 0.0196075439453125, 0.052337646484375, 0.0328369140625, 0.011199951171875, 0.0200653076171875, -0.0174713134765625, 0.0004088878631591797, -0.03680419921875, -0.03887939453125, 0.04046630859375, 0.015472412109375, 0.0596923828125, -0.0190277099609375, 0.0384521484375, 0.0189361572265625, 0.003997802734375, -0.057769775390625, 0.046173095703125, -0.0225372314453125, -0.037139892578125, 0.00507354736328125, -0.03448486328125, -0.06268310546875, 0.0028533935546875, -0.0254058837890625, -0.0604248046875, 0.06121826171875, 0.0323486328125, -0.004642486572265625, 0.042388916015625, -0.053253173828125, 0.0728759765625, -0.025238037109375, -0.022613525390625, 0.0206298828125, -0.05718994140625, -0.001056671142578125, 0.023712158203125, -0.01230621337890625, 0.02734375, 0.00826263427734375, 0.07220458984375, -0.0521240234375, 0.05596923828125, -0.006175994873046875, 0.0033435821533203125, 0.03955078125, -0.0006918907165527344, 0.0478515625, -0.036590576171875, -0.01215362548828125, 0.048095703125, 0.0101470947265625, -0.015625, -0.0233001708984375, 0.0185394287109375, -0.0704345703125, -0.0166778564453125, -0.06298828125, -0.049407958984375, 0.0182647705078125, 0.041046142578125, 0.061004638671875, 0.045379638671875, -0.0019855499267578125, -0.004528045654296875, 0.043121337890625, -0.007076263427734375, 0.03277587890625, 0.026336669921875, -0.00072479248046875, -0.058441162109375, 0.06103515625, 0.01436614990234375, 0.0279693603515625, 0.037506103515625, 0.0129852294921875, -0.01482391357421875, -0.0216064453125, -0.0240020751953125, 0.03570556640625, -0.04949951171875, -0.0214996337890625, -0.033111572265625, -0.028839111328125, -0.0295867919921875, -0.0173187255859375, -0.0183258056640625, -0.017242431640625, -0.0540771484375, 0.016693115234375, 0.028228759765625, 0.045013427734375, -0.00019872188568115234, 0.058746337890625, -0.06280517578125, 0.0308685302734375, 0.00555419921875, 0.027099609375, -0.005222320556640625, -0.049407958984375, -0.0155487060546875, 0.00843048095703125, -0.0287933349609375, -0.061187744140625, 0.06085205078125, 0.0271148681640625, 0.020965576171875, 0.0335693359375, 0.0021495819091796875, 0.061676025390625, -0.0440673828125, 0.0517578125, 0.0396728515625, -0.07177734375, 0.033203125, 0.00589752197265625, 0.020965576171875, 0.025146484375, 0.005939483642578125, -0.041839599609375, -0.0085296630859375, -0.04248046875, -0.042266845703125, 0.08367919921875, 0.0058135986328125, -0.004283905029296875, 0.0188446044921875, 0.03936767578125, -0.0216217041015625, 0.01343536376953125, -0.07666015625, -0.02081298828125, -0.03460693359375, -0.0418701171875, -0.00634002685546875, -0.0307464599609375, 0.012939453125, -0.017242431640625, 0.032318115234375, -0.0040740966796875, 0.060699462890625, 0.04754638671875, -0.035614013671875, -0.0135650634765625, -0.0052642822265625, 0.05328369140625, 0.033355712890625, -0.017364501953125, 0.01558685302734375, -0.003551483154296875, -0.0809326171875, -0.007648468017578125, 0.007671356201171875, -0.0261688232421875, 0.009765625, 0.03253173828125, 0.0833740234375, -0.0135498046875, -0.034332275390625, 0.033843994140625, -0.006023406982421875, -0.0194854736328125, -0.0178375244140625, -0.00818634033203125, -0.02288818359375, 0.01453399658203125, 0.036834716796875, 0.0178985595703125, -0.0006914138793945312, -0.036163330078125, 0.00978851318359375, 0.03790283203125, -0.0555419921875, -0.0245208740234375, 0.04974365234375, -0.00957489013671875, -0.0416259765625, 0.058380126953125, -0.00040435791015625, -0.06854248046875, 0.05816650390625, 0.053497314453125, 0.051300048828125, -0.0186309814453125, 0.01273345947265625, 0.044158935546875, 0.0408935546875, -0.00008678436279296875, 0.0213165283203125, -0.0030689239501953125, -0.059722900390625, 0.019866943359375, -0.041778564453125, -0.01522064208984375, -0.001201629638671875, -0.054412841796875, 0.03472900390625, -0.037750244140625, -0.029815673828125, -0.00649261474609375, 0.0164794921875, -0.049407958984375, 0.033203125, 0.006053924560546875, 0.07720947265625, -0.038665771484375, 0.05682373046875, 0.046478271484375, -0.03887939453125, -0.055511474609375, -0.01207733154296875, -0.0181732177734375, -0.07928466796875, 0.048095703125, 0.0216064453125, -0.0081787109375, 0.01309967041015625, -0.0484619140625, -0.05523681640625, 0.08868408203125, 0.01904296875, -0.04669189453125, -0.025970458984375, 0.0292816162109375, 0.04754638671875, -0.035247802734375, 0.047088623046875, 0.0163726806640625, 0.0179595947265625, 0.035064697265625, -0.05792236328125, 0.003734588623046875, -0.0311279296875, 0.0261993408203125, 0.004619598388671875, -0.05059814453125, 0.0689697265625, -0.036376953125, -0.02142333984375, 0.029571533203125, 0.054962158203125, 0.019805908203125, 0.01922607421875, 0.033416748046875, 0.05029296875, 0.048187255859375, -0.012908935546875, 0.0657958984375, -0.0309600830078125, 0.039947509765625, 0.05938720703125, -0.0016241073608398438, 0.057098388671875, 0.0340576171875, 0.0013275146484375, 0.049224853515625, 0.0377197265625, -0.033660888671875, 0.0401611328125, -0.0184326171875, 0.0087738037109375, 0.0027561187744140625, -0.0020008087158203125, -0.029571533203125, 0.01493072509765625, 0.01143646240234375, -0.05352783203125, -0.0007004737854003906, 0.0122833251953125, -0.0147857666015625, -0.0261077880859375, -0.0304718017578125, 0.051544189453125, 0.0023345947265625, -0.0386962890625, 0.054656982421875, -0.0047607421875, 0.0648193359375, -0.053192138671875, -0.0027942657470703125, -0.00876617431640625, 0.03814697265625, -0.01268768310546875, -0.05767822265625, 0.01275634765625, 0.0003223419189453125, -0.0248870849609375, 0.01070404052734375, 0.05755615234375, -0.0428466796875, -0.06671142578125, 0.024017333984375, -0.006046295166015625, 0.01342010498046875, 0.02301025390625, -0.06048583984375, 0.0166778564453125, 0.00331878662109375, -0.018341064453125, -0.0000388026237487793, 0.035064697265625, 0.005340576171875, 0.041534423828125, 0.0400390625, 0.006061553955078125, 0.0197296142578125, -0.0156402587890625, 0.053985595703125, -0.0489501953125, -0.0421142578125, -0.045074462890625, 0.046112060546875, 0.004177093505859375, -0.038787841796875, 0.03985595703125, 0.03985595703125, 0.04931640625, -0.0213623046875, 0.032989501953125, -0.0171051025390625, 0.0146331787109375, -0.018890380859375, 0.07122802734375, -0.05963134765625, -0.01145172119140625, -0.033111572265625, -0.0531005859375, -0.037841796875, 0.07403564453125, -0.0168304443359375, 0.0228729248046875, 0.0478515625, 0.08355712890625, -0.012420654296875, -0.0203399658203125, 0.0142822265625, 0.0195159912109375, 0.00740814208984375, 0.04302978515625, 0.03448486328125, -0.069091796875, 0.06201171875, -0.0295562744140625, -0.01374053955078125, -0.019012451171875, -0.0650634765625, -0.0751953125, -0.0489501953125, -0.033416748046875, -0.051666259765625, -0.0008053779602050781, 0.041412353515625, 0.06292724609375, -0.0723876953125, -0.0108489990234375, -0.0193939208984375, 0.0056915283203125, -0.0164337158203125, -0.019805908203125, 0.03448486328125, 0.00876617431640625, -0.0555419921875, -0.03277587890625, -0.00948333740234375, 0.0290679931640625, 0.00909423828125, -0.0179901123046875, -0.01338958740234375, -0.0019664764404296875, 0.03289794921875, 0.043853759765625, -0.045745849609375, -0.00942230224609375, 0.012542724609375, -0.0215911865234375, 0.039459228515625, 0.042266845703125, -0.05267333984375, 0.02886962890625, 0.042205810546875, 0.005657196044921875, 0.04632568359375, -0.00847625732421875, 0.0106353759765625, -0.0330810546875, 0.0262603759765625, 0.013885498046875, 0.03814697265625, 0.027862548828125, -0.0360107421875, 0.029571533203125, 0.034576416015625, -0.04095458984375, -0.0712890625, -0.007137298583984375, -0.101806640625, 0.010223388671875, 0.06292724609375, -0.01065826416015625, -0.040618896484375, 0.0247955322265625, -0.022613525390625, 0.0345458984375, -0.030242919921875, 0.03448486328125, 0.020477294921875, 0.009429931640625, -0.056243896484375, -0.002407073974609375, 0.0168914794921875, -0.012237548828125, -0.039337158203125, -0.0110931396484375, 0.0245208740234375, 0.0266265869140625, 0.052764892578125, 0.03521728515625, -0.0187530517578125, 0.01544952392578125, 0.00588226318359375, 0.0478515625, -0.01337432861328125, -0.01861572265625, -0.027191162109375, 0.004665374755859375, -0.017425537109375, -0.0155487060546875 ] ]
flair/upos-english
2023-04-07T09:34:50.000Z
[ "flair", "pytorch", "token-classification", "sequence-tagger-model", "en", "dataset:ontonotes", "region:us" ]
token-classification
flair
null
null
flair/upos-english
3
160,268
flair
2022-03-02T23:29:05
--- tags: - flair - token-classification - sequence-tagger-model language: en datasets: - ontonotes widget: - text: "I love Berlin." --- ## English Universal Part-of-Speech Tagging in Flair (default model) This is the standard universal part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/). F1-Score: **98,6** (Ontonotes) Predicts universal POS tags: | **tag** | **meaning** | |---------------------------------|-----------| |ADJ | adjective | | ADP | adposition | | ADV | adverb | | AUX | auxiliary | | CCONJ | coordinating conjunction | | DET | determiner | | INTJ | interjection | | NOUN | noun | | NUM | numeral | | PART | particle | | PRON | pronoun | | PROPN | proper noun | | PUNCT | punctuation | | SCONJ | subordinating conjunction | | SYM | symbol | | VERB | verb | | X | other | Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF. --- ### Demo: How to use in Flair Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`) ```python from flair.data import Sentence from flair.models import SequenceTagger # load tagger tagger = SequenceTagger.load("flair/upos-english") # make example sentence sentence = Sentence("I love Berlin.") # predict NER tags tagger.predict(sentence) # print sentence print(sentence) # print predicted NER spans print('The following NER tags are found:') # iterate over entities and print for entity in sentence.get_spans('pos'): print(entity) ``` This yields the following output: ``` Span [1]: "I" [− Labels: PRON (0.9996)] Span [2]: "love" [− Labels: VERB (1.0)] Span [3]: "Berlin" [− Labels: PROPN (0.9986)] Span [4]: "." [− Labels: PUNCT (1.0)] ``` So, the word "*I*" is labeled as a **pronoun** (PRON), "*love*" is labeled as a **verb** (VERB) and "*Berlin*" is labeled as a **proper noun** (PROPN) in the sentence "*I love Berlin*". --- ### Training: Script to train this model The following Flair script was used to train this model: ```python from flair.data import Corpus from flair.datasets import ColumnCorpus from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings # 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself) corpus: Corpus = ColumnCorpus( "resources/tasks/onto-ner", column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"}, tag_to_bioes="ner", ) # 2. what tag do we want to predict? tag_type = 'upos' # 3. make the tag dictionary from the corpus tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type) # 4. initialize each embedding we use embedding_types = [ # contextual string embeddings, forward FlairEmbeddings('news-forward'), # contextual string embeddings, backward FlairEmbeddings('news-backward'), ] # embedding stack consists of Flair and GloVe embeddings embeddings = StackedEmbeddings(embeddings=embedding_types) # 5. initialize sequence tagger from flair.models import SequenceTagger tagger = SequenceTagger(hidden_size=256, embeddings=embeddings, tag_dictionary=tag_dictionary, tag_type=tag_type) # 6. initialize trainer from flair.trainers import ModelTrainer trainer = ModelTrainer(tagger, corpus) # 7. run training trainer.train('resources/taggers/upos-english', train_with_dev=True, max_epochs=150) ``` --- ### Cite Please cite the following paper when using this model. ``` @inproceedings{akbik2018coling, title={Contextual String Embeddings for Sequence Labeling}, author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland}, booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics}, pages = {1638--1649}, year = {2018} } ``` --- ### Issues? The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
4,092
[ [ -0.02789306640625, -0.040496826171875, 0.008544921875, 0.01593017578125, -0.0251007080078125, -0.01242828369140625, -0.0157318115234375, -0.0258636474609375, 0.049774169921875, 0.0180511474609375, -0.029754638671875, -0.046783447265625, -0.0304718017578125, 0.0218658447265625, 0.0001010894775390625, 0.0819091796875, 0.00600433349609375, 0.029144287109375, -0.014984130859375, -0.0037784576416015625, -0.030914306640625, -0.049346923828125, -0.0350341796875, -0.017242431640625, 0.0382080078125, 0.026031494140625, 0.039337158203125, 0.05841064453125, 0.025909423828125, 0.01983642578125, -0.021270751953125, 0.00731658935546875, -0.005405426025390625, -0.0019550323486328125, -0.0142364501953125, -0.029052734375, -0.053009033203125, 0.0107421875, 0.050201416015625, 0.038299560546875, 0.0123748779296875, 0.0031223297119140625, -0.005657196044921875, 0.013397216796875, -0.0185089111328125, 0.027862548828125, -0.04681396484375, -0.0187530517578125, -0.0214691162109375, -0.0033931732177734375, -0.0208892822265625, -0.017730712890625, 0.0057373046875, -0.03900146484375, -0.00043511390686035156, 0.0167083740234375, 0.1005859375, 0.01047515869140625, -0.030120849609375, -0.020721435546875, -0.030548095703125, 0.059295654296875, -0.06317138671875, 0.0242462158203125, 0.0252838134765625, -0.016510009765625, -0.01349639892578125, -0.047332763671875, -0.05279541015625, -0.0185089111328125, -0.012939453125, 0.017730712890625, -0.00902557373046875, -0.0033245086669921875, 0.0088043212890625, 0.0138092041015625, -0.050567626953125, -0.0046234130859375, -0.01100921630859375, -0.0190887451171875, 0.05596923828125, 0.0097503662109375, 0.022705078125, -0.0399169921875, -0.03424072265625, -0.007648468017578125, -0.02618408203125, 0.00323486328125, 0.0069732666015625, 0.039794921875, -0.01108551025390625, 0.0457763671875, 0.001068115234375, 0.05267333984375, 0.004062652587890625, -0.017822265625, 0.04046630859375, -0.028167724609375, -0.0181884765625, -0.00848388671875, 0.0736083984375, 0.0224609375, 0.01120758056640625, -0.010986328125, -0.00513458251953125, 0.023834228515625, -0.01320648193359375, -0.03631591796875, -0.0189361572265625, 0.0198822021484375, -0.00962066650390625, -0.0228424072265625, 0.0026531219482421875, -0.05377197265625, -0.00835418701171875, -0.00634002685546875, 0.0419921875, -0.04345703125, -0.006435394287109375, 0.01433563232421875, -0.0234375, 0.01309967041015625, 0.0030536651611328125, -0.05889892578125, -0.0033359527587890625, 0.0299072265625, 0.041595458984375, 0.0146942138671875, -0.0263671875, -0.01739501953125, -0.005126953125, -0.01053619384765625, 0.056304931640625, -0.032501220703125, -0.0222625732421875, 0.00328826904296875, 0.01111602783203125, -0.028411865234375, -0.01375579833984375, 0.057647705078125, -0.040252685546875, 0.034271240234375, -0.0187225341796875, -0.06329345703125, -0.024505615234375, 0.0152130126953125, -0.037017822265625, 0.0716552734375, -0.00044536590576171875, -0.0867919921875, 0.029510498046875, -0.037017822265625, -0.0341796875, 0.00215911865234375, -0.005207061767578125, -0.0304412841796875, -0.0149383544921875, 0.0084228515625, 0.05889892578125, -0.010162353515625, 0.0198822021484375, -0.028564453125, -0.002094268798828125, 0.017822265625, 0.015106201171875, 0.06573486328125, 0.006134033203125, -0.01345062255859375, 0.0093536376953125, -0.06512451171875, -0.0185394287109375, 0.016448974609375, -0.03472900390625, -0.0302276611328125, 0.00119781494140625, 0.00829315185546875, 0.0166473388671875, 0.01271820068359375, -0.049224853515625, 0.0439453125, -0.045684814453125, 0.031463623046875, 0.035308837890625, -0.0012044906616210938, 0.048126220703125, -0.032684326171875, 0.033447265625, 0.0112457275390625, -0.0213775634765625, -0.0090789794921875, -0.05084228515625, -0.05352783203125, -0.0341796875, 0.04766845703125, 0.056304931640625, -0.050384521484375, 0.06231689453125, -0.0273590087890625, -0.04913330078125, -0.040191650390625, -0.0181427001953125, 0.022796630859375, 0.0498046875, 0.03546142578125, -0.01407623291015625, -0.07000732421875, -0.0518798828125, -0.016357421875, -0.008941650390625, 0.0212860107421875, 0.0123748779296875, 0.062744140625, -0.0195770263671875, 0.0673828125, -0.0352783203125, -0.033538818359375, -0.0250091552734375, 0.0120086669921875, 0.031402587890625, 0.0460205078125, 0.032501220703125, -0.051605224609375, -0.04296875, -0.0187835693359375, -0.034515380859375, 0.0087738037109375, -0.007843017578125, -0.0028057098388671875, 0.027557373046875, 0.0218505859375, -0.04339599609375, 0.0300140380859375, 0.0216064453125, -0.05120849609375, 0.0462646484375, 0.0008711814880371094, -0.01071929931640625, -0.1160888671875, 0.0186767578125, 0.019073486328125, -0.01953125, -0.048065185546875, -0.01229095458984375, -0.0026645660400390625, 0.025299072265625, -0.028564453125, 0.05718994140625, -0.0296630859375, 0.01033782958984375, 0.0015745162963867188, 0.0120086669921875, 0.006427764892578125, 0.031585693359375, 0.018341064453125, 0.037017822265625, 0.044708251953125, -0.046478271484375, 0.0226287841796875, 0.03668212890625, -0.0257568359375, 0.005054473876953125, -0.031585693359375, -0.015350341796875, -0.01555633544921875, 0.02105712890625, -0.0916748046875, -0.0227813720703125, 0.033416748046875, -0.061859130859375, 0.040740966796875, 0.0029163360595703125, -0.037567138671875, -0.031646728515625, -0.0196685791015625, 0.0038623809814453125, 0.0333251953125, -0.026580810546875, 0.033111572265625, 0.036041259765625, 0.00974273681640625, -0.051055908203125, -0.04962158203125, -0.01544952392578125, -0.0217132568359375, -0.052093505859375, 0.043914794921875, -0.00862884521484375, -0.00504302978515625, 0.0099639892578125, 0.01239776611328125, -0.00521087646484375, 0.0162811279296875, 0.016815185546875, 0.035675048828125, -0.0102081298828125, 0.0179901123046875, -0.0147552490234375, 0.004665374755859375, -0.00559234619140625, -0.01502227783203125, 0.05926513671875, -0.00902557373046875, 0.0182952880859375, -0.036712646484375, 0.0090789794921875, 0.01690673828125, -0.0238494873046875, 0.05615234375, 0.06463623046875, -0.03466796875, -0.0059051513671875, -0.0216522216796875, -0.00748443603515625, -0.0274200439453125, 0.03778076171875, -0.036102294921875, -0.0601806640625, 0.04132080078125, 0.0128021240234375, 0.005016326904296875, 0.061279296875, 0.03857421875, -0.0148162841796875, 0.081298828125, 0.04888916015625, -0.0224609375, 0.0273284912109375, -0.036895751953125, 0.0037746429443359375, -0.060394287109375, -0.01219940185546875, -0.042327880859375, -0.004848480224609375, -0.054656982421875, -0.023529052734375, 0.0105743408203125, 0.03436279296875, -0.0262298583984375, 0.043609619140625, -0.042938232421875, 0.01523590087890625, 0.045806884765625, -0.0103302001953125, 0.007465362548828125, -0.003841400146484375, -0.03173828125, -0.01561737060546875, -0.052398681640625, -0.037200927734375, 0.0655517578125, 0.03472900390625, 0.04571533203125, -0.00269317626953125, 0.06256103515625, 0.005779266357421875, 0.0201263427734375, -0.06884765625, 0.037750244140625, -0.017120361328125, -0.059112548828125, -0.0059661865234375, -0.01239776611328125, -0.07586669921875, 0.01390838623046875, -0.0232086181640625, -0.07550048828125, 0.01551055908203125, 0.012115478515625, -0.037017822265625, 0.026824951171875, -0.0237579345703125, 0.068603515625, 0.005657196044921875, -0.0151824951171875, 0.0137176513671875, -0.05615234375, 0.0189361572265625, 0.0128173828125, 0.0306396484375, -0.017364501953125, -0.00328826904296875, 0.08099365234375, -0.0157012939453125, 0.07281494140625, 0.00627899169921875, 0.0103607177734375, 0.0225830078125, 0.00261688232421875, 0.017578125, 0.007598876953125, -0.0033111572265625, 0.005687713623046875, 0.003917694091796875, -0.0143890380859375, -0.0061798095703125, 0.04620361328125, -0.056396484375, -0.022705078125, -0.06231689453125, -0.0258026123046875, -0.009063720703125, 0.0165863037109375, 0.054229736328125, 0.036376953125, -0.0161285400390625, -0.01233673095703125, 0.03466796875, -0.01153564453125, 0.049835205078125, 0.030914306640625, -0.034912109375, -0.05426025390625, 0.06689453125, 0.007335662841796875, -0.01387786865234375, 0.039581298828125, 0.0165863037109375, -0.032867431640625, -0.004611968994140625, -0.026763916015625, 0.046661376953125, -0.041534423828125, -0.0281982421875, -0.0478515625, -0.01082611083984375, -0.07012939453125, -0.00450897216796875, -0.0185089111328125, -0.041595458984375, -0.0550537109375, 0.001697540283203125, 0.026611328125, 0.056732177734375, -0.0272064208984375, 0.0230712890625, -0.057708740234375, -0.008697509765625, 0.0010061264038085938, 0.006031036376953125, -0.0171051025390625, -0.0701904296875, -0.0249176025390625, 0.00075531005859375, -0.0270233154296875, -0.0819091796875, 0.0673828125, 0.0211639404296875, 0.028839111328125, 0.02838134765625, -0.007312774658203125, 0.031890869140625, -0.032440185546875, 0.07525634765625, 0.00843048095703125, -0.07421875, 0.039398193359375, -0.029022216796875, 0.0096588134765625, 0.01153564453125, 0.06890869140625, -0.04449462890625, -0.004680633544921875, -0.059478759765625, -0.06658935546875, 0.049224853515625, -0.004566192626953125, 0.003200531005859375, -0.0269012451171875, 0.020538330078125, -0.014556884765625, 0.0092315673828125, -0.0738525390625, -0.04193115234375, -0.00948333740234375, -0.018707275390625, -0.0195159912109375, -0.0184326171875, 0.0008916854858398438, -0.04656982421875, 0.08544921875, 0.0003387928009033203, 0.041259765625, 0.036407470703125, 0.0019521713256835938, 0.0038242340087890625, 0.0186614990234375, 0.047515869140625, 0.013885498046875, -0.03094482421875, -0.00373077392578125, 0.007564544677734375, -0.0192413330078125, -0.0090179443359375, 0.01468658447265625, -0.007564544677734375, 0.0214691162109375, 0.0306396484375, 0.06365966796875, 0.01084136962890625, -0.02545166015625, 0.04595947265625, 0.0026378631591796875, -0.013397216796875, -0.036102294921875, -0.0179901123046875, 0.01435089111328125, 0.0085601806640625, 0.005756378173828125, 0.005306243896484375, -0.00254058837890625, -0.04193115234375, 0.0161895751953125, 0.033172607421875, -0.036163330078125, -0.045257568359375, 0.06427001953125, -0.00038123130798339844, -0.009521484375, 0.017364501953125, -0.04510498046875, -0.06622314453125, 0.04254150390625, 0.050811767578125, 0.056610107421875, -0.018890380859375, 0.0142974853515625, 0.04833984375, 0.0086669921875, 0.00021541118621826172, 0.059051513671875, 0.02410888671875, -0.0802001953125, -0.029693603515625, -0.0721435546875, 0.0021820068359375, 0.01285552978515625, -0.04351806640625, 0.028472900390625, -0.0223541259765625, -0.0288543701171875, 0.0262451171875, 0.0112457275390625, -0.05474853515625, 0.0271453857421875, 0.03271484375, 0.08056640625, -0.0736083984375, 0.08221435546875, 0.09063720703125, -0.06231689453125, -0.079833984375, -0.01329803466796875, -0.00478363037109375, -0.0521240234375, 0.0599365234375, 0.0189971923828125, 0.028411865234375, 0.01355743408203125, -0.052734375, -0.08929443359375, 0.07196044921875, -0.01367950439453125, -0.02069091796875, -0.0155792236328125, -0.00519561767578125, 0.03741455078125, -0.040496826171875, 0.0311279296875, 0.041900634765625, 0.0330810546875, 0.0022220611572265625, -0.07275390625, -0.00013399124145507812, -0.0174102783203125, -0.00228118896484375, 0.005458831787109375, -0.053192138671875, 0.0811767578125, -0.01280975341796875, -0.016510009765625, 0.019683837890625, 0.06964111328125, -0.0005884170532226562, 0.0020351409912109375, 0.022369384765625, 0.0682373046875, 0.051422119140625, -0.0162811279296875, 0.061065673828125, -0.028900146484375, 0.04644775390625, 0.0833740234375, -0.0011568069458007812, 0.07574462890625, 0.02520751953125, -0.01175689697265625, 0.041961669921875, 0.059906005859375, -0.00891876220703125, 0.0364990234375, 0.0156707763671875, -0.006816864013671875, -0.0239715576171875, -0.022857666015625, -0.028717041015625, 0.053009033203125, 0.0275726318359375, -0.03515625, 0.005184173583984375, 0.00269317626953125, 0.045074462890625, -0.003192901611328125, -0.023834228515625, 0.05712890625, 0.0003705024719238281, -0.0423583984375, 0.04742431640625, 0.0108642578125, 0.0750732421875, -0.031036376953125, 0.006160736083984375, -0.00916290283203125, 0.01360321044921875, -0.012969970703125, -0.0535888671875, 0.01282501220703125, -0.0241546630859375, -0.01113128662109375, 0.0005125999450683594, 0.052398681640625, -0.054412841796875, -0.02288818359375, 0.024658203125, 0.0303192138671875, 0.0164947509765625, 0.002643585205078125, -0.052398681640625, -0.00931549072265625, 0.0181884765625, -0.030364990234375, 0.01497650146484375, 0.01003265380859375, 0.013702392578125, 0.03424072265625, 0.027618408203125, 0.01546478271484375, 0.0048828125, -0.0171966552734375, 0.06622314453125, -0.060028076171875, -0.0284576416015625, -0.060394287109375, 0.055419921875, 0.00212860107421875, -0.03985595703125, 0.057159423828125, 0.051025390625, 0.06732177734375, -0.01352691650390625, 0.06195068359375, -0.0282135009765625, 0.06170654296875, -0.0160064697265625, 0.051422119140625, -0.0584716796875, -0.000005662441253662109, -0.01534271240234375, -0.047607421875, -0.038177490234375, 0.048553466796875, -0.0214691162109375, -0.0218505859375, 0.0435791015625, 0.059173583984375, 0.0117034912109375, -0.005374908447265625, 0.0171966552734375, 0.034637451171875, 0.003101348876953125, 0.02752685546875, 0.04339599609375, -0.04510498046875, 0.02069091796875, -0.0452880859375, -0.01837158203125, -0.017913818359375, -0.06353759765625, -0.06646728515625, -0.06951904296875, -0.032806396484375, -0.060211181640625, -0.0144805908203125, 0.09295654296875, 0.029937744140625, -0.0684814453125, -0.019378662109375, 0.012664794921875, -0.002887725830078125, 0.005580902099609375, -0.0207366943359375, 0.033599853515625, -0.0247650146484375, -0.055633544921875, 0.0269775390625, -0.0146636962890625, 0.0122528076171875, 0.005832672119140625, 0.01082611083984375, -0.053375244140625, 0.00931549072265625, 0.032318115234375, 0.0289154052734375, -0.055145263671875, -0.0210113525390625, 0.00588226318359375, -0.0248260498046875, 0.0145111083984375, 0.01470947265625, -0.04852294921875, 0.0176239013671875, 0.057220458984375, 0.0198822021484375, 0.0149688720703125, -0.0008831024169921875, 0.0247802734375, -0.05621337890625, 0.00213623046875, 0.0257110595703125, 0.048980712890625, 0.019317626953125, -0.01206207275390625, 0.032073974609375, 0.041595458984375, -0.05950927734375, -0.049407958984375, 0.00040078163146972656, -0.0833740234375, -0.0192413330078125, 0.0948486328125, -0.0158538818359375, -0.036285400390625, 0.00589752197265625, -0.0160980224609375, 0.049346923828125, -0.035858154296875, 0.0284576416015625, 0.03533935546875, -0.01003265380859375, 0.0217437744140625, -0.01137542724609375, 0.062255859375, 0.03399658203125, -0.03271484375, -0.0227508544921875, 0.020904541015625, 0.0408935546875, 0.0306549072265625, 0.042327880859375, 0.01049041748046875, 0.0006284713745117188, -0.0038509368896484375, 0.04351806640625, 0.015167236328125, -0.01334381103515625, -0.0408935546875, -0.007526397705078125, -0.005664825439453125, -0.02911376953125 ] ]
kredor/punctuate-all
2022-04-28T05:26:05.000Z
[ "transformers", "pytorch", "xlm-roberta", "token-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
kredor
null
null
kredor/punctuate-all
9
158,413
transformers
2022-04-09T12:05:11
This is based on [Oliver Guhr's work](https://huggingface.co/oliverguhr/fullstop-punctuation-multilang-large). The difference is that it is a finetuned xlm-roberta-base instead of an xlm-roberta-large and on twelve languages instead of four. The languages are: English, German, French, Spanish, Bulgarian, Italian, Polish, Dutch, Czech, Portugese, Slovak, Slovenian. ----- report ----- precision recall f1-score support 0 0.99 0.99 0.99 73317475 . 0.94 0.95 0.95 4484845 , 0.86 0.86 0.86 6100650 ? 0.88 0.85 0.86 136479 - 0.60 0.29 0.39 233630 : 0.71 0.49 0.58 152424 accuracy 0.98 84425503 macro avg 0.83 0.74 0.77 84425503 weighted avg 0.98 0.98 0.98 84425503 ----- confusion matrix ----- t/p 0 . , ? - : 0 1.0 0.0 0.0 0.0 0.0 0.0 . 0.0 1.0 0.0 0.0 0.0 0.0 , 0.1 0.0 0.9 0.0 0.0 0.0 ? 0.0 0.1 0.0 0.8 0.0 0.0 - 0.1 0.1 0.5 0.0 0.3 0.0 : 0.0 0.3 0.1 0.0 0.0 0.5
1,290
[ [ -0.01190185546875, -0.051361083984375, 0.052642822265625, 0.048248291015625, -0.010406494140625, 0.01274871826171875, -0.033172607421875, -0.033050537109375, 0.018310546875, 0.030059814453125, -0.0210723876953125, -0.040924072265625, -0.0374755859375, 0.032989501953125, -0.01108551025390625, 0.06329345703125, -0.018402099609375, 0.005275726318359375, 0.0181884765625, -0.022430419921875, -0.026611328125, -0.0474853515625, -0.0401611328125, -0.01421356201171875, 0.0242462158203125, 0.039276123046875, 0.04522705078125, 0.0285491943359375, 0.022735595703125, 0.01430511474609375, -0.0140228271484375, -0.0034313201904296875, -0.03546142578125, -0.0158843994140625, 0.007801055908203125, -0.0283966064453125, -0.021270751953125, -0.007843017578125, 0.054534912109375, 0.0248870849609375, -0.01568603515625, 0.000881195068359375, 0.00917816162109375, 0.0738525390625, -0.03570556640625, 0.0194091796875, -0.018524169921875, 0.011383056640625, -0.01299285888671875, 0.0107269287109375, -0.04156494140625, -0.00893402099609375, -0.0087127685546875, -0.05352783203125, -0.015716552734375, 0.015869140625, 0.072265625, -0.00809478759765625, -0.04534912109375, -0.016082763671875, -0.053009033203125, 0.06982421875, -0.029144287109375, 0.0284576416015625, 0.014251708984375, 0.0166168212890625, -0.00004476308822631836, -0.042510986328125, -0.06109619140625, 0.0030727386474609375, -0.00952911376953125, 0.0197601318359375, -0.018310546875, -0.00466156005859375, 0.0288238525390625, 0.03387451171875, -0.051239013671875, -0.0201263427734375, -0.034149169921875, -0.004730224609375, 0.03076171875, -0.00690460205078125, 0.0012159347534179688, 0.004852294921875, -0.035919189453125, -0.0080718994140625, -0.059478759765625, -0.0003483295440673828, 0.017913818359375, 0.02764892578125, -0.0295257568359375, 0.0419921875, -0.0102081298828125, 0.05621337890625, -0.0010814666748046875, -0.019317626953125, 0.05633544921875, -0.0213623046875, -0.0308837890625, -0.01898193359375, 0.07318115234375, 0.0222320556640625, 0.05426025390625, 0.003818511962890625, -0.0006508827209472656, -0.01171875, -0.0009517669677734375, -0.050201416015625, -0.018707275390625, 0.01555633544921875, -0.02410888671875, 0.00406646728515625, 0.017242431640625, -0.036651611328125, 0.0107879638671875, -0.0307159423828125, 0.031829833984375, -0.06817626953125, 0.01007080078125, 0.018341064453125, -0.00487518310546875, 0.01739501953125, 0.0254364013671875, -0.0369873046875, 0.0135955810546875, 0.033477783203125, 0.06964111328125, -0.0010089874267578125, -0.0271759033203125, -0.036651611328125, 0.00974273681640625, -0.015869140625, 0.060150146484375, -0.03851318359375, -0.027618408203125, 0.01678466796875, 0.0209808349609375, -0.0228118896484375, -0.0325927734375, 0.06060791015625, -0.01459503173828125, 0.031494140625, -0.033111572265625, -0.02154541015625, -0.0244293212890625, 0.0311737060546875, -0.06048583984375, 0.0830078125, 0.01221466064453125, -0.03717041015625, 0.020599365234375, -0.042999267578125, -0.048614501953125, -0.005710601806640625, -0.0172576904296875, -0.033050537109375, -0.0182037353515625, 0.0086517333984375, 0.005641937255859375, -0.0177154541015625, 0.0126800537109375, 0.0021228790283203125, -0.032470703125, 0.015411376953125, -0.01873779296875, 0.09033203125, 0.02301025390625, -0.0248870849609375, 0.00009953975677490234, -0.078369140625, 0.023101806640625, -0.004375457763671875, -0.053497314453125, -0.0086212158203125, -0.0289764404296875, 0.01776123046875, 0.045684814453125, 0.0257415771484375, -0.040863037109375, 0.0171966552734375, 0.00885772705078125, 0.032745361328125, 0.03826904296875, -0.0006422996520996094, 0.01088714599609375, -0.034088134765625, 0.0465087890625, 0.007297515869140625, 0.0211334228515625, -0.02191162109375, -0.04754638671875, -0.05859375, -0.0465087890625, 0.023956298828125, 0.07861328125, -0.042266845703125, 0.05841064453125, -0.004058837890625, -0.0176849365234375, -0.0276336669921875, 0.01611328125, 0.049957275390625, 0.029632568359375, 0.041656494140625, -0.0047760009765625, -0.0377197265625, -0.063720703125, 0.01033782958984375, -0.0123748779296875, 0.007472991943359375, -0.0018024444580078125, 0.049957275390625, 0.0007948875427246094, 0.0406494140625, -0.046600341796875, -0.032745361328125, -0.00542449951171875, -0.0005750656127929688, 0.04974365234375, 0.03399658203125, 0.05877685546875, -0.031341552734375, -0.0736083984375, -0.0008378028869628906, -0.042999267578125, 0.010101318359375, -0.0069427490234375, -0.0291290283203125, 0.0272369384765625, 0.0242919921875, -0.060028076171875, 0.034393310546875, 0.03411865234375, -0.032928466796875, 0.07147216796875, -0.039764404296875, 0.01910400390625, -0.081298828125, 0.035247802734375, -0.006549835205078125, -0.01134490966796875, -0.046234130859375, 0.019287109375, 0.04010009765625, 0.004329681396484375, -0.033721923828125, 0.039764404296875, -0.050506591796875, 0.0204315185546875, 0.014251708984375, -0.0103302001953125, 0.01422119140625, 0.041839599609375, -0.0051422119140625, 0.067138671875, 0.02874755859375, -0.026336669921875, 0.01357269287109375, 0.0362548828125, -0.03424072265625, 0.05743408203125, -0.04449462890625, -0.006389617919921875, 0.02154541015625, 0.005130767822265625, -0.0789794921875, 0.00994873046875, 0.00391387939453125, -0.032257080078125, 0.0257568359375, -0.00809478759765625, -0.04815673828125, -0.0293121337890625, -0.034942626953125, 0.0239105224609375, 0.033782958984375, -0.0257415771484375, 0.027313232421875, 0.01081085205078125, -0.020599365234375, -0.052734375, -0.06317138671875, 0.01971435546875, -0.00839996337890625, -0.05621337890625, 0.01342010498046875, -0.035858154296875, -0.0295257568359375, -0.0251007080078125, 0.014373779296875, -0.017242431640625, -0.01084136962890625, -0.003307342529296875, 0.0238189697265625, -0.021514892578125, 0.004825592041015625, -0.01485443115234375, -0.000965118408203125, -0.026458740234375, -0.0012788772583007812, 0.060638427734375, -0.014404296875, -0.0008511543273925781, -0.029083251953125, 0.04315185546875, 0.036346435546875, -0.0225830078125, 0.040771484375, 0.0406494140625, -0.00798797607421875, 0.01666259765625, -0.044158935546875, 0.01282501220703125, -0.02935791015625, 0.014129638671875, -0.03961181640625, -0.058624267578125, 0.06524658203125, 0.03277587890625, -0.01102447509765625, 0.052581787109375, 0.030364990234375, -0.004467010498046875, 0.05548095703125, 0.047149658203125, -0.0260162353515625, 0.025390625, -0.011993408203125, 0.00672149658203125, -0.05548095703125, -0.0229339599609375, -0.050018310546875, -0.01654052734375, -0.0266265869140625, -0.0232391357421875, 0.0225067138671875, 0.0022258758544921875, -0.04827880859375, 0.0274200439453125, -0.02032470703125, 0.038482666015625, 0.05987548828125, -0.0037708282470703125, 0.04351806640625, 0.02667236328125, -0.040130615234375, -0.0162506103515625, -0.024169921875, -0.0234222412109375, 0.0814208984375, -0.0084381103515625, 0.04278564453125, 0.02874755859375, 0.036346435546875, 0.0057830810546875, -0.002452850341796875, -0.056243896484375, 0.0230712890625, -0.0167999267578125, -0.061614990234375, -0.0171661376953125, -0.027618408203125, -0.07745361328125, 0.0138092041015625, -0.0306854248046875, -0.062286376953125, 0.00873565673828125, -0.0077362060546875, -0.045989990234375, 0.0105133056640625, -0.050384521484375, 0.05352783203125, -0.0159759521484375, -0.01434326171875, -0.010040283203125, -0.045257568359375, 0.022186279296875, -0.01202392578125, 0.0303192138671875, 0.00809478759765625, 0.0233612060546875, 0.046966552734375, -0.039825439453125, 0.039642333984375, 0.0006837844848632812, -0.0097503662109375, 0.0299835205078125, 0.0136566162109375, 0.039215087890625, 0.005023956298828125, -0.006015777587890625, 0.01898193359375, -0.0120697021484375, -0.0227813720703125, -0.00962066650390625, 0.036468505859375, -0.058013916015625, -0.0261993408203125, -0.06024169921875, -0.052276611328125, -0.00576019287109375, 0.047821044921875, 0.006816864013671875, 0.0202484130859375, -0.016387939453125, 0.0204010009765625, 0.0304412841796875, -0.0163116455078125, 0.0335693359375, 0.061004638671875, -0.015594482421875, -0.06414794921875, 0.03271484375, 0.01535797119140625, 0.0112152099609375, 0.051116943359375, -0.0030422210693359375, -0.0064697265625, -0.0121002197265625, -0.0244903564453125, 0.048858642578125, -0.03460693359375, -0.00927734375, -0.036651611328125, -0.01519775390625, -0.046783447265625, -0.0158843994140625, -0.005764007568359375, -0.05224609375, -0.00806427001953125, -0.0096588134765625, 0.0236358642578125, 0.05010986328125, -0.01084136962890625, 0.04156494140625, -0.0550537109375, 0.006954193115234375, 0.0171356201171875, 0.0158233642578125, -0.0233917236328125, -0.054779052734375, -0.032196044921875, -0.0025691986083984375, -0.03045654296875, -0.054290771484375, 0.037322998046875, 0.0285186767578125, 0.0419921875, 0.05389404296875, -0.013885498046875, 0.0809326171875, -0.047149658203125, 0.05462646484375, 0.04327392578125, -0.07574462890625, 0.01055908203125, -0.0202789306640625, 0.031768798828125, 0.05316162109375, 0.027587890625, -0.06573486328125, -0.031951904296875, -0.055389404296875, -0.07696533203125, 0.05615234375, 0.0194549560546875, -0.0183563232421875, -0.01035308837890625, 0.002391815185546875, 0.01340484619140625, 0.0207366943359375, -0.053955078125, -0.029449462890625, 0.004291534423828125, -0.0247650146484375, -0.020660400390625, -0.032318115234375, -0.03515625, -0.031494140625, 0.04547119140625, -0.0022125244140625, 0.030303955078125, 0.0081939697265625, -0.0259552001953125, -0.001659393310546875, 0.0196533203125, 0.0826416015625, 0.0743408203125, -0.030426025390625, 0.01119232177734375, -0.001190185546875, -0.051513671875, 0.00235748291015625, 0.005733489990234375, -0.0162506103515625, 0.01244354248046875, 0.04986572265625, 0.050140380859375, 0.007171630859375, -0.056488037109375, 0.040313720703125, 0.0016164779663085938, -0.0296478271484375, -0.0679931640625, -0.0109710693359375, 0.0083465576171875, 0.0186004638671875, 0.05126953125, 0.00603485107421875, -0.0107574462890625, -0.050537109375, 0.0233154296875, 0.02325439453125, -0.0167083740234375, -0.027313232421875, 0.033355712890625, 0.0030155181884765625, -0.0097808837890625, 0.054840087890625, -0.0189361572265625, -0.03570556640625, 0.04730224609375, 0.048675537109375, 0.03143310546875, -0.0308380126953125, 0.0059661865234375, 0.05615234375, 0.034515380859375, -0.021240234375, 0.06414794921875, 0.01029205322265625, -0.05572509765625, -0.0382080078125, -0.0279541015625, -0.0225830078125, 0.017425537109375, -0.0616455078125, 0.0200653076171875, -0.039825439453125, -0.00736236572265625, -0.008453369140625, 0.0030727386474609375, -0.05853271484375, 0.0211334228515625, -0.00812530517578125, 0.0738525390625, -0.06732177734375, 0.07720947265625, 0.049041748046875, -0.03411865234375, -0.053009033203125, -0.0120391845703125, -0.00232696533203125, -0.09979248046875, 0.045379638671875, 0.018798828125, 0.0301971435546875, -0.0262298583984375, -0.02301025390625, -0.074462890625, 0.06805419921875, 0.0268707275390625, -0.059478759765625, -0.0104522705078125, 0.0104522705078125, 0.044097900390625, -0.0243988037109375, 0.03204345703125, 0.040283203125, 0.059051513671875, 0.0089874267578125, -0.09112548828125, 0.004970550537109375, -0.0246429443359375, -0.0233154296875, 0.01482391357421875, -0.0711669921875, 0.1011962890625, -0.022247314453125, -0.01043701171875, 0.0045166015625, 0.0228424072265625, 0.0308837890625, -0.0047454833984375, 0.0285797119140625, 0.06463623046875, 0.058380126953125, -0.01406097412109375, 0.056488037109375, -0.052032470703125, 0.040924072265625, 0.08343505859375, -0.0240478515625, 0.06866455078125, 0.04736328125, -0.047637939453125, 0.053009033203125, 0.018402099609375, -0.007358551025390625, 0.0258636474609375, -0.01513671875, -0.00897216796875, -0.01543426513671875, 0.0347900390625, -0.03045654296875, 0.0176544189453125, 0.024932861328125, -0.041473388671875, -0.007190704345703125, -0.0181732177734375, 0.051300048828125, -0.00855255126953125, 0.0024356842041015625, 0.032318115234375, 0.01403045654296875, -0.048736572265625, 0.06573486328125, 0.000804901123046875, 0.03778076171875, -0.041259765625, 0.00605010986328125, -0.0177459716796875, 0.03155517578125, -0.00653076171875, -0.084716796875, 0.01025390625, -0.004199981689453125, -0.0177764892578125, -0.00019669532775878906, 0.0148468017578125, -0.051239013671875, -0.04754638671875, 0.030517578125, 0.036956787109375, 0.006504058837890625, -0.0191497802734375, -0.051300048828125, -0.0007481575012207031, 0.025390625, -0.02044677734375, 0.002704620361328125, 0.0419921875, -0.0063018798828125, 0.0372314453125, 0.026580810546875, 0.01141357421875, -0.00470733642578125, -0.002666473388671875, 0.03515625, -0.076171875, -0.05853271484375, -0.057159423828125, 0.04742431640625, -0.026458740234375, -0.042572021484375, 0.04754638671875, 0.067138671875, 0.04425048828125, -0.031524658203125, 0.06256103515625, -0.00899505615234375, 0.03302001953125, -0.045989990234375, 0.040069580078125, -0.0287628173828125, -0.0244293212890625, -0.0228729248046875, -0.07659912109375, -0.04876708984375, 0.072509765625, -0.006481170654296875, -0.0020198822021484375, 0.08819580078125, 0.03277587890625, 0.00122833251953125, -0.0213165283203125, 0.035400390625, 0.04010009765625, 0.0093231201171875, 0.06976318359375, 0.0236358642578125, -0.022430419921875, 0.0650634765625, -0.01047515869140625, -0.01702880859375, -0.0182952880859375, -0.060150146484375, -0.0528564453125, -0.0399169921875, -0.01515960693359375, -0.0309600830078125, -0.0016374588012695312, 0.057769775390625, 0.0297393798828125, -0.06927490234375, -0.03155517578125, 0.0214996337890625, -0.009918212890625, -0.0193939208984375, -0.02081298828125, 0.0347900390625, -0.01020050048828125, -0.053497314453125, 0.0274200439453125, 0.0184326171875, 0.00115203857421875, 0.01222991943359375, -0.016326904296875, -0.039520263671875, -0.005893707275390625, 0.052276611328125, 0.0235137939453125, -0.0255279541015625, -0.041839599609375, -0.0036792755126953125, -0.009552001953125, 0.03875732421875, 0.036407470703125, -0.023590087890625, 0.022430419921875, 0.0232086181640625, -0.005817413330078125, 0.044769287109375, -0.006656646728515625, 0.02935791015625, -0.050872802734375, 0.03900146484375, -0.00797271728515625, 0.056915283203125, 0.021270751953125, -0.0186767578125, 0.039764404296875, 0.033050537109375, -0.049407958984375, -0.035247802734375, 0.01221466064453125, -0.09716796875, -0.0091400146484375, 0.09600830078125, -0.02288818359375, -0.02752685546875, -0.0119171142578125, -0.02264404296875, 0.01043701171875, -0.06201171875, 0.059478759765625, 0.07025146484375, 0.0005326271057128906, 0.00787353515625, -0.035400390625, 0.01071929931640625, 0.0306854248046875, -0.03778076171875, -0.00746917724609375, 0.049896240234375, 0.048675537109375, 0.0357666015625, 0.059112548828125, -0.029449462890625, 0.0210723876953125, -0.01116180419921875, 0.054473876953125, -0.0027675628662109375, -0.027618408203125, -0.01548004150390625, 0.0099945068359375, 0.003765106201171875, -0.003116607666015625 ] ]
juierror/flan-t5-text2sql-with-schema
2023-07-31T15:15:31.000Z
[ "transformers", "pytorch", "t5", "text2text-generation", "en", "dataset:wikisql", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
juierror
null
null
juierror/flan-t5-text2sql-with-schema
28
155,736
transformers
2023-01-26T06:59:13
--- language: en datasets: - wikisql widget: - text: 'question: get people name with age equal 25 table: id, name, age' license: apache-2.0 --- There are an upgraded version that support multiple tables and support "<" sign [here](https://huggingface.co/juierror/flan-t5-text2sql-with-schema-v2). # How to use ```python from typing import List from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("juierror/flan-t5-text2sql-with-schema") model = AutoModelForSeq2SeqLM.from_pretrained("juierror/flan-t5-text2sql-with-schema") def prepare_input(question: str, table: List[str]): table_prefix = "table:" question_prefix = "question:" join_table = ",".join(table) inputs = f"{question_prefix} {question} {table_prefix} {join_table}" input_ids = tokenizer(inputs, max_length=512, return_tensors="pt").input_ids return input_ids def inference(question: str, table: List[str]) -> str: input_data = prepare_input(question=question, table=table) input_data = input_data.to(model.device) outputs = model.generate(inputs=input_data, num_beams=10, top_k=10, max_length=700) result = tokenizer.decode(token_ids=outputs[0], skip_special_tokens=True) return result print(inference(question="get people name with age equal 25", table=["id", "name", "age"])) ```
1,345
[ [ -0.0087432861328125, -0.04620361328125, 0.0187835693359375, 0.023956298828125, -0.017120361328125, -0.01520538330078125, -0.004085540771484375, -0.0182342529296875, 0.01184844970703125, 0.02789306640625, -0.0279998779296875, -0.0294952392578125, -0.03631591796875, 0.0237274169921875, -0.0311126708984375, 0.0709228515625, -0.00350189208984375, 0.015869140625, 0.005977630615234375, -0.0283660888671875, -0.00994873046875, -0.04296875, -0.0543212890625, -0.007419586181640625, 0.035858154296875, 0.03375244140625, 0.059967041015625, 0.0227508544921875, 0.03155517578125, 0.0235748291015625, 0.00591278076171875, 0.0244140625, 0.0091705322265625, 0.01708984375, -0.01080322265625, -0.0308837890625, -0.042510986328125, -0.0262603759765625, 0.049652099609375, 0.0352783203125, -0.0032501220703125, 0.0026035308837890625, -0.022796630859375, 0.034332275390625, -0.035125732421875, 0.022552490234375, -0.04901123046875, 0.0177154541015625, -0.0213165283203125, 0.006076812744140625, -0.029541015625, -0.033966064453125, -0.00984954833984375, -0.050689697265625, -0.0164947509765625, 0.01776123046875, 0.08544921875, 0.0146636962890625, -0.07183837890625, -0.0082855224609375, -0.04595947265625, 0.04034423828125, -0.06463623046875, 0.031707763671875, 0.0279998779296875, 0.0482177734375, -0.02313232421875, -0.0732421875, -0.068359375, -0.01261138916015625, -0.04638671875, -0.00350189208984375, 0.0125885009765625, 0.0174713134765625, 0.04583740234375, 0.03448486328125, -0.04736328125, -0.0111083984375, -0.042755126953125, -0.030242919921875, 0.054779052734375, 0.01395416259765625, 0.0223236083984375, -0.013153076171875, -0.0308074951171875, -0.005672454833984375, -0.0249481201171875, -0.0039825439453125, 0.008331298828125, 0.0242462158203125, -0.0077667236328125, 0.05902099609375, -0.03399658203125, 0.048492431640625, -0.01442718505859375, 0.00681304931640625, 0.02960205078125, -0.043212890625, -0.016448974609375, -0.0161895751953125, 0.0845947265625, 0.01122283935546875, 0.023101806640625, 0.00363922119140625, -0.00504302978515625, 0.0056304931640625, -0.0164337158203125, -0.048614501953125, -0.022705078125, 0.038238525390625, -0.035003662109375, -0.022979736328125, 0.021148681640625, -0.07391357421875, -0.04559326171875, 0.0192413330078125, 0.047119140625, -0.0296630859375, -0.04803466796875, -0.00392913818359375, -0.041473388671875, 0.0249481201171875, 0.0266876220703125, -0.08624267578125, 0.039306640625, 0.030029296875, 0.053314208984375, -0.0008616447448730469, -0.02813720703125, -0.039764404296875, 0.012176513671875, -0.0125732421875, 0.045257568359375, -0.017608642578125, -0.0170745849609375, 0.02001953125, 0.0005402565002441406, -0.01702880859375, -0.028076171875, 0.0457763671875, -0.00988006591796875, 0.0400390625, -0.0261993408203125, -0.043060302734375, -0.0029582977294921875, -0.004138946533203125, -0.048370361328125, 0.10223388671875, 0.046539306640625, -0.052825927734375, 0.044708251953125, -0.057098388671875, -0.06707763671875, 0.00495147705078125, -0.01885986328125, -0.047332763671875, -0.002193450927734375, 0.01271820068359375, 0.0207672119140625, -0.00249481201171875, 0.0038776397705078125, -0.038909912109375, -0.0198974609375, 0.0290374755859375, -0.0005121231079101562, 0.05377197265625, 0.01776123046875, -0.03338623046875, 0.022979736328125, -0.065185546875, 0.0020961761474609375, 0.007659912109375, -0.0297698974609375, -0.00281524658203125, 0.0012950897216796875, 0.0225677490234375, 0.04119873046875, 0.016510009765625, -0.035919189453125, 0.03692626953125, -0.0309906005859375, 0.057891845703125, 0.0090789794921875, 0.00827789306640625, 0.03082275390625, -0.0248260498046875, 0.01534271240234375, 0.0239410400390625, 0.006805419921875, -0.003063201904296875, -0.0281982421875, -0.0841064453125, -0.00894927978515625, 0.012176513671875, 0.05078125, -0.06439208984375, 0.040313720703125, -0.007457733154296875, -0.03753662109375, -0.015655517578125, -0.0225982666015625, 0.004138946533203125, 0.0216064453125, 0.053131103515625, 0.01215362548828125, -0.0628662109375, -0.05810546875, -0.037139892578125, -0.0187530517578125, -0.0016117095947265625, 0.0178985595703125, 0.059539794921875, -0.00028634071350097656, 0.04473876953125, -0.0209197998046875, -0.04656982421875, -0.02252197265625, 0.0026454925537109375, 0.02801513671875, 0.05657958984375, 0.045135498046875, -0.0416259765625, -0.040008544921875, -0.0017080307006835938, -0.040496826171875, 0.019500732421875, -0.0272369384765625, -0.03460693359375, 0.0272369384765625, 0.0017547607421875, -0.06854248046875, 0.040618896484375, 0.0004870891571044922, -0.061614990234375, 0.040924072265625, -0.0029582977294921875, 0.0162353515625, -0.103515625, -0.01163482666015625, -0.00923919677734375, -0.025604248046875, -0.055633544921875, 0.006893157958984375, 0.01885986328125, 0.010345458984375, -0.0360107421875, 0.049163818359375, -0.03472900390625, -0.01605224609375, 0.0110931396484375, -0.005542755126953125, 0.01061248779296875, 0.0164337158203125, 0.018524169921875, 0.048126220703125, 0.04852294921875, -0.050445556640625, 0.040008544921875, 0.03265380859375, -0.00518035888671875, 0.03369140625, -0.040252685546875, -0.00737762451171875, 0.0155487060546875, 0.037017822265625, -0.085693359375, -0.0297393798828125, 0.037811279296875, -0.03271484375, 0.0131072998046875, -0.0028247833251953125, -0.0394287109375, -0.0533447265625, -0.0233154296875, 0.003017425537109375, 0.0292816162109375, -0.0294647216796875, 0.0838623046875, 0.005237579345703125, 0.0101470947265625, -0.037109375, -0.10211181640625, -0.006412506103515625, -0.033935546875, -0.055816650390625, 0.0204925537109375, 0.01427459716796875, -0.00626373291015625, -0.026214599609375, -0.01488494873046875, -0.031524658203125, -0.00434112548828125, 0.00545501708984375, 0.01425933837890625, -0.0135650634765625, 0.005641937255859375, 0.033538818359375, -0.007801055908203125, -0.00018548965454101562, -0.0362548828125, 0.04736328125, -0.00707244873046875, 0.0163421630859375, -0.05224609375, 0.016448974609375, 0.052001953125, -0.0322265625, 0.06878662109375, 0.0631103515625, -0.06048583984375, -0.0116119384765625, -0.041534423828125, -0.0191650390625, -0.03436279296875, 0.039764404296875, -0.06329345703125, -0.057464599609375, 0.06597900390625, 0.01446533203125, 0.019683837890625, 0.0606689453125, 0.040008544921875, -0.010711669921875, 0.062744140625, -0.003368377685546875, 0.0224151611328125, 0.0210418701171875, -0.007747650146484375, 0.00978851318359375, -0.04144287109375, -0.020111083984375, -0.041473388671875, 0.00007992982864379883, -0.0308685302734375, -0.0106658935546875, 0.020416259765625, 0.0014553070068359375, -0.030914306640625, 0.04400634765625, -0.0487060546875, 0.01605224609375, 0.044830322265625, 0.0038509368896484375, 0.0177764892578125, 0.001171112060546875, -0.002063751220703125, -0.0034046173095703125, -0.027313232421875, -0.013458251953125, 0.072265625, 0.019622802734375, 0.04205322265625, 0.0257415771484375, 0.0614013671875, 0.008056640625, 0.00455474853515625, -0.054962158203125, 0.0318603515625, -0.007389068603515625, -0.07318115234375, -0.026580810546875, -0.0374755859375, -0.03497314453125, 0.0478515625, -0.0208282470703125, -0.06195068359375, 0.0256195068359375, 0.0139007568359375, -0.037353515625, 0.0135650634765625, -0.033203125, 0.0765380859375, -0.0193328857421875, -0.018157958984375, -0.002887725830078125, -0.0413818359375, 0.044281005859375, 0.0181732177734375, 0.01375579833984375, -0.0130462646484375, -0.0070648193359375, 0.0638427734375, -0.05560302734375, 0.0226287841796875, -0.0274810791015625, 0.00811767578125, 0.0101470947265625, 0.015838623046875, 0.036834716796875, 0.041473388671875, -0.032196044921875, 0.006275177001953125, 0.047943115234375, -0.00331878662109375, -0.0452880859375, 0.0379638671875, -0.045867919921875, -0.0237884521484375, -0.058074951171875, -0.0267791748046875, -0.0014743804931640625, 0.04351806640625, 0.0189361572265625, 0.032073974609375, 0.0170135498046875, 0.01079559326171875, 0.016326904296875, -0.0012502670288085938, 0.04058837890625, 0.024688720703125, -0.0209503173828125, -0.064453125, 0.051239013671875, 0.010711669921875, 0.00844573974609375, 0.035400390625, 0.0218963623046875, -0.045074462890625, -0.01715087890625, -0.040313720703125, 0.0072021484375, -0.04803466796875, -0.0242919921875, -0.067626953125, -0.008758544921875, -0.04998779296875, 0.0171356201171875, -0.013336181640625, -0.07177734375, -0.0192413330078125, -0.01247406005859375, 0.036651611328125, 0.02288818359375, 0.0017147064208984375, 0.010009765625, -0.046539306640625, 0.0285491943359375, 0.0179901123046875, 0.01497650146484375, -0.0233154296875, -0.037567138671875, 0.00720977783203125, -0.006412506103515625, 0.002590179443359375, -0.09881591796875, 0.00537109375, 0.040618896484375, 0.02703857421875, 0.0224609375, 0.019561767578125, 0.048858642578125, -0.023223876953125, 0.06427001953125, 0.0006990432739257812, -0.0799560546875, 0.0411376953125, -0.0162353515625, 0.00318145751953125, 0.055450439453125, 0.00951385498046875, -0.05364990234375, -0.00850677490234375, -0.03125, -0.06805419921875, 0.0684814453125, 0.025848388671875, -0.0009164810180664062, 0.0004870891571044922, 0.03082275390625, 0.01317596435546875, 0.01629638671875, -0.0797119140625, -0.040313720703125, -0.03472900390625, -0.0211639404296875, 0.05560302734375, -0.036834716796875, -0.00894927978515625, -0.0445556640625, 0.0628662109375, -0.006954193115234375, 0.0252532958984375, 0.013885498046875, -0.0023555755615234375, 0.00469970703125, 0.0227508544921875, 0.046234130859375, 0.038330078125, -0.016845703125, 0.0027484893798828125, 0.0299530029296875, -0.0104217529296875, -0.00971221923828125, 0.036041259765625, -0.01432037353515625, 0.0185394287109375, 0.006412506103515625, 0.0634765625, -0.0022106170654296875, -0.00341796875, 0.0352783203125, -0.0221710205078125, -0.041168212890625, -0.061553955078125, -0.0011692047119140625, -0.0007090568542480469, 0.01248931884765625, 0.040985107421875, -0.006572723388671875, 0.0116119384765625, -0.0340576171875, 0.0249786376953125, 0.021148681640625, -0.017791748046875, 0.0028476715087890625, 0.0772705078125, 0.007129669189453125, -0.0261077880859375, 0.05865478515625, -0.0154876708984375, -0.033935546875, 0.039093017578125, 0.035064697265625, 0.049652099609375, -0.004695892333984375, 0.0173797607421875, 0.042572021484375, 0.032989501953125, 0.0055694580078125, 0.06805419921875, -0.0197601318359375, -0.049835205078125, 0.0047454833984375, -0.05035400390625, -0.01241302490234375, 0.005970001220703125, -0.04278564453125, 0.0212860107421875, -0.04180908203125, -0.0178985595703125, 0.00878143310546875, -0.0034503936767578125, -0.037994384765625, 0.0210723876953125, -0.00983428955078125, 0.068115234375, -0.057647705078125, 0.0770263671875, 0.06585693359375, -0.059906005859375, -0.0770263671875, -0.022186279296875, -0.01026153564453125, -0.03466796875, 0.0303497314453125, 0.00673675537109375, 0.042755126953125, -0.0004544258117675781, -0.006771087646484375, -0.0828857421875, 0.08349609375, -0.01465606689453125, -0.031951904296875, -0.0102996826171875, 0.041046142578125, 0.022705078125, -0.01245880126953125, 0.0628662109375, 0.07080078125, 0.0654296875, -0.005035400390625, -0.068115234375, 0.033721923828125, -0.00732421875, 0.0036830902099609375, 0.0114898681640625, -0.0469970703125, 0.05902099609375, -0.011138916015625, -0.00951385498046875, 0.029754638671875, 0.07818603515625, 0.0256500244140625, 0.01058197021484375, 0.04266357421875, 0.0251007080078125, 0.04595947265625, -0.057830810546875, 0.0538330078125, -0.02142333984375, 0.053375244140625, 0.06329345703125, -0.0276947021484375, 0.05096435546875, 0.03753662109375, -0.04034423828125, 0.049652099609375, 0.052337646484375, -0.017333984375, 0.0224151611328125, 0.0193634033203125, -0.0010128021240234375, -0.004833221435546875, 0.01540374755859375, -0.04071044921875, 0.01313018798828125, 0.0484619140625, -0.029327392578125, -0.004123687744140625, -0.005634307861328125, 0.021392822265625, -0.016326904296875, 0.01458740234375, 0.047943115234375, 0.0031490325927734375, -0.0264129638671875, 0.03558349609375, 0.00849151611328125, 0.0582275390625, -0.052886962890625, -0.008514404296875, -0.03338623046875, 0.0050048828125, -0.045074462890625, -0.037994384765625, 0.01428985595703125, -0.03961181640625, 0.004474639892578125, 0.0203094482421875, 0.048095703125, -0.057708740234375, -0.04345703125, -0.006229400634765625, 0.01107025146484375, 0.00731658935546875, -0.006317138671875, -0.048797607421875, 0.006427764892578125, -0.011566162109375, -0.0253448486328125, -0.008209228515625, 0.021148681640625, 0.0005464553833007812, 0.036773681640625, 0.03369140625, 0.00165557861328125, 0.02423095703125, 0.027923583984375, 0.047119140625, -0.04266357421875, -0.046173095703125, -0.052642822265625, 0.051300048828125, -0.027008056640625, -0.0457763671875, 0.046783447265625, 0.0494384765625, 0.046905517578125, -0.0171661376953125, 0.05615234375, -0.038726806640625, 0.024749755859375, -0.0257110595703125, 0.061798095703125, -0.049163818359375, -0.018798828125, -0.0198822021484375, -0.07647705078125, -0.032806396484375, 0.0557861328125, 0.00025534629821777344, 0.0233917236328125, 0.08160400390625, 0.04876708984375, 0.01421356201171875, -0.0009007453918457031, 0.0070953369140625, 0.020355224609375, 0.034881591796875, 0.04217529296875, 0.042816162109375, -0.0684814453125, 0.040069580078125, -0.035888671875, -0.00969696044921875, -0.0133819580078125, -0.0295562744140625, -0.06585693359375, -0.047088623046875, -0.035675048828125, -0.062744140625, 0.0010805130004882812, 0.06329345703125, 0.047332763671875, -0.08697509765625, -0.0007519721984863281, -0.0155029296875, 0.004665374755859375, -0.031524658203125, -0.0276947021484375, 0.04620361328125, -0.01287078857421875, -0.05145263671875, 0.025787353515625, 0.00807952880859375, -0.0023860931396484375, -0.0235443115234375, 0.00553131103515625, -0.02301025390625, -0.006122589111328125, 0.0273895263671875, 0.0218963623046875, -0.062469482421875, -0.0196533203125, -0.0265655517578125, -0.0180511474609375, 0.003940582275390625, 0.03912353515625, -0.03668212890625, 0.00994873046875, 0.05950927734375, 0.0221710205078125, 0.041534423828125, -0.01097869873046875, 0.0291748046875, -0.0286102294921875, 0.0189666748046875, 0.01149749755859375, 0.05572509765625, 0.017181396484375, -0.0457763671875, 0.044647216796875, 0.015899658203125, -0.0302276611328125, -0.055908203125, 0.0088958740234375, -0.053192138671875, 0.006317138671875, 0.09326171875, 0.005092620849609375, -0.01183319091796875, -0.01103973388671875, -0.01708984375, 0.04034423828125, -0.046112060546875, 0.04205322265625, 0.042449951171875, -0.009979248046875, -0.01708984375, -0.0178375244140625, 0.052947998046875, 0.05242919921875, -0.06231689453125, -0.007518768310546875, 0.0256195068359375, 0.0212554931640625, 0.00009584426879882812, 0.048065185546875, -0.02301025390625, 0.038665771484375, 0.006107330322265625, 0.00785064697265625, -0.018310546875, -0.01197052001953125, -0.00823211669921875, -0.006381988525390625, -0.005123138427734375, -0.020172119140625 ] ]
thenlper/gte-small
2023-10-11T01:00:04.000Z
[ "sentence-transformers", "pytorch", "safetensors", "bert", "mteb", "sentence-similarity", "Sentence Transformers", "en", "arxiv:2308.03281", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
thenlper
null
null
thenlper/gte-small
54
154,697
sentence-transformers
2023-07-27T10:14:55
--- tags: - mteb - sentence-similarity - sentence-transformers - Sentence Transformers model-index: - name: gte-small results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 73.22388059701493 - type: ap value: 36.09895941426988 - type: f1 value: 67.3205651539195 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 91.81894999999999 - type: ap value: 88.5240138417305 - type: f1 value: 91.80367382706962 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.032 - type: f1 value: 47.4490665674719 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 30.725 - type: map_at_10 value: 46.604 - type: map_at_100 value: 47.535 - type: map_at_1000 value: 47.538000000000004 - type: map_at_3 value: 41.833 - type: map_at_5 value: 44.61 - type: mrr_at_1 value: 31.223 - type: mrr_at_10 value: 46.794000000000004 - type: mrr_at_100 value: 47.725 - type: mrr_at_1000 value: 47.727000000000004 - type: mrr_at_3 value: 42.07 - type: mrr_at_5 value: 44.812000000000005 - type: ndcg_at_1 value: 30.725 - type: ndcg_at_10 value: 55.440999999999995 - type: ndcg_at_100 value: 59.134 - type: ndcg_at_1000 value: 59.199 - type: ndcg_at_3 value: 45.599000000000004 - type: ndcg_at_5 value: 50.637 - type: precision_at_1 value: 30.725 - type: precision_at_10 value: 8.364 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.848000000000003 - type: precision_at_5 value: 13.77 - type: recall_at_1 value: 30.725 - type: recall_at_10 value: 83.64200000000001 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 56.543 - type: recall_at_5 value: 68.848 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 47.90178078197678 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 40.25728393431922 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 61.720297062897764 - type: mrr value: 75.24139295607439 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 89.43527309184616 - type: cos_sim_spearman value: 88.17128615100206 - type: euclidean_pearson value: 87.89922623089282 - type: euclidean_spearman value: 87.96104039655451 - type: manhattan_pearson value: 87.9818290932077 - type: manhattan_spearman value: 88.00923426576885 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.0844155844156 - type: f1 value: 84.01485017302213 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 38.36574769259432 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 35.4857033165287 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.261 - type: map_at_10 value: 42.419000000000004 - type: map_at_100 value: 43.927 - type: map_at_1000 value: 44.055 - type: map_at_3 value: 38.597 - type: map_at_5 value: 40.701 - type: mrr_at_1 value: 36.91 - type: mrr_at_10 value: 48.02 - type: mrr_at_100 value: 48.658 - type: mrr_at_1000 value: 48.708 - type: mrr_at_3 value: 44.945 - type: mrr_at_5 value: 46.705000000000005 - type: ndcg_at_1 value: 36.91 - type: ndcg_at_10 value: 49.353 - type: ndcg_at_100 value: 54.456 - type: ndcg_at_1000 value: 56.363 - type: ndcg_at_3 value: 43.483 - type: ndcg_at_5 value: 46.150999999999996 - type: precision_at_1 value: 36.91 - type: precision_at_10 value: 9.700000000000001 - type: precision_at_100 value: 1.557 - type: precision_at_1000 value: 0.202 - type: precision_at_3 value: 21.078 - type: precision_at_5 value: 15.421999999999999 - type: recall_at_1 value: 30.261 - type: recall_at_10 value: 63.242 - type: recall_at_100 value: 84.09100000000001 - type: recall_at_1000 value: 96.143 - type: recall_at_3 value: 46.478 - type: recall_at_5 value: 53.708 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 31.145 - type: map_at_10 value: 40.996 - type: map_at_100 value: 42.266999999999996 - type: map_at_1000 value: 42.397 - type: map_at_3 value: 38.005 - type: map_at_5 value: 39.628 - type: mrr_at_1 value: 38.344 - type: mrr_at_10 value: 46.827000000000005 - type: mrr_at_100 value: 47.446 - type: mrr_at_1000 value: 47.489 - type: mrr_at_3 value: 44.448 - type: mrr_at_5 value: 45.747 - type: ndcg_at_1 value: 38.344 - type: ndcg_at_10 value: 46.733000000000004 - type: ndcg_at_100 value: 51.103 - type: ndcg_at_1000 value: 53.075 - type: ndcg_at_3 value: 42.366 - type: ndcg_at_5 value: 44.242 - type: precision_at_1 value: 38.344 - type: precision_at_10 value: 8.822000000000001 - type: precision_at_100 value: 1.417 - type: precision_at_1000 value: 0.187 - type: precision_at_3 value: 20.403 - type: precision_at_5 value: 14.306 - type: recall_at_1 value: 31.145 - type: recall_at_10 value: 56.909 - type: recall_at_100 value: 75.274 - type: recall_at_1000 value: 87.629 - type: recall_at_3 value: 43.784 - type: recall_at_5 value: 49.338 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 38.83 - type: map_at_10 value: 51.553000000000004 - type: map_at_100 value: 52.581 - type: map_at_1000 value: 52.638 - type: map_at_3 value: 48.112 - type: map_at_5 value: 50.095 - type: mrr_at_1 value: 44.513999999999996 - type: mrr_at_10 value: 54.998000000000005 - type: mrr_at_100 value: 55.650999999999996 - type: mrr_at_1000 value: 55.679 - type: mrr_at_3 value: 52.602000000000004 - type: mrr_at_5 value: 53.931 - type: ndcg_at_1 value: 44.513999999999996 - type: ndcg_at_10 value: 57.67400000000001 - type: ndcg_at_100 value: 61.663999999999994 - type: ndcg_at_1000 value: 62.743 - type: ndcg_at_3 value: 51.964 - type: ndcg_at_5 value: 54.773 - type: precision_at_1 value: 44.513999999999996 - type: precision_at_10 value: 9.423 - type: precision_at_100 value: 1.2309999999999999 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 23.323 - type: precision_at_5 value: 16.163 - type: recall_at_1 value: 38.83 - type: recall_at_10 value: 72.327 - type: recall_at_100 value: 89.519 - type: recall_at_1000 value: 97.041 - type: recall_at_3 value: 57.206 - type: recall_at_5 value: 63.88399999999999 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.484 - type: map_at_10 value: 34.527 - type: map_at_100 value: 35.661 - type: map_at_1000 value: 35.739 - type: map_at_3 value: 32.199 - type: map_at_5 value: 33.632 - type: mrr_at_1 value: 27.458 - type: mrr_at_10 value: 36.543 - type: mrr_at_100 value: 37.482 - type: mrr_at_1000 value: 37.543 - type: mrr_at_3 value: 34.256 - type: mrr_at_5 value: 35.618 - type: ndcg_at_1 value: 27.458 - type: ndcg_at_10 value: 39.396 - type: ndcg_at_100 value: 44.742 - type: ndcg_at_1000 value: 46.708 - type: ndcg_at_3 value: 34.817 - type: ndcg_at_5 value: 37.247 - type: precision_at_1 value: 27.458 - type: precision_at_10 value: 5.976999999999999 - type: precision_at_100 value: 0.907 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 14.878 - type: precision_at_5 value: 10.35 - type: recall_at_1 value: 25.484 - type: recall_at_10 value: 52.317 - type: recall_at_100 value: 76.701 - type: recall_at_1000 value: 91.408 - type: recall_at_3 value: 40.043 - type: recall_at_5 value: 45.879 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.719 - type: map_at_10 value: 25.269000000000002 - type: map_at_100 value: 26.442 - type: map_at_1000 value: 26.557 - type: map_at_3 value: 22.56 - type: map_at_5 value: 24.082 - type: mrr_at_1 value: 20.896 - type: mrr_at_10 value: 29.982999999999997 - type: mrr_at_100 value: 30.895 - type: mrr_at_1000 value: 30.961 - type: mrr_at_3 value: 27.239 - type: mrr_at_5 value: 28.787000000000003 - type: ndcg_at_1 value: 20.896 - type: ndcg_at_10 value: 30.814000000000004 - type: ndcg_at_100 value: 36.418 - type: ndcg_at_1000 value: 39.182 - type: ndcg_at_3 value: 25.807999999999996 - type: ndcg_at_5 value: 28.143 - type: precision_at_1 value: 20.896 - type: precision_at_10 value: 5.821 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.136 - type: precision_at_3 value: 12.562000000000001 - type: precision_at_5 value: 9.254 - type: recall_at_1 value: 16.719 - type: recall_at_10 value: 43.155 - type: recall_at_100 value: 67.831 - type: recall_at_1000 value: 87.617 - type: recall_at_3 value: 29.259 - type: recall_at_5 value: 35.260999999999996 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.398999999999997 - type: map_at_10 value: 39.876 - type: map_at_100 value: 41.205999999999996 - type: map_at_1000 value: 41.321999999999996 - type: map_at_3 value: 36.588 - type: map_at_5 value: 38.538 - type: mrr_at_1 value: 35.9 - type: mrr_at_10 value: 45.528 - type: mrr_at_100 value: 46.343 - type: mrr_at_1000 value: 46.388 - type: mrr_at_3 value: 42.862 - type: mrr_at_5 value: 44.440000000000005 - type: ndcg_at_1 value: 35.9 - type: ndcg_at_10 value: 45.987 - type: ndcg_at_100 value: 51.370000000000005 - type: ndcg_at_1000 value: 53.400000000000006 - type: ndcg_at_3 value: 40.841 - type: ndcg_at_5 value: 43.447 - type: precision_at_1 value: 35.9 - type: precision_at_10 value: 8.393 - type: precision_at_100 value: 1.283 - type: precision_at_1000 value: 0.166 - type: precision_at_3 value: 19.538 - type: precision_at_5 value: 13.975000000000001 - type: recall_at_1 value: 29.398999999999997 - type: recall_at_10 value: 58.361 - type: recall_at_100 value: 81.081 - type: recall_at_1000 value: 94.004 - type: recall_at_3 value: 43.657000000000004 - type: recall_at_5 value: 50.519999999999996 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 21.589 - type: map_at_10 value: 31.608999999999998 - type: map_at_100 value: 33.128 - type: map_at_1000 value: 33.247 - type: map_at_3 value: 28.671999999999997 - type: map_at_5 value: 30.233999999999998 - type: mrr_at_1 value: 26.712000000000003 - type: mrr_at_10 value: 36.713 - type: mrr_at_100 value: 37.713 - type: mrr_at_1000 value: 37.771 - type: mrr_at_3 value: 34.075 - type: mrr_at_5 value: 35.451 - type: ndcg_at_1 value: 26.712000000000003 - type: ndcg_at_10 value: 37.519999999999996 - type: ndcg_at_100 value: 43.946000000000005 - type: ndcg_at_1000 value: 46.297 - type: ndcg_at_3 value: 32.551 - type: ndcg_at_5 value: 34.660999999999994 - type: precision_at_1 value: 26.712000000000003 - type: precision_at_10 value: 7.066 - type: precision_at_100 value: 1.216 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 15.906 - type: precision_at_5 value: 11.437999999999999 - type: recall_at_1 value: 21.589 - type: recall_at_10 value: 50.090999999999994 - type: recall_at_100 value: 77.43900000000001 - type: recall_at_1000 value: 93.35900000000001 - type: recall_at_3 value: 36.028999999999996 - type: recall_at_5 value: 41.698 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.121666666666663 - type: map_at_10 value: 34.46258333333334 - type: map_at_100 value: 35.710499999999996 - type: map_at_1000 value: 35.82691666666666 - type: map_at_3 value: 31.563249999999996 - type: map_at_5 value: 33.189750000000004 - type: mrr_at_1 value: 29.66441666666667 - type: mrr_at_10 value: 38.5455 - type: mrr_at_100 value: 39.39566666666667 - type: mrr_at_1000 value: 39.45325 - type: mrr_at_3 value: 36.003333333333345 - type: mrr_at_5 value: 37.440916666666666 - type: ndcg_at_1 value: 29.66441666666667 - type: ndcg_at_10 value: 39.978416666666675 - type: ndcg_at_100 value: 45.278666666666666 - type: ndcg_at_1000 value: 47.52275 - type: ndcg_at_3 value: 35.00058333333334 - type: ndcg_at_5 value: 37.34908333333333 - type: precision_at_1 value: 29.66441666666667 - type: precision_at_10 value: 7.094500000000001 - type: precision_at_100 value: 1.1523333333333332 - type: precision_at_1000 value: 0.15358333333333332 - type: precision_at_3 value: 16.184166666666663 - type: precision_at_5 value: 11.6005 - type: recall_at_1 value: 25.121666666666663 - type: recall_at_10 value: 52.23975000000001 - type: recall_at_100 value: 75.48408333333333 - type: recall_at_1000 value: 90.95316666666668 - type: recall_at_3 value: 38.38458333333333 - type: recall_at_5 value: 44.39933333333333 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.569000000000003 - type: map_at_10 value: 30.389 - type: map_at_100 value: 31.396 - type: map_at_1000 value: 31.493 - type: map_at_3 value: 28.276 - type: map_at_5 value: 29.459000000000003 - type: mrr_at_1 value: 26.534000000000002 - type: mrr_at_10 value: 33.217999999999996 - type: mrr_at_100 value: 34.054 - type: mrr_at_1000 value: 34.12 - type: mrr_at_3 value: 31.058000000000003 - type: mrr_at_5 value: 32.330999999999996 - type: ndcg_at_1 value: 26.534000000000002 - type: ndcg_at_10 value: 34.608 - type: ndcg_at_100 value: 39.391999999999996 - type: ndcg_at_1000 value: 41.837999999999994 - type: ndcg_at_3 value: 30.564999999999998 - type: ndcg_at_5 value: 32.509 - type: precision_at_1 value: 26.534000000000002 - type: precision_at_10 value: 5.414 - type: precision_at_100 value: 0.847 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 12.986 - type: precision_at_5 value: 9.202 - type: recall_at_1 value: 23.569000000000003 - type: recall_at_10 value: 44.896 - type: recall_at_100 value: 66.476 - type: recall_at_1000 value: 84.548 - type: recall_at_3 value: 33.79 - type: recall_at_5 value: 38.512 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.36 - type: map_at_10 value: 23.57 - type: map_at_100 value: 24.698999999999998 - type: map_at_1000 value: 24.834999999999997 - type: map_at_3 value: 21.093 - type: map_at_5 value: 22.418 - type: mrr_at_1 value: 19.718 - type: mrr_at_10 value: 27.139999999999997 - type: mrr_at_100 value: 28.097 - type: mrr_at_1000 value: 28.177999999999997 - type: mrr_at_3 value: 24.805 - type: mrr_at_5 value: 26.121 - type: ndcg_at_1 value: 19.718 - type: ndcg_at_10 value: 28.238999999999997 - type: ndcg_at_100 value: 33.663 - type: ndcg_at_1000 value: 36.763 - type: ndcg_at_3 value: 23.747 - type: ndcg_at_5 value: 25.796000000000003 - type: precision_at_1 value: 19.718 - type: precision_at_10 value: 5.282 - type: precision_at_100 value: 0.9390000000000001 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.264000000000001 - type: precision_at_5 value: 8.341 - type: recall_at_1 value: 16.36 - type: recall_at_10 value: 38.669 - type: recall_at_100 value: 63.184 - type: recall_at_1000 value: 85.33800000000001 - type: recall_at_3 value: 26.214 - type: recall_at_5 value: 31.423000000000002 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.618999999999996 - type: map_at_10 value: 34.361999999999995 - type: map_at_100 value: 35.534 - type: map_at_1000 value: 35.634 - type: map_at_3 value: 31.402 - type: map_at_5 value: 32.815 - type: mrr_at_1 value: 30.037000000000003 - type: mrr_at_10 value: 38.284 - type: mrr_at_100 value: 39.141999999999996 - type: mrr_at_1000 value: 39.2 - type: mrr_at_3 value: 35.603 - type: mrr_at_5 value: 36.867 - type: ndcg_at_1 value: 30.037000000000003 - type: ndcg_at_10 value: 39.87 - type: ndcg_at_100 value: 45.243 - type: ndcg_at_1000 value: 47.507 - type: ndcg_at_3 value: 34.371 - type: ndcg_at_5 value: 36.521 - type: precision_at_1 value: 30.037000000000003 - type: precision_at_10 value: 6.819 - type: precision_at_100 value: 1.0699999999999998 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 15.392 - type: precision_at_5 value: 10.821 - type: recall_at_1 value: 25.618999999999996 - type: recall_at_10 value: 52.869 - type: recall_at_100 value: 76.395 - type: recall_at_1000 value: 92.19500000000001 - type: recall_at_3 value: 37.943 - type: recall_at_5 value: 43.342999999999996 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.283 - type: map_at_10 value: 32.155 - type: map_at_100 value: 33.724 - type: map_at_1000 value: 33.939 - type: map_at_3 value: 29.018 - type: map_at_5 value: 30.864000000000004 - type: mrr_at_1 value: 28.063 - type: mrr_at_10 value: 36.632 - type: mrr_at_100 value: 37.606 - type: mrr_at_1000 value: 37.671 - type: mrr_at_3 value: 33.992 - type: mrr_at_5 value: 35.613 - type: ndcg_at_1 value: 28.063 - type: ndcg_at_10 value: 38.024 - type: ndcg_at_100 value: 44.292 - type: ndcg_at_1000 value: 46.818 - type: ndcg_at_3 value: 32.965 - type: ndcg_at_5 value: 35.562 - type: precision_at_1 value: 28.063 - type: precision_at_10 value: 7.352 - type: precision_at_100 value: 1.514 - type: precision_at_1000 value: 0.23800000000000002 - type: precision_at_3 value: 15.481 - type: precision_at_5 value: 11.542 - type: recall_at_1 value: 23.283 - type: recall_at_10 value: 49.756 - type: recall_at_100 value: 78.05 - type: recall_at_1000 value: 93.854 - type: recall_at_3 value: 35.408 - type: recall_at_5 value: 42.187000000000005 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 19.201999999999998 - type: map_at_10 value: 26.826 - type: map_at_100 value: 27.961000000000002 - type: map_at_1000 value: 28.066999999999997 - type: map_at_3 value: 24.237000000000002 - type: map_at_5 value: 25.811 - type: mrr_at_1 value: 20.887 - type: mrr_at_10 value: 28.660000000000004 - type: mrr_at_100 value: 29.660999999999998 - type: mrr_at_1000 value: 29.731 - type: mrr_at_3 value: 26.155 - type: mrr_at_5 value: 27.68 - type: ndcg_at_1 value: 20.887 - type: ndcg_at_10 value: 31.523 - type: ndcg_at_100 value: 37.055 - type: ndcg_at_1000 value: 39.579 - type: ndcg_at_3 value: 26.529000000000003 - type: ndcg_at_5 value: 29.137 - type: precision_at_1 value: 20.887 - type: precision_at_10 value: 5.065 - type: precision_at_100 value: 0.856 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 11.399 - type: precision_at_5 value: 8.392 - type: recall_at_1 value: 19.201999999999998 - type: recall_at_10 value: 44.285000000000004 - type: recall_at_100 value: 69.768 - type: recall_at_1000 value: 88.302 - type: recall_at_3 value: 30.804 - type: recall_at_5 value: 37.039 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 11.244 - type: map_at_10 value: 18.956 - type: map_at_100 value: 20.674 - type: map_at_1000 value: 20.863 - type: map_at_3 value: 15.923000000000002 - type: map_at_5 value: 17.518 - type: mrr_at_1 value: 25.080999999999996 - type: mrr_at_10 value: 35.94 - type: mrr_at_100 value: 36.969 - type: mrr_at_1000 value: 37.013 - type: mrr_at_3 value: 32.617000000000004 - type: mrr_at_5 value: 34.682 - type: ndcg_at_1 value: 25.080999999999996 - type: ndcg_at_10 value: 26.539 - type: ndcg_at_100 value: 33.601 - type: ndcg_at_1000 value: 37.203 - type: ndcg_at_3 value: 21.695999999999998 - type: ndcg_at_5 value: 23.567 - type: precision_at_1 value: 25.080999999999996 - type: precision_at_10 value: 8.143 - type: precision_at_100 value: 1.5650000000000002 - type: precision_at_1000 value: 0.22300000000000003 - type: precision_at_3 value: 15.983 - type: precision_at_5 value: 12.417 - type: recall_at_1 value: 11.244 - type: recall_at_10 value: 31.457 - type: recall_at_100 value: 55.92 - type: recall_at_1000 value: 76.372 - type: recall_at_3 value: 19.784 - type: recall_at_5 value: 24.857000000000003 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 8.595 - type: map_at_10 value: 18.75 - type: map_at_100 value: 26.354 - type: map_at_1000 value: 27.912 - type: map_at_3 value: 13.794 - type: map_at_5 value: 16.021 - type: mrr_at_1 value: 65.75 - type: mrr_at_10 value: 73.837 - type: mrr_at_100 value: 74.22800000000001 - type: mrr_at_1000 value: 74.234 - type: mrr_at_3 value: 72.5 - type: mrr_at_5 value: 73.387 - type: ndcg_at_1 value: 52.625 - type: ndcg_at_10 value: 39.101 - type: ndcg_at_100 value: 43.836000000000006 - type: ndcg_at_1000 value: 51.086 - type: ndcg_at_3 value: 44.229 - type: ndcg_at_5 value: 41.555 - type: precision_at_1 value: 65.75 - type: precision_at_10 value: 30.45 - type: precision_at_100 value: 9.81 - type: precision_at_1000 value: 2.045 - type: precision_at_3 value: 48.667 - type: precision_at_5 value: 40.8 - type: recall_at_1 value: 8.595 - type: recall_at_10 value: 24.201 - type: recall_at_100 value: 50.096 - type: recall_at_1000 value: 72.677 - type: recall_at_3 value: 15.212 - type: recall_at_5 value: 18.745 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 46.565 - type: f1 value: 41.49914329345582 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 66.60000000000001 - type: map_at_10 value: 76.838 - type: map_at_100 value: 77.076 - type: map_at_1000 value: 77.09 - type: map_at_3 value: 75.545 - type: map_at_5 value: 76.39 - type: mrr_at_1 value: 71.707 - type: mrr_at_10 value: 81.514 - type: mrr_at_100 value: 81.64099999999999 - type: mrr_at_1000 value: 81.645 - type: mrr_at_3 value: 80.428 - type: mrr_at_5 value: 81.159 - type: ndcg_at_1 value: 71.707 - type: ndcg_at_10 value: 81.545 - type: ndcg_at_100 value: 82.477 - type: ndcg_at_1000 value: 82.73899999999999 - type: ndcg_at_3 value: 79.292 - type: ndcg_at_5 value: 80.599 - type: precision_at_1 value: 71.707 - type: precision_at_10 value: 10.035 - type: precision_at_100 value: 1.068 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 30.918 - type: precision_at_5 value: 19.328 - type: recall_at_1 value: 66.60000000000001 - type: recall_at_10 value: 91.353 - type: recall_at_100 value: 95.21 - type: recall_at_1000 value: 96.89999999999999 - type: recall_at_3 value: 85.188 - type: recall_at_5 value: 88.52 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 19.338 - type: map_at_10 value: 31.752000000000002 - type: map_at_100 value: 33.516 - type: map_at_1000 value: 33.694 - type: map_at_3 value: 27.716 - type: map_at_5 value: 29.67 - type: mrr_at_1 value: 38.117000000000004 - type: mrr_at_10 value: 47.323 - type: mrr_at_100 value: 48.13 - type: mrr_at_1000 value: 48.161 - type: mrr_at_3 value: 45.062000000000005 - type: mrr_at_5 value: 46.358 - type: ndcg_at_1 value: 38.117000000000004 - type: ndcg_at_10 value: 39.353 - type: ndcg_at_100 value: 46.044000000000004 - type: ndcg_at_1000 value: 49.083 - type: ndcg_at_3 value: 35.891 - type: ndcg_at_5 value: 36.661 - type: precision_at_1 value: 38.117000000000004 - type: precision_at_10 value: 11.187999999999999 - type: precision_at_100 value: 1.802 - type: precision_at_1000 value: 0.234 - type: precision_at_3 value: 24.126 - type: precision_at_5 value: 17.562 - type: recall_at_1 value: 19.338 - type: recall_at_10 value: 45.735 - type: recall_at_100 value: 71.281 - type: recall_at_1000 value: 89.537 - type: recall_at_3 value: 32.525 - type: recall_at_5 value: 37.671 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 36.995 - type: map_at_10 value: 55.032000000000004 - type: map_at_100 value: 55.86 - type: map_at_1000 value: 55.932 - type: map_at_3 value: 52.125 - type: map_at_5 value: 53.884 - type: mrr_at_1 value: 73.991 - type: mrr_at_10 value: 80.096 - type: mrr_at_100 value: 80.32000000000001 - type: mrr_at_1000 value: 80.331 - type: mrr_at_3 value: 79.037 - type: mrr_at_5 value: 79.719 - type: ndcg_at_1 value: 73.991 - type: ndcg_at_10 value: 63.786 - type: ndcg_at_100 value: 66.78 - type: ndcg_at_1000 value: 68.255 - type: ndcg_at_3 value: 59.501000000000005 - type: ndcg_at_5 value: 61.82299999999999 - type: precision_at_1 value: 73.991 - type: precision_at_10 value: 13.157 - type: precision_at_100 value: 1.552 - type: precision_at_1000 value: 0.17500000000000002 - type: precision_at_3 value: 37.519999999999996 - type: precision_at_5 value: 24.351 - type: recall_at_1 value: 36.995 - type: recall_at_10 value: 65.78699999999999 - type: recall_at_100 value: 77.583 - type: recall_at_1000 value: 87.421 - type: recall_at_3 value: 56.279999999999994 - type: recall_at_5 value: 60.878 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 86.80239999999999 - type: ap value: 81.97305141128378 - type: f1 value: 86.76976305549273 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 21.166 - type: map_at_10 value: 33.396 - type: map_at_100 value: 34.588 - type: map_at_1000 value: 34.637 - type: map_at_3 value: 29.509999999999998 - type: map_at_5 value: 31.719 - type: mrr_at_1 value: 21.762 - type: mrr_at_10 value: 33.969 - type: mrr_at_100 value: 35.099000000000004 - type: mrr_at_1000 value: 35.141 - type: mrr_at_3 value: 30.148000000000003 - type: mrr_at_5 value: 32.324000000000005 - type: ndcg_at_1 value: 21.776999999999997 - type: ndcg_at_10 value: 40.306999999999995 - type: ndcg_at_100 value: 46.068 - type: ndcg_at_1000 value: 47.3 - type: ndcg_at_3 value: 32.416 - type: ndcg_at_5 value: 36.345 - type: precision_at_1 value: 21.776999999999997 - type: precision_at_10 value: 6.433 - type: precision_at_100 value: 0.932 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 13.897 - type: precision_at_5 value: 10.324 - type: recall_at_1 value: 21.166 - type: recall_at_10 value: 61.587 - type: recall_at_100 value: 88.251 - type: recall_at_1000 value: 97.727 - type: recall_at_3 value: 40.196 - type: recall_at_5 value: 49.611 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.04605563155496 - type: f1 value: 92.78007303978372 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 69.65116279069767 - type: f1 value: 52.75775172527262 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.34633490248822 - type: f1 value: 68.15345065392562 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.63887020847343 - type: f1 value: 76.08074680233685 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 33.77933406071333 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 32.06504927238196 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 32.20682480490871 - type: mrr value: 33.41462721527003 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.548 - type: map_at_10 value: 13.086999999999998 - type: map_at_100 value: 16.698 - type: map_at_1000 value: 18.151999999999997 - type: map_at_3 value: 9.576 - type: map_at_5 value: 11.175 - type: mrr_at_1 value: 44.272 - type: mrr_at_10 value: 53.635999999999996 - type: mrr_at_100 value: 54.228 - type: mrr_at_1000 value: 54.26499999999999 - type: mrr_at_3 value: 51.754 - type: mrr_at_5 value: 53.086 - type: ndcg_at_1 value: 42.724000000000004 - type: ndcg_at_10 value: 34.769 - type: ndcg_at_100 value: 32.283 - type: ndcg_at_1000 value: 40.843 - type: ndcg_at_3 value: 39.852 - type: ndcg_at_5 value: 37.858999999999995 - type: precision_at_1 value: 44.272 - type: precision_at_10 value: 26.068 - type: precision_at_100 value: 8.328000000000001 - type: precision_at_1000 value: 2.1 - type: precision_at_3 value: 37.874 - type: precision_at_5 value: 33.065 - type: recall_at_1 value: 5.548 - type: recall_at_10 value: 16.936999999999998 - type: recall_at_100 value: 33.72 - type: recall_at_1000 value: 64.348 - type: recall_at_3 value: 10.764999999999999 - type: recall_at_5 value: 13.361 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 28.008 - type: map_at_10 value: 42.675000000000004 - type: map_at_100 value: 43.85 - type: map_at_1000 value: 43.884 - type: map_at_3 value: 38.286 - type: map_at_5 value: 40.78 - type: mrr_at_1 value: 31.518 - type: mrr_at_10 value: 45.015 - type: mrr_at_100 value: 45.924 - type: mrr_at_1000 value: 45.946999999999996 - type: mrr_at_3 value: 41.348 - type: mrr_at_5 value: 43.428 - type: ndcg_at_1 value: 31.489 - type: ndcg_at_10 value: 50.285999999999994 - type: ndcg_at_100 value: 55.291999999999994 - type: ndcg_at_1000 value: 56.05 - type: ndcg_at_3 value: 41.976 - type: ndcg_at_5 value: 46.103 - type: precision_at_1 value: 31.489 - type: precision_at_10 value: 8.456 - type: precision_at_100 value: 1.125 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 19.09 - type: precision_at_5 value: 13.841000000000001 - type: recall_at_1 value: 28.008 - type: recall_at_10 value: 71.21499999999999 - type: recall_at_100 value: 92.99 - type: recall_at_1000 value: 98.578 - type: recall_at_3 value: 49.604 - type: recall_at_5 value: 59.094 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 70.351 - type: map_at_10 value: 84.163 - type: map_at_100 value: 84.785 - type: map_at_1000 value: 84.801 - type: map_at_3 value: 81.16 - type: map_at_5 value: 83.031 - type: mrr_at_1 value: 80.96 - type: mrr_at_10 value: 87.241 - type: mrr_at_100 value: 87.346 - type: mrr_at_1000 value: 87.347 - type: mrr_at_3 value: 86.25699999999999 - type: mrr_at_5 value: 86.907 - type: ndcg_at_1 value: 80.97 - type: ndcg_at_10 value: 88.017 - type: ndcg_at_100 value: 89.241 - type: ndcg_at_1000 value: 89.34299999999999 - type: ndcg_at_3 value: 85.053 - type: ndcg_at_5 value: 86.663 - type: precision_at_1 value: 80.97 - type: precision_at_10 value: 13.358 - type: precision_at_100 value: 1.525 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.143 - type: precision_at_5 value: 24.451999999999998 - type: recall_at_1 value: 70.351 - type: recall_at_10 value: 95.39800000000001 - type: recall_at_100 value: 99.55199999999999 - type: recall_at_1000 value: 99.978 - type: recall_at_3 value: 86.913 - type: recall_at_5 value: 91.448 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 55.62406719814139 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 61.386700035141736 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.618 - type: map_at_10 value: 12.920000000000002 - type: map_at_100 value: 15.304 - type: map_at_1000 value: 15.656999999999998 - type: map_at_3 value: 9.187 - type: map_at_5 value: 10.937 - type: mrr_at_1 value: 22.8 - type: mrr_at_10 value: 35.13 - type: mrr_at_100 value: 36.239 - type: mrr_at_1000 value: 36.291000000000004 - type: mrr_at_3 value: 31.917 - type: mrr_at_5 value: 33.787 - type: ndcg_at_1 value: 22.8 - type: ndcg_at_10 value: 21.382 - type: ndcg_at_100 value: 30.257 - type: ndcg_at_1000 value: 36.001 - type: ndcg_at_3 value: 20.43 - type: ndcg_at_5 value: 17.622 - type: precision_at_1 value: 22.8 - type: precision_at_10 value: 11.26 - type: precision_at_100 value: 2.405 - type: precision_at_1000 value: 0.377 - type: precision_at_3 value: 19.633 - type: precision_at_5 value: 15.68 - type: recall_at_1 value: 4.618 - type: recall_at_10 value: 22.811999999999998 - type: recall_at_100 value: 48.787000000000006 - type: recall_at_1000 value: 76.63799999999999 - type: recall_at_3 value: 11.952 - type: recall_at_5 value: 15.892000000000001 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 84.01529458252244 - type: cos_sim_spearman value: 77.92985224770254 - type: euclidean_pearson value: 81.04251429422487 - type: euclidean_spearman value: 77.92838490549133 - type: manhattan_pearson value: 80.95892251458979 - type: manhattan_spearman value: 77.81028089705941 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 83.97885282534388 - type: cos_sim_spearman value: 75.1221970851712 - type: euclidean_pearson value: 80.34455956720097 - type: euclidean_spearman value: 74.5894274239938 - type: manhattan_pearson value: 80.38999766325465 - type: manhattan_spearman value: 74.68524557166975 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 82.95746064915672 - type: cos_sim_spearman value: 85.08683458043946 - type: euclidean_pearson value: 84.56699492836385 - type: euclidean_spearman value: 85.66089116133713 - type: manhattan_pearson value: 84.47553323458541 - type: manhattan_spearman value: 85.56142206781472 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.71377893595067 - type: cos_sim_spearman value: 81.03453291428589 - type: euclidean_pearson value: 82.57136298308613 - type: euclidean_spearman value: 81.15839961890875 - type: manhattan_pearson value: 82.55157879373837 - type: manhattan_spearman value: 81.1540163767054 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.64197832372373 - type: cos_sim_spearman value: 88.31966852492485 - type: euclidean_pearson value: 87.98692129976983 - type: euclidean_spearman value: 88.6247340837856 - type: manhattan_pearson value: 87.90437827826412 - type: manhattan_spearman value: 88.56278787131457 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 81.84159950146693 - type: cos_sim_spearman value: 83.90678384140168 - type: euclidean_pearson value: 83.19005018860221 - type: euclidean_spearman value: 84.16260415876295 - type: manhattan_pearson value: 83.05030612994494 - type: manhattan_spearman value: 83.99605629718336 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.49935350176666 - type: cos_sim_spearman value: 87.59086606735383 - type: euclidean_pearson value: 88.06537181129983 - type: euclidean_spearman value: 87.6687448086014 - type: manhattan_pearson value: 87.96599131972935 - type: manhattan_spearman value: 87.63295748969642 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 67.68232799482763 - type: cos_sim_spearman value: 67.99930378085793 - type: euclidean_pearson value: 68.50275360001696 - type: euclidean_spearman value: 67.81588179309259 - type: manhattan_pearson value: 68.5892154749763 - type: manhattan_spearman value: 67.84357259640682 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.37049618406554 - type: cos_sim_spearman value: 85.57014313159492 - type: euclidean_pearson value: 85.57469513908282 - type: euclidean_spearman value: 85.661948135258 - type: manhattan_pearson value: 85.36866831229028 - type: manhattan_spearman value: 85.5043455368843 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 84.83259065376154 - type: mrr value: 95.58455433455433 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 58.817 - type: map_at_10 value: 68.459 - type: map_at_100 value: 68.951 - type: map_at_1000 value: 68.979 - type: map_at_3 value: 65.791 - type: map_at_5 value: 67.583 - type: mrr_at_1 value: 61.667 - type: mrr_at_10 value: 69.368 - type: mrr_at_100 value: 69.721 - type: mrr_at_1000 value: 69.744 - type: mrr_at_3 value: 67.278 - type: mrr_at_5 value: 68.611 - type: ndcg_at_1 value: 61.667 - type: ndcg_at_10 value: 72.70100000000001 - type: ndcg_at_100 value: 74.928 - type: ndcg_at_1000 value: 75.553 - type: ndcg_at_3 value: 68.203 - type: ndcg_at_5 value: 70.804 - type: precision_at_1 value: 61.667 - type: precision_at_10 value: 9.533 - type: precision_at_100 value: 1.077 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.444000000000003 - type: precision_at_5 value: 17.599999999999998 - type: recall_at_1 value: 58.817 - type: recall_at_10 value: 84.789 - type: recall_at_100 value: 95.0 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 72.8 - type: recall_at_5 value: 79.294 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.8108910891089 - type: cos_sim_ap value: 95.5743678558349 - type: cos_sim_f1 value: 90.43133366385722 - type: cos_sim_precision value: 89.67551622418878 - type: cos_sim_recall value: 91.2 - type: dot_accuracy value: 99.75841584158415 - type: dot_ap value: 94.00786363627253 - type: dot_f1 value: 87.51910341314316 - type: dot_precision value: 89.20041536863967 - type: dot_recall value: 85.9 - type: euclidean_accuracy value: 99.81485148514851 - type: euclidean_ap value: 95.4752113136905 - type: euclidean_f1 value: 90.44334975369456 - type: euclidean_precision value: 89.126213592233 - type: euclidean_recall value: 91.8 - type: manhattan_accuracy value: 99.81584158415842 - type: manhattan_ap value: 95.5163172682464 - type: manhattan_f1 value: 90.51987767584097 - type: manhattan_precision value: 92.3076923076923 - type: manhattan_recall value: 88.8 - type: max_accuracy value: 99.81584158415842 - type: max_ap value: 95.5743678558349 - type: max_f1 value: 90.51987767584097 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 62.63235986949449 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 36.334795589585575 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 52.02955214518782 - type: mrr value: 52.8004838298956 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.63769566275453 - type: cos_sim_spearman value: 30.422379185989335 - type: dot_pearson value: 26.88493071882256 - type: dot_spearman value: 26.505249740971305 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.21 - type: map_at_10 value: 1.654 - type: map_at_100 value: 10.095 - type: map_at_1000 value: 25.808999999999997 - type: map_at_3 value: 0.594 - type: map_at_5 value: 0.9289999999999999 - type: mrr_at_1 value: 78.0 - type: mrr_at_10 value: 87.019 - type: mrr_at_100 value: 87.019 - type: mrr_at_1000 value: 87.019 - type: mrr_at_3 value: 86.333 - type: mrr_at_5 value: 86.733 - type: ndcg_at_1 value: 73.0 - type: ndcg_at_10 value: 66.52900000000001 - type: ndcg_at_100 value: 53.433 - type: ndcg_at_1000 value: 51.324000000000005 - type: ndcg_at_3 value: 72.02199999999999 - type: ndcg_at_5 value: 69.696 - type: precision_at_1 value: 78.0 - type: precision_at_10 value: 70.39999999999999 - type: precision_at_100 value: 55.46 - type: precision_at_1000 value: 22.758 - type: precision_at_3 value: 76.667 - type: precision_at_5 value: 74.0 - type: recall_at_1 value: 0.21 - type: recall_at_10 value: 1.8849999999999998 - type: recall_at_100 value: 13.801 - type: recall_at_1000 value: 49.649 - type: recall_at_3 value: 0.632 - type: recall_at_5 value: 1.009 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.797 - type: map_at_10 value: 9.01 - type: map_at_100 value: 14.682 - type: map_at_1000 value: 16.336000000000002 - type: map_at_3 value: 4.546 - type: map_at_5 value: 5.9270000000000005 - type: mrr_at_1 value: 24.490000000000002 - type: mrr_at_10 value: 41.156 - type: mrr_at_100 value: 42.392 - type: mrr_at_1000 value: 42.408 - type: mrr_at_3 value: 38.775999999999996 - type: mrr_at_5 value: 40.102 - type: ndcg_at_1 value: 21.429000000000002 - type: ndcg_at_10 value: 22.222 - type: ndcg_at_100 value: 34.405 - type: ndcg_at_1000 value: 46.599000000000004 - type: ndcg_at_3 value: 25.261 - type: ndcg_at_5 value: 22.695999999999998 - type: precision_at_1 value: 24.490000000000002 - type: precision_at_10 value: 19.796 - type: precision_at_100 value: 7.306 - type: precision_at_1000 value: 1.5350000000000001 - type: precision_at_3 value: 27.211000000000002 - type: precision_at_5 value: 22.857 - type: recall_at_1 value: 1.797 - type: recall_at_10 value: 15.706000000000001 - type: recall_at_100 value: 46.412 - type: recall_at_1000 value: 83.159 - type: recall_at_3 value: 6.1370000000000005 - type: recall_at_5 value: 8.599 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.3302 - type: ap value: 14.169121204575601 - type: f1 value: 54.229345975274235 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 58.22297679683077 - type: f1 value: 58.62984908377875 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 49.952922428464255 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 84.68140907194373 - type: cos_sim_ap value: 70.12180123666836 - type: cos_sim_f1 value: 65.77501791258658 - type: cos_sim_precision value: 60.07853403141361 - type: cos_sim_recall value: 72.66490765171504 - type: dot_accuracy value: 81.92167848840674 - type: dot_ap value: 60.49837581423469 - type: dot_f1 value: 58.44186046511628 - type: dot_precision value: 52.24532224532224 - type: dot_recall value: 66.3060686015831 - type: euclidean_accuracy value: 84.73505394289802 - type: euclidean_ap value: 70.3278904593286 - type: euclidean_f1 value: 65.98851124940161 - type: euclidean_precision value: 60.38107752956636 - type: euclidean_recall value: 72.74406332453826 - type: manhattan_accuracy value: 84.73505394289802 - type: manhattan_ap value: 70.00737738537337 - type: manhattan_f1 value: 65.80150784822642 - type: manhattan_precision value: 61.892583120204606 - type: manhattan_recall value: 70.23746701846966 - type: max_accuracy value: 84.73505394289802 - type: max_ap value: 70.3278904593286 - type: max_f1 value: 65.98851124940161 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.44258159661582 - type: cos_sim_ap value: 84.91926704880888 - type: cos_sim_f1 value: 77.07651086632926 - type: cos_sim_precision value: 74.5894554883319 - type: cos_sim_recall value: 79.73514012935017 - type: dot_accuracy value: 85.88116583226608 - type: dot_ap value: 78.9753854779923 - type: dot_f1 value: 72.17757637979255 - type: dot_precision value: 66.80647486729143 - type: dot_recall value: 78.48783492454572 - type: euclidean_accuracy value: 88.5299025885823 - type: euclidean_ap value: 85.08006075642194 - type: euclidean_f1 value: 77.29637336504163 - type: euclidean_precision value: 74.69836253950014 - type: euclidean_recall value: 80.08161379735141 - type: manhattan_accuracy value: 88.55124771995187 - type: manhattan_ap value: 85.00941529932851 - type: manhattan_f1 value: 77.33100233100232 - type: manhattan_precision value: 73.37572573956317 - type: manhattan_recall value: 81.73698798891284 - type: max_accuracy value: 88.55124771995187 - type: max_ap value: 85.08006075642194 - type: max_f1 value: 77.33100233100232 language: - en license: mit --- # gte-small General Text Embeddings (GTE) model. [Towards General Text Embeddings with Multi-stage Contrastive Learning](https://arxiv.org/abs/2308.03281) The GTE models are trained by Alibaba DAMO Academy. They are mainly based on the BERT framework and currently offer three different sizes of models, including [GTE-large](https://huggingface.co/thenlper/gte-large), [GTE-base](https://huggingface.co/thenlper/gte-base), and [GTE-small](https://huggingface.co/thenlper/gte-small). The GTE models are trained on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be applied to various downstream tasks of text embeddings, including **information retrieval**, **semantic textual similarity**, **text reranking**, etc. ## Metrics We compared the performance of the GTE models with other popular text embedding models on the MTEB benchmark. For more detailed comparison results, please refer to the [MTEB leaderboard](https://huggingface.co/spaces/mteb/leaderboard). | Model Name | Model Size (GB) | Dimension | Sequence Length | Average (56) | Clustering (11) | Pair Classification (3) | Reranking (4) | Retrieval (15) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [**gte-large**](https://huggingface.co/thenlper/gte-large) | 0.67 | 1024 | 512 | **63.13** | 46.84 | 85.00 | 59.13 | 52.22 | 83.35 | 31.66 | 73.33 | | [**gte-base**](https://huggingface.co/thenlper/gte-base) | 0.22 | 768 | 512 | **62.39** | 46.2 | 84.57 | 58.61 | 51.14 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1.34 | 1024| 512 | 62.25 | 44.49 | 86.03 | 56.61 | 50.56 | 82.05 | 30.19 | 75.24 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.44 | 768 | 512 | 61.5 | 43.80 | 85.73 | 55.91 | 50.29 | 81.05 | 30.28 | 73.84 | | [**gte-small**](https://huggingface.co/thenlper/gte-small) | 0.07 | 384 | 512 | **61.36** | 44.89 | 83.54 | 57.7 | 49.46 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | - | 1536 | 8192 | 60.99 | 45.9 | 84.89 | 56.32 | 49.25 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 0.13 | 384 | 512 | 59.93 | 39.92 | 84.67 | 54.32 | 49.04 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 9.73 | 768 | 512 | 59.51 | 43.72 | 85.06 | 56.42 | 42.24 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 0.44 | 768 | 514 | 57.78 | 43.69 | 83.04 | 59.36 | 43.81 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 28.27 | 4096 | 2048 | 57.59 | 38.93 | 81.9 | 55.65 | 48.22 | 77.74 | 33.6 | 66.19 | | [all-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2) | 0.13 | 384 | 512 | 56.53 | 41.81 | 82.41 | 58.44 | 42.69 | 79.8 | 27.9 | 63.21 | | [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | 0.09 | 384 | 512 | 56.26 | 42.35 | 82.37 | 58.04 | 41.95 | 78.9 | 30.81 | 63.05 | | [contriever-base-msmarco](https://huggingface.co/nthakur/contriever-base-msmarco) | 0.44 | 768 | 512 | 56.00 | 41.1 | 82.54 | 53.14 | 41.88 | 76.51 | 30.36 | 66.68 | | [sentence-t5-base](https://huggingface.co/sentence-transformers/sentence-t5-base) | 0.22 | 768 | 512 | 55.27 | 40.21 | 85.18 | 53.09 | 33.63 | 81.14 | 31.39 | 69.81 | ## Usage Code example ```python import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0) return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] input_texts = [ "what is the capital of China?", "how to implement quick sort in python?", "Beijing", "sorting algorithms" ] tokenizer = AutoTokenizer.from_pretrained("thenlper/gte-small") model = AutoModel.from_pretrained("thenlper/gte-small") # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # (Optionally) normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:1] @ embeddings[1:].T) * 100 print(scores.tolist()) ``` Use with sentence-transformers: ```python from sentence_transformers import SentenceTransformer from sentence_transformers.util import cos_sim sentences = ['That is a happy person', 'That is a very happy person'] model = SentenceTransformer('thenlper/gte-large') embeddings = model.encode(sentences) print(cos_sim(embeddings[0], embeddings[1])) ``` ### Limitation This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens. ### Citation If you find our paper or models helpful, please consider citing them as follows: ``` @misc{li2023general, title={Towards General Text Embeddings with Multi-stage Contrastive Learning}, author={Zehan Li and Xin Zhang and Yanzhao Zhang and Dingkun Long and Pengjun Xie and Meishan Zhang}, year={2023}, eprint={2308.03281}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
68,127
[ [ -0.04010009765625, -0.044036865234375, 0.0214385986328125, 0.0167083740234375, -0.0159759521484375, -0.01027679443359375, -0.0231170654296875, -0.0220184326171875, 0.03558349609375, 0.00394439697265625, -0.03936767578125, -0.053070068359375, -0.0546875, -0.0006775856018066406, -0.01390838623046875, 0.073486328125, -0.003337860107421875, -0.01306915283203125, 0.0013580322265625, -0.02081298828125, -0.01471710205078125, -0.02813720703125, -0.049224853515625, -0.01068878173828125, 0.0343017578125, 0.0233154296875, 0.056640625, 0.0494384765625, 0.0243988037109375, 0.0283050537109375, -0.01476287841796875, -0.0016622543334960938, -0.0272216796875, -0.0158843994140625, 0.01428985595703125, -0.02734375, -0.033660888671875, 0.0092010498046875, 0.038421630859375, 0.0253753662109375, -0.0013885498046875, 0.01351165771484375, 0.0190887451171875, 0.0311279296875, -0.02081298828125, 0.0152740478515625, -0.01629638671875, 0.0095672607421875, -0.008056640625, 0.0140533447265625, -0.028350830078125, -0.0235748291015625, 0.022125244140625, -0.032806396484375, 0.01317596435546875, 0.015716552734375, 0.1031494140625, 0.01513671875, -0.0225830078125, -0.030059814453125, -0.0107574462890625, 0.06597900390625, -0.06884765625, 0.0224151611328125, 0.016754150390625, -0.00298309326171875, -0.00016236305236816406, -0.06390380859375, -0.045166015625, -0.003299713134765625, -0.046051025390625, 0.0189666748046875, -0.0196075439453125, -0.0015668869018554688, 0.02056884765625, 0.03857421875, -0.055206298828125, 0.0031261444091796875, -0.003910064697265625, -0.01166534423828125, 0.04473876953125, 0.003986358642578125, 0.0374755859375, -0.0430908203125, -0.03363037109375, -0.02301025390625, -0.0296173095703125, 0.01540374755859375, 0.0189361572265625, -0.0009369850158691406, -0.049072265625, 0.044189453125, -0.006664276123046875, 0.035888671875, 0.00753021240234375, -0.0026416778564453125, 0.048248291015625, -0.027191162109375, -0.02362060546875, -0.0265960693359375, 0.08941650390625, 0.0312042236328125, 0.01123046875, 0.002140045166015625, -0.008819580078125, -0.005855560302734375, -0.01513671875, -0.0712890625, -0.0233154296875, 0.013885498046875, -0.049346923828125, -0.0250701904296875, 0.0093231201171875, -0.0711669921875, -0.006656646728515625, -0.001148223876953125, 0.044219970703125, -0.04833984375, -0.007083892822265625, 0.004283905029296875, -0.0128173828125, 0.0218048095703125, -0.0036468505859375, -0.059112548828125, 0.0092010498046875, 0.0261383056640625, 0.07037353515625, 0.01251220703125, -0.0219573974609375, -0.00977325439453125, -0.0100860595703125, -0.004611968994140625, 0.0411376953125, -0.024566650390625, -0.009063720703125, -0.0011119842529296875, 0.0119781494140625, -0.026336669921875, -0.01523590087890625, 0.05816650390625, -0.00533294677734375, 0.045166015625, -0.009735107421875, -0.043365478515625, -0.0127716064453125, 0.01511383056640625, -0.0465087890625, 0.093505859375, 0.005702972412109375, -0.07989501953125, 0.0160064697265625, -0.046844482421875, -0.00528717041015625, -0.031768798828125, -0.00893402099609375, -0.053924560546875, -0.00768280029296875, 0.0391845703125, 0.0501708984375, -0.01499176025390625, -0.001190185546875, -0.0144500732421875, -0.0170745849609375, 0.00458526611328125, -0.0159912109375, 0.0736083984375, 0.007518768310546875, -0.046600341796875, 0.012237548828125, -0.04541015625, 0.007781982421875, 0.02679443359375, -0.00984954833984375, -0.020233154296875, -0.007740020751953125, 0.00780487060546875, 0.033782958984375, 0.02197265625, -0.035064697265625, 0.018280029296875, -0.036376953125, 0.0576171875, 0.06024169921875, 0.0002722740173339844, 0.0261383056640625, -0.02862548828125, 0.0161590576171875, 0.0105133056640625, 0.0225372314453125, -0.00850677490234375, -0.04718017578125, -0.06988525390625, -0.04010009765625, 0.037109375, 0.044647216796875, -0.058685302734375, 0.056427001953125, -0.0311737060546875, -0.0418701171875, -0.048492431640625, 0.0015497207641601562, 0.03167724609375, 0.0238800048828125, 0.03546142578125, -0.0061798095703125, -0.033660888671875, -0.08135986328125, -0.00737762451171875, -0.0012941360473632812, 0.0007872581481933594, 0.028900146484375, 0.058990478515625, -0.0230865478515625, 0.0625, -0.048583984375, -0.01451873779296875, -0.0151519775390625, 0.0151519775390625, 0.03466796875, 0.04327392578125, 0.06121826171875, -0.055267333984375, -0.05767822265625, -0.0207061767578125, -0.060333251953125, 0.01065826416015625, -0.00286102294921875, -0.0233001708984375, 0.01403045654296875, 0.03179931640625, -0.062042236328125, 0.03662109375, 0.039886474609375, -0.0460205078125, 0.0301513671875, -0.024322509765625, 0.007610321044921875, -0.10107421875, 0.001495361328125, 0.013671875, -0.017669677734375, -0.0462646484375, 0.0052490234375, 0.00406646728515625, 0.00946807861328125, -0.028411865234375, 0.04986572265625, -0.04766845703125, 0.017364501953125, 0.00910186767578125, 0.03265380859375, -0.0056304931640625, 0.05767822265625, -0.01053619384765625, 0.052001953125, 0.038421630859375, -0.02740478515625, 0.008941650390625, 0.035064697265625, -0.031585693359375, 0.03826904296875, -0.052734375, 0.004161834716796875, -0.007335662841796875, 0.0188751220703125, -0.08056640625, -0.00925445556640625, 0.0306854248046875, -0.046875, 0.027008056640625, -0.0006160736083984375, -0.045074462890625, -0.051544189453125, -0.05419921875, 0.01381683349609375, 0.035797119140625, -0.040283203125, 0.026947021484375, 0.01678466796875, -0.01129150390625, -0.0570068359375, -0.06170654296875, -0.003223419189453125, -0.0203857421875, -0.061309814453125, 0.04400634765625, -0.01546478271484375, 0.00821685791015625, 0.01378631591796875, 0.009613037109375, 0.003963470458984375, -0.007434844970703125, 0.01288604736328125, 0.0247955322265625, -0.019073486328125, -0.0007596015930175781, -0.0006928443908691406, -0.0081024169921875, -0.0116119384765625, -0.0030765533447265625, 0.053955078125, -0.0210113525390625, -0.003627777099609375, -0.040924072265625, 0.0195770263671875, 0.03173828125, -0.003070831298828125, 0.07037353515625, 0.07025146484375, -0.03582763671875, 0.00504302978515625, -0.035675048828125, -0.011322021484375, -0.038238525390625, 0.0230712890625, -0.034393310546875, -0.0721435546875, 0.05401611328125, 0.02362060546875, 0.00913238525390625, 0.07244873046875, 0.0443115234375, -0.0027904510498046875, 0.080322265625, 0.04241943359375, -0.0230865478515625, 0.0472412109375, -0.052459716796875, 0.020172119140625, -0.07000732421875, -0.0244293212890625, -0.0304718017578125, -0.0404052734375, -0.0655517578125, -0.034027099609375, 0.01221466064453125, 0.015380859375, -0.036468505859375, 0.04254150390625, -0.046844482421875, 0.01117706298828125, 0.04071044921875, 0.0202178955078125, -0.00949859619140625, 0.0001703500747680664, -0.033233642578125, -0.018341064453125, -0.0504150390625, -0.0306243896484375, 0.0640869140625, 0.041351318359375, 0.03619384765625, 0.00873565673828125, 0.052734375, 0.0124053955078125, 0.0196533203125, -0.050537109375, 0.0433349609375, -0.01314544677734375, -0.0462646484375, -0.019683837890625, -0.04034423828125, -0.06805419921875, 0.0255279541015625, -0.0245819091796875, -0.07147216796875, 0.01448822021484375, -0.009552001953125, -0.03155517578125, 0.031768798828125, -0.062103271484375, 0.07666015625, 0.0019474029541015625, -0.02392578125, -0.00293731689453125, -0.049530029296875, 0.01983642578125, 0.0369873046875, 0.0128173828125, 0.0021457672119140625, -0.00791168212890625, 0.06732177734375, -0.036773681640625, 0.0543212890625, -0.01360321044921875, 0.006282806396484375, 0.0255584716796875, -0.0186004638671875, 0.0472412109375, 0.00833892822265625, 0.002414703369140625, 0.0027904510498046875, -0.0165557861328125, -0.03973388671875, -0.036285400390625, 0.062164306640625, -0.0672607421875, -0.038909912109375, -0.044830322265625, -0.0255279541015625, -0.005096435546875, 0.0167388916015625, 0.0401611328125, 0.031219482421875, -0.0016841888427734375, 0.0311737060546875, 0.048492431640625, -0.0307769775390625, 0.061431884765625, -0.00021219253540039062, 0.0031871795654296875, -0.050750732421875, 0.06475830078125, 0.00452423095703125, 0.0036449432373046875, 0.030670166015625, 0.006183624267578125, -0.0311431884765625, -0.0224761962890625, -0.0283660888671875, 0.04443359375, -0.043670654296875, -0.012451171875, -0.059234619140625, -0.0312347412109375, -0.037689208984375, -0.01019287109375, -0.0183258056640625, -0.03558349609375, -0.0416259765625, -0.0168304443359375, 0.0260009765625, 0.056396484375, -0.005290985107421875, 0.017974853515625, -0.0369873046875, 0.0230865478515625, 0.016845703125, 0.0285797119140625, -0.00045180320739746094, -0.054962158203125, -0.02783203125, -0.005035400390625, -0.0272064208984375, -0.06744384765625, 0.0396728515625, 0.006343841552734375, 0.04791259765625, 0.0277557373046875, -0.01123046875, 0.0521240234375, -0.03460693359375, 0.0712890625, 0.025634765625, -0.07293701171875, 0.0302734375, -0.01410675048828125, 0.0116119384765625, 0.0316162109375, 0.0343017578125, -0.041015625, -0.0230865478515625, -0.06353759765625, -0.0771484375, 0.050506591796875, 0.031585693359375, 0.006317138671875, -0.0081634521484375, 0.0248565673828125, -0.01080322265625, 0.00844573974609375, -0.07159423828125, -0.05767822265625, -0.02490234375, -0.04278564453125, -0.01500701904296875, -0.0260009765625, 0.003879547119140625, -0.030426025390625, 0.05413818359375, -0.0022373199462890625, 0.06524658203125, 0.0248260498046875, -0.0178985595703125, 0.0299835205078125, 0.0108795166015625, 0.057464599609375, 0.0158843994140625, -0.011260986328125, -0.00141143798828125, 0.0185546875, -0.039642333984375, 0.00531005859375, 0.017730712890625, -0.01175689697265625, 0.0065460205078125, 0.026947021484375, 0.065673828125, 0.0185546875, -0.020751953125, 0.053955078125, -0.0117950439453125, -0.027435302734375, -0.0199432373046875, -0.00855255126953125, 0.01522064208984375, 0.0229644775390625, 0.0228271484375, -0.00400543212890625, 0.0035266876220703125, -0.036865234375, 0.0253753662109375, 0.0255584716796875, -0.036956787109375, -0.0206451416015625, 0.05194091796875, -0.00032448768615722656, -0.0031070709228515625, 0.039703369140625, -0.02239990234375, -0.0478515625, 0.045318603515625, 0.0321044921875, 0.0655517578125, -0.0056915283203125, 0.023590087890625, 0.054779052734375, 0.03179931640625, -0.0010814666748046875, 0.0133819580078125, 0.02618408203125, -0.038909912109375, -0.0220184326171875, -0.03521728515625, 0.00193023681640625, 0.0157928466796875, -0.036895751953125, 0.028106689453125, -0.026885986328125, -0.0159912109375, -0.00688934326171875, 0.0178680419921875, -0.058074951171875, 0.0175018310546875, 0.0048675537109375, 0.07470703125, -0.0584716796875, 0.06304931640625, 0.0482177734375, -0.044891357421875, -0.05987548828125, 0.00024330615997314453, -0.00792694091796875, -0.057403564453125, 0.0325927734375, 0.02374267578125, 0.0190582275390625, 0.007587432861328125, -0.034454345703125, -0.0665283203125, 0.1068115234375, 0.01416778564453125, -0.02606201171875, -0.019073486328125, 0.0023651123046875, 0.0384521484375, -0.0283050537109375, 0.056182861328125, 0.031982421875, 0.03857421875, -0.0161285400390625, -0.04913330078125, 0.024658203125, -0.03424072265625, 0.005413055419921875, 0.004138946533203125, -0.081298828125, 0.0797119140625, -0.005279541015625, -0.00911712646484375, 0.016387939453125, 0.060699462890625, 0.01424407958984375, 0.003887176513671875, 0.036895751953125, 0.06890869140625, 0.045806884765625, -0.0298919677734375, 0.08636474609375, -0.022125244140625, 0.054931640625, 0.05419921875, 0.032318115234375, 0.06329345703125, 0.0212860107421875, -0.01293182373046875, 0.06317138671875, 0.05670166015625, -0.0030269622802734375, 0.038726806640625, 0.009185791015625, -0.0005278587341308594, -0.003116607666015625, 0.0123443603515625, -0.03179931640625, 0.01085662841796875, 0.0217742919921875, -0.037994384765625, -0.0024204254150390625, -0.0005998611450195312, 0.02508544921875, -0.006256103515625, -0.006839752197265625, 0.03936767578125, 0.014434814453125, -0.021453857421875, 0.040679931640625, 0.0081329345703125, 0.08111572265625, -0.0242919921875, 0.01381683349609375, -0.016387939453125, 0.0092620849609375, -0.0149078369140625, -0.071044921875, 0.009429931640625, -0.00409698486328125, -0.01531219482421875, -0.012786865234375, 0.041351318359375, -0.0304412841796875, -0.0283050537109375, 0.033416748046875, 0.031768798828125, -0.0024204254150390625, 0.0084381103515625, -0.08660888671875, -0.00574493408203125, 0.011474609375, -0.0577392578125, 0.019012451171875, 0.03192138671875, 0.016326904296875, 0.039642333984375, 0.026397705078125, -0.014068603515625, 0.019866943359375, -0.004795074462890625, 0.0560302734375, -0.068603515625, -0.037689208984375, -0.0704345703125, 0.059661865234375, -0.0278472900390625, -0.0355224609375, 0.05645751953125, 0.048736572265625, 0.046356201171875, -0.0078277587890625, 0.045623779296875, -0.0278167724609375, 0.034942626953125, -0.035186767578125, 0.057342529296875, -0.056884765625, -0.01068878173828125, -0.0231475830078125, -0.062103271484375, -0.021636962890625, 0.05792236328125, -0.0261077880859375, 0.01529693603515625, 0.06243896484375, 0.0528564453125, 0.0004391670227050781, -0.014251708984375, 0.0034503936767578125, 0.037200927734375, 0.0232391357421875, 0.06439208984375, 0.037139892578125, -0.0703125, 0.049957275390625, -0.034393310546875, -0.00984954833984375, -0.02410888671875, -0.059417724609375, -0.0814208984375, -0.055145263671875, -0.031402587890625, -0.02618408203125, -0.0153045654296875, 0.0672607421875, 0.048614501953125, -0.058837890625, 0.0014200210571289062, -0.0001475811004638672, 0.01300048828125, -0.00670623779296875, -0.025390625, 0.047393798828125, -0.006572723388671875, -0.0745849609375, 0.0045928955078125, 0.001995086669921875, 0.0199127197265625, 0.0014791488647460938, -0.017059326171875, -0.03619384765625, 0.00513458251953125, 0.047454833984375, 0.01410675048828125, -0.048431396484375, -0.039886474609375, 0.005405426025390625, -0.0355224609375, 0.01270294189453125, 0.0253753662109375, -0.0309600830078125, 0.00028634071350097656, 0.050323486328125, 0.02508544921875, 0.0516357421875, -0.013885498046875, 0.00693511962890625, -0.055419921875, 0.02398681640625, 0.002201080322265625, 0.0439453125, 0.0190887451171875, -0.01702880859375, 0.047393798828125, 0.025054931640625, -0.044097900390625, -0.05474853515625, -0.00986480712890625, -0.0819091796875, -0.01097869873046875, 0.0721435546875, -0.020416259765625, -0.02410888671875, 0.02056884765625, -0.0214996337890625, 0.024200439453125, -0.029510498046875, 0.04498291015625, 0.06463623046875, 0.0001220703125, -0.02716064453125, -0.0467529296875, 0.035003662109375, 0.0357666015625, -0.056671142578125, -0.028106689453125, 0.006633758544921875, 0.0228729248046875, 0.028228759765625, 0.038818359375, -0.00948333740234375, 0.00547027587890625, 0.0009312629699707031, 0.008514404296875, 0.01021575927734375, -0.005710601806640625, -0.01178741455078125, 0.010406494140625, -0.0156707763671875, -0.0182647705078125 ] ]
BAAI/bge-large-en-v1.5
2023-10-12T03:37:51.000Z
[ "sentence-transformers", "pytorch", "bert", "feature-extraction", "sentence-similarity", "transformers", "mteb", "en", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
BAAI
null
null
BAAI/bge-large-en-v1.5
118
154,492
sentence-transformers
2023-09-12T05:20:08
--- tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers - mteb model-index: - name: bge-large-en-v1.5 results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 75.8507462686567 - type: ap value: 38.566457320228245 - type: f1 value: 69.69386648043475 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 92.416675 - type: ap value: 89.1928861155922 - type: f1 value: 92.39477019574215 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.175999999999995 - type: f1 value: 47.80712792870253 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 40.184999999999995 - type: map_at_10 value: 55.654 - type: map_at_100 value: 56.25 - type: map_at_1000 value: 56.255 - type: map_at_3 value: 51.742999999999995 - type: map_at_5 value: 54.129000000000005 - type: mrr_at_1 value: 40.967 - type: mrr_at_10 value: 55.96 - type: mrr_at_100 value: 56.54900000000001 - type: mrr_at_1000 value: 56.554 - type: mrr_at_3 value: 51.980000000000004 - type: mrr_at_5 value: 54.44 - type: ndcg_at_1 value: 40.184999999999995 - type: ndcg_at_10 value: 63.542 - type: ndcg_at_100 value: 65.96499999999999 - type: ndcg_at_1000 value: 66.08699999999999 - type: ndcg_at_3 value: 55.582 - type: ndcg_at_5 value: 59.855000000000004 - type: precision_at_1 value: 40.184999999999995 - type: precision_at_10 value: 8.841000000000001 - type: precision_at_100 value: 0.987 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 22.238 - type: precision_at_5 value: 15.405 - type: recall_at_1 value: 40.184999999999995 - type: recall_at_10 value: 88.407 - type: recall_at_100 value: 98.72 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 66.714 - type: recall_at_5 value: 77.027 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 48.567077926750066 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 43.19453389182364 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 64.46555939623092 - type: mrr value: 77.82361605768807 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.9554128814735 - type: cos_sim_spearman value: 84.65373612172036 - type: euclidean_pearson value: 83.2905059954138 - type: euclidean_spearman value: 84.52240782811128 - type: manhattan_pearson value: 82.99533802997436 - type: manhattan_spearman value: 84.20673798475734 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 87.78896103896103 - type: f1 value: 87.77189310964883 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 39.714538337650495 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 36.90108349284447 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 32.795 - type: map_at_10 value: 43.669000000000004 - type: map_at_100 value: 45.151 - type: map_at_1000 value: 45.278 - type: map_at_3 value: 40.006 - type: map_at_5 value: 42.059999999999995 - type: mrr_at_1 value: 39.771 - type: mrr_at_10 value: 49.826 - type: mrr_at_100 value: 50.504000000000005 - type: mrr_at_1000 value: 50.549 - type: mrr_at_3 value: 47.115 - type: mrr_at_5 value: 48.832 - type: ndcg_at_1 value: 39.771 - type: ndcg_at_10 value: 50.217999999999996 - type: ndcg_at_100 value: 55.454 - type: ndcg_at_1000 value: 57.37 - type: ndcg_at_3 value: 44.885000000000005 - type: ndcg_at_5 value: 47.419 - type: precision_at_1 value: 39.771 - type: precision_at_10 value: 9.642000000000001 - type: precision_at_100 value: 1.538 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 21.268 - type: precision_at_5 value: 15.536 - type: recall_at_1 value: 32.795 - type: recall_at_10 value: 62.580999999999996 - type: recall_at_100 value: 84.438 - type: recall_at_1000 value: 96.492 - type: recall_at_3 value: 47.071000000000005 - type: recall_at_5 value: 54.079 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 32.671 - type: map_at_10 value: 43.334 - type: map_at_100 value: 44.566 - type: map_at_1000 value: 44.702999999999996 - type: map_at_3 value: 40.343 - type: map_at_5 value: 41.983 - type: mrr_at_1 value: 40.764 - type: mrr_at_10 value: 49.382 - type: mrr_at_100 value: 49.988 - type: mrr_at_1000 value: 50.03300000000001 - type: mrr_at_3 value: 47.293 - type: mrr_at_5 value: 48.51 - type: ndcg_at_1 value: 40.764 - type: ndcg_at_10 value: 49.039 - type: ndcg_at_100 value: 53.259 - type: ndcg_at_1000 value: 55.253 - type: ndcg_at_3 value: 45.091 - type: ndcg_at_5 value: 46.839999999999996 - type: precision_at_1 value: 40.764 - type: precision_at_10 value: 9.191 - type: precision_at_100 value: 1.476 - type: precision_at_1000 value: 0.19499999999999998 - type: precision_at_3 value: 21.72 - type: precision_at_5 value: 15.299 - type: recall_at_1 value: 32.671 - type: recall_at_10 value: 58.816 - type: recall_at_100 value: 76.654 - type: recall_at_1000 value: 89.05999999999999 - type: recall_at_3 value: 46.743 - type: recall_at_5 value: 51.783 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 40.328 - type: map_at_10 value: 53.32599999999999 - type: map_at_100 value: 54.37499999999999 - type: map_at_1000 value: 54.429 - type: map_at_3 value: 49.902 - type: map_at_5 value: 52.002 - type: mrr_at_1 value: 46.332 - type: mrr_at_10 value: 56.858 - type: mrr_at_100 value: 57.522 - type: mrr_at_1000 value: 57.54899999999999 - type: mrr_at_3 value: 54.472 - type: mrr_at_5 value: 55.996 - type: ndcg_at_1 value: 46.332 - type: ndcg_at_10 value: 59.313 - type: ndcg_at_100 value: 63.266999999999996 - type: ndcg_at_1000 value: 64.36 - type: ndcg_at_3 value: 53.815000000000005 - type: ndcg_at_5 value: 56.814 - type: precision_at_1 value: 46.332 - type: precision_at_10 value: 9.53 - type: precision_at_100 value: 1.238 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 24.054000000000002 - type: precision_at_5 value: 16.589000000000002 - type: recall_at_1 value: 40.328 - type: recall_at_10 value: 73.421 - type: recall_at_100 value: 90.059 - type: recall_at_1000 value: 97.81 - type: recall_at_3 value: 59.009 - type: recall_at_5 value: 66.352 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.424 - type: map_at_10 value: 36.332 - type: map_at_100 value: 37.347 - type: map_at_1000 value: 37.422 - type: map_at_3 value: 33.743 - type: map_at_5 value: 35.176 - type: mrr_at_1 value: 29.153000000000002 - type: mrr_at_10 value: 38.233 - type: mrr_at_100 value: 39.109 - type: mrr_at_1000 value: 39.164 - type: mrr_at_3 value: 35.876000000000005 - type: mrr_at_5 value: 37.169000000000004 - type: ndcg_at_1 value: 29.153000000000002 - type: ndcg_at_10 value: 41.439 - type: ndcg_at_100 value: 46.42 - type: ndcg_at_1000 value: 48.242000000000004 - type: ndcg_at_3 value: 36.362 - type: ndcg_at_5 value: 38.743 - type: precision_at_1 value: 29.153000000000002 - type: precision_at_10 value: 6.315999999999999 - type: precision_at_100 value: 0.927 - type: precision_at_1000 value: 0.11199999999999999 - type: precision_at_3 value: 15.443000000000001 - type: precision_at_5 value: 10.644 - type: recall_at_1 value: 27.424 - type: recall_at_10 value: 55.364000000000004 - type: recall_at_100 value: 78.211 - type: recall_at_1000 value: 91.74600000000001 - type: recall_at_3 value: 41.379 - type: recall_at_5 value: 47.14 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 19.601 - type: map_at_10 value: 27.826 - type: map_at_100 value: 29.017 - type: map_at_1000 value: 29.137 - type: map_at_3 value: 25.125999999999998 - type: map_at_5 value: 26.765 - type: mrr_at_1 value: 24.005000000000003 - type: mrr_at_10 value: 32.716 - type: mrr_at_100 value: 33.631 - type: mrr_at_1000 value: 33.694 - type: mrr_at_3 value: 29.934 - type: mrr_at_5 value: 31.630999999999997 - type: ndcg_at_1 value: 24.005000000000003 - type: ndcg_at_10 value: 33.158 - type: ndcg_at_100 value: 38.739000000000004 - type: ndcg_at_1000 value: 41.495 - type: ndcg_at_3 value: 28.185 - type: ndcg_at_5 value: 30.796 - type: precision_at_1 value: 24.005000000000003 - type: precision_at_10 value: 5.908 - type: precision_at_100 value: 1.005 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 13.391 - type: precision_at_5 value: 9.876 - type: recall_at_1 value: 19.601 - type: recall_at_10 value: 44.746 - type: recall_at_100 value: 68.82300000000001 - type: recall_at_1000 value: 88.215 - type: recall_at_3 value: 31.239 - type: recall_at_5 value: 37.695 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.130000000000003 - type: map_at_10 value: 40.96 - type: map_at_100 value: 42.282 - type: map_at_1000 value: 42.392 - type: map_at_3 value: 37.889 - type: map_at_5 value: 39.661 - type: mrr_at_1 value: 36.958999999999996 - type: mrr_at_10 value: 46.835 - type: mrr_at_100 value: 47.644 - type: mrr_at_1000 value: 47.688 - type: mrr_at_3 value: 44.562000000000005 - type: mrr_at_5 value: 45.938 - type: ndcg_at_1 value: 36.958999999999996 - type: ndcg_at_10 value: 47.06 - type: ndcg_at_100 value: 52.345 - type: ndcg_at_1000 value: 54.35 - type: ndcg_at_3 value: 42.301 - type: ndcg_at_5 value: 44.635999999999996 - type: precision_at_1 value: 36.958999999999996 - type: precision_at_10 value: 8.479000000000001 - type: precision_at_100 value: 1.284 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 20.244 - type: precision_at_5 value: 14.224999999999998 - type: recall_at_1 value: 30.130000000000003 - type: recall_at_10 value: 59.27 - type: recall_at_100 value: 81.195 - type: recall_at_1000 value: 94.21199999999999 - type: recall_at_3 value: 45.885 - type: recall_at_5 value: 52.016 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.169999999999998 - type: map_at_10 value: 36.451 - type: map_at_100 value: 37.791000000000004 - type: map_at_1000 value: 37.897 - type: map_at_3 value: 33.109 - type: map_at_5 value: 34.937000000000005 - type: mrr_at_1 value: 32.877 - type: mrr_at_10 value: 42.368 - type: mrr_at_100 value: 43.201 - type: mrr_at_1000 value: 43.259 - type: mrr_at_3 value: 39.763999999999996 - type: mrr_at_5 value: 41.260000000000005 - type: ndcg_at_1 value: 32.877 - type: ndcg_at_10 value: 42.659000000000006 - type: ndcg_at_100 value: 48.161 - type: ndcg_at_1000 value: 50.345 - type: ndcg_at_3 value: 37.302 - type: ndcg_at_5 value: 39.722 - type: precision_at_1 value: 32.877 - type: precision_at_10 value: 7.9 - type: precision_at_100 value: 1.236 - type: precision_at_1000 value: 0.158 - type: precision_at_3 value: 17.846 - type: precision_at_5 value: 12.9 - type: recall_at_1 value: 26.169999999999998 - type: recall_at_10 value: 55.35 - type: recall_at_100 value: 78.755 - type: recall_at_1000 value: 93.518 - type: recall_at_3 value: 40.176 - type: recall_at_5 value: 46.589000000000006 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.15516666666667 - type: map_at_10 value: 36.65741666666667 - type: map_at_100 value: 37.84991666666666 - type: map_at_1000 value: 37.96316666666667 - type: map_at_3 value: 33.74974999999999 - type: map_at_5 value: 35.3765 - type: mrr_at_1 value: 32.08233333333334 - type: mrr_at_10 value: 41.033833333333334 - type: mrr_at_100 value: 41.84524999999999 - type: mrr_at_1000 value: 41.89983333333333 - type: mrr_at_3 value: 38.62008333333333 - type: mrr_at_5 value: 40.03441666666666 - type: ndcg_at_1 value: 32.08233333333334 - type: ndcg_at_10 value: 42.229 - type: ndcg_at_100 value: 47.26716666666667 - type: ndcg_at_1000 value: 49.43466666666667 - type: ndcg_at_3 value: 37.36408333333333 - type: ndcg_at_5 value: 39.6715 - type: precision_at_1 value: 32.08233333333334 - type: precision_at_10 value: 7.382583333333334 - type: precision_at_100 value: 1.16625 - type: precision_at_1000 value: 0.15408333333333332 - type: precision_at_3 value: 17.218 - type: precision_at_5 value: 12.21875 - type: recall_at_1 value: 27.15516666666667 - type: recall_at_10 value: 54.36683333333333 - type: recall_at_100 value: 76.37183333333333 - type: recall_at_1000 value: 91.26183333333333 - type: recall_at_3 value: 40.769916666666674 - type: recall_at_5 value: 46.702333333333335 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.749 - type: map_at_10 value: 33.001999999999995 - type: map_at_100 value: 33.891 - type: map_at_1000 value: 33.993 - type: map_at_3 value: 30.703999999999997 - type: map_at_5 value: 31.959 - type: mrr_at_1 value: 28.834 - type: mrr_at_10 value: 35.955 - type: mrr_at_100 value: 36.709 - type: mrr_at_1000 value: 36.779 - type: mrr_at_3 value: 33.947 - type: mrr_at_5 value: 35.089 - type: ndcg_at_1 value: 28.834 - type: ndcg_at_10 value: 37.329 - type: ndcg_at_100 value: 41.79 - type: ndcg_at_1000 value: 44.169000000000004 - type: ndcg_at_3 value: 33.184999999999995 - type: ndcg_at_5 value: 35.107 - type: precision_at_1 value: 28.834 - type: precision_at_10 value: 5.7669999999999995 - type: precision_at_100 value: 0.876 - type: precision_at_1000 value: 0.11399999999999999 - type: precision_at_3 value: 14.213000000000001 - type: precision_at_5 value: 9.754999999999999 - type: recall_at_1 value: 25.749 - type: recall_at_10 value: 47.791 - type: recall_at_100 value: 68.255 - type: recall_at_1000 value: 85.749 - type: recall_at_3 value: 36.199 - type: recall_at_5 value: 41.071999999999996 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.777 - type: map_at_10 value: 25.201 - type: map_at_100 value: 26.423999999999996 - type: map_at_1000 value: 26.544 - type: map_at_3 value: 22.869 - type: map_at_5 value: 24.023 - type: mrr_at_1 value: 21.473 - type: mrr_at_10 value: 29.12 - type: mrr_at_100 value: 30.144 - type: mrr_at_1000 value: 30.215999999999998 - type: mrr_at_3 value: 26.933 - type: mrr_at_5 value: 28.051 - type: ndcg_at_1 value: 21.473 - type: ndcg_at_10 value: 30.003 - type: ndcg_at_100 value: 35.766 - type: ndcg_at_1000 value: 38.501000000000005 - type: ndcg_at_3 value: 25.773000000000003 - type: ndcg_at_5 value: 27.462999999999997 - type: precision_at_1 value: 21.473 - type: precision_at_10 value: 5.482 - type: precision_at_100 value: 0.975 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_3 value: 12.205 - type: precision_at_5 value: 8.692 - type: recall_at_1 value: 17.777 - type: recall_at_10 value: 40.582 - type: recall_at_100 value: 66.305 - type: recall_at_1000 value: 85.636 - type: recall_at_3 value: 28.687 - type: recall_at_5 value: 33.089 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.677 - type: map_at_10 value: 36.309000000000005 - type: map_at_100 value: 37.403999999999996 - type: map_at_1000 value: 37.496 - type: map_at_3 value: 33.382 - type: map_at_5 value: 34.98 - type: mrr_at_1 value: 31.343 - type: mrr_at_10 value: 40.549 - type: mrr_at_100 value: 41.342 - type: mrr_at_1000 value: 41.397 - type: mrr_at_3 value: 38.029 - type: mrr_at_5 value: 39.451 - type: ndcg_at_1 value: 31.343 - type: ndcg_at_10 value: 42.1 - type: ndcg_at_100 value: 47.089999999999996 - type: ndcg_at_1000 value: 49.222 - type: ndcg_at_3 value: 36.836999999999996 - type: ndcg_at_5 value: 39.21 - type: precision_at_1 value: 31.343 - type: precision_at_10 value: 7.164 - type: precision_at_100 value: 1.0959999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 16.915 - type: precision_at_5 value: 11.940000000000001 - type: recall_at_1 value: 26.677 - type: recall_at_10 value: 55.54599999999999 - type: recall_at_100 value: 77.094 - type: recall_at_1000 value: 92.01 - type: recall_at_3 value: 41.191 - type: recall_at_5 value: 47.006 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 24.501 - type: map_at_10 value: 33.102 - type: map_at_100 value: 34.676 - type: map_at_1000 value: 34.888000000000005 - type: map_at_3 value: 29.944 - type: map_at_5 value: 31.613999999999997 - type: mrr_at_1 value: 29.447000000000003 - type: mrr_at_10 value: 37.996 - type: mrr_at_100 value: 38.946 - type: mrr_at_1000 value: 38.995000000000005 - type: mrr_at_3 value: 35.079 - type: mrr_at_5 value: 36.69 - type: ndcg_at_1 value: 29.447000000000003 - type: ndcg_at_10 value: 39.232 - type: ndcg_at_100 value: 45.247 - type: ndcg_at_1000 value: 47.613 - type: ndcg_at_3 value: 33.922999999999995 - type: ndcg_at_5 value: 36.284 - type: precision_at_1 value: 29.447000000000003 - type: precision_at_10 value: 7.648000000000001 - type: precision_at_100 value: 1.516 - type: precision_at_1000 value: 0.23900000000000002 - type: precision_at_3 value: 16.008 - type: precision_at_5 value: 11.779 - type: recall_at_1 value: 24.501 - type: recall_at_10 value: 51.18899999999999 - type: recall_at_100 value: 78.437 - type: recall_at_1000 value: 92.842 - type: recall_at_3 value: 35.808 - type: recall_at_5 value: 42.197 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 22.039 - type: map_at_10 value: 30.377 - type: map_at_100 value: 31.275 - type: map_at_1000 value: 31.379 - type: map_at_3 value: 27.98 - type: map_at_5 value: 29.358 - type: mrr_at_1 value: 24.03 - type: mrr_at_10 value: 32.568000000000005 - type: mrr_at_100 value: 33.403 - type: mrr_at_1000 value: 33.475 - type: mrr_at_3 value: 30.436999999999998 - type: mrr_at_5 value: 31.796000000000003 - type: ndcg_at_1 value: 24.03 - type: ndcg_at_10 value: 35.198 - type: ndcg_at_100 value: 39.668 - type: ndcg_at_1000 value: 42.296 - type: ndcg_at_3 value: 30.709999999999997 - type: ndcg_at_5 value: 33.024 - type: precision_at_1 value: 24.03 - type: precision_at_10 value: 5.564 - type: precision_at_100 value: 0.828 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 13.309000000000001 - type: precision_at_5 value: 9.39 - type: recall_at_1 value: 22.039 - type: recall_at_10 value: 47.746 - type: recall_at_100 value: 68.23599999999999 - type: recall_at_1000 value: 87.852 - type: recall_at_3 value: 35.852000000000004 - type: recall_at_5 value: 41.410000000000004 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 15.692999999999998 - type: map_at_10 value: 26.903 - type: map_at_100 value: 28.987000000000002 - type: map_at_1000 value: 29.176999999999996 - type: map_at_3 value: 22.137 - type: map_at_5 value: 24.758 - type: mrr_at_1 value: 35.57 - type: mrr_at_10 value: 47.821999999999996 - type: mrr_at_100 value: 48.608000000000004 - type: mrr_at_1000 value: 48.638999999999996 - type: mrr_at_3 value: 44.452000000000005 - type: mrr_at_5 value: 46.546 - type: ndcg_at_1 value: 35.57 - type: ndcg_at_10 value: 36.567 - type: ndcg_at_100 value: 44.085 - type: ndcg_at_1000 value: 47.24 - type: ndcg_at_3 value: 29.964000000000002 - type: ndcg_at_5 value: 32.511 - type: precision_at_1 value: 35.57 - type: precision_at_10 value: 11.485 - type: precision_at_100 value: 1.9619999999999997 - type: precision_at_1000 value: 0.256 - type: precision_at_3 value: 22.237000000000002 - type: precision_at_5 value: 17.471999999999998 - type: recall_at_1 value: 15.692999999999998 - type: recall_at_10 value: 43.056 - type: recall_at_100 value: 68.628 - type: recall_at_1000 value: 86.075 - type: recall_at_3 value: 26.918999999999997 - type: recall_at_5 value: 34.14 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 9.53 - type: map_at_10 value: 20.951 - type: map_at_100 value: 30.136000000000003 - type: map_at_1000 value: 31.801000000000002 - type: map_at_3 value: 15.021 - type: map_at_5 value: 17.471999999999998 - type: mrr_at_1 value: 71.0 - type: mrr_at_10 value: 79.176 - type: mrr_at_100 value: 79.418 - type: mrr_at_1000 value: 79.426 - type: mrr_at_3 value: 78.125 - type: mrr_at_5 value: 78.61200000000001 - type: ndcg_at_1 value: 58.5 - type: ndcg_at_10 value: 44.106 - type: ndcg_at_100 value: 49.268 - type: ndcg_at_1000 value: 56.711999999999996 - type: ndcg_at_3 value: 48.934 - type: ndcg_at_5 value: 45.826 - type: precision_at_1 value: 71.0 - type: precision_at_10 value: 35.0 - type: precision_at_100 value: 11.360000000000001 - type: precision_at_1000 value: 2.046 - type: precision_at_3 value: 52.833 - type: precision_at_5 value: 44.15 - type: recall_at_1 value: 9.53 - type: recall_at_10 value: 26.811 - type: recall_at_100 value: 55.916999999999994 - type: recall_at_1000 value: 79.973 - type: recall_at_3 value: 16.413 - type: recall_at_5 value: 19.980999999999998 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.519999999999996 - type: f1 value: 46.36601294761231 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 74.413 - type: map_at_10 value: 83.414 - type: map_at_100 value: 83.621 - type: map_at_1000 value: 83.635 - type: map_at_3 value: 82.337 - type: map_at_5 value: 83.039 - type: mrr_at_1 value: 80.19800000000001 - type: mrr_at_10 value: 87.715 - type: mrr_at_100 value: 87.778 - type: mrr_at_1000 value: 87.779 - type: mrr_at_3 value: 87.106 - type: mrr_at_5 value: 87.555 - type: ndcg_at_1 value: 80.19800000000001 - type: ndcg_at_10 value: 87.182 - type: ndcg_at_100 value: 87.90299999999999 - type: ndcg_at_1000 value: 88.143 - type: ndcg_at_3 value: 85.60600000000001 - type: ndcg_at_5 value: 86.541 - type: precision_at_1 value: 80.19800000000001 - type: precision_at_10 value: 10.531 - type: precision_at_100 value: 1.113 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 32.933 - type: precision_at_5 value: 20.429 - type: recall_at_1 value: 74.413 - type: recall_at_10 value: 94.363 - type: recall_at_100 value: 97.165 - type: recall_at_1000 value: 98.668 - type: recall_at_3 value: 90.108 - type: recall_at_5 value: 92.52 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 22.701 - type: map_at_10 value: 37.122 - type: map_at_100 value: 39.178000000000004 - type: map_at_1000 value: 39.326 - type: map_at_3 value: 32.971000000000004 - type: map_at_5 value: 35.332 - type: mrr_at_1 value: 44.753 - type: mrr_at_10 value: 53.452 - type: mrr_at_100 value: 54.198 - type: mrr_at_1000 value: 54.225 - type: mrr_at_3 value: 50.952 - type: mrr_at_5 value: 52.464 - type: ndcg_at_1 value: 44.753 - type: ndcg_at_10 value: 45.021 - type: ndcg_at_100 value: 52.028 - type: ndcg_at_1000 value: 54.596000000000004 - type: ndcg_at_3 value: 41.622 - type: ndcg_at_5 value: 42.736000000000004 - type: precision_at_1 value: 44.753 - type: precision_at_10 value: 12.284 - type: precision_at_100 value: 1.955 - type: precision_at_1000 value: 0.243 - type: precision_at_3 value: 27.828999999999997 - type: precision_at_5 value: 20.061999999999998 - type: recall_at_1 value: 22.701 - type: recall_at_10 value: 51.432 - type: recall_at_100 value: 77.009 - type: recall_at_1000 value: 92.511 - type: recall_at_3 value: 37.919000000000004 - type: recall_at_5 value: 44.131 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 40.189 - type: map_at_10 value: 66.24600000000001 - type: map_at_100 value: 67.098 - type: map_at_1000 value: 67.149 - type: map_at_3 value: 62.684 - type: map_at_5 value: 64.974 - type: mrr_at_1 value: 80.378 - type: mrr_at_10 value: 86.127 - type: mrr_at_100 value: 86.29299999999999 - type: mrr_at_1000 value: 86.297 - type: mrr_at_3 value: 85.31400000000001 - type: mrr_at_5 value: 85.858 - type: ndcg_at_1 value: 80.378 - type: ndcg_at_10 value: 74.101 - type: ndcg_at_100 value: 76.993 - type: ndcg_at_1000 value: 77.948 - type: ndcg_at_3 value: 69.232 - type: ndcg_at_5 value: 72.04599999999999 - type: precision_at_1 value: 80.378 - type: precision_at_10 value: 15.595999999999998 - type: precision_at_100 value: 1.7840000000000003 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 44.884 - type: precision_at_5 value: 29.145 - type: recall_at_1 value: 40.189 - type: recall_at_10 value: 77.981 - type: recall_at_100 value: 89.21 - type: recall_at_1000 value: 95.48299999999999 - type: recall_at_3 value: 67.326 - type: recall_at_5 value: 72.863 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 92.84599999999999 - type: ap value: 89.4710787567357 - type: f1 value: 92.83752676932258 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 23.132 - type: map_at_10 value: 35.543 - type: map_at_100 value: 36.702 - type: map_at_1000 value: 36.748999999999995 - type: map_at_3 value: 31.737 - type: map_at_5 value: 33.927 - type: mrr_at_1 value: 23.782 - type: mrr_at_10 value: 36.204 - type: mrr_at_100 value: 37.29 - type: mrr_at_1000 value: 37.330999999999996 - type: mrr_at_3 value: 32.458999999999996 - type: mrr_at_5 value: 34.631 - type: ndcg_at_1 value: 23.782 - type: ndcg_at_10 value: 42.492999999999995 - type: ndcg_at_100 value: 47.985 - type: ndcg_at_1000 value: 49.141 - type: ndcg_at_3 value: 34.748000000000005 - type: ndcg_at_5 value: 38.651 - type: precision_at_1 value: 23.782 - type: precision_at_10 value: 6.665 - type: precision_at_100 value: 0.941 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.776 - type: precision_at_5 value: 10.84 - type: recall_at_1 value: 23.132 - type: recall_at_10 value: 63.794 - type: recall_at_100 value: 89.027 - type: recall_at_1000 value: 97.807 - type: recall_at_3 value: 42.765 - type: recall_at_5 value: 52.11 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 94.59188326493388 - type: f1 value: 94.3842594786827 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 79.49384404924761 - type: f1 value: 59.7580539534629 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 77.56220578345663 - type: f1 value: 75.27228165561478 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 80.53463349024884 - type: f1 value: 80.4893958236536 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 32.56100273484962 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 31.470380028839607 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 32.06102792457849 - type: mrr value: 33.30709199672238 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 6.776999999999999 - type: map_at_10 value: 14.924000000000001 - type: map_at_100 value: 18.955 - type: map_at_1000 value: 20.538999999999998 - type: map_at_3 value: 10.982 - type: map_at_5 value: 12.679000000000002 - type: mrr_at_1 value: 47.988 - type: mrr_at_10 value: 57.232000000000006 - type: mrr_at_100 value: 57.818999999999996 - type: mrr_at_1000 value: 57.847 - type: mrr_at_3 value: 54.901999999999994 - type: mrr_at_5 value: 56.481 - type: ndcg_at_1 value: 46.594 - type: ndcg_at_10 value: 38.129000000000005 - type: ndcg_at_100 value: 35.54 - type: ndcg_at_1000 value: 44.172 - type: ndcg_at_3 value: 43.025999999999996 - type: ndcg_at_5 value: 41.052 - type: precision_at_1 value: 47.988 - type: precision_at_10 value: 28.111000000000004 - type: precision_at_100 value: 8.929 - type: precision_at_1000 value: 2.185 - type: precision_at_3 value: 40.144000000000005 - type: precision_at_5 value: 35.232 - type: recall_at_1 value: 6.776999999999999 - type: recall_at_10 value: 19.289 - type: recall_at_100 value: 36.359 - type: recall_at_1000 value: 67.54 - type: recall_at_3 value: 11.869 - type: recall_at_5 value: 14.999 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 31.108000000000004 - type: map_at_10 value: 47.126000000000005 - type: map_at_100 value: 48.171 - type: map_at_1000 value: 48.199 - type: map_at_3 value: 42.734 - type: map_at_5 value: 45.362 - type: mrr_at_1 value: 34.936 - type: mrr_at_10 value: 49.571 - type: mrr_at_100 value: 50.345 - type: mrr_at_1000 value: 50.363 - type: mrr_at_3 value: 45.959 - type: mrr_at_5 value: 48.165 - type: ndcg_at_1 value: 34.936 - type: ndcg_at_10 value: 55.028999999999996 - type: ndcg_at_100 value: 59.244 - type: ndcg_at_1000 value: 59.861 - type: ndcg_at_3 value: 46.872 - type: ndcg_at_5 value: 51.217999999999996 - type: precision_at_1 value: 34.936 - type: precision_at_10 value: 9.099 - type: precision_at_100 value: 1.145 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 21.456 - type: precision_at_5 value: 15.411 - type: recall_at_1 value: 31.108000000000004 - type: recall_at_10 value: 76.53999999999999 - type: recall_at_100 value: 94.39 - type: recall_at_1000 value: 98.947 - type: recall_at_3 value: 55.572 - type: recall_at_5 value: 65.525 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 71.56400000000001 - type: map_at_10 value: 85.482 - type: map_at_100 value: 86.114 - type: map_at_1000 value: 86.13 - type: map_at_3 value: 82.607 - type: map_at_5 value: 84.405 - type: mrr_at_1 value: 82.42 - type: mrr_at_10 value: 88.304 - type: mrr_at_100 value: 88.399 - type: mrr_at_1000 value: 88.399 - type: mrr_at_3 value: 87.37 - type: mrr_at_5 value: 88.024 - type: ndcg_at_1 value: 82.45 - type: ndcg_at_10 value: 89.06500000000001 - type: ndcg_at_100 value: 90.232 - type: ndcg_at_1000 value: 90.305 - type: ndcg_at_3 value: 86.375 - type: ndcg_at_5 value: 87.85300000000001 - type: precision_at_1 value: 82.45 - type: precision_at_10 value: 13.486999999999998 - type: precision_at_100 value: 1.534 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.813 - type: precision_at_5 value: 24.773999999999997 - type: recall_at_1 value: 71.56400000000001 - type: recall_at_10 value: 95.812 - type: recall_at_100 value: 99.7 - type: recall_at_1000 value: 99.979 - type: recall_at_3 value: 87.966 - type: recall_at_5 value: 92.268 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 57.241876648614145 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 64.66212576446223 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 5.308 - type: map_at_10 value: 13.803 - type: map_at_100 value: 16.176 - type: map_at_1000 value: 16.561 - type: map_at_3 value: 9.761000000000001 - type: map_at_5 value: 11.802 - type: mrr_at_1 value: 26.200000000000003 - type: mrr_at_10 value: 37.621 - type: mrr_at_100 value: 38.767 - type: mrr_at_1000 value: 38.815 - type: mrr_at_3 value: 34.117 - type: mrr_at_5 value: 36.107 - type: ndcg_at_1 value: 26.200000000000003 - type: ndcg_at_10 value: 22.64 - type: ndcg_at_100 value: 31.567 - type: ndcg_at_1000 value: 37.623 - type: ndcg_at_3 value: 21.435000000000002 - type: ndcg_at_5 value: 18.87 - type: precision_at_1 value: 26.200000000000003 - type: precision_at_10 value: 11.74 - type: precision_at_100 value: 2.465 - type: precision_at_1000 value: 0.391 - type: precision_at_3 value: 20.033 - type: precision_at_5 value: 16.64 - type: recall_at_1 value: 5.308 - type: recall_at_10 value: 23.794999999999998 - type: recall_at_100 value: 50.015 - type: recall_at_1000 value: 79.283 - type: recall_at_3 value: 12.178 - type: recall_at_5 value: 16.882 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 84.93231134675553 - type: cos_sim_spearman value: 81.68319292603205 - type: euclidean_pearson value: 81.8396814380367 - type: euclidean_spearman value: 81.24641903349945 - type: manhattan_pearson value: 81.84698799204274 - type: manhattan_spearman value: 81.24269997904105 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 86.73241671587446 - type: cos_sim_spearman value: 79.05091082971826 - type: euclidean_pearson value: 83.91146869578044 - type: euclidean_spearman value: 79.87978465370936 - type: manhattan_pearson value: 83.90888338917678 - type: manhattan_spearman value: 79.87482848584241 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 85.14970731146177 - type: cos_sim_spearman value: 86.37363490084627 - type: euclidean_pearson value: 83.02154218530433 - type: euclidean_spearman value: 83.80258761957367 - type: manhattan_pearson value: 83.01664495119347 - type: manhattan_spearman value: 83.77567458007952 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 83.40474139886784 - type: cos_sim_spearman value: 82.77768789165984 - type: euclidean_pearson value: 80.7065877443695 - type: euclidean_spearman value: 81.375940662505 - type: manhattan_pearson value: 80.6507552270278 - type: manhattan_spearman value: 81.32782179098741 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 87.08585968722274 - type: cos_sim_spearman value: 88.03110031451399 - type: euclidean_pearson value: 85.74012019602384 - type: euclidean_spearman value: 86.13592849438209 - type: manhattan_pearson value: 85.74404842369206 - type: manhattan_spearman value: 86.14492318960154 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.95069052788875 - type: cos_sim_spearman value: 86.4867991595147 - type: euclidean_pearson value: 84.31013325754635 - type: euclidean_spearman value: 85.01529258006482 - type: manhattan_pearson value: 84.26995570085374 - type: manhattan_spearman value: 84.96982104986162 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.54617647971897 - type: cos_sim_spearman value: 87.49834181751034 - type: euclidean_pearson value: 86.01015322577122 - type: euclidean_spearman value: 84.63362652063199 - type: manhattan_pearson value: 86.13807574475706 - type: manhattan_spearman value: 84.7772370721132 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 67.20047755786615 - type: cos_sim_spearman value: 67.05324077987636 - type: euclidean_pearson value: 66.91930642976601 - type: euclidean_spearman value: 65.21491856099105 - type: manhattan_pearson value: 66.78756851976624 - type: manhattan_spearman value: 65.12356257740728 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 86.19852871539686 - type: cos_sim_spearman value: 87.5161895296395 - type: euclidean_pearson value: 84.59848645207485 - type: euclidean_spearman value: 85.26427328757919 - type: manhattan_pearson value: 84.59747366996524 - type: manhattan_spearman value: 85.24045855146915 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 87.63320317811032 - type: mrr value: 96.26242947321379 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 60.928000000000004 - type: map_at_10 value: 70.112 - type: map_at_100 value: 70.59299999999999 - type: map_at_1000 value: 70.623 - type: map_at_3 value: 66.846 - type: map_at_5 value: 68.447 - type: mrr_at_1 value: 64.0 - type: mrr_at_10 value: 71.212 - type: mrr_at_100 value: 71.616 - type: mrr_at_1000 value: 71.64500000000001 - type: mrr_at_3 value: 68.77799999999999 - type: mrr_at_5 value: 70.094 - type: ndcg_at_1 value: 64.0 - type: ndcg_at_10 value: 74.607 - type: ndcg_at_100 value: 76.416 - type: ndcg_at_1000 value: 77.102 - type: ndcg_at_3 value: 69.126 - type: ndcg_at_5 value: 71.41300000000001 - type: precision_at_1 value: 64.0 - type: precision_at_10 value: 9.933 - type: precision_at_100 value: 1.077 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 26.556 - type: precision_at_5 value: 17.467 - type: recall_at_1 value: 60.928000000000004 - type: recall_at_10 value: 87.322 - type: recall_at_100 value: 94.833 - type: recall_at_1000 value: 100.0 - type: recall_at_3 value: 72.628 - type: recall_at_5 value: 78.428 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.86237623762376 - type: cos_sim_ap value: 96.72586477206649 - type: cos_sim_f1 value: 93.01858362631845 - type: cos_sim_precision value: 93.4409687184662 - type: cos_sim_recall value: 92.60000000000001 - type: dot_accuracy value: 99.78019801980199 - type: dot_ap value: 93.72748205246228 - type: dot_f1 value: 89.04109589041096 - type: dot_precision value: 87.16475095785441 - type: dot_recall value: 91.0 - type: euclidean_accuracy value: 99.85445544554456 - type: euclidean_ap value: 96.6661459876145 - type: euclidean_f1 value: 92.58337481333997 - type: euclidean_precision value: 92.17046580773042 - type: euclidean_recall value: 93.0 - type: manhattan_accuracy value: 99.85445544554456 - type: manhattan_ap value: 96.6883549244056 - type: manhattan_f1 value: 92.57598405580468 - type: manhattan_precision value: 92.25422045680239 - type: manhattan_recall value: 92.9 - type: max_accuracy value: 99.86237623762376 - type: max_ap value: 96.72586477206649 - type: max_f1 value: 93.01858362631845 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 66.39930057069995 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 34.96398659903402 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 55.946944700355395 - type: mrr value: 56.97151398438164 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 31.541657650692905 - type: cos_sim_spearman value: 31.605804192286303 - type: dot_pearson value: 28.26905996736398 - type: dot_spearman value: 27.864801765851187 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.22599999999999998 - type: map_at_10 value: 1.8870000000000002 - type: map_at_100 value: 9.78 - type: map_at_1000 value: 22.514 - type: map_at_3 value: 0.6669999999999999 - type: map_at_5 value: 1.077 - type: mrr_at_1 value: 82.0 - type: mrr_at_10 value: 89.86699999999999 - type: mrr_at_100 value: 89.86699999999999 - type: mrr_at_1000 value: 89.86699999999999 - type: mrr_at_3 value: 89.667 - type: mrr_at_5 value: 89.667 - type: ndcg_at_1 value: 79.0 - type: ndcg_at_10 value: 74.818 - type: ndcg_at_100 value: 53.715999999999994 - type: ndcg_at_1000 value: 47.082 - type: ndcg_at_3 value: 82.134 - type: ndcg_at_5 value: 79.81899999999999 - type: precision_at_1 value: 82.0 - type: precision_at_10 value: 78.0 - type: precision_at_100 value: 54.48 - type: precision_at_1000 value: 20.518 - type: precision_at_3 value: 87.333 - type: precision_at_5 value: 85.2 - type: recall_at_1 value: 0.22599999999999998 - type: recall_at_10 value: 2.072 - type: recall_at_100 value: 13.013 - type: recall_at_1000 value: 43.462 - type: recall_at_3 value: 0.695 - type: recall_at_5 value: 1.139 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.328 - type: map_at_10 value: 9.795 - type: map_at_100 value: 15.801000000000002 - type: map_at_1000 value: 17.23 - type: map_at_3 value: 4.734 - type: map_at_5 value: 6.644 - type: mrr_at_1 value: 30.612000000000002 - type: mrr_at_10 value: 46.902 - type: mrr_at_100 value: 47.495 - type: mrr_at_1000 value: 47.495 - type: mrr_at_3 value: 41.156 - type: mrr_at_5 value: 44.218 - type: ndcg_at_1 value: 28.571 - type: ndcg_at_10 value: 24.806 - type: ndcg_at_100 value: 36.419000000000004 - type: ndcg_at_1000 value: 47.272999999999996 - type: ndcg_at_3 value: 25.666 - type: ndcg_at_5 value: 25.448999999999998 - type: precision_at_1 value: 30.612000000000002 - type: precision_at_10 value: 23.061 - type: precision_at_100 value: 7.714 - type: precision_at_1000 value: 1.484 - type: precision_at_3 value: 26.531 - type: precision_at_5 value: 26.122 - type: recall_at_1 value: 2.328 - type: recall_at_10 value: 16.524 - type: recall_at_100 value: 47.179 - type: recall_at_1000 value: 81.22200000000001 - type: recall_at_3 value: 5.745 - type: recall_at_5 value: 9.339 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 70.9142 - type: ap value: 14.335574772555415 - type: f1 value: 54.62839595194111 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 59.94340690435768 - type: f1 value: 60.286487936731916 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 51.26597708987974 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 87.48882398521786 - type: cos_sim_ap value: 79.04326607602204 - type: cos_sim_f1 value: 71.64566826860633 - type: cos_sim_precision value: 70.55512918905092 - type: cos_sim_recall value: 72.77044854881267 - type: dot_accuracy value: 84.19264469213805 - type: dot_ap value: 67.96360043562528 - type: dot_f1 value: 64.06418393006827 - type: dot_precision value: 58.64941898706424 - type: dot_recall value: 70.58047493403694 - type: euclidean_accuracy value: 87.45902127913214 - type: euclidean_ap value: 78.9742237648272 - type: euclidean_f1 value: 71.5553235908142 - type: euclidean_precision value: 70.77955601445535 - type: euclidean_recall value: 72.34828496042216 - type: manhattan_accuracy value: 87.41729749061214 - type: manhattan_ap value: 78.90073137580596 - type: manhattan_f1 value: 71.3942611553533 - type: manhattan_precision value: 68.52705653967483 - type: manhattan_recall value: 74.51187335092348 - type: max_accuracy value: 87.48882398521786 - type: max_ap value: 79.04326607602204 - type: max_f1 value: 71.64566826860633 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.68125897465751 - type: cos_sim_ap value: 85.6003454431979 - type: cos_sim_f1 value: 77.6957163958641 - type: cos_sim_precision value: 73.0110366307807 - type: cos_sim_recall value: 83.02279026793964 - type: dot_accuracy value: 87.7672992587418 - type: dot_ap value: 82.4971301112899 - type: dot_f1 value: 75.90528233151184 - type: dot_precision value: 72.0370626469368 - type: dot_recall value: 80.21250384970742 - type: euclidean_accuracy value: 88.4503434625684 - type: euclidean_ap value: 84.91949884748384 - type: euclidean_f1 value: 76.92365018444684 - type: euclidean_precision value: 74.53245721712759 - type: euclidean_recall value: 79.47336002463813 - type: manhattan_accuracy value: 88.47556952691427 - type: manhattan_ap value: 84.8963689101517 - type: manhattan_f1 value: 76.85901249256395 - type: manhattan_precision value: 74.31693989071039 - type: manhattan_recall value: 79.58115183246073 - type: max_accuracy value: 88.68125897465751 - type: max_ap value: 85.6003454431979 - type: max_f1 value: 77.6957163958641 license: mit language: - en --- <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
89,816
[ [ -0.036651611328125, -0.06793212890625, 0.029296875, 0.01184844970703125, -0.027191162109375, -0.0205535888671875, -0.02374267578125, -0.0189666748046875, 0.0300445556640625, 0.0283966064453125, -0.0260467529296875, -0.06549072265625, -0.036102294921875, -0.004619598388671875, -0.006389617919921875, 0.04150390625, -0.0037555694580078125, 0.01070404052734375, 0.00390625, -0.0186920166015625, -0.0283966064453125, -0.0190277099609375, -0.049224853515625, -0.0192108154296875, 0.0257568359375, 0.017059326171875, 0.042205810546875, 0.056365966796875, 0.0222930908203125, 0.020233154296875, -0.0175933837890625, 0.01165771484375, -0.0357666015625, -0.00511932373046875, -0.01552581787109375, -0.02484130859375, -0.03143310546875, 0.0131988525390625, 0.050994873046875, 0.0335693359375, -0.007740020751953125, 0.00778961181640625, 0.0005087852478027344, 0.053192138671875, -0.034515380859375, 0.020751953125, -0.042724609375, 0.0027141571044921875, -0.0180511474609375, 0.0112762451171875, -0.03863525390625, -0.0287017822265625, 0.0119476318359375, -0.0458984375, 0.0061798095703125, 0.0216217041015625, 0.097412109375, 0.0154266357421875, -0.033935546875, -0.0123748779296875, -0.009063720703125, 0.0740966796875, -0.07525634765625, 0.051177978515625, 0.03790283203125, 0.0190277099609375, -0.00580596923828125, -0.06085205078125, -0.0269622802734375, -0.01239776611328125, -0.01514434814453125, 0.031494140625, 0.00189971923828125, 0.001491546630859375, 0.0236053466796875, 0.044586181640625, -0.041351318359375, 0.007038116455078125, -0.005123138427734375, -0.0118560791015625, 0.056884765625, -0.01229095458984375, 0.034149169921875, -0.041534423828125, -0.0223388671875, -0.0274505615234375, -0.059661865234375, 0.0034122467041015625, 0.0273284912109375, 0.01024627685546875, -0.029327392578125, 0.0423583984375, -0.017120361328125, 0.045501708984375, 0.0037994384765625, 0.003856658935546875, 0.03924560546875, -0.0278472900390625, -0.01548004150390625, -0.0109405517578125, 0.069580078125, 0.029449462890625, -0.004306793212890625, 0.00383758544921875, -0.024139404296875, -0.00708770751953125, -0.006938934326171875, -0.0670166015625, -0.018157958984375, 0.0148162841796875, -0.05712890625, -0.013580322265625, 0.0177459716796875, -0.05810546875, 0.00774383544921875, 0.00010269880294799805, 0.043670654296875, -0.055694580078125, -0.005512237548828125, 0.0233612060546875, -0.0157623291015625, 0.029998779296875, -0.00023877620697021484, -0.046844482421875, -0.018524169921875, 0.039764404296875, 0.0640869140625, 0.01251220703125, -0.005702972412109375, -0.0279541015625, 0.00281524658203125, -0.01064300537109375, 0.0244903564453125, -0.03887939453125, -0.01331329345703125, 0.015777587890625, 0.028961181640625, -0.00769805908203125, -0.0216827392578125, 0.06597900390625, -0.0401611328125, 0.02691650390625, -0.0283050537109375, -0.061553955078125, -0.037506103515625, 0.006927490234375, -0.060089111328125, 0.08282470703125, -0.00733184814453125, -0.06353759765625, 0.006229400634765625, -0.048248291015625, -0.01617431640625, -0.0191192626953125, -0.0024871826171875, -0.044830322265625, -0.0087890625, 0.0284423828125, 0.043701171875, -0.017120361328125, 0.0026111602783203125, -0.0259857177734375, -0.042755126953125, -0.0005445480346679688, -0.01727294921875, 0.0819091796875, 0.0191497802734375, -0.025115966796875, -0.0164794921875, -0.03277587890625, 0.00899505615234375, 0.022705078125, -0.02337646484375, -0.0258331298828125, 0.0165557861328125, 0.0176849365234375, 0.0038738250732421875, 0.03961181640625, -0.05279541015625, 0.01369476318359375, -0.043792724609375, 0.044464111328125, 0.04180908203125, 0.0129241943359375, 0.0179595947265625, -0.035491943359375, 0.0215606689453125, -0.00174713134765625, -0.0028553009033203125, -0.016632080078125, -0.039703369140625, -0.046966552734375, -0.0226287841796875, 0.055511474609375, 0.04937744140625, -0.0653076171875, 0.04974365234375, -0.034210205078125, -0.046356201171875, -0.070556640625, 0.0100555419921875, 0.039947509765625, 0.00015819072723388672, 0.05377197265625, -0.01035308837890625, -0.035858154296875, -0.0699462890625, -0.00464630126953125, 0.005870819091796875, -0.0067901611328125, 0.040191650390625, 0.0460205078125, -0.0238800048828125, 0.0305023193359375, -0.05487060546875, -0.026153564453125, -0.0171966552734375, -0.0054931640625, 0.0253753662109375, 0.036773681640625, 0.0478515625, -0.07537841796875, -0.043701171875, -0.0006184577941894531, -0.058319091796875, 0.0057220458984375, 0.0027179718017578125, -0.0224151611328125, 0.01305389404296875, 0.045654296875, -0.0307769775390625, 0.017822265625, 0.03564453125, -0.0193328857421875, 0.0210418701171875, -0.0015621185302734375, 0.01097869873046875, -0.09912109375, 0.001644134521484375, 0.022613525390625, -0.0085906982421875, -0.020477294921875, 0.0389404296875, 0.01271820068359375, 0.01544952392578125, -0.02587890625, 0.0439453125, -0.0394287109375, 0.018768310546875, 0.009674072265625, 0.0460205078125, -0.0066986083984375, 0.038330078125, -0.0035381317138671875, 0.0537109375, 0.02783203125, -0.0299072265625, 0.00926971435546875, 0.03955078125, -0.033355712890625, 0.006107330322265625, -0.04937744140625, -0.005687713623046875, -0.0055389404296875, 0.01256561279296875, -0.06207275390625, -0.00543975830078125, 0.0198211669921875, -0.042999267578125, 0.03955078125, -0.0224609375, -0.0372314453125, -0.027587890625, -0.06829833984375, 0.0110015869140625, 0.04376220703125, -0.048553466796875, 0.0164794921875, 0.022125244140625, 0.006961822509765625, -0.057952880859375, -0.061309814453125, -0.01165771484375, -0.0001741647720336914, -0.039520263671875, 0.0408935546875, -0.0021533966064453125, 0.0191650390625, 0.0141754150390625, -0.005359649658203125, 0.0112762451171875, 0.00867462158203125, -0.00018215179443359375, 0.0184173583984375, -0.035797119140625, 0.0035552978515625, 0.0205535888671875, 0.00978851318359375, -0.0148773193359375, -0.01212310791015625, 0.033111572265625, -0.0129241943359375, -0.0267791748046875, -0.017791748046875, 0.0255584716796875, 0.0192413330078125, -0.030426025390625, 0.0445556640625, 0.07452392578125, -0.0281829833984375, -0.00624847412109375, -0.0496826171875, -0.009246826171875, -0.036224365234375, 0.034149169921875, -0.02435302734375, -0.07379150390625, 0.029754638671875, -0.00152587890625, 0.0162506103515625, 0.050872802734375, 0.0252227783203125, -0.01064300537109375, 0.0809326171875, 0.0281524658203125, -0.0202789306640625, 0.049835205078125, -0.049774169921875, 0.0133209228515625, -0.08831787109375, -0.0033626556396484375, -0.0297088623046875, -0.0297088623046875, -0.09991455078125, -0.03802490234375, 0.0046234130859375, 0.021026611328125, -0.02862548828125, 0.0323486328125, -0.04302978515625, 0.0114593505859375, 0.03643798828125, 0.0222930908203125, -0.00138092041015625, 0.00936126708984375, -0.032623291015625, -0.0203704833984375, -0.04583740234375, -0.038238525390625, 0.07525634765625, 0.036407470703125, 0.04608154296875, 0.0273590087890625, 0.061981201171875, 0.01422119140625, 0.0073089599609375, -0.058197021484375, 0.042999267578125, -0.03936767578125, -0.04296875, -0.0269927978515625, -0.036712646484375, -0.083984375, 0.02984619140625, -0.0206146240234375, -0.058319091796875, 0.008087158203125, -0.0148773193359375, -0.002300262451171875, 0.03515625, -0.050872802734375, 0.07720947265625, -0.0081024169921875, -0.0231781005859375, -0.0058746337890625, -0.03155517578125, 0.024505615234375, 0.01496124267578125, 0.006206512451171875, 0.00556182861328125, -0.0195770263671875, 0.0572509765625, -0.01415252685546875, 0.04803466796875, -0.01220703125, 0.01123809814453125, 0.03240966796875, -0.0138397216796875, 0.04168701171875, 0.00604248046875, -0.01355743408203125, 0.0226898193359375, 0.0067291259765625, -0.036376953125, -0.037384033203125, 0.06634521484375, -0.050750732421875, -0.05328369140625, -0.0282440185546875, -0.0188751220703125, 0.01348876953125, 0.032989501953125, 0.026580810546875, 0.01654052734375, -0.00777435302734375, 0.048736572265625, 0.06982421875, -0.041107177734375, 0.028900146484375, 0.0261077880859375, -0.0206146240234375, -0.044647216796875, 0.0845947265625, 0.019775390625, -0.003963470458984375, 0.05078125, 0.0010080337524414062, -0.0210418701171875, -0.040008544921875, -0.03436279296875, 0.047943115234375, -0.04473876953125, -0.0126495361328125, -0.048309326171875, -0.032196044921875, -0.0325927734375, 0.0016298294067382812, -0.02044677734375, -0.0213623046875, -0.013427734375, -0.02117919921875, 0.0177764892578125, 0.035797119140625, 0.00917816162109375, 0.00669097900390625, -0.0535888671875, 0.0158843994140625, -0.00734710693359375, 0.033172607421875, 0.005420684814453125, -0.04071044921875, -0.046844482421875, 0.013153076171875, -0.0369873046875, -0.081787109375, 0.0262908935546875, 0.00568389892578125, 0.06317138671875, 0.02484130859375, -0.0008320808410644531, 0.0309600830078125, -0.03955078125, 0.0804443359375, -0.008148193359375, -0.05926513671875, 0.0384521484375, -0.02117919921875, 0.01244354248046875, 0.042205810546875, 0.049285888671875, -0.034912109375, -0.0206451416015625, -0.037078857421875, -0.07275390625, 0.036712646484375, 0.01374053955078125, 0.003215789794921875, -0.0223541259765625, 0.024810791015625, -0.01372528076171875, -0.0001634359359741211, -0.060302734375, -0.05621337890625, -0.0251922607421875, -0.02655029296875, -0.007312774658203125, -0.020904541015625, 0.0155792236328125, -0.021881103515625, 0.075439453125, 0.0003428459167480469, 0.041351318359375, 0.0270233154296875, -0.0246429443359375, 0.0180206298828125, 0.019073486328125, 0.0224456787109375, 0.01412200927734375, -0.0291900634765625, -0.0109405517578125, 0.023712158203125, -0.041595458984375, -0.004810333251953125, 0.0233612060546875, -0.035308837890625, 0.01459503173828125, 0.023040771484375, 0.05328369140625, 0.033843994140625, -0.03338623046875, 0.042633056640625, 0.00862884521484375, -0.01421356201171875, -0.02252197265625, -0.005405426025390625, 0.023040771484375, 0.0189666748046875, 0.00879669189453125, -0.03436279296875, 0.0199432373046875, -0.039947509765625, 0.0255584716796875, 0.033843994140625, -0.0287017822265625, -0.00504302978515625, 0.052764892578125, 0.002620697021484375, -0.00160980224609375, 0.036102294921875, -0.037841796875, -0.055450439453125, 0.031982421875, 0.0282440185546875, 0.06329345703125, -0.01097869873046875, 0.016876220703125, 0.06500244140625, 0.04010009765625, -0.024078369140625, 0.026885986328125, 0.00585174560546875, -0.04400634765625, -0.033447265625, -0.0408935546875, -0.004375457763671875, 0.02008056640625, -0.0435791015625, 0.026397705078125, -0.031402587890625, -0.01114654541015625, 0.002368927001953125, 0.033111572265625, -0.05609130859375, 0.0095367431640625, 0.0034198760986328125, 0.0848388671875, -0.04400634765625, 0.06292724609375, 0.07470703125, -0.07220458984375, -0.058197021484375, 0.005970001220703125, -0.00984954833984375, -0.04583740234375, 0.0283050537109375, 0.0198822021484375, 0.01334381103515625, 0.00469207763671875, -0.03594970703125, -0.06890869140625, 0.11822509765625, 0.00287628173828125, -0.0399169921875, -0.00466156005859375, -0.021240234375, 0.034454345703125, -0.0285186767578125, 0.033599853515625, 0.0310516357421875, 0.0460205078125, -0.0139617919921875, -0.048553466796875, 0.040863037109375, -0.023834228515625, 0.0176849365234375, 0.00370025634765625, -0.07330322265625, 0.0623779296875, 0.003437042236328125, -0.0249786376953125, 0.014678955078125, 0.054473876953125, 0.0173187255859375, 0.031585693359375, 0.0182037353515625, 0.07000732421875, 0.049652099609375, -0.01678466796875, 0.08709716796875, -0.019439697265625, 0.04736328125, 0.06549072265625, 0.0127716064453125, 0.0843505859375, 0.006694793701171875, -0.0177459716796875, 0.05059814453125, 0.0596923828125, -0.024078369140625, 0.035064697265625, 0.001430511474609375, 0.00460052490234375, -0.02410888671875, 0.0040130615234375, -0.04034423828125, 0.0217132568359375, 0.0245513916015625, -0.038970947265625, 0.0033626556396484375, -0.0220489501953125, 0.0089874267578125, 0.00799560546875, -0.0017023086547851562, 0.04339599609375, 0.0235443115234375, -0.035064697265625, 0.050079345703125, 0.0177459716796875, 0.07611083984375, -0.03045654296875, -0.011505126953125, -0.0212860107421875, -0.008575439453125, -0.0170745849609375, -0.05889892578125, -0.006092071533203125, -0.019378662109375, -0.0155487060546875, 0.00634765625, 0.0406494140625, -0.046905517578125, -0.0306549072265625, 0.0426025390625, 0.038818359375, 0.01824951171875, 0.01348114013671875, -0.08270263671875, 0.0023632049560546875, 0.0288238525390625, -0.0401611328125, 0.0231170654296875, 0.035552978515625, -0.004657745361328125, 0.04443359375, 0.04400634765625, 0.00481414794921875, -0.0014934539794921875, 0.0030193328857421875, 0.03875732421875, -0.07037353515625, -0.0229949951171875, -0.047607421875, 0.02716064453125, -0.0246124267578125, 0.001651763916015625, 0.060791015625, 0.053009033203125, 0.08074951171875, -0.00395965576171875, 0.061309814453125, -0.00862884521484375, 0.03070068359375, -0.045318603515625, 0.067138671875, -0.0772705078125, 0.019439697265625, -0.0266265869140625, -0.070556640625, -0.0118408203125, 0.052703857421875, -0.025299072265625, 0.0174102783203125, 0.051177978515625, 0.0736083984375, -0.019287109375, -0.01438140869140625, 0.023193359375, 0.0328369140625, 0.0118865966796875, 0.0596923828125, 0.0259857177734375, -0.0736083984375, 0.048248291015625, -0.017852783203125, 0.00962066650390625, -0.039154052734375, -0.04791259765625, -0.06939697265625, -0.055145263671875, -0.031707763671875, -0.02301025390625, -0.00341033935546875, 0.06927490234375, 0.0257568359375, -0.056304931640625, -0.00518798828125, 0.0207672119140625, 0.036468505859375, -0.02008056640625, -0.0207061767578125, 0.0496826171875, -0.005706787109375, -0.07037353515625, 0.0247039794921875, -0.006290435791015625, -0.005138397216796875, -0.003997802734375, -0.0184326171875, -0.06640625, 0.008941650390625, 0.044952392578125, 0.0191497802734375, -0.068603515625, -0.03155517578125, 0.00634002685546875, -0.0196075439453125, -0.0115814208984375, 0.01271820068359375, -0.0309906005859375, 0.0270538330078125, 0.046600341796875, 0.05792236328125, 0.0496826171875, -0.0035228729248046875, 0.01529693603515625, -0.046051025390625, -0.006412506103515625, -0.0031604766845703125, 0.053375244140625, 0.0273895263671875, -0.02264404296875, 0.06781005859375, 0.016082763671875, -0.0305023193359375, -0.056610107421875, 0.00243377685546875, -0.07928466796875, -0.02471923828125, 0.08404541015625, -0.031982421875, -0.0188140869140625, 0.0205535888671875, -0.01439666748046875, 0.041259765625, -0.036956787109375, 0.035369873046875, 0.05999755859375, 0.03271484375, -0.0117034912109375, -0.0684814453125, 0.0235443115234375, 0.046722412109375, -0.0206451416015625, -0.025543212890625, 0.025390625, 0.036346435546875, 0.0179595947265625, 0.00876617431640625, -0.0185546875, 0.024017333984375, -0.005641937255859375, -0.0006690025329589844, -0.01044464111328125, 0.018524169921875, -0.01372528076171875, 0.001956939697265625, -0.01256561279296875, -0.0232086181640625 ] ]
beomi/kcbert-base
2023-03-30T08:52:15.000Z
[ "transformers", "pytorch", "jax", "safetensors", "bert", "fill-mask", "korean", "ko", "arxiv:1810.04805", "doi:10.57967/hf/0016", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
beomi
null
null
beomi/kcbert-base
7
154,044
transformers
2022-03-02T23:29:05
--- language: ko license: apache-2.0 tags: - korean --- # KcBERT: Korean comments BERT ** Updates on 2021.04.07 ** - KcELECTRA가 릴리즈 되었습니다!🤗 - KcELECTRA는 보다 더 많은 데이터셋, 그리고 더 큰 General vocab을 통해 KcBERT 대비 **모든 태스크에서 더 높은 성능**을 보입니다. - 아래 깃헙 링크에서 직접 사용해보세요! - https://github.com/Beomi/KcELECTRA ** Updates on 2021.03.14 ** - KcBERT Paper 인용 표기를 추가하였습니다.(bibtex) - KcBERT-finetune Performance score를 본문에 추가하였습니다. ** Updates on 2020.12.04 ** Huggingface Transformers가 v4.0.0으로 업데이트됨에 따라 Tutorial의 코드가 일부 변경되었습니다. 업데이트된 KcBERT-Large NSMC Finetuning Colab: <a href="https://colab.research.google.com/drive/1dFC0FL-521m7CL_PSd8RLKq67jgTJVhL?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> ** Updates on 2020.09.11 ** KcBERT를 Google Colab에서 TPU를 통해 학습할 수 있는 튜토리얼을 제공합니다! 아래 버튼을 눌러보세요. Colab에서 TPU로 KcBERT Pretrain 해보기: <a href="https://colab.research.google.com/drive/1lYBYtaXqt9S733OXdXvrvC09ysKFN30W"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> 텍스트 분량만 전체 12G 텍스트 중 일부(144MB)로 줄여 학습을 진행합니다. 한국어 데이터셋/코퍼스를 좀더 쉽게 사용할 수 있는 [Korpora](https://github.com/ko-nlp/Korpora) 패키지를 사용합니다. ** Updates on 2020.09.08 ** Github Release를 통해 학습 데이터를 업로드하였습니다. 다만 한 파일당 2GB 이내의 제약으로 인해 분할압축되어있습니다. 아래 링크를 통해 받아주세요. (가입 없이 받을 수 있어요. 분할압축) 만약 한 파일로 받고싶으시거나/Kaggle에서 데이터를 살펴보고 싶으시다면 아래의 캐글 데이터셋을 이용해주세요. - Github릴리즈: https://github.com/Beomi/KcBERT/releases/tag/TrainData_v1 ** Updates on 2020.08.22 ** Pretrain Dataset 공개 - 캐글: https://www.kaggle.com/junbumlee/kcbert-pretraining-corpus-korean-news-comments (한 파일로 받을 수 있어요. 단일파일) Kaggle에 학습을 위해 정제한(아래 `clean`처리를 거친) Dataset을 공개하였습니다! 직접 다운받으셔서 다양한 Task에 학습을 진행해보세요 :) --- 공개된 한국어 BERT는 대부분 한국어 위키, 뉴스 기사, 책 등 잘 정제된 데이터를 기반으로 학습한 모델입니다. 한편, 실제로 NSMC와 같은 댓글형 데이터셋은 정제되지 않았고 구어체 특징에 신조어가 많으며, 오탈자 등 공식적인 글쓰기에서 나타나지 않는 표현들이 빈번하게 등장합니다. KcBERT는 위와 같은 특성의 데이터셋에 적용하기 위해, 네이버 뉴스에서 댓글과 대댓글을 수집해, 토크나이저와 BERT모델을 처음부터 학습한 Pretrained BERT 모델입니다. KcBERT는 Huggingface의 Transformers 라이브러리를 통해 간편히 불러와 사용할 수 있습니다. (별도의 파일 다운로드가 필요하지 않습니다.) ## KcBERT Performance - Finetune 코드는 https://github.com/Beomi/KcBERT-finetune 에서 찾아보실 수 있습니다. | | Size<br/>(용량) | **NSMC**<br/>(acc) | **Naver NER**<br/>(F1) | **PAWS**<br/>(acc) | **KorNLI**<br/>(acc) | **KorSTS**<br/>(spearman) | **Question Pair**<br/>(acc) | **KorQuaD (Dev)**<br/>(EM/F1) | | :-------------------- | :---: | :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: | | KcBERT-Base | 417M | 89.62 | 84.34 | 66.95 | 74.85 | 75.57 | 93.93 | 60.25 / 84.39 | | KcBERT-Large | 1.2G | **90.68** | 85.53 | 70.15 | 76.99 | 77.49 | 94.06 | 62.16 / 86.64 | | KoBERT | 351M | 89.63 | 86.11 | 80.65 | 79.00 | 79.64 | 93.93 | 52.81 / 80.27 | | XLM-Roberta-Base | 1.03G | 89.49 | 86.26 | 82.95 | 79.92 | 79.09 | 93.53 | 64.70 / 88.94 | | HanBERT | 614M | 90.16 | **87.31** | 82.40 | **80.89** | 83.33 | 94.19 | 78.74 / 92.02 | | KoELECTRA-Base | 423M | **90.21** | 86.87 | 81.90 | 80.85 | 83.21 | 94.20 | 61.10 / 89.59 | | KoELECTRA-Base-v2 | 423M | 89.70 | 87.02 | **83.90** | 80.61 | **84.30** | **94.72** | **84.34 / 92.58** | | DistilKoBERT | 108M | 88.41 | 84.13 | 62.55 | 70.55 | 73.21 | 92.48 | 54.12 / 77.80 | \*HanBERT의 Size는 Bert Model과 Tokenizer DB를 합친 것입니다. \***config의 세팅을 그대로 하여 돌린 결과이며, hyperparameter tuning을 추가적으로 할 시 더 좋은 성능이 나올 수 있습니다.** ## How to use ### Requirements - `pytorch <= 1.8.0` - `transformers ~= 3.0.1` - `transformers ~= 4.0.0` 도 호환됩니다. - `emoji ~= 0.6.0` - `soynlp ~= 0.0.493` ```python from transformers import AutoTokenizer, AutoModelWithLMHead # Base Model (108M) tokenizer = AutoTokenizer.from_pretrained("beomi/kcbert-base") model = AutoModelWithLMHead.from_pretrained("beomi/kcbert-base") # Large Model (334M) tokenizer = AutoTokenizer.from_pretrained("beomi/kcbert-large") model = AutoModelWithLMHead.from_pretrained("beomi/kcbert-large") ``` ### Pretrain & Finetune Colab 링크 모음 #### Pretrain Data - [데이터셋 다운로드(Kaggle, 단일파일, 로그인 필요)](https://www.kaggle.com/junbumlee/kcbert-pretraining-corpus-korean-news-comments) - [데이터셋 다운로드(Github, 압축 여러파일, 로그인 불필요)](https://github.com/Beomi/KcBERT/releases/tag/TrainData_v1) #### Pretrain Code Colab에서 TPU로 KcBERT Pretrain 해보기: <a href="https://colab.research.google.com/drive/1lYBYtaXqt9S733OXdXvrvC09ysKFN30W"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> #### Finetune Samples **KcBERT-Base** NSMC Finetuning with PyTorch-Lightning (Colab) <a href="https://colab.research.google.com/drive/1fn4sVJ82BrrInjq6y5655CYPP-1UKCLb?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> **KcBERT-Large** NSMC Finetuning with PyTorch-Lightning (Colab) <a href="https://colab.research.google.com/drive/1dFC0FL-521m7CL_PSd8RLKq67jgTJVhL?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> > 위 두 코드는 Pretrain 모델(base, large)와 batch size만 다를 뿐, 나머지 코드는 완전히 동일합니다. ## Train Data & Preprocessing ### Raw Data 학습 데이터는 2019.01.01 ~ 2020.06.15 사이에 작성된 **댓글 많은 뉴스** 기사들의 **댓글과 대댓글**을 모두 수집한 데이터입니다. 데이터 사이즈는 텍스트만 추출시 **약 15.4GB이며, 1억1천만개 이상의 문장**으로 이뤄져 있습니다. ### Preprocessing PLM 학습을 위해서 전처리를 진행한 과정은 다음과 같습니다. 1. 한글 및 영어, 특수문자, 그리고 이모지(🥳)까지! 정규표현식을 통해 한글, 영어, 특수문자를 포함해 Emoji까지 학습 대상에 포함했습니다. 한편, 한글 범위를 `ㄱ-ㅎ가-힣` 으로 지정해 `ㄱ-힣` 내의 한자를 제외했습니다. 2. 댓글 내 중복 문자열 축약 `ㅋㅋㅋㅋㅋ`와 같이 중복된 글자를 `ㅋㅋ`와 같은 것으로 합쳤습니다. 3. Cased Model KcBERT는 영문에 대해서는 대소문자를 유지하는 Cased model입니다. 4. 글자 단위 10글자 이하 제거 10글자 미만의 텍스트는 단일 단어로 이뤄진 경우가 많아 해당 부분을 제외했습니다. 5. 중복 제거 중복적으로 쓰인 댓글을 제거하기 위해 중복 댓글을 하나로 합쳤습니다. 이를 통해 만든 최종 학습 데이터는 **12.5GB, 8.9천만개 문장**입니다. 아래 명령어로 pip로 설치한 뒤, 아래 clean함수로 클리닝을 하면 Downstream task에서 보다 성능이 좋아집니다. (`[UNK]` 감소) ```bash pip install soynlp emoji ``` 아래 `clean` 함수를 Text data에 사용해주세요. ```python import re import emoji from soynlp.normalizer import repeat_normalize emojis = list({y for x in emoji.UNICODE_EMOJI.values() for y in x.keys()}) emojis = ''.join(emojis) pattern = re.compile(f'[^ .,?!/@$%~%·∼()\x00-\x7Fㄱ-ㅣ가-힣{emojis}]+') url_pattern = re.compile( r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)') def clean(x): x = pattern.sub(' ', x) x = url_pattern.sub('', x) x = x.strip() x = repeat_normalize(x, num_repeats=2) return x ``` ### Cleaned Data (Released on Kaggle) 원본 데이터를 위 `clean`함수로 정제한 12GB분량의 txt 파일을 아래 Kaggle Dataset에서 다운받으실 수 있습니다 :) https://www.kaggle.com/junbumlee/kcbert-pretraining-corpus-korean-news-comments ## Tokenizer Train Tokenizer는 Huggingface의 [Tokenizers](https://github.com/huggingface/tokenizers) 라이브러리를 통해 학습을 진행했습니다. 그 중 `BertWordPieceTokenizer` 를 이용해 학습을 진행했고, Vocab Size는 `30000`으로 진행했습니다. Tokenizer를 학습하는 것에는 `1/10`로 샘플링한 데이터로 학습을 진행했고, 보다 골고루 샘플링하기 위해 일자별로 stratify를 지정한 뒤 햑습을 진행했습니다. ## BERT Model Pretrain - KcBERT Base config ```json { "max_position_embeddings": 300, "hidden_dropout_prob": 0.1, "hidden_act": "gelu", "initializer_range": 0.02, "num_hidden_layers": 12, "type_vocab_size": 2, "vocab_size": 30000, "hidden_size": 768, "attention_probs_dropout_prob": 0.1, "directionality": "bidi", "num_attention_heads": 12, "intermediate_size": 3072, "architectures": [ "BertForMaskedLM" ], "model_type": "bert" } ``` - KcBERT Large config ```json { "type_vocab_size": 2, "initializer_range": 0.02, "max_position_embeddings": 300, "vocab_size": 30000, "hidden_size": 1024, "hidden_dropout_prob": 0.1, "model_type": "bert", "directionality": "bidi", "pad_token_id": 0, "layer_norm_eps": 1e-12, "hidden_act": "gelu", "num_hidden_layers": 24, "num_attention_heads": 16, "attention_probs_dropout_prob": 0.1, "intermediate_size": 4096, "architectures": [ "BertForMaskedLM" ] } ``` BERT Model Config는 Base, Large 기본 세팅값을 그대로 사용했습니다. (MLM 15% 등) TPU `v3-8` 을 이용해 각각 3일, N일(Large는 학습 진행 중)을 진행했고, 현재 Huggingface에 공개된 모델은 1m(100만) step을 학습한 ckpt가 업로드 되어있습니다. 모델 학습 Loss는 Step에 따라 초기 200k에 가장 빠르게 Loss가 줄어들다 400k이후로는 조금씩 감소하는 것을 볼 수 있습니다. - Base Model Loss ![KcBERT-Base Pretraining Loss](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200719183852243.38b124.png) - Large Model Loss ![KcBERT-Large Pretraining Loss](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200806160746694.d56fa1.png) 학습은 GCP의 TPU v3-8을 이용해 학습을 진행했고, 학습 시간은 Base Model 기준 2.5일정도 진행했습니다. Large Model은 약 5일정도 진행한 뒤 가장 낮은 loss를 가진 체크포인트로 정했습니다. ## Example ### HuggingFace MASK LM [HuggingFace kcbert-base 모델](https://huggingface.co/beomi/kcbert-base?text=오늘은+날씨가+[MASK]) 에서 아래와 같이 테스트 해 볼 수 있습니다. ![오늘은 날씨가 "좋네요", KcBERT-Base](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200719205919389.5670d6.png) 물론 [kcbert-large 모델](https://huggingface.co/beomi/kcbert-large?text=오늘은+날씨가+[MASK]) 에서도 테스트 할 수 있습니다. ![image-20200806160624340](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200806160624340.58f9be.png) ### NSMC Binary Classification [네이버 영화평 코퍼스](https://github.com/e9t/nsmc) 데이터셋을 대상으로 Fine Tuning을 진행해 성능을 간단히 테스트해보았습니다. Base Model을 Fine Tune하는 코드는 <a href="https://colab.research.google.com/drive/1fn4sVJ82BrrInjq6y5655CYPP-1UKCLb?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> 에서 직접 실행해보실 수 있습니다. Large Model을 Fine Tune하는 코드는 <a href="https://colab.research.google.com/drive/1dFC0FL-521m7CL_PSd8RLKq67jgTJVhL?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> 에서 직접 실행해볼 수 있습니다. - GPU는 P100 x1대 기준 1epoch에 2-3시간, TPU는 1epoch에 1시간 내로 소요됩니다. - GPU RTX Titan x4대 기준 30분/epoch 소요됩니다. - 예시 코드는 [pytorch-lightning](https://github.com/PyTorchLightning/pytorch-lightning)으로 개발했습니다. #### 실험결과 - KcBERT-Base Model 실험결과: Val acc `.8905` ![KcBERT Base finetune on NSMC](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200719201102895.ddbdfc.png) - KcBERT-Large Model 실험 결과: Val acc `.9089` ![image-20200806190242834](https://raw.githubusercontent.com/Beomi/KcBERT/master/img/image-20200806190242834.56d6ee.png) > 더 다양한 Downstream Task에 대해 테스트를 진행하고 공개할 예정입니다. ## 인용표기/Citation KcBERT를 인용하실 때는 아래 양식을 통해 인용해주세요. ``` @inproceedings{lee2020kcbert, title={KcBERT: Korean Comments BERT}, author={Lee, Junbum}, booktitle={Proceedings of the 32nd Annual Conference on Human and Cognitive Language Technology}, pages={437--440}, year={2020} } ``` - 논문집 다운로드 링크: http://hclt.kr/dwn/?v=bG5iOmNvbmZlcmVuY2U7aWR4OjMy (*혹은 http://hclt.kr/symp/?lnb=conference ) ## Acknowledgement KcBERT Model을 학습하는 GCP/TPU 환경은 [TFRC](https://www.tensorflow.org/tfrc?hl=ko) 프로그램의 지원을 받았습니다. 모델 학습 과정에서 많은 조언을 주신 [Monologg](https://github.com/monologg/) 님 감사합니다 :) ## Reference ### Github Repos - [BERT by Google](https://github.com/google-research/bert) - [KoBERT by SKT](https://github.com/SKTBrain/KoBERT) - [KoELECTRA by Monologg](https://github.com/monologg/KoELECTRA/) - [Transformers by Huggingface](https://github.com/huggingface/transformers) - [Tokenizers by Hugginface](https://github.com/huggingface/tokenizers) ### Papers - [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) ### Blogs - [Monologg님의 KoELECTRA 학습기](https://monologg.kr/categories/NLP/ELECTRA/) - [Colab에서 TPU로 BERT 처음부터 학습시키기 - Tensorflow/Google ver.](https://beomi.github.io/2020/02/26/Train-BERT-from-scratch-on-colab-TPU-Tensorflow-ver/)
12,791
[ [ -0.05267333984375, -0.0440673828125, 0.01557159423828125, 0.034637451171875, -0.032928466796875, 0.005374908447265625, -0.015655517578125, -0.01055908203125, 0.043426513671875, 0.0125579833984375, -0.038818359375, -0.0462646484375, -0.046844482421875, 0.00714111328125, 0.0019855499267578125, 0.0635986328125, -0.0142669677734375, 0.01404571533203125, 0.00428009033203125, -0.002338409423828125, -0.044891357421875, -0.04107666015625, -0.0390625, -0.0234375, 0.010009765625, 0.033050537109375, 0.0426025390625, 0.0292510986328125, 0.03973388671875, 0.02850341796875, -0.00603485107421875, -0.0104522705078125, -0.018890380859375, -0.007785797119140625, 0.02471923828125, -0.033660888671875, -0.02447509765625, -0.0030155181884765625, 0.028656005859375, 0.03631591796875, 0.0109710693359375, 0.0119476318359375, 0.010467529296875, 0.059661865234375, -0.035797119140625, 0.007080078125, -0.0094146728515625, 0.0109710693359375, -0.0003180503845214844, -0.005306243896484375, -0.0005273818969726562, -0.047149658203125, 0.0066986083984375, -0.050506591796875, 0.017181396484375, -0.0009965896606445312, 0.10382080078125, -0.00176239013671875, -0.0104827880859375, -0.02801513671875, -0.034149169921875, 0.06732177734375, -0.057861328125, 0.031341552734375, 0.024871826171875, 0.012481689453125, -0.01885986328125, -0.0677490234375, -0.049163818359375, -0.0143890380859375, -0.0234527587890625, 0.031829833984375, 0.00047087669372558594, -0.023529052734375, 0.01837158203125, 0.0244293212890625, -0.03509521484375, -0.00655364990234375, -0.0199432373046875, -0.018280029296875, 0.04144287109375, -0.0098114013671875, 0.032684326171875, -0.033966064453125, -0.03887939453125, -0.032562255859375, -0.0278167724609375, 0.024139404296875, 0.0229339599609375, 0.005275726318359375, -0.02532958984375, 0.033050537109375, -0.0116424560546875, 0.0278167724609375, 0.01898193359375, -0.003215789794921875, 0.060546875, -0.038330078125, -0.033294677734375, 0.01184844970703125, 0.0777587890625, 0.032440185546875, 0.016326904296875, 0.0032634735107421875, -0.0008540153503417969, -0.00630950927734375, -0.0018072128295898438, -0.07073974609375, -0.040283203125, 0.0284576416015625, -0.04998779296875, -0.0171966552734375, 0.007167816162109375, -0.0775146484375, 0.01215362548828125, -0.025115966796875, 0.052032470703125, -0.0523681640625, -0.0242156982421875, 0.001781463623046875, -0.01288604736328125, 0.01264190673828125, 0.0228424072265625, -0.04974365234375, 0.00853729248046875, 0.0096588134765625, 0.049560546875, 0.017669677734375, -0.0131072998046875, 0.010284423828125, -0.00514984130859375, -0.023529052734375, 0.040252685546875, -0.0021762847900390625, -0.033905029296875, -0.006595611572265625, 0.02081298828125, -0.0217132568359375, -0.0218048095703125, 0.04266357421875, -0.019775390625, 0.01070404052734375, -0.03436279296875, -0.032806396484375, -0.004222869873046875, 0.0207061767578125, -0.037200927734375, 0.08184814453125, 0.01068115234375, -0.059783935546875, 0.028717041015625, -0.04827880859375, -0.0293731689453125, -0.00975799560546875, -0.0038852691650390625, -0.049468994140625, -0.0180511474609375, 0.037322998046875, 0.03924560546875, -0.00844573974609375, -0.0093841552734375, -0.0070343017578125, -0.02386474609375, 0.0188751220703125, -0.0029048919677734375, 0.0831298828125, 0.03411865234375, -0.033538818359375, 0.00455474853515625, -0.072021484375, 0.0307769775390625, 0.034881591796875, -0.031219482421875, -0.00504302978515625, -0.0360107421875, 0.00643157958984375, 0.026458740234375, 0.033721923828125, -0.03619384765625, 0.00446319580078125, -0.0221710205078125, 0.0367431640625, 0.0775146484375, -0.002246856689453125, 0.02557373046875, -0.027374267578125, 0.0374755859375, 0.0117645263671875, 0.02032470703125, -0.00630950927734375, -0.017486572265625, -0.06927490234375, -0.033538818359375, 0.041717529296875, 0.0364990234375, -0.050201416015625, 0.06658935546875, -0.0135345458984375, -0.055908203125, -0.051971435546875, -0.005046844482421875, 0.026947021484375, 0.032867431640625, 0.03802490234375, -0.0098876953125, -0.055877685546875, -0.053375244140625, -0.0088653564453125, -0.0009465217590332031, -0.0018568038940429688, 0.037841796875, 0.059722900390625, -0.0150299072265625, 0.0677490234375, -0.05279541015625, -0.021331787109375, -0.0214385986328125, 0.008941650390625, 0.046844482421875, 0.06585693359375, 0.061553955078125, -0.0638427734375, -0.0684814453125, -0.007175445556640625, -0.058563232421875, 0.003246307373046875, 0.00013971328735351562, -0.020843505859375, 0.01169586181640625, 0.03485107421875, -0.054595947265625, 0.036041259765625, 0.0220794677734375, -0.031829833984375, 0.06341552734375, -0.0263214111328125, 0.029266357421875, -0.082275390625, 0.0274505615234375, -0.00797271728515625, 0.00457000732421875, -0.0455322265625, -0.01065826416015625, 0.00786590576171875, 0.0020275115966796875, -0.042144775390625, 0.039520263671875, -0.043182373046875, 0.0195770263671875, 0.005786895751953125, 0.005641937255859375, 0.00012314319610595703, 0.050079345703125, -0.01245880126953125, 0.044952392578125, 0.04388427734375, -0.049957275390625, 0.018768310546875, 0.022308349609375, -0.039764404296875, 0.01396942138671875, -0.055145263671875, 0.0014324188232421875, -0.0079498291015625, 0.0157928466796875, -0.091064453125, -0.0188751220703125, 0.043914794921875, -0.059295654296875, 0.033172607421875, -0.0053558349609375, -0.03399658203125, -0.0472412109375, -0.0439453125, 0.011993408203125, 0.043731689453125, -0.040283203125, 0.03582763671875, 0.00878143310546875, -0.006557464599609375, -0.047027587890625, -0.0472412109375, -0.019989013671875, -0.0088653564453125, -0.04931640625, 0.0443115234375, -0.017791748046875, 0.007511138916015625, -0.0032444000244140625, -0.0135955810546875, -0.01209259033203125, -0.00524139404296875, 0.0160980224609375, 0.03253173828125, -0.0091400146484375, -0.00682830810546875, -0.01141357421875, -0.0035533905029296875, 0.0010862350463867188, 0.0014095306396484375, 0.0694580078125, -0.01457977294921875, -0.013671875, -0.047332763671875, 0.00281524658203125, 0.0487060546875, -0.0046539306640625, 0.06268310546875, 0.0682373046875, -0.005794525146484375, 0.0190277099609375, -0.03485107421875, -0.0091400146484375, -0.039703369140625, 0.0193939208984375, -0.0260772705078125, -0.043365478515625, 0.0472412109375, -0.00972747802734375, -0.017059326171875, 0.051971435546875, 0.037567138671875, -0.02459716796875, 0.07891845703125, 0.02825927734375, -0.016021728515625, 0.0421142578125, -0.05609130859375, 0.02587890625, -0.06134033203125, -0.03900146484375, -0.044708251953125, -0.0277252197265625, -0.05291748046875, -0.0106201171875, 0.0223236083984375, 0.030181884765625, -0.0297393798828125, 0.032318115234375, -0.060272216796875, 0.003139495849609375, 0.04510498046875, 0.028472900390625, -0.019134521484375, -0.0033397674560546875, -0.0296173095703125, -0.0013790130615234375, -0.050750732421875, -0.0215911865234375, 0.08782958984375, 0.0220794677734375, 0.04559326171875, -0.0037708282470703125, 0.05206298828125, 0.0081939697265625, -0.0008387565612792969, -0.041717529296875, 0.043365478515625, 0.0028209686279296875, -0.046539306640625, -0.031036376953125, -0.0193328857421875, -0.0770263671875, 0.028533935546875, -0.027618408203125, -0.06317138671875, 0.0234832763671875, 0.0165863037109375, -0.0201263427734375, 0.0160980224609375, -0.06707763671875, 0.07440185546875, -0.027679443359375, -0.0301513671875, 0.0082855224609375, -0.0577392578125, 0.00576019287109375, 0.003170013427734375, 0.01241302490234375, -0.01053619384765625, 0.025726318359375, 0.0748291015625, -0.06768798828125, 0.0374755859375, -0.0263519287109375, 0.00988006591796875, 0.03851318359375, -0.0236663818359375, 0.05316162109375, 0.0003447532653808594, 0.00331878662109375, 0.012908935546875, -0.0022716522216796875, -0.03515625, -0.02325439453125, 0.06048583984375, -0.0755615234375, -0.036773681640625, -0.04010009765625, -0.01788330078125, 0.00826263427734375, 0.032806396484375, 0.04705810546875, 0.005123138427734375, 0.00490570068359375, 0.028564453125, 0.0308990478515625, -0.03131103515625, 0.0460205078125, 0.01433563232421875, -0.00862884521484375, -0.03753662109375, 0.0665283203125, 0.022796630859375, 0.006313323974609375, 0.015838623046875, 0.0184783935546875, -0.03350830078125, -0.0286712646484375, -0.0194091796875, 0.017547607421875, -0.050018310546875, -0.00988006591796875, -0.049713134765625, -0.01467132568359375, -0.06475830078125, -0.0267333984375, -0.0250396728515625, -0.031707763671875, -0.02691650390625, -0.0037708282470703125, 0.033294677734375, 0.01427459716796875, -0.0159149169921875, 0.03131103515625, -0.042816162109375, 0.019989013671875, 0.0079498291015625, 0.016082763671875, 0.013275146484375, -0.030914306640625, -0.0198516845703125, 0.007320404052734375, -0.035064697265625, -0.057342529296875, 0.0472412109375, -0.0286712646484375, 0.03912353515625, 0.0249176025390625, -0.001148223876953125, 0.0648193359375, -0.0265045166015625, 0.06585693359375, 0.04388427734375, -0.06756591796875, 0.0560302734375, -0.0176239013671875, 0.02227783203125, 0.044403076171875, 0.049285888671875, -0.03192138671875, -0.007781982421875, -0.052459716796875, -0.07928466796875, 0.06707763671875, 0.036041259765625, -0.01268768310546875, 0.0094451904296875, 0.0007138252258300781, -0.019866943359375, 0.021331787109375, -0.05780029296875, -0.054107666015625, -0.045867919921875, -0.0093231201171875, 0.0004019737243652344, -0.0046234130859375, 0.002880096435546875, -0.04925537109375, 0.06610107421875, 0.0228729248046875, 0.051788330078125, 0.0386962890625, -0.01178741455078125, 0.004131317138671875, 0.01513671875, 0.031951904296875, 0.0491943359375, -0.0389404296875, -0.01340484619140625, 0.0196533203125, -0.059661865234375, 0.01425933837890625, 0.002071380615234375, -0.021942138671875, 0.0169219970703125, 0.01389312744140625, 0.0555419921875, 0.007320404052734375, -0.0300140380859375, 0.03924560546875, -0.0016469955444335938, -0.024993896484375, -0.041229248046875, -0.0014400482177734375, 0.00060272216796875, 0.0057525634765625, 0.028533935546875, 0.00484466552734375, -0.01837158203125, -0.04168701171875, 0.005275726318359375, 0.0208740234375, -0.0236663818359375, -0.0057220458984375, 0.04559326171875, -0.0033283233642578125, -0.01165008544921875, 0.04193115234375, -0.002834320068359375, -0.06146240234375, 0.07611083984375, 0.045257568359375, 0.049774169921875, -0.0265045166015625, 0.00487518310546875, 0.0631103515625, 0.00988006591796875, 0.003650665283203125, 0.032684326171875, 0.00812530517578125, -0.047576904296875, -0.0001609325408935547, -0.050811767578125, 0.00627899169921875, 0.03851318359375, -0.04254150390625, 0.0238037109375, -0.04632568359375, -0.02618408203125, 0.01177215576171875, 0.01554107666015625, -0.06048583984375, 0.039031982421875, -0.014190673828125, 0.056060791015625, -0.06329345703125, 0.058319091796875, 0.048858642578125, -0.04852294921875, -0.06683349609375, 0.0024166107177734375, -0.022674560546875, -0.06097412109375, 0.06390380859375, 0.011444091796875, 0.0160980224609375, 0.0016002655029296875, -0.0428466796875, -0.0806884765625, 0.09515380859375, 0.00042319297790527344, -0.030670166015625, 0.019866943359375, 0.01421356201171875, 0.037261962890625, -0.0030536651611328125, 0.0271148681640625, 0.033477783203125, 0.05023193359375, 0.0107269287109375, -0.056610107421875, 0.0236663818359375, -0.026611328125, -0.00872039794921875, 0.0172576904296875, -0.07513427734375, 0.0838623046875, -0.003925323486328125, 0.003509521484375, -0.0016393661499023438, 0.040740966796875, 0.0244598388671875, 0.0145721435546875, 0.037109375, 0.043365478515625, 0.044189453125, -0.017791748046875, 0.0650634765625, -0.0310516357421875, 0.044952392578125, 0.0494384765625, 0.01270294189453125, 0.028656005859375, 0.033538818359375, -0.04345703125, 0.033050537109375, 0.049530029296875, -0.0399169921875, 0.054901123046875, 0.00988006591796875, -0.019622802734375, -0.0027294158935546875, 0.00951385498046875, -0.046600341796875, 0.019256591796875, 0.01270294189453125, -0.0386962890625, 0.006435394287109375, 0.00923919677734375, 0.0199432373046875, -0.0187835693359375, -0.033050537109375, 0.0240478515625, 0.00030231475830078125, -0.054351806640625, 0.06182861328125, -0.0032596588134765625, 0.04998779296875, -0.04705810546875, 0.00756072998046875, -0.0004439353942871094, 0.0187225341796875, -0.026123046875, -0.0555419921875, -0.01172637939453125, -0.0033283233642578125, -0.01348114013671875, -0.004932403564453125, 0.05889892578125, -0.0085906982421875, -0.046112060546875, 0.029052734375, 0.0125579833984375, 0.0234375, 0.01512908935546875, -0.0694580078125, 0.00684356689453125, 0.0189971923828125, -0.0311737060546875, 0.0300445556640625, 0.0229949951171875, 0.01474761962890625, 0.0426025390625, 0.0645751953125, 0.0023632049560546875, 0.023773193359375, -0.0018749237060546875, 0.07684326171875, -0.049835205078125, -0.04022216796875, -0.0650634765625, 0.04876708984375, -0.0155181884765625, -0.025848388671875, 0.06903076171875, 0.053863525390625, 0.06756591796875, -0.026519775390625, 0.0704345703125, -0.03692626953125, 0.0289459228515625, -0.03717041015625, 0.070556640625, -0.044219970703125, -0.0117645263671875, -0.03851318359375, -0.04486083984375, -0.0069122314453125, 0.06488037109375, -0.022064208984375, 0.0238800048828125, 0.04876708984375, 0.05108642578125, 0.00489044189453125, -0.007564544677734375, 0.0014314651489257812, 0.033294677734375, 0.01549530029296875, 0.043121337890625, 0.033111572265625, -0.058746337890625, 0.054901123046875, -0.048797607421875, 0.0007853507995605469, -0.037353515625, -0.0435791015625, -0.06854248046875, -0.027587890625, -0.02301025390625, -0.0288238525390625, -0.0103607177734375, 0.0711669921875, 0.045684814453125, -0.06854248046875, -0.01340484619140625, -0.0137786865234375, 0.00786590576171875, -0.0233917236328125, -0.0221405029296875, 0.0699462890625, -0.0261077880859375, -0.06414794921875, 0.0038127899169921875, -0.0189971923828125, 0.0261077880859375, 0.01959228515625, -0.0143585205078125, -0.044036865234375, -0.01180267333984375, 0.0426025390625, 0.02301025390625, -0.037200927734375, -0.0164031982421875, -0.0010013580322265625, -0.00807952880859375, 0.0237884521484375, 0.0267181396484375, -0.04339599609375, 0.02569580078125, 0.043731689453125, 0.0110321044921875, 0.057830810546875, 0.00015878677368164062, 0.01177978515625, -0.04254150390625, 0.0174102783203125, 0.00237274169921875, 0.032867431640625, -0.00582122802734375, -0.023162841796875, 0.046142578125, 0.0259857177734375, -0.0491943359375, -0.058685302734375, -0.021392822265625, -0.0792236328125, -0.04071044921875, 0.0645751953125, -0.0214080810546875, -0.02838134765625, 0.0034503936767578125, -0.0247650146484375, 0.0452880859375, -0.03167724609375, 0.054412841796875, 0.0579833984375, -0.0157318115234375, 0.007053375244140625, -0.044525146484375, 0.04058837890625, 0.04083251953125, -0.04901123046875, -0.0307464599609375, 0.007747650146484375, 0.0295867919921875, 0.031951904296875, 0.0567626953125, -0.00441741943359375, 0.01824951171875, -0.002895355224609375, 0.02349853515625, -0.0013179779052734375, -0.0035228729248046875, -0.002315521240234375, 0.0034770965576171875, -0.028656005859375, -0.054962158203125 ] ]
flair/ner-german
2023-04-05T09:42:58.000Z
[ "flair", "pytorch", "token-classification", "sequence-tagger-model", "de", "dataset:conll2003", "region:us" ]
token-classification
flair
null
null
flair/ner-german
11
153,311
flair
2022-03-02T23:29:05
--- tags: - flair - token-classification - sequence-tagger-model language: de datasets: - conll2003 widget: - text: "George Washington ging nach Washington" --- ## German NER in Flair (default model) This is the standard 4-class NER model for German that ships with [Flair](https://github.com/flairNLP/flair/). F1-Score: **87,94** (CoNLL-03 German revised) Predicts 4 tags: | **tag** | **meaning** | |---------------------------------|-----------| | PER | person name | | LOC | location name | | ORG | organization name | | MISC | other name | Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF. --- ### Demo: How to use in Flair Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`) ```python from flair.data import Sentence from flair.models import SequenceTagger # load tagger tagger = SequenceTagger.load("flair/ner-german") # make example sentence sentence = Sentence("George Washington ging nach Washington") # predict NER tags tagger.predict(sentence) # print sentence print(sentence) # print predicted NER spans print('The following NER tags are found:') # iterate over entities and print for entity in sentence.get_spans('ner'): print(entity) ``` This yields the following output: ``` Span [1,2]: "George Washington" [− Labels: PER (0.9977)] Span [5]: "Washington" [− Labels: LOC (0.9895)] ``` So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington ging nach Washington*". --- ### Training: Script to train this model The following Flair script was used to train this model: ```python from flair.data import Corpus from flair.datasets import CONLL_03_GERMAN from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings # 1. get the corpus corpus: Corpus = CONLL_03_GERMAN() # 2. what tag do we want to predict? tag_type = 'ner' # 3. make the tag dictionary from the corpus tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type) # 4. initialize each embedding we use embedding_types = [ # GloVe embeddings WordEmbeddings('de'), # contextual string embeddings, forward FlairEmbeddings('de-forward'), # contextual string embeddings, backward FlairEmbeddings('de-backward'), ] # embedding stack consists of Flair and GloVe embeddings embeddings = StackedEmbeddings(embeddings=embedding_types) # 5. initialize sequence tagger from flair.models import SequenceTagger tagger = SequenceTagger(hidden_size=256, embeddings=embeddings, tag_dictionary=tag_dictionary, tag_type=tag_type) # 6. initialize trainer from flair.trainers import ModelTrainer trainer = ModelTrainer(tagger, corpus) # 7. run training trainer.train('resources/taggers/ner-german', train_with_dev=True, max_epochs=150) ``` --- ### Cite Please cite the following paper when using this model. ``` @inproceedings{akbik2018coling, title={Contextual String Embeddings for Sequence Labeling}, author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland}, booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics}, pages = {1638--1649}, year = {2018} } ``` --- ### Issues? The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
3,500
[ [ -0.039886474609375, -0.05023193359375, 0.01161956787109375, 0.0007491111755371094, -0.011383056640625, -0.0139312744140625, -0.017181396484375, -0.02984619140625, 0.034088134765625, 0.01218414306640625, -0.038848876953125, -0.044189453125, -0.03253173828125, 0.0223541259765625, 0.0015964508056640625, 0.08770751953125, 0.0120086669921875, 0.029144287109375, -0.006877899169921875, -0.003849029541015625, -0.026519775390625, -0.05694580078125, -0.03955078125, -0.03302001953125, 0.03985595703125, 0.0224456787109375, 0.041778564453125, 0.04412841796875, 0.026947021484375, 0.021240234375, -0.0170440673828125, -0.006256103515625, -0.0143585205078125, -0.00719451904296875, -0.00926971435546875, -0.017730712890625, -0.059967041015625, 0.0069732666015625, 0.049163818359375, 0.036956787109375, 0.00872802734375, 0.002346038818359375, -0.0030364990234375, 0.015716552734375, -0.017852783203125, 0.025970458984375, -0.0389404296875, -0.0213470458984375, -0.017364501953125, 0.0011138916015625, -0.0310516357421875, -0.0229034423828125, 0.0218658447265625, -0.03350830078125, 0.0188140869140625, 0.0081787109375, 0.10406494140625, 0.01178741455078125, -0.0291290283203125, -0.0088348388671875, -0.032012939453125, 0.0517578125, -0.07623291015625, 0.02911376953125, 0.0245361328125, -0.004291534423828125, -0.01067352294921875, -0.045867919921875, -0.04583740234375, -0.011474609375, -0.0094451904296875, 0.00832366943359375, -0.02508544921875, -0.01238250732421875, 0.0217132568359375, 0.01169586181640625, -0.05389404296875, 0.0011692047119140625, -0.0161285400390625, -0.0193939208984375, 0.05596923828125, 0.01238250732421875, 0.0092620849609375, -0.0228118896484375, -0.02978515625, -0.0210113525390625, -0.023040771484375, -0.006237030029296875, 0.01497650146484375, 0.036529541015625, -0.018096923828125, 0.0302886962890625, -0.00934600830078125, 0.064697265625, 0.0206756591796875, -0.0140838623046875, 0.061004638671875, -0.01091766357421875, -0.016510009765625, -0.00545501708984375, 0.06842041015625, 0.02716064453125, 0.021453857421875, -0.00995635986328125, -0.0141143798828125, 0.00799560546875, -0.00664520263671875, -0.050750732421875, -0.0173797607421875, 0.00629425048828125, -0.0211334228515625, -0.0235748291015625, 0.0126495361328125, -0.060028076171875, -0.005870819091796875, -0.0064697265625, 0.038787841796875, -0.042266845703125, -0.01293182373046875, 0.0033855438232421875, -0.0267181396484375, 0.0297088623046875, 0.004863739013671875, -0.0645751953125, 0.00551605224609375, 0.025177001953125, 0.04290771484375, 0.01116943359375, -0.0261993408203125, -0.0171356201171875, -0.0093994140625, -0.01479339599609375, 0.047637939453125, -0.02838134765625, -0.026153564453125, -0.00093841552734375, 0.0132904052734375, -0.01904296875, -0.01605224609375, 0.044769287109375, -0.052032470703125, 0.0299072265625, -0.01490020751953125, -0.0653076171875, -0.03955078125, 0.013946533203125, -0.044586181640625, 0.06829833984375, 0.01137542724609375, -0.0755615234375, 0.0311431884765625, -0.03314208984375, -0.042755126953125, 0.006565093994140625, 0.00494384765625, -0.034088134765625, -0.004863739013671875, 0.00843048095703125, 0.056610107421875, -0.007442474365234375, 0.0357666015625, -0.036712646484375, -0.0034942626953125, 0.013702392578125, 0.00313568115234375, 0.059478759765625, 0.0028076171875, -0.023345947265625, 0.006954193115234375, -0.06622314453125, -0.0012750625610351562, 0.0167999267578125, -0.0309600830078125, -0.0154571533203125, -0.0027980804443359375, 0.0174407958984375, 0.020751953125, 0.01531219482421875, -0.0300140380859375, 0.0272674560546875, -0.039031982421875, 0.023895263671875, 0.041290283203125, 0.0010814666748046875, 0.05194091796875, -0.03302001953125, 0.03143310546875, 0.0013637542724609375, -0.021209716796875, 0.0014934539794921875, -0.049774169921875, -0.06011962890625, -0.01497650146484375, 0.037750244140625, 0.056976318359375, -0.049530029296875, 0.064453125, -0.03131103515625, -0.062103271484375, -0.0188446044921875, -0.02374267578125, 0.0178985595703125, 0.056243896484375, 0.039031982421875, -0.0129241943359375, -0.06201171875, -0.0648193359375, -0.01462554931640625, -0.01174163818359375, 0.01995849609375, 0.01861572265625, 0.07147216796875, -0.02923583984375, 0.0628662109375, -0.0308074951171875, -0.0162200927734375, -0.0260467529296875, 0.0141754150390625, 0.042694091796875, 0.04931640625, 0.0360107421875, -0.046112060546875, -0.04541015625, -0.014129638671875, -0.04412841796875, 0.00841522216796875, -0.0064697265625, 0.005855560302734375, 0.039581298828125, 0.0252838134765625, -0.031402587890625, 0.03240966796875, 0.021820068359375, -0.04559326171875, 0.0399169921875, 0.004505157470703125, -0.0206451416015625, -0.09869384765625, 0.0260467529296875, 0.0245819091796875, -0.020416259765625, -0.0587158203125, -0.017578125, 0.00836181640625, 0.018798828125, -0.02374267578125, 0.06695556640625, -0.0256500244140625, 0.0117034912109375, -0.004138946533203125, -0.01290130615234375, -0.005329132080078125, 0.035400390625, 0.029327392578125, 0.032012939453125, 0.046844482421875, -0.052886962890625, 0.006313323974609375, 0.031890869140625, -0.0240936279296875, 0.01641845703125, -0.036041259765625, -0.00677490234375, -0.00949859619140625, 0.024169921875, -0.06170654296875, -0.033355712890625, 0.028961181640625, -0.0552978515625, 0.050750732421875, -0.0004010200500488281, -0.034423828125, -0.03509521484375, -0.00586700439453125, 0.00067138671875, 0.0404052734375, -0.02642822265625, 0.03961181640625, 0.0225677490234375, 0.00785064697265625, -0.046417236328125, -0.04852294921875, -0.01407623291015625, -0.0281829833984375, -0.05072021484375, 0.043701171875, -0.007221221923828125, 0.00336456298828125, 0.0081939697265625, -0.00048279762268066406, -0.0011262893676757812, 0.018341064453125, 0.01203155517578125, 0.040252685546875, -0.01522064208984375, 0.006744384765625, -0.025115966796875, -0.004581451416015625, -0.00507354736328125, -0.00908660888671875, 0.056365966796875, -0.0279998779296875, 0.0203857421875, -0.03985595703125, 0.00672149658203125, 0.0170440673828125, -0.014984130859375, 0.061492919921875, 0.06634521484375, -0.03338623046875, -0.006267547607421875, -0.031951904296875, -0.02471923828125, -0.0291748046875, 0.04388427734375, -0.03271484375, -0.06085205078125, 0.039794921875, 0.01024627685546875, 0.002597808837890625, 0.066650390625, 0.04180908203125, -0.0019893646240234375, 0.08233642578125, 0.053436279296875, -0.020965576171875, 0.036529541015625, -0.035186767578125, 0.00536346435546875, -0.05145263671875, -0.011566162109375, -0.03369140625, -0.01166534423828125, -0.05645751953125, -0.01519775390625, 0.01508331298828125, 0.029296875, -0.052886962890625, 0.041473388671875, -0.034942626953125, 0.01557159423828125, 0.0531005859375, -0.0102081298828125, -0.007129669189453125, -0.0020885467529296875, -0.0193939208984375, -0.0107421875, -0.05792236328125, -0.043609619140625, 0.06524658203125, 0.03271484375, 0.0494384765625, 0.0022106170654296875, 0.0703125, -0.0038604736328125, 0.0251617431640625, -0.0672607421875, 0.031280517578125, -0.005542755126953125, -0.068359375, -0.0069427490234375, -0.018890380859375, -0.0614013671875, 0.014312744140625, -0.02313232421875, -0.06597900390625, 0.0198974609375, 0.00787353515625, -0.031982421875, 0.0245513916015625, -0.030609130859375, 0.07470703125, -0.00616455078125, -0.014801025390625, 0.023345947265625, -0.06829833984375, 0.0180206298828125, 0.00272369384765625, 0.02423095703125, -0.0119781494140625, 0.00638580322265625, 0.07904052734375, -0.01276397705078125, 0.078857421875, -0.00510406494140625, 0.01605224609375, 0.00982666015625, 0.000629425048828125, 0.031829833984375, 0.003726959228515625, -0.017669677734375, 0.01318359375, -0.0035152435302734375, -0.0103607177734375, -0.0038928985595703125, 0.0450439453125, -0.057464599609375, -0.0305328369140625, -0.0709228515625, -0.0189666748046875, -0.00884246826171875, 0.0303192138671875, 0.055084228515625, 0.03167724609375, -0.02764892578125, -0.00699615478515625, 0.03240966796875, -0.01490020751953125, 0.0535888671875, 0.04071044921875, -0.023834228515625, -0.04852294921875, 0.06170654296875, 0.0008983612060546875, -0.003986358642578125, 0.0209197998046875, 0.01006317138671875, -0.0272216796875, -0.00881195068359375, -0.0297088623046875, 0.03515625, -0.055084228515625, -0.03643798828125, -0.04852294921875, -0.02362060546875, -0.055023193359375, -0.01137542724609375, -0.01155853271484375, -0.0300750732421875, -0.0609130859375, -0.0033435821533203125, 0.0265045166015625, 0.054779052734375, -0.0273590087890625, 0.03228759765625, -0.05072021484375, -0.01184844970703125, -0.005596160888671875, 0.0112762451171875, -0.00585174560546875, -0.06817626953125, -0.0215911865234375, -0.006824493408203125, -0.032928466796875, -0.0849609375, 0.07183837890625, 0.02313232421875, 0.040130615234375, 0.0280609130859375, -0.008453369140625, 0.0345458984375, -0.038665771484375, 0.06317138671875, 0.0132293701171875, -0.0672607421875, 0.03515625, -0.025390625, 0.016510009765625, 0.0195465087890625, 0.0538330078125, -0.041839599609375, -0.0036144256591796875, -0.07196044921875, -0.06719970703125, 0.060791015625, -0.0100860595703125, 0.0114593505859375, -0.0304107666015625, 0.01039886474609375, -0.0091094970703125, -0.0013513565063476562, -0.07708740234375, -0.04620361328125, -0.01776123046875, -0.00997161865234375, -0.032958984375, -0.0153350830078125, 0.0084075927734375, -0.040374755859375, 0.094482421875, -0.0054473876953125, 0.03265380859375, 0.024200439453125, 0.0021495819091796875, 0.003925323486328125, 0.01904296875, 0.04510498046875, 0.019073486328125, -0.0323486328125, -0.00859832763671875, 0.0215606689453125, -0.005573272705078125, -0.00736236572265625, 0.0207366943359375, -0.011962890625, 0.0240631103515625, 0.036834716796875, 0.0745849609375, 0.01486968994140625, -0.01169586181640625, 0.0404052734375, 0.00897979736328125, -0.0207061767578125, -0.04034423828125, -0.01490020751953125, 0.0145721435546875, 0.01215362548828125, 0.01453399658203125, 0.01430511474609375, -0.0009131431579589844, -0.030670166015625, 0.0172271728515625, 0.03131103515625, -0.038360595703125, -0.038665771484375, 0.06451416015625, 0.0018682479858398438, -0.01114654541015625, 0.036041259765625, -0.03765869140625, -0.0653076171875, 0.0478515625, 0.0543212890625, 0.059417724609375, -0.01513671875, 0.01019287109375, 0.06829833984375, 0.005039215087890625, -0.0173797607421875, 0.045623779296875, 0.03167724609375, -0.0728759765625, -0.018951416015625, -0.06768798828125, 0.00907135009765625, 0.0157318115234375, -0.0501708984375, 0.03369140625, -0.0242156982421875, -0.0423583984375, 0.0249176025390625, 0.0057373046875, -0.065185546875, 0.01861572265625, 0.035614013671875, 0.0826416015625, -0.070068359375, 0.077880859375, 0.0718994140625, -0.053497314453125, -0.07171630859375, -0.009674072265625, 0.0069732666015625, -0.050323486328125, 0.0604248046875, 0.02685546875, 0.0269317626953125, 0.01898193359375, -0.050140380859375, -0.0970458984375, 0.07281494140625, -0.01491546630859375, -0.03204345703125, -0.01385498046875, -0.02532958984375, 0.033935546875, -0.0297088623046875, 0.0238189697265625, 0.04205322265625, 0.040496826171875, 0.0019426345825195312, -0.0706787109375, -0.00045228004455566406, -0.017852783203125, -0.0056915283203125, 0.00766754150390625, -0.040252685546875, 0.07867431640625, -0.01428985595703125, -0.015869140625, 0.020111083984375, 0.0635986328125, 0.0059356689453125, 0.0037860870361328125, 0.01910400390625, 0.06597900390625, 0.057586669921875, -0.0163116455078125, 0.07470703125, -0.0251922607421875, 0.04693603515625, 0.094482421875, -0.00917816162109375, 0.06793212890625, 0.0237274169921875, -0.006206512451171875, 0.049072265625, 0.061126708984375, -0.026031494140625, 0.0472412109375, 0.01702880859375, -0.00445556640625, -0.02557373046875, -0.00418853759765625, -0.040863037109375, 0.046051025390625, 0.024810791015625, -0.037384033203125, -0.008331298828125, -0.009979248046875, 0.036651611328125, -0.0143585205078125, -0.01548004150390625, 0.06494140625, 0.0004968643188476562, -0.047698974609375, 0.054595947265625, 0.01500701904296875, 0.07080078125, -0.032958984375, 0.006984710693359375, -0.004856109619140625, 0.01934814453125, -0.0137176513671875, -0.04534912109375, 0.011474609375, -0.01245880126953125, -0.0208282470703125, -0.0024738311767578125, 0.049407958984375, -0.031402587890625, -0.037353515625, 0.0225830078125, 0.04193115234375, 0.01378631591796875, 0.0023860931396484375, -0.05401611328125, -0.018890380859375, 0.010101318359375, -0.0328369140625, 0.0111083984375, 0.0081329345703125, -0.0009975433349609375, 0.031646728515625, 0.035797119140625, -0.004150390625, 0.002895355224609375, -0.011322021484375, 0.06207275390625, -0.0614013671875, -0.0300750732421875, -0.0718994140625, 0.0419921875, -0.0012426376342773438, -0.035614013671875, 0.0546875, 0.054290771484375, 0.0693359375, -0.0174560546875, 0.06475830078125, -0.03863525390625, 0.048828125, -0.01012420654296875, 0.054229736328125, -0.05059814453125, -0.0067291259765625, -0.0167236328125, -0.054718017578125, -0.034027099609375, 0.057159423828125, -0.0290679931640625, 0.002788543701171875, 0.04046630859375, 0.056060791015625, 0.00620269775390625, -0.0019893646240234375, 0.00931549072265625, 0.02301025390625, 0.00833892822265625, 0.033416748046875, 0.0428466796875, -0.03717041015625, 0.00860595703125, -0.044677734375, -0.0157623291015625, -0.024383544921875, -0.068359375, -0.07257080078125, -0.0548095703125, -0.0275726318359375, -0.054168701171875, -0.0204315185546875, 0.07928466796875, 0.04193115234375, -0.07281494140625, -0.0174407958984375, 0.008056640625, -0.00823974609375, -0.0027294158935546875, -0.020172119140625, 0.0350341796875, -0.0244598388671875, -0.051361083984375, 0.0267333984375, -0.0251922607421875, 0.026275634765625, 0.0202789306640625, -0.00812530517578125, -0.047698974609375, -0.009246826171875, 0.026153564453125, 0.0284271240234375, -0.06842041015625, -0.0183563232421875, 0.01041412353515625, -0.02447509765625, 0.012420654296875, 0.006565093994140625, -0.05438232421875, 0.016632080078125, 0.04510498046875, 0.0221099853515625, 0.033416748046875, -0.0026035308837890625, 0.035552978515625, -0.051300048828125, -0.0018720626831054688, 0.02960205078125, 0.04443359375, 0.0247955322265625, -0.01114654541015625, 0.041229248046875, 0.0272369384765625, -0.04888916015625, -0.04302978515625, -0.0025005340576171875, -0.07769775390625, -0.0251312255859375, 0.09283447265625, -0.0046234130859375, -0.0244140625, 0.01116180419921875, -0.0014781951904296875, 0.0458984375, -0.037994384765625, 0.01442718505859375, 0.048614501953125, -0.007476806640625, 0.0161895751953125, -0.03314208984375, 0.058441162109375, 0.0255126953125, -0.0310211181640625, -0.0211029052734375, 0.031463623046875, 0.047088623046875, 0.0189971923828125, 0.038848876953125, 0.00902557373046875, 0.01268768310546875, -0.0041656494140625, 0.0380859375, 0.01226806640625, -0.007213592529296875, -0.04815673828125, -0.0176239013671875, -0.0116424560546875, -0.0095672607421875 ] ]
sentence-transformers/bert-base-nli-mean-tokens
2022-06-09T12:34:28.000Z
[ "sentence-transformers", "pytorch", "tf", "jax", "rust", "bert", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/bert-base-nli-mean-tokens
20
152,458
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers license: apache-2.0 --- **⚠️ This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: [SBERT.net - Pretrained Models](https://www.sbert.net/docs/pretrained_models.html)** # sentence-transformers/bert-base-nli-mean-tokens This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/bert-base-nli-mean-tokens') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/bert-base-nli-mean-tokens') model = AutoModel.from_pretrained('sentence-transformers/bert-base-nli-mean-tokens') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/bert-base-nli-mean-tokens) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,947
[ [ -0.0178985595703125, -0.057281494140625, 0.016693115234375, 0.0298614501953125, -0.0335693359375, -0.034332275390625, -0.0230560302734375, -0.007396697998046875, 0.01837158203125, 0.02740478515625, -0.040802001953125, -0.0298004150390625, -0.053497314453125, 0.00481414794921875, -0.0308074951171875, 0.06427001953125, -0.00998687744140625, 0.00498199462890625, -0.018707275390625, -0.01094818115234375, -0.02606201171875, -0.03668212890625, -0.0271759033203125, -0.021331787109375, 0.0157012939453125, 0.004589080810546875, 0.03607177734375, 0.0268402099609375, 0.021087646484375, 0.0322265625, -0.00574493408203125, 0.01067352294921875, -0.027801513671875, -0.00861358642578125, 0.0079803466796875, -0.021820068359375, -0.00720977783203125, 0.026092529296875, 0.04351806640625, 0.038177490234375, -0.010406494140625, 0.0036754608154296875, 0.0006055831909179688, 0.0221405029296875, -0.03594970703125, 0.032501220703125, -0.043487548828125, 0.0135040283203125, 0.00981903076171875, 0.005573272705078125, -0.04766845703125, -0.01226806640625, 0.02667236328125, -0.032623291015625, 0.006534576416015625, 0.01349639892578125, 0.0863037109375, 0.0275115966796875, -0.0185089111328125, -0.026885986328125, -0.020355224609375, 0.067626953125, -0.07354736328125, 0.021728515625, 0.0206451416015625, -0.00415802001953125, -0.00177001953125, -0.07666015625, -0.05670166015625, -0.009918212890625, -0.032135009765625, 0.01552581787109375, -0.03106689453125, 0.00289154052734375, 0.0070648193359375, 0.01824951171875, -0.048980712890625, -0.00687408447265625, -0.0300140380859375, -0.00940704345703125, 0.03851318359375, -0.0027065277099609375, 0.0273284912109375, -0.04522705078125, -0.033111572265625, -0.0243682861328125, -0.01495361328125, -0.009490966796875, 0.01082611083984375, 0.01544952392578125, -0.0216522216796875, 0.056915283203125, 0.005954742431640625, 0.041107177734375, -0.001617431640625, 0.022613525390625, 0.0521240234375, -0.027008056640625, -0.027313232421875, -0.00618743896484375, 0.08056640625, 0.033447265625, 0.03265380859375, -0.00927734375, -0.013916015625, 0.0015583038330078125, 0.0214691162109375, -0.058135986328125, -0.0280303955078125, 0.01163482666015625, -0.031951904296875, -0.024078369140625, 0.01177215576171875, -0.046295166015625, 0.0006480216979980469, 0.00580596923828125, 0.055755615234375, -0.048187255859375, 0.0007505416870117188, 0.0212554931640625, -0.020416259765625, 0.015899658203125, -0.020538330078125, -0.05584716796875, 0.0157928466796875, 0.019500732421875, 0.0693359375, 0.00762939453125, -0.0350341796875, -0.01528167724609375, -0.01214599609375, -0.0014524459838867188, 0.044586181640625, -0.022613525390625, -0.01074981689453125, 0.01398468017578125, 0.0176849365234375, -0.041656494140625, -0.025634765625, 0.044097900390625, -0.027740478515625, 0.0533447265625, 0.00750732421875, -0.0654296875, -0.01507568359375, 0.010955810546875, -0.039031982421875, 0.07958984375, 0.0132293701171875, -0.0714111328125, 0.01053619384765625, -0.061798095703125, -0.0235443115234375, -0.01259613037109375, 0.00881195068359375, -0.052520751953125, 0.0121002197265625, 0.036590576171875, 0.0521240234375, 0.0138702392578125, 0.037200927734375, -0.0177154541015625, -0.037139892578125, 0.0299530029296875, -0.03192138671875, 0.0902099609375, 0.009521484375, -0.0247650146484375, 0.01140594482421875, -0.03790283203125, -0.0063934326171875, 0.0222930908203125, -0.012359619140625, -0.0160980224609375, 0.0040283203125, 0.0254974365234375, 0.0177001953125, 0.0160369873046875, -0.056915283203125, 0.00780487060546875, -0.04913330078125, 0.0721435546875, 0.047271728515625, 0.00164031982421875, 0.039306640625, -0.0225982666015625, 0.005420684814453125, 0.024658203125, 0.00034689903259277344, -0.0157012939453125, -0.033843994140625, -0.07635498046875, -0.0241851806640625, 0.028045654296875, 0.040496826171875, -0.0533447265625, 0.08599853515625, -0.037841796875, -0.0322265625, -0.05560302734375, -0.004913330078125, 0.00528717041015625, 0.0269927978515625, 0.049468994140625, -0.0081329345703125, -0.051177978515625, -0.070556640625, 0.000004351139068603516, -0.00505828857421875, 0.0039043426513671875, 0.0197906494140625, 0.055694580078125, -0.036163330078125, 0.08026123046875, -0.045257568359375, -0.03228759765625, -0.037139892578125, 0.0237274169921875, 0.0192718505859375, 0.05084228515625, 0.04119873046875, -0.04669189453125, -0.021636962890625, -0.051727294921875, -0.0535888671875, 0.0026149749755859375, -0.0179595947265625, -0.012786865234375, 0.01776123046875, 0.035247802734375, -0.0628662109375, 0.02880859375, 0.0457763671875, -0.03936767578125, 0.0236053466796875, -0.021392822265625, -0.018402099609375, -0.102294921875, 0.00324249267578125, 0.005512237548828125, -0.0164794921875, -0.032623291015625, 0.0034923553466796875, 0.0087890625, -0.010955810546875, -0.036102294921875, 0.036346435546875, -0.0257720947265625, 0.01210784912109375, -0.0002467632293701172, 0.0283203125, 0.0008931159973144531, 0.057861328125, -0.004970550537109375, 0.052459716796875, 0.0343017578125, -0.041717529296875, 0.0214996337890625, 0.047027587890625, -0.040618896484375, 0.004974365234375, -0.0655517578125, -0.0013713836669921875, -0.0024127960205078125, 0.03515625, -0.08209228515625, -0.00004935264587402344, 0.0266571044921875, -0.04522705078125, 0.01470184326171875, 0.028350830078125, -0.052764892578125, -0.044281005859375, -0.0313720703125, 0.01154327392578125, 0.04498291015625, -0.0447998046875, 0.044464111328125, 0.0196990966796875, -0.0009655952453613281, -0.0433349609375, -0.08929443359375, 0.0019292831420898438, -0.0084381103515625, -0.050445556640625, 0.041656494140625, -0.00431060791015625, 0.0158538818359375, 0.02691650390625, 0.0214996337890625, -0.0012998580932617188, 0.0005555152893066406, 0.0015201568603515625, 0.018035888671875, -0.005222320556640625, 0.02105712890625, 0.01215362548828125, -0.0085601806640625, 0.005382537841796875, -0.01580810546875, 0.0535888671875, -0.0126953125, -0.00885772705078125, -0.035003662109375, 0.01446533203125, 0.0289154052734375, -0.02099609375, 0.08343505859375, 0.0767822265625, -0.035614013671875, -0.004924774169921875, -0.0428466796875, -0.02294921875, -0.034576416015625, 0.05029296875, -0.00930023193359375, -0.07489013671875, 0.0267333984375, 0.0167388916015625, 0.00548553466796875, 0.04803466796875, 0.039947509765625, -0.01265716552734375, 0.05914306640625, 0.04498291015625, -0.01715087890625, 0.0406494140625, -0.047454833984375, 0.02740478515625, -0.07147216796875, -0.0009965896606445312, -0.0134735107421875, -0.0231170654296875, -0.05316162109375, -0.0330810546875, 0.0091705322265625, -0.007076263427734375, -0.0251617431640625, 0.0404052734375, -0.041351318359375, 0.01100921630859375, 0.049468994140625, 0.01519775390625, -0.01242828369140625, 0.0032520294189453125, -0.0304107666015625, -0.0059356689453125, -0.04986572265625, -0.041534423828125, 0.06451416015625, 0.0380859375, 0.0340576171875, -0.00910186767578125, 0.05218505859375, 0.006359100341796875, 0.00411224365234375, -0.0533447265625, 0.044586181640625, -0.0304107666015625, -0.038177490234375, -0.02569580078125, -0.024505615234375, -0.06536865234375, 0.027923583984375, -0.0157928466796875, -0.0572509765625, 0.01039886474609375, -0.0178070068359375, -0.0208282470703125, 0.0229644775390625, -0.064453125, 0.0787353515625, 0.005794525146484375, 0.0003204345703125, -0.0105743408203125, -0.051422119140625, 0.01136016845703125, 0.0179901123046875, 0.0025653839111328125, -0.0013980865478515625, -0.00113677978515625, 0.068603515625, -0.0218048095703125, 0.08123779296875, -0.01837158203125, 0.02130126953125, 0.032196044921875, -0.029571533203125, 0.01947021484375, -0.0072021484375, -0.004627227783203125, 0.0106658935546875, -0.01617431640625, -0.0275421142578125, -0.03875732421875, 0.050750732421875, -0.0770263671875, -0.027008056640625, -0.034881591796875, -0.042510986328125, -0.00347137451171875, 0.0132598876953125, 0.028961181640625, 0.033477783203125, -0.01739501953125, 0.034515380859375, 0.03594970703125, -0.0288238525390625, 0.06097412109375, 0.0063018798828125, 0.0013952255249023438, -0.04217529296875, 0.04925537109375, 0.007015228271484375, -0.0031414031982421875, 0.0323486328125, 0.01398468017578125, -0.033721923828125, -0.016387939453125, -0.02667236328125, 0.032501220703125, -0.044097900390625, -0.01366424560546875, -0.07818603515625, -0.0423583984375, -0.048065185546875, -0.004032135009765625, -0.0162200927734375, -0.034210205078125, -0.0438232421875, -0.0240020751953125, 0.0252227783203125, 0.0341796875, -0.0029544830322265625, 0.032012939453125, -0.052886962890625, 0.006816864013671875, 0.01519775390625, 0.01470184326171875, -0.00026154518127441406, -0.05291748046875, -0.0281219482421875, 0.0005784034729003906, -0.0290069580078125, -0.06329345703125, 0.050811767578125, 0.018951416015625, 0.046844482421875, 0.01153564453125, 0.01104736328125, 0.04315185546875, -0.043792724609375, 0.072998046875, 0.0053558349609375, -0.08013916015625, 0.034576416015625, -0.0024261474609375, 0.0303802490234375, 0.034759521484375, 0.0225677490234375, -0.032257080078125, -0.03204345703125, -0.05352783203125, -0.07916259765625, 0.04766845703125, 0.032958984375, 0.04931640625, -0.0325927734375, 0.0222320556640625, -0.0214691162109375, 0.01473236083984375, -0.09161376953125, -0.0269927978515625, -0.03485107421875, -0.04669189453125, -0.0245819091796875, -0.028076171875, 0.0181121826171875, -0.028045654296875, 0.06011962890625, 0.006927490234375, 0.06243896484375, 0.0279388427734375, -0.0421142578125, 0.01201629638671875, 0.0174407958984375, 0.038177490234375, 0.01502227783203125, -0.0157928466796875, 0.00951385498046875, 0.0212860107421875, -0.02655029296875, -0.002349853515625, 0.037811279296875, -0.00955963134765625, 0.01861572265625, 0.032379150390625, 0.07708740234375, 0.042236328125, -0.0360107421875, 0.060577392578125, -0.0050048828125, -0.0212860107421875, -0.032958984375, -0.010833740234375, 0.019073486328125, 0.0184326171875, 0.0231170654296875, -0.0016469955444335938, -0.00048804283142089844, -0.024322509765625, 0.02642822265625, 0.018798828125, -0.034637451171875, -0.00435638427734375, 0.04864501953125, 0.0122833251953125, -0.01146697998046875, 0.07855224609375, -0.02142333984375, -0.05413818359375, 0.0286102294921875, 0.049896240234375, 0.0771484375, 0.004154205322265625, 0.0211029052734375, 0.0404052734375, 0.0296630859375, 0.0011730194091796875, -0.0019989013671875, 0.0111846923828125, -0.07257080078125, -0.024658203125, -0.04412841796875, 0.006580352783203125, 0.0030364990234375, -0.04083251953125, 0.0160675048828125, -0.009613037109375, -0.01259613037109375, -0.01617431640625, 0.0005412101745605469, -0.046234130859375, 0.009002685546875, 0.007503509521484375, 0.064453125, -0.07672119140625, 0.058807373046875, 0.049896240234375, -0.050628662109375, -0.051055908203125, -0.003376007080078125, -0.0294342041015625, -0.058868408203125, 0.0408935546875, 0.040069580078125, 0.01727294921875, 0.0178070068359375, -0.0457763671875, -0.058868408203125, 0.09698486328125, 0.0157623291015625, -0.0255889892578125, -0.0192108154296875, 0.00431060791015625, 0.03668212890625, -0.03912353515625, 0.0269927978515625, 0.0245361328125, 0.0232391357421875, -0.006290435791015625, -0.048248291015625, 0.0159912109375, -0.02471923828125, 0.019287109375, -0.01357269287109375, -0.03704833984375, 0.07086181640625, -0.00592041015625, -0.0162200927734375, 0.01468658447265625, 0.0682373046875, 0.0205078125, -0.0069122314453125, 0.03765869140625, 0.065673828125, 0.041717529296875, -0.00975799560546875, 0.07025146484375, -0.0217132568359375, 0.05126953125, 0.07373046875, 0.00821685791015625, 0.08514404296875, 0.03314208984375, -0.003490447998046875, 0.06280517578125, 0.04248046875, -0.027374267578125, 0.05291748046875, 0.018035888671875, 0.0057220458984375, 0.00043201446533203125, 0.009307861328125, -0.015045166015625, 0.03594970703125, 0.01468658447265625, -0.056884765625, -0.0035190582275390625, 0.01251220703125, 0.00408935546875, -0.00296783447265625, 0.00934600830078125, 0.0447998046875, 0.00982666015625, -0.032135009765625, 0.0299530029296875, 0.01531219482421875, 0.07952880859375, -0.0284271240234375, 0.01114654541015625, -0.0015163421630859375, 0.022186279296875, 0.00536346435546875, -0.042755126953125, 0.0272369384765625, -0.00904083251953125, -0.003192901611328125, -0.0174560546875, 0.04644775390625, -0.045166015625, -0.046112060546875, 0.02801513671875, 0.039886474609375, 0.00234222412109375, 0.006885528564453125, -0.07763671875, -0.0006642341613769531, -0.0019741058349609375, -0.038970947265625, 0.01151275634765625, 0.02166748046875, 0.0291900634765625, 0.041656494140625, 0.028106689453125, -0.01497650146484375, 0.0089569091796875, 0.014007568359375, 0.064697265625, -0.046234130859375, -0.0440673828125, -0.070068359375, 0.055206298828125, -0.0142974853515625, -0.023712158203125, 0.04498291015625, 0.0384521484375, 0.06536865234375, -0.021728515625, 0.041107177734375, -0.01031494140625, 0.0178070068359375, -0.04010009765625, 0.06573486328125, -0.03424072265625, -0.006168365478515625, -0.0177459716796875, -0.06915283203125, -0.0260162353515625, 0.08685302734375, -0.0257568359375, 0.015045166015625, 0.0665283203125, 0.05621337890625, -0.00319671630859375, -0.0018901824951171875, 0.0102996826171875, 0.031494140625, 0.01708984375, 0.035491943359375, 0.034088134765625, -0.06243896484375, 0.047027587890625, -0.036834716796875, -0.00496673583984375, -0.011627197265625, -0.06439208984375, -0.076171875, -0.059722900390625, -0.032257080078125, -0.0176849365234375, -0.0003826618194580078, 0.08245849609375, 0.04931640625, -0.055389404296875, -0.00860595703125, -0.021209716796875, -0.01702880859375, -0.00933074951171875, -0.0238189697265625, 0.0404052734375, -0.04510498046875, -0.060302734375, 0.01001739501953125, -0.01158905029296875, 0.01171875, -0.0292510986328125, 0.010406494140625, -0.052520751953125, 0.01436614990234375, 0.044921875, -0.023468017578125, -0.0631103515625, -0.0249481201171875, 0.0036258697509765625, -0.0251922607421875, -0.00843048095703125, 0.0244903564453125, -0.052886962890625, 0.021270751953125, 0.0251007080078125, 0.046539306640625, 0.046630859375, -0.018035888671875, 0.03790283203125, -0.0640869140625, 0.0169677734375, 0.009521484375, 0.05194091796875, 0.0338134765625, -0.017852783203125, 0.042510986328125, 0.01617431640625, -0.036102294921875, -0.04949951171875, -0.0162506103515625, -0.07684326171875, -0.0248870849609375, 0.07989501953125, -0.032684326171875, -0.0260009765625, 0.01520538330078125, -0.01198577880859375, 0.03961181640625, -0.0253448486328125, 0.052703857421875, 0.0679931640625, 0.00481414794921875, -0.023834228515625, -0.0240936279296875, 0.01194000244140625, 0.03289794921875, -0.039306640625, -0.01198577880859375, 0.0194549560546875, 0.0178680419921875, 0.0232696533203125, 0.0312347412109375, -0.007450103759765625, -0.0013256072998046875, 0.003719329833984375, 0.01313018798828125, -0.01454925537109375, 0.004085540771484375, -0.0253448486328125, 0.0024318695068359375, -0.029205322265625, -0.033172607421875 ] ]
naver-clova-ix/donut-base-finetuned-cord-v2
2022-08-13T08:28:13.000Z
[ "transformers", "pytorch", "vision-encoder-decoder", "donut", "image-to-text", "vision", "arxiv:2111.15664", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
naver-clova-ix
null
null
naver-clova-ix/donut-base-finetuned-cord-v2
47
151,863
transformers
2022-07-19T01:53:24
--- license: mit tags: - donut - image-to-text - vision --- # Donut (base-sized model, fine-tuned on CORD) Donut model fine-tuned on CORD. It was introduced in the paper [OCR-free Document Understanding Transformer](https://arxiv.org/abs/2111.15664) by Geewok et al. and first released in [this repository](https://github.com/clovaai/donut). Disclaimer: The team releasing Donut did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description Donut consists of a vision encoder (Swin Transformer) and a text decoder (BART). Given an image, the encoder first encodes the image into a tensor of embeddings (of shape batch_size, seq_len, hidden_size), after which the decoder autoregressively generates text, conditioned on the encoding of the encoder. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/donut_architecture.jpg) ## Intended uses & limitations This model is fine-tuned on CORD, a document parsing dataset. We refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/donut) which includes code examples. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2111-15664, author = {Geewook Kim and Teakgyu Hong and Moonbin Yim and Jinyoung Park and Jinyeong Yim and Wonseok Hwang and Sangdoo Yun and Dongyoon Han and Seunghyun Park}, title = {Donut: Document Understanding Transformer without {OCR}}, journal = {CoRR}, volume = {abs/2111.15664}, year = {2021}, url = {https://arxiv.org/abs/2111.15664}, eprinttype = {arXiv}, eprint = {2111.15664}, timestamp = {Thu, 02 Dec 2021 10:50:44 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2111-15664.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
2,006
[ [ -0.025115966796875, -0.045196533203125, 0.0195159912109375, -0.0189666748046875, -0.00383758544921875, -0.0013761520385742188, -0.007221221923828125, -0.033172607421875, 0.017730712890625, 0.039794921875, -0.03778076171875, -0.0170440673828125, -0.043060302734375, 0.0021572113037109375, -0.0290679931640625, 0.0828857421875, -0.0166168212890625, 0.00019431114196777344, -0.01434326171875, -0.004146575927734375, -0.00921630859375, -0.0406494140625, -0.021484375, -0.033203125, 0.01448822021484375, 0.0267486572265625, 0.062744140625, 0.046234130859375, 0.042938232421875, 0.0208587646484375, -0.02679443359375, -0.006229400634765625, -0.0361328125, -0.0242462158203125, 0.00591278076171875, -0.057952880859375, -0.06787109375, 0.00428009033203125, 0.019195556640625, 0.042877197265625, 0.0066070556640625, 0.01004791259765625, -0.0008563995361328125, 0.04180908203125, -0.026947021484375, 0.00121307373046875, -0.0377197265625, -0.002437591552734375, -0.0088348388671875, 0.01751708984375, -0.027618408203125, -0.024627685546875, -0.005466461181640625, -0.046600341796875, 0.0543212890625, 0.0175933837890625, 0.1175537109375, 0.00879669189453125, -0.01119232177734375, -0.01605224609375, -0.058837890625, 0.060577392578125, -0.039093017578125, 0.04541015625, 0.024749755859375, 0.0298614501953125, 0.0091705322265625, -0.0643310546875, -0.058807373046875, -0.0198211669921875, -0.03936767578125, 0.012298583984375, -0.037261962890625, -0.0163116455078125, 0.040618896484375, 0.051116943359375, -0.037994384765625, -0.01251220703125, -0.0474853515625, 0.004703521728515625, 0.0462646484375, -0.0007090568542480469, 0.03485107421875, -0.03790283203125, -0.037322998046875, -0.0186614990234375, -0.035247802734375, 0.0096282958984375, 0.0274200439453125, -0.0080108642578125, -0.04736328125, 0.042877197265625, 0.01149749755859375, 0.0269622802734375, 0.0304412841796875, 0.0027332305908203125, 0.0428466796875, -0.0275726318359375, -0.0269622802734375, 0.0033473968505859375, 0.07452392578125, 0.03704833984375, 0.027618408203125, -0.019927978515625, -0.0242767333984375, 0.006954193115234375, 0.049346923828125, -0.0587158203125, -0.033233642578125, -0.0024776458740234375, -0.037994384765625, -0.02435302734375, 0.01045989990234375, -0.05731201171875, 0.00750732421875, -0.0225982666015625, 0.00998687744140625, -0.03277587890625, -0.041839599609375, -0.0037975311279296875, 0.003398895263671875, 0.0220184326171875, 0.025604248046875, -0.06829833984375, 0.03955078125, 0.0343017578125, 0.059600830078125, -0.0003731250762939453, -0.000606536865234375, -0.01207733154296875, -0.00562286376953125, -0.0254974365234375, 0.0513916015625, -0.029541015625, -0.04034423828125, -0.014678955078125, 0.03717041015625, -0.0112152099609375, -0.04248046875, 0.08111572265625, -0.034942626953125, 0.0013170242309570312, -0.0268402099609375, -0.020751953125, -0.0174102783203125, 0.0214385986328125, -0.06512451171875, 0.08099365234375, 0.023223876953125, -0.06829833984375, 0.0240936279296875, -0.047119140625, -0.01546478271484375, -0.004241943359375, -0.005634307861328125, -0.035858154296875, 0.0240478515625, 0.0291290283203125, 0.015533447265625, -0.017974853515625, 0.00675201416015625, -0.016815185546875, -0.006900787353515625, 0.019195556640625, -0.00701141357421875, 0.061492919921875, 0.011077880859375, -0.006336212158203125, 0.01090240478515625, -0.047943115234375, -0.0244598388671875, 0.0587158203125, 0.0146942138671875, -0.01396942138671875, -0.024139404296875, 0.027435302734375, 0.00400543212890625, 0.0188140869140625, -0.05816650390625, 0.02862548828125, -0.03375244140625, 0.039642333984375, 0.0240478515625, -0.0309600830078125, 0.05316162109375, -0.03851318359375, 0.036407470703125, 0.0038280487060546875, 0.0224151611328125, -0.0280914306640625, -0.033294677734375, -0.06787109375, -0.00896453857421875, 0.035797119140625, 0.044769287109375, -0.0230255126953125, 0.04248046875, -0.03662109375, -0.053314208984375, -0.04693603515625, -0.012908935546875, 0.01287841796875, 0.053619384765625, 0.034576416015625, -0.018890380859375, -0.0280609130859375, -0.071044921875, -0.0013437271118164062, 0.0019702911376953125, -0.0017843246459960938, 0.02862548828125, 0.034027099609375, -0.005359649658203125, 0.07928466796875, -0.04876708984375, -0.0254058837890625, -0.032379150390625, -0.00734710693359375, 0.034088134765625, 0.040618896484375, 0.0628662109375, -0.0550537109375, -0.04779052734375, 0.00292205810546875, -0.0443115234375, -0.0118865966796875, 0.0033206939697265625, -0.0115814208984375, 0.0196075439453125, 0.031890869140625, -0.059539794921875, 0.05938720703125, 0.030426025390625, -0.0118865966796875, 0.04150390625, -0.01494598388671875, 0.01410675048828125, -0.07818603515625, 0.0204010009765625, 0.0030975341796875, -0.0251922607421875, -0.0472412109375, -0.004322052001953125, 0.02728271484375, -0.0045928955078125, -0.0260162353515625, 0.053314208984375, -0.031341552734375, 0.017608642578125, -0.0162200927734375, 0.023193359375, 0.023468017578125, 0.03546142578125, -0.0029506683349609375, 0.034576416015625, 0.032867431640625, -0.0289154052734375, 0.0179595947265625, 0.033111572265625, -0.011474609375, 0.061279296875, -0.0679931640625, 0.0089874267578125, -0.002483367919921875, 0.01554107666015625, -0.0833740234375, -0.017822265625, 0.034210205078125, -0.037017822265625, 0.03826904296875, -0.0258636474609375, -0.06573486328125, -0.048370361328125, -0.006168365478515625, 0.028778076171875, 0.0592041015625, -0.049163818359375, 0.051025390625, 0.015045166015625, 0.0132598876953125, -0.0210418701171875, -0.0635986328125, -0.025665283203125, -0.007396697998046875, -0.06756591796875, 0.0606689453125, -0.0212249755859375, 0.007083892822265625, 0.0182647705078125, -0.02508544921875, -0.0081939697265625, -0.01678466796875, 0.0291290283203125, 0.0291595458984375, -0.013702392578125, -0.006427764892578125, 0.014068603515625, -0.0276947021484375, -0.0033588409423828125, 0.023040771484375, 0.047760009765625, -0.00835418701171875, -0.0168914794921875, -0.052978515625, 0.0064544677734375, 0.035797119140625, -0.00833892822265625, 0.0352783203125, 0.060028076171875, -0.04296875, 0.01078033447265625, -0.042449951171875, -0.0140838623046875, -0.037353515625, -0.005748748779296875, -0.043182373046875, -0.036865234375, 0.05462646484375, -0.004413604736328125, -0.0007529258728027344, 0.06494140625, 0.0298614501953125, 0.003143310546875, 0.05706787109375, 0.048553466796875, 0.0110015869140625, 0.032562255859375, -0.032440185546875, 0.03131103515625, -0.07940673828125, -0.0298004150390625, -0.039398193359375, -0.036529541015625, -0.031463623046875, -0.0198211669921875, 0.01343536376953125, 0.054718017578125, -0.0100860595703125, 0.05401611328125, -0.05755615234375, 0.02862548828125, 0.033721923828125, -0.006740570068359375, 0.021484375, -0.0004868507385253906, -0.036041259765625, -0.0164947509765625, -0.0305938720703125, -0.04974365234375, 0.0648193359375, 0.03216552734375, 0.0628662109375, 0.004566192626953125, 0.04083251953125, -0.007427215576171875, 0.00910186767578125, -0.06146240234375, 0.030303955078125, -0.012481689453125, -0.048095703125, 0.0247039794921875, -0.0194854736328125, -0.08099365234375, -0.01165771484375, -0.01029205322265625, -0.07635498046875, 0.0091552734375, 0.020355224609375, -0.01519012451171875, 0.045166015625, -0.0679931640625, 0.06475830078125, -0.030120849609375, -0.0024662017822265625, 0.006343841552734375, -0.033416748046875, 0.0202178955078125, 0.0103302001953125, 0.006328582763671875, -0.0022640228271484375, 0.01493072509765625, 0.05230712890625, -0.030914306640625, 0.058380126953125, -0.007671356201171875, 0.0015735626220703125, 0.01500701904296875, 0.01467132568359375, 0.033050537109375, 0.01114654541015625, 0.0015516281127929688, 0.05908203125, 0.0216064453125, -0.014312744140625, -0.0367431640625, 0.059295654296875, -0.070556640625, -0.03155517578125, -0.032440185546875, -0.02679443359375, 0.01360321044921875, 0.0343017578125, 0.04461669921875, 0.01491546630859375, -0.0197601318359375, 0.003330230712890625, 0.034210205078125, -0.01556396484375, 0.04388427734375, 0.0083770751953125, -0.034393310546875, -0.034088134765625, 0.04193115234375, 0.0120697021484375, 0.00958251953125, 0.032745361328125, 0.02362060546875, -0.019622802734375, -0.00756072998046875, -0.052520751953125, 0.048187255859375, -0.039337158203125, -0.0264434814453125, -0.06927490234375, -0.044952392578125, -0.0362548828125, -0.0298309326171875, -0.045989990234375, -0.023895263671875, -0.047210693359375, 0.0092926025390625, 0.037322998046875, 0.0640869140625, 0.004657745361328125, 0.05389404296875, -0.058990478515625, 0.03192138671875, -0.00782012939453125, 0.043304443359375, 0.0185089111328125, -0.04400634765625, -0.01201629638671875, -0.004913330078125, -0.04962158203125, -0.05914306640625, 0.037109375, -0.020721435546875, 0.049560546875, 0.022613525390625, 0.0113677978515625, 0.037994384765625, -0.05450439453125, 0.06988525390625, 0.03375244140625, -0.07513427734375, 0.0304412841796875, -0.00913238525390625, 0.0244140625, 0.01139068603515625, 0.0287933349609375, -0.038665771484375, 0.011810302734375, -0.0701904296875, -0.0587158203125, 0.08306884765625, 0.0228729248046875, 0.0262298583984375, 0.0185089111328125, 0.0178985595703125, 0.028656005859375, 0.005157470703125, -0.04681396484375, -0.034576416015625, -0.039306640625, -0.020965576171875, 0.018218994140625, -0.0264434814453125, -0.01499176025390625, -0.0269012451171875, 0.0258941650390625, 0.016448974609375, 0.04827880859375, 0.0207061767578125, -0.0161590576171875, -0.0210418701171875, 0.004688262939453125, 0.047210693359375, 0.0275726318359375, -0.0311279296875, -0.01311492919921875, -0.0055389404296875, -0.04852294921875, -0.0264892578125, 0.00966644287109375, -0.00815582275390625, 0.00482177734375, 0.0206298828125, 0.07818603515625, 0.004253387451171875, -0.023895263671875, 0.046905517578125, -0.013397216796875, -0.033599853515625, -0.04168701171875, 0.01495361328125, 0.007289886474609375, 0.01555633544921875, 0.0263671875, 0.0220489501953125, -0.0108642578125, 0.007030487060546875, 0.013214111328125, 0.01165771484375, -0.033294677734375, -0.0572509765625, 0.060882568359375, -0.001316070556640625, -0.039642333984375, 0.037353515625, -0.027679443359375, -0.033172607421875, 0.0386962890625, 0.04779052734375, 0.062225341796875, -0.01462554931640625, -0.003406524658203125, 0.05181884765625, 0.03851318359375, 0.0026397705078125, 0.0172576904296875, 0.005748748779296875, -0.0498046875, 0.00146484375, -0.051116943359375, -0.0121307373046875, 0.031494140625, -0.0518798828125, 0.04876708984375, -0.044403076171875, -0.0178375244140625, 0.004932403564453125, -0.005138397216796875, -0.07916259765625, 0.0160675048828125, 0.004177093505859375, 0.0673828125, -0.04730224609375, 0.0528564453125, 0.04998779296875, -0.032623291015625, -0.04833984375, 0.0162811279296875, 0.0006580352783203125, -0.058319091796875, 0.0548095703125, 0.019683837890625, 0.015899658203125, -0.01197052001953125, -0.041656494140625, -0.0767822265625, 0.07818603515625, 0.0245819091796875, -0.055908203125, -0.01067352294921875, -0.00832366943359375, 0.0273895263671875, -0.029083251953125, 0.045166015625, 0.016448974609375, 0.022918701171875, 0.0230865478515625, -0.0877685546875, 0.01276397705078125, -0.0224609375, -0.004512786865234375, 0.0175628662109375, -0.058135986328125, 0.0660400390625, -0.01050567626953125, -0.01535797119140625, 0.004505157470703125, 0.03704833984375, -0.00556182861328125, 0.02197265625, 0.0400390625, 0.0655517578125, 0.0439453125, -0.01971435546875, 0.07794189453125, -0.018463134765625, 0.045196533203125, 0.07171630859375, 0.0028018951416015625, 0.046234130859375, 0.01513671875, -0.021240234375, 0.039093017578125, 0.043792724609375, -0.0240020751953125, 0.03643798828125, 0.0020465850830078125, 0.0268402099609375, -0.0049285888671875, -0.0045013427734375, -0.03521728515625, 0.027984619140625, 0.039306640625, -0.048492431640625, -0.0206451416015625, -0.01125335693359375, 0.019683837890625, -0.005523681640625, -0.0131072998046875, 0.034515380859375, 0.0136260986328125, -0.0099029541015625, 0.057159423828125, -0.001880645751953125, 0.04815673828125, -0.0229949951171875, 0.0080108642578125, -0.007190704345703125, 0.0019445419311523438, -0.0269012451171875, -0.0469970703125, 0.041107177734375, 0.0064239501953125, -0.0298614501953125, -0.006191253662109375, 0.043670654296875, -0.0035152435302734375, -0.05950927734375, 0.0287628173828125, 0.0213623046875, 0.01349639892578125, 0.02166748046875, -0.0672607421875, 0.0301055908203125, -0.00921630859375, -0.02667236328125, 0.0057830810546875, 0.029998779296875, 0.0034809112548828125, 0.0162353515625, 0.04449462890625, -0.007053375244140625, 0.0020008087158203125, 0.004573822021484375, 0.07159423828125, -0.050048828125, -0.029541015625, -0.0498046875, 0.0599365234375, -0.018341064453125, -0.0209808349609375, 0.034393310546875, 0.037628173828125, 0.0867919921875, -0.011474609375, 0.05059814453125, -0.022186279296875, 0.041168212890625, -0.0219573974609375, 0.0670166015625, -0.0703125, -0.01006317138671875, -0.0298919677734375, -0.0675048828125, -0.032684326171875, 0.054168701171875, -0.04095458984375, 0.0195159912109375, 0.0592041015625, 0.05596923828125, -0.050689697265625, 0.003597259521484375, 0.0247650146484375, 0.0107879638671875, 0.0205535888671875, 0.00428009033203125, 0.0264892578125, -0.060272216796875, 0.044921875, -0.03399658203125, -0.0127410888671875, -0.042205810546875, -0.055694580078125, -0.08544921875, -0.04608154296875, -0.032318115234375, -0.031341552734375, -0.04461669921875, 0.04931640625, 0.06622314453125, -0.060272216796875, -0.0002830028533935547, -0.0032787322998046875, 0.00494384765625, -0.010986328125, -0.02020263671875, 0.043365478515625, -0.0085601806640625, -0.07550048828125, -0.0025157928466796875, 0.005199432373046875, 0.00864410400390625, 0.0020198822021484375, -0.0010366439819335938, 0.0087432861328125, 0.0000890493392944336, 0.022735595703125, 0.0012292861938476562, -0.06597900390625, -0.00830841064453125, 0.0277252197265625, -0.0095977783203125, 0.039825439453125, 0.051422119140625, -0.035675048828125, 0.034820556640625, 0.037109375, 0.033050537109375, 0.05126953125, -0.00630950927734375, 0.00998687744140625, -0.047515869140625, 0.008697509765625, 0.001628875732421875, 0.034332275390625, 0.034515380859375, -0.036895751953125, 0.0284576416015625, 0.0350341796875, -0.043609619140625, -0.062225341796875, 0.0274810791015625, -0.1092529296875, 0.0028133392333984375, 0.07598876953125, -0.007411956787109375, -0.039459228515625, 0.015655517578125, -0.0249176025390625, 0.037689208984375, -0.049346923828125, 0.03662109375, 0.041534423828125, -0.00685882568359375, -0.04296875, -0.045745849609375, 0.0202484130859375, 0.007472991943359375, -0.037628173828125, -0.0038776397705078125, 0.0033626556396484375, 0.0266571044921875, 0.051849365234375, 0.0275421142578125, -0.0187225341796875, -0.00949859619140625, 0.006847381591796875, 0.0212860107421875, -0.0022678375244140625, -0.01016998291015625, -0.00572967529296875, -0.0019779205322265625, -0.011077880859375, -0.021392822265625 ] ]
microsoft/codebert-base
2022-02-11T19:59:44.000Z
[ "transformers", "pytorch", "tf", "jax", "rust", "roberta", "feature-extraction", "arxiv:2002.08155", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
microsoft
null
null
microsoft/codebert-base
136
149,783
transformers
2022-03-02T23:29:05
## CodeBERT-base Pretrained weights for [CodeBERT: A Pre-Trained Model for Programming and Natural Languages](https://arxiv.org/abs/2002.08155). ### Training Data The model is trained on bi-modal data (documents & code) of [CodeSearchNet](https://github.com/github/CodeSearchNet) ### Training Objective This model is initialized with Roberta-base and trained with MLM+RTD objective (cf. the paper). ### Usage Please see [the official repository](https://github.com/microsoft/CodeBERT) for scripts that support "code search" and "code-to-document generation". ### Reference 1. [CodeBERT trained with Masked LM objective](https://huggingface.co/microsoft/codebert-base-mlm) (suitable for code completion) 2. 🤗 [Hugging Face's CodeBERTa](https://huggingface.co/huggingface/CodeBERTa-small-v1) (small size, 6 layers) ### Citation ```bibtex @misc{feng2020codebert, title={CodeBERT: A Pre-Trained Model for Programming and Natural Languages}, author={Zhangyin Feng and Daya Guo and Duyu Tang and Nan Duan and Xiaocheng Feng and Ming Gong and Linjun Shou and Bing Qin and Ting Liu and Daxin Jiang and Ming Zhou}, year={2020}, eprint={2002.08155}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
1,219
[ [ -0.0109100341796875, -0.01380157470703125, 0.012786865234375, 0.0293731689453125, 0.00206756591796875, 0.00870513916015625, -0.0280914306640625, -0.00586700439453125, 0.011474609375, 0.03802490234375, -0.03570556640625, -0.055877685546875, -0.043426513671875, -0.007297515869140625, -0.0233612060546875, 0.0980224609375, -0.0160675048828125, 0.033599853515625, -0.0194244384765625, -0.0166168212890625, -0.0130767822265625, -0.05877685546875, -0.0189971923828125, -0.02716064453125, 0.0286102294921875, 0.01523590087890625, 0.03839111328125, 0.01445770263671875, 0.02593994140625, 0.01512908935546875, 0.01068878173828125, -0.002925872802734375, -0.049774169921875, -0.0253753662109375, 0.0166778564453125, -0.05108642578125, -0.053955078125, 0.02301025390625, 0.0270538330078125, 0.04083251953125, 0.00283050537109375, 0.03558349609375, 0.01261138916015625, 0.0687255859375, -0.04449462890625, 0.017822265625, -0.039642333984375, 0.0091400146484375, 0.0007758140563964844, -0.0258331298828125, -0.03680419921875, -0.030059814453125, 0.0069122314453125, -0.0262298583984375, 0.0254974365234375, -0.003437042236328125, 0.090576171875, 0.0140228271484375, -0.0252838134765625, -0.0303497314453125, -0.033233642578125, 0.06817626953125, -0.041595458984375, 0.0158538818359375, 0.0372314453125, -0.00730133056640625, 0.0192718505859375, -0.07568359375, -0.004405975341796875, -0.03411865234375, -0.00872802734375, -0.00769805908203125, -0.0199737548828125, 0.004528045654296875, 0.04058837890625, 0.0101165771484375, -0.07501220703125, -0.03228759765625, -0.06097412109375, -0.0207977294921875, 0.033355712890625, -0.007572174072265625, 0.015655517578125, 0.0024700164794921875, -0.0450439453125, -0.00118255615234375, -0.05047607421875, 0.0166473388671875, 0.03289794921875, 0.0031185150146484375, -0.0428466796875, 0.033172607421875, -0.0014562606811523438, 0.07220458984375, 0.0023822784423828125, 0.014190673828125, 0.06591796875, -0.037353515625, -0.021209716796875, -0.0012369155883789062, 0.052032470703125, 0.004611968994140625, 0.053558349609375, -0.0198211669921875, -0.0178985595703125, -0.008209228515625, 0.0328369140625, -0.07696533203125, -0.049468994140625, 0.018646240234375, -0.053131103515625, -0.038299560546875, 0.0275115966796875, -0.01540374755859375, 0.01080322265625, -0.03851318359375, 0.035125732421875, -0.0168609619140625, 0.00013947486877441406, 0.0142364501953125, 0.002094268798828125, 0.0168914794921875, 0.00659942626953125, -0.036956787109375, 0.0128173828125, 0.0164031982421875, 0.058258056640625, -0.004215240478515625, -0.0196533203125, -0.04449462890625, -0.0222015380859375, -0.01380157470703125, 0.026153564453125, -0.023895263671875, -0.005008697509765625, 0.01023101806640625, 0.0091094970703125, -0.01247406005859375, -0.03131103515625, 0.0443115234375, -0.05902099609375, 0.02447509765625, 0.00498199462890625, -0.0214080810546875, -0.0176544189453125, 0.024444580078125, -0.04058837890625, 0.0770263671875, 0.0295867919921875, -0.0595703125, 0.0184326171875, -0.043060302734375, -0.0180206298828125, -0.005283355712890625, -0.021087646484375, -0.03759765625, -0.01617431640625, -0.006805419921875, 0.034515380859375, -0.00563812255859375, 0.0230712890625, -0.0167083740234375, -0.02386474609375, 0.0193023681640625, 0.007732391357421875, 0.080078125, 0.0300140380859375, -0.0264892578125, 0.022918701171875, -0.070068359375, 0.03643798828125, 0.0099334716796875, -0.0390625, -0.002323150634765625, -0.01021575927734375, 0.0294342041015625, 0.0191192626953125, 0.053680419921875, -0.01751708984375, 0.0274505615234375, -0.0190887451171875, 0.0243988037109375, 0.03228759765625, -0.02899169921875, 0.0418701171875, -0.0224761962890625, 0.062408447265625, -0.0087432861328125, 0.0085296630859375, 0.00817108154296875, -0.016815185546875, -0.054718017578125, -0.0184478759765625, 0.0404052734375, 0.0450439453125, -0.052093505859375, 0.05413818359375, -0.018798828125, -0.04559326171875, -0.02056884765625, 0.028472900390625, 0.04571533203125, 0.017242431640625, 0.0310211181640625, -0.0226898193359375, -0.07000732421875, -0.062255859375, -0.01290130615234375, 0.0067901611328125, -0.008209228515625, 0.0103607177734375, 0.03955078125, -0.0308990478515625, 0.07489013671875, -0.047637939453125, -0.0208587646484375, -0.0233917236328125, 0.0302734375, 0.0289459228515625, 0.044891357421875, 0.06011962890625, -0.06793212890625, -0.042205810546875, -0.038299560546875, -0.048583984375, -0.0132904052734375, -0.003757476806640625, -0.0197601318359375, 0.01287841796875, 0.031402587890625, -0.01702880859375, 0.040863037109375, 0.057525634765625, -0.01385498046875, 0.041015625, -0.0082855224609375, 0.0036907196044921875, -0.07763671875, 0.016204833984375, -0.0014009475708007812, -0.023406982421875, -0.0443115234375, -0.00519561767578125, 0.0256805419921875, -0.00970458984375, -0.0245513916015625, 0.01751708984375, -0.04986572265625, 0.0008640289306640625, -0.01885986328125, 0.0170745849609375, 0.01181793212890625, 0.060302734375, 0.0098419189453125, 0.05963134765625, 0.05316162109375, -0.04302978515625, 0.013824462890625, 0.0117340087890625, -0.020294189453125, -0.0007200241088867188, -0.07373046875, 0.01122283935546875, 0.00780487060546875, 0.0202484130859375, -0.062042236328125, 0.0208740234375, 0.0283203125, -0.054046630859375, 0.019378662109375, -0.0210418701171875, -0.03302001953125, -0.016632080078125, -0.031951904296875, 0.04876708984375, 0.0347900390625, -0.02789306640625, 0.0284423828125, 0.007320404052734375, 0.01352691650390625, -0.04388427734375, -0.055450439453125, -0.00965118408203125, -0.00012004375457763672, -0.0504150390625, 0.033233642578125, -0.0345458984375, 0.007419586181640625, -0.00830841064453125, -0.00032448768615722656, -0.019439697265625, -0.0086212158203125, 0.02117919921875, 0.035125732421875, -0.01302337646484375, 0.0129852294921875, -0.046539306640625, -0.00608062744140625, 0.0103302001953125, -0.0272979736328125, 0.05731201171875, -0.0345458984375, -0.0109710693359375, 0.0007042884826660156, -0.00911712646484375, 0.00655364990234375, -0.030853271484375, 0.061248779296875, 0.06683349609375, -0.0267333984375, -0.016204833984375, -0.0289154052734375, 0.00395965576171875, -0.0310211181640625, 0.02294921875, -0.0111236572265625, -0.0517578125, 0.0239715576171875, 0.01318359375, -0.0126190185546875, 0.02587890625, 0.049346923828125, 0.019744873046875, 0.05206298828125, 0.04327392578125, -0.0186767578125, 0.055572509765625, -0.060028076171875, 0.01250457763671875, -0.054229736328125, -0.013427734375, -0.06298828125, -0.0223541259765625, -0.02783203125, -0.038055419921875, 0.0131072998046875, 0.030120849609375, -0.037933349609375, 0.07208251953125, -0.0577392578125, -0.0036792755126953125, 0.0567626953125, 0.01959228515625, 0.004871368408203125, 0.01006317138671875, -0.00812530517578125, -0.0027179718017578125, -0.0458984375, -0.0418701171875, 0.09478759765625, 0.01473236083984375, 0.08575439453125, -0.0023670196533203125, 0.07159423828125, 0.0289306640625, 0.0163726806640625, -0.040679931640625, 0.0206451416015625, 0.0173187255859375, -0.05902099609375, 0.008026123046875, -0.04583740234375, -0.0753173828125, -0.00927734375, -0.0125732421875, -0.055145263671875, 0.01454925537109375, 0.016815185546875, -0.00250244140625, -0.0012788772583007812, -0.0526123046875, 0.07183837890625, -0.04449462890625, -0.02020263671875, 0.0015630722045898438, -0.047698974609375, 0.020355224609375, 0.00475311279296875, 0.01114654541015625, 0.01366424560546875, -0.007358551025390625, 0.06414794921875, -0.02978515625, 0.05645751953125, -0.0157318115234375, -0.005146026611328125, 0.0217437744140625, -0.006824493408203125, 0.044189453125, 0.0034198760986328125, -0.003665924072265625, 0.02557373046875, 0.01096343994140625, -0.040802001953125, -0.0277862548828125, 0.033905029296875, -0.06658935546875, 0.00026679039001464844, -0.044769287109375, -0.0297088623046875, 0.006305694580078125, 0.016815185546875, 0.040924072265625, 0.048004150390625, -0.006946563720703125, 0.041778564453125, 0.050018310546875, -0.0188140869140625, 0.01334381103515625, 0.04034423828125, -0.0216064453125, -0.0291748046875, 0.059967041015625, 0.0017147064208984375, 0.01503753662109375, 0.04046630859375, -0.02020263671875, 0.0023956298828125, -0.036346435546875, -0.0110931396484375, 0.002277374267578125, -0.048553466796875, -0.026702880859375, -0.053131103515625, -0.042633056640625, -0.037139892578125, -0.018707275390625, -0.028228759765625, -0.01190948486328125, -0.036956787109375, 0.011962890625, 0.0167388916015625, 0.026397705078125, 0.004413604736328125, 0.007068634033203125, -0.061370849609375, 0.0284271240234375, -0.01409149169921875, 0.0286102294921875, -0.00940704345703125, -0.040191650390625, -0.061614990234375, 0.0112457275390625, -0.0274505615234375, -0.044952392578125, 0.0263671875, 0.004146575927734375, 0.044647216796875, 0.0240020751953125, 0.00867462158203125, 0.01049041748046875, -0.018707275390625, 0.04656982421875, 0.0283050537109375, -0.054473876953125, 0.035369873046875, -0.01080322265625, 0.021240234375, 0.05810546875, 0.044708251953125, -0.01216888427734375, 0.00554656982421875, -0.04736328125, -0.06805419921875, 0.055267333984375, 0.032958984375, 0.006664276123046875, 0.023590087890625, -0.0028400421142578125, -0.01500701904296875, 0.036407470703125, -0.10589599609375, -0.038482666015625, -0.0160369873046875, -0.0290985107421875, 0.0034885406494140625, -0.0172882080078125, -0.0296478271484375, -0.023895263671875, 0.05718994140625, -0.0101470947265625, 0.0347900390625, 0.019989013671875, -0.04522705078125, 0.0057830810546875, -0.0017919540405273438, 0.06427001953125, 0.05963134765625, -0.03826904296875, -0.007747650146484375, -0.01004791259765625, -0.036346435546875, -0.00991058349609375, 0.0180206298828125, -0.001148223876953125, 0.0016889572143554688, 0.0386962890625, 0.0576171875, 0.0247802734375, -0.05572509765625, 0.056671142578125, -0.00420379638671875, -0.042388916015625, -0.0513916015625, 0.0126800537109375, 0.0053253173828125, 0.015777587890625, 0.0301513671875, 0.0280914306640625, 0.006175994873046875, -0.0264434814453125, 0.0198516845703125, 0.0308837890625, -0.03741455078125, -0.0079345703125, 0.044891357421875, 0.01019287109375, -0.0220794677734375, 0.01464080810546875, -0.035675048828125, -0.052886962890625, 0.053466796875, 0.01421356201171875, 0.07110595703125, 0.037933349609375, 0.00004082918167114258, 0.041717529296875, 0.0264434814453125, 0.01299285888671875, 0.030731201171875, -0.0173492431640625, -0.055450439453125, 0.0003657341003417969, -0.04388427734375, 0.0002543926239013672, 0.00006604194641113281, -0.047607421875, 0.0283966064453125, -0.02685546875, -0.014801025390625, -0.032501220703125, 0.02960205078125, -0.07940673828125, 0.006450653076171875, 0.004467010498046875, 0.07958984375, -0.0270233154296875, 0.07159423828125, 0.04833984375, -0.06329345703125, -0.07720947265625, 0.00244140625, -0.01690673828125, -0.0673828125, 0.08697509765625, 0.00844573974609375, -0.005596160888671875, 0.0024967193603515625, -0.06536865234375, -0.058868408203125, 0.0855712890625, 0.0130157470703125, -0.031280517578125, -0.02362060546875, -0.015167236328125, 0.03460693359375, -0.049041748046875, 0.0277252197265625, 0.0012226104736328125, 0.0256500244140625, -0.003726959228515625, -0.044769287109375, -0.01291656494140625, -0.036712646484375, 0.00400543212890625, -0.010345458984375, -0.041015625, 0.083984375, -0.00981903076171875, 0.007656097412109375, 0.01493072509765625, 0.038299560546875, 0.0189666748046875, 0.0193023681640625, 0.0295257568359375, 0.0384521484375, 0.033599853515625, -0.0038509368896484375, 0.05743408203125, -0.06671142578125, 0.05963134765625, 0.089111328125, -0.006961822509765625, 0.0340576171875, 0.009521484375, -0.027557373046875, 0.053375244140625, 0.050201416015625, -0.0390625, 0.032806396484375, 0.040191650390625, 0.004726409912109375, 0.00005704164505004883, 0.03082275390625, -0.06298828125, 0.016326904296875, 0.0009918212890625, -0.057891845703125, -0.003437042236328125, -0.0098876953125, 0.01076507568359375, 0.0035839080810546875, -0.00994873046875, 0.0250396728515625, 0.0007796287536621094, -0.0400390625, 0.07781982421875, 0.00659942626953125, 0.07550048828125, -0.0631103515625, 0.0058135986328125, 0.0020427703857421875, 0.0118865966796875, -0.03179931640625, -0.02099609375, -0.01425933837890625, 0.0126800537109375, -0.0296478271484375, -0.0167999267578125, 0.054229736328125, -0.0482177734375, -0.045257568359375, 0.04180908203125, 0.025238037109375, 0.017120361328125, 0.0016736984252929688, -0.074462890625, 0.0111236572265625, 0.0185089111328125, -0.038909912109375, 0.01560211181640625, 0.0123138427734375, 0.0305938720703125, 0.047698974609375, 0.0273590087890625, 0.004467010498046875, 0.006984710693359375, 0.00765228271484375, 0.06475830078125, -0.034576416015625, -0.0136260986328125, -0.036895751953125, 0.038848876953125, 0.0010023117065429688, -0.0250244140625, 0.05450439453125, 0.0665283203125, 0.07781982421875, -0.027252197265625, 0.053955078125, -0.02850341796875, 0.0233154296875, -0.0498046875, 0.05413818359375, -0.0516357421875, 0.005107879638671875, -0.028106689453125, -0.0697021484375, -0.0003864765167236328, 0.05279541015625, -0.0210418701171875, 0.05413818359375, 0.055694580078125, 0.0848388671875, 0.003204345703125, -0.03656005859375, 0.0163726806640625, -0.00029730796813964844, 0.00769805908203125, 0.046844482421875, 0.021453857421875, -0.052703857421875, 0.059417724609375, -0.008880615234375, -0.016693115234375, -0.034637451171875, -0.060394287109375, -0.08099365234375, -0.04736328125, -0.0266876220703125, -0.052032470703125, -0.0153961181640625, 0.09210205078125, 0.07720947265625, -0.061187744140625, -0.0217742919921875, -0.01146697998046875, 0.00019609928131103516, 0.00421142578125, -0.0205841064453125, 0.02423095703125, -0.040618896484375, -0.050628662109375, 0.0090179443359375, -0.01039886474609375, -0.009185791015625, -0.016204833984375, -0.036468505859375, -0.034912109375, -0.0167083740234375, 0.029052734375, 0.0230712890625, -0.0262298583984375, 0.0002942085266113281, -0.004390716552734375, -0.043182373046875, 0.016845703125, 0.074462890625, -0.0589599609375, 0.0204315185546875, 0.0391845703125, 0.0307159423828125, 0.037109375, -0.00008881092071533203, 0.040130615234375, -0.0657958984375, 0.01788330078125, 0.0095062255859375, 0.034149169921875, 0.004558563232421875, -0.0162811279296875, 0.054656982421875, 0.01910400390625, -0.048004150390625, -0.048004150390625, 0.004810333251953125, -0.087890625, -0.005519866943359375, 0.07781982421875, -0.0177001953125, 0.01023101806640625, 0.0142974853515625, -0.036712646484375, 0.0289459228515625, -0.02020263671875, 0.033721923828125, 0.0469970703125, 0.007030487060546875, -0.0095672607421875, -0.04803466796875, 0.0445556640625, 0.013214111328125, -0.04449462890625, -0.0214385986328125, 0.0213165283203125, 0.019195556640625, 0.021331787109375, 0.054046630859375, -0.00405120849609375, 0.0211944580078125, -0.01342010498046875, 0.057037353515625, -0.0186920166015625, -0.02117919921875, -0.0121917724609375, 0.006275177001953125, 0.01324462890625, -0.0196533203125 ] ]
SG161222/Realistic_Vision_V4.0_noVAE
2023-07-10T16:29:30.000Z
[ "diffusers", "license:creativeml-openrail-m", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
null
SG161222
null
null
SG161222/Realistic_Vision_V4.0_noVAE
63
149,091
diffusers
2023-07-09T08:05:09
--- license: creativeml-openrail-m --- <b>The recommended negative prompt:</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br> <b>OR</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation <b>Recommended parameters for generation:</b><br> Euler A or DPM++ SDE Karras<br> CFG Scale 3,5 - 15<br> Hires. fix with 4x-UltraSharp upscaler<br> 0 Hires steps and Denoising strength 0.25-0.7<br> Upscale by 1.1-2.0
1,175
[ [ -0.06549072265625, -0.056060791015625, 0.0433349609375, 0.0243072509765625, -0.043701171875, -0.01332855224609375, 0.0270233154296875, -0.024932861328125, 0.036834716796875, 0.037689208984375, -0.0792236328125, -0.0298614501953125, -0.037261962890625, 0.031890869140625, -0.036285400390625, 0.07403564453125, 0.004436492919921875, 0.01041412353515625, 0.01273345947265625, 0.0263824462890625, -0.056365966796875, -0.0053558349609375, -0.044219970703125, -0.0144805908203125, 0.030792236328125, 0.06292724609375, 0.0614013671875, 0.054107666015625, 0.007106781005859375, 0.0175323486328125, 0.01178741455078125, -0.003124237060546875, -0.0399169921875, 0.005023956298828125, 0.0010986328125, -0.0103912353515625, -0.0328369140625, 0.0007219314575195312, 0.035064697265625, 0.03778076171875, 0.0154571533203125, 0.0292816162109375, 0.0015506744384765625, 0.051116943359375, -0.0305328369140625, -0.016021728515625, -0.00026226043701171875, 0.03326416015625, -0.018310546875, -0.023681640625, -0.019866943359375, -0.032379150390625, -0.0311737060546875, -0.06390380859375, 0.04852294921875, 0.007465362548828125, 0.08245849609375, 0.002407073974609375, -0.006092071533203125, 0.0059051513671875, -0.04693603515625, 0.04022216796875, -0.04217529296875, -0.0023250579833984375, 0.025146484375, 0.019775390625, -0.0064544677734375, -0.044708251953125, -0.057373046875, -0.0048828125, 0.00336456298828125, 0.0139007568359375, -0.02276611328125, 0.01428985595703125, 0.0552978515625, 0.039276123046875, -0.046905517578125, 0.0036334991455078125, -0.042327880859375, -0.0345458984375, 0.056793212890625, 0.045684814453125, 0.04296875, -0.013336181640625, -0.03704833984375, -0.03619384765625, -0.0509033203125, -0.007564544677734375, 0.0595703125, 0.01021575927734375, -0.0231781005859375, 0.055389404296875, -0.01171112060546875, 0.05560302734375, 0.049346923828125, -0.00576019287109375, 0.0205841064453125, -0.036376953125, 0.005748748779296875, -0.00527191162109375, 0.048309326171875, 0.0693359375, 0.0093994140625, 0.020172119140625, 0.0006303787231445312, 0.0178680419921875, 0.02191162109375, -0.07958984375, -0.0218658447265625, 0.0224151611328125, -0.038055419921875, -0.0185089111328125, -0.03424072265625, -0.1141357421875, -0.03179931640625, -0.0224609375, 0.042877197265625, -0.030364990234375, 0.00048279762268066406, -0.001007080078125, -0.038848876953125, 0.033905029296875, 0.0243988037109375, -0.07476806640625, -0.006786346435546875, -0.005519866943359375, 0.03363037109375, 0.0186004638671875, -0.004619598388671875, -0.003719329833984375, 0.007213592529296875, -0.03363037109375, 0.0657958984375, -0.0310516357421875, -0.048095703125, -0.0191802978515625, 0.02349853515625, 0.014556884765625, -0.0219573974609375, 0.0684814453125, -0.005184173583984375, 0.0251007080078125, -0.02532958984375, -0.018280029296875, -0.023101806640625, -0.0134124755859375, -0.050201416015625, 0.03717041015625, 0.036834716796875, -0.0516357421875, 0.045623779296875, -0.0322265625, -0.0290069580078125, 0.0206756591796875, -0.00870513916015625, -0.0404052734375, 0.003513336181640625, 0.024566650390625, 0.040557861328125, -0.01409149169921875, 0.0016012191772460938, -0.028076171875, -0.034698486328125, -0.0212554931640625, -0.043121337890625, 0.0625, 0.0215606689453125, -0.039886474609375, 0.009002685546875, -0.085693359375, 0.001163482666015625, 0.02655029296875, -0.0031681060791015625, 0.019439697265625, -0.01080322265625, 0.006237030029296875, 0.0174560546875, 0.0198822021484375, -0.053314208984375, 0.0031070709228515625, -0.0171051025390625, 0.00797271728515625, 0.0576171875, 0.030914306640625, 0.0086669921875, -0.04241943359375, 0.061187744140625, 0.0123138427734375, 0.01739501953125, -0.01873779296875, -0.043914794921875, -0.0833740234375, -0.0158233642578125, -0.022003173828125, 0.034576416015625, -0.06982421875, 0.0295257568359375, 0.029632568359375, -0.04571533203125, -0.01525115966796875, 0.00019681453704833984, 0.0389404296875, 0.04290771484375, 0.03515625, -0.020416259765625, -0.051544189453125, -0.07373046875, 0.01244354248046875, -0.00501251220703125, -0.009735107421875, 0.01007080078125, 0.0220794677734375, 0.002002716064453125, 0.036712646484375, -0.05291748046875, -0.0177001953125, -0.007251739501953125, -0.00685882568359375, 0.0304718017578125, 0.03961181640625, 0.041473388671875, -0.034759521484375, -0.01493072509765625, -0.024627685546875, -0.037506103515625, -0.00971221923828125, -0.0247955322265625, -0.0153656005859375, -0.00986480712890625, 0.0110931396484375, -0.046905517578125, 0.03662109375, 0.030059814453125, -0.060211181640625, 0.0748291015625, -0.0271453857421875, 0.0179290771484375, -0.07879638671875, 0.00875091552734375, 0.0088958740234375, -0.00047135353088378906, -0.046173095703125, 0.022003173828125, -0.007366180419921875, -0.03546142578125, -0.038543701171875, 0.037109375, -0.038970947265625, 0.01493072509765625, 0.00739288330078125, 0.0087127685546875, 0.0178070068359375, 0.03363037109375, -0.0031337738037109375, 0.06365966796875, 0.0489501953125, -0.062744140625, 0.050048828125, 0.01837158203125, -0.005977630615234375, 0.060546875, -0.058837890625, 0.004467010498046875, -0.0203094482421875, -0.0012149810791015625, -0.084228515625, -0.047943115234375, 0.04486083984375, -0.04345703125, 0.0271453857421875, 0.03314208984375, -0.0113067626953125, -0.055511474609375, -0.043182373046875, 0.0295257568359375, 0.0531005859375, -0.037384033203125, 0.028594970703125, -0.00862884521484375, -0.004100799560546875, -0.00811767578125, -0.047515869140625, 0.0054779052734375, -0.030059814453125, -0.0347900390625, 0.0287628173828125, -0.0169525146484375, 0.011383056640625, -0.01171112060546875, 0.0177154541015625, -0.00273895263671875, -0.019989013671875, 0.012847900390625, 0.017791748046875, -0.033203125, -0.03485107421875, 0.0232086181640625, -0.013427734375, 0.0033435821533203125, 0.0296630859375, 0.0421142578125, 0.01520538330078125, -0.031585693359375, -0.04522705078125, 0.033416748046875, 0.0552978515625, 0.01751708984375, 0.0059356689453125, 0.063720703125, -0.0295257568359375, 0.010498046875, -0.0255584716796875, -0.01247406005859375, -0.035797119140625, 0.015289306640625, -0.01062774658203125, -0.03167724609375, 0.055816650390625, 0.01312255859375, -0.011077880859375, 0.06817626953125, 0.038299560546875, -0.01155853271484375, 0.10498046875, 0.039794921875, 0.02435302734375, 0.0279693603515625, -0.028961181640625, 0.0011911392211914062, -0.06817626953125, -0.0301666259765625, -0.0308990478515625, -0.035797119140625, -0.04010009765625, -0.020416259765625, 0.024932861328125, 0.01776123046875, -0.0430908203125, 0.040740966796875, -0.05194091796875, 0.033416748046875, 0.0438232421875, 0.0266265869140625, -0.00814056396484375, 0.01377105712890625, -0.01436614990234375, -0.00226593017578125, -0.02978515625, -0.04412841796875, 0.045928955078125, 0.01436614990234375, 0.054290771484375, 0.0070037841796875, 0.047943115234375, 0.00493621826171875, 0.0028362274169921875, -0.039337158203125, 0.059906005859375, -0.034820556640625, -0.072021484375, -0.02655029296875, -0.005496978759765625, -0.07647705078125, -0.005168914794921875, -0.0384521484375, -0.0751953125, 0.047821044921875, 0.0263519287109375, -0.056365966796875, 0.016845703125, -0.0526123046875, 0.0498046875, -0.00791168212890625, -0.0389404296875, -0.006572723388671875, -0.0430908203125, 0.038482666015625, -0.0012369155883789062, -0.014617919921875, 0.006832122802734375, -0.0011510848999023438, 0.043975830078125, -0.0283355712890625, 0.06304931640625, -0.0328369140625, 0.006389617919921875, 0.04638671875, 0.00485992431640625, 0.0180816650390625, 0.01145172119140625, -0.0035953521728515625, -0.005870819091796875, 0.00859832763671875, -0.0272369384765625, -0.02105712890625, 0.035491943359375, -0.06109619140625, -0.05615234375, -0.02801513671875, -0.0137176513671875, -0.00569915771484375, 0.0202178955078125, 0.061798095703125, 0.050445556640625, 0.0015707015991210938, 0.00030684471130371094, 0.035003662109375, -0.043243408203125, 0.053619384765625, 0.0222625732421875, -0.023284912109375, -0.050384521484375, 0.07366943359375, 0.001934051513671875, 0.0279083251953125, -0.01116180419921875, 0.0107269287109375, -0.019256591796875, -0.0240478515625, -0.0848388671875, 0.0140380859375, -0.038970947265625, -0.03717041015625, -0.0221405029296875, -0.008819580078125, -0.0157470703125, -0.0304107666015625, -0.004673004150390625, -0.0167083740234375, -0.0748291015625, -0.004604339599609375, 0.04180908203125, 0.036773681640625, -0.0115203857421875, 0.01485443115234375, -0.0667724609375, 0.0577392578125, 0.0146331787109375, 0.023193359375, 0.01177215576171875, -0.0360107421875, 0.00605010986328125, 0.00481414794921875, -0.06536865234375, -0.07421875, 0.02685546875, -0.0012502670288085938, 0.042205810546875, 0.055206298828125, 0.00742340087890625, 0.08984375, -0.03045654296875, 0.103515625, 0.04901123046875, -0.043304443359375, 0.0406494140625, -0.044891357421875, 0.025177001953125, 0.036834716796875, 0.0222930908203125, -0.0225067138671875, -0.029205322265625, -0.065673828125, -0.06463623046875, 0.053619384765625, 0.004123687744140625, 0.048583984375, -0.0206451416015625, 0.02923583984375, 0.01151275634765625, -0.0033931732177734375, -0.056060791015625, -0.0355224609375, -0.038360595703125, 0.01104736328125, 0.0025768280029296875, -0.035064697265625, 0.00836944580078125, -0.04193115234375, 0.05718994140625, 0.00632476806640625, 0.039886474609375, 0.0184173583984375, 0.041900634765625, -0.04412841796875, 0.0019969940185546875, 0.065673828125, 0.035064697265625, -0.01227569580078125, -0.0084686279296875, -0.0028057098388671875, -0.044921875, 0.0367431640625, -0.0024929046630859375, -0.0335693359375, 0.0122528076171875, 0.032928466796875, 0.0679931640625, -0.017791748046875, -0.02435302734375, 0.029754638671875, -0.0003323554992675781, -0.01910400390625, -0.0175933837890625, 0.034088134765625, -0.0159454345703125, 0.0024433135986328125, 0.032928466796875, 0.0208892822265625, 0.0181121826171875, -0.029998779296875, 0.00928497314453125, 0.0028667449951171875, -0.0192718505859375, -0.02166748046875, 0.0506591796875, 0.0229034423828125, -0.027191162109375, 0.04571533203125, -0.03424072265625, -0.022674560546875, 0.058441162109375, 0.06182861328125, 0.05621337890625, -0.0263519287109375, 0.04296875, 0.079833984375, 0.01480865478515625, 0.0019464492797851562, 0.04681396484375, 0.00885009765625, -0.036468505859375, -0.018463134765625, -0.045013427734375, -0.016021728515625, 0.04168701171875, -0.050201416015625, 0.049407958984375, -0.05615234375, -0.012054443359375, -0.0072021484375, -0.01137542724609375, -0.03424072265625, 0.056060791015625, 0.01178741455078125, 0.048004150390625, -0.0679931640625, 0.0154571533203125, 0.045684814453125, -0.064453125, -0.07720947265625, -0.00856781005859375, 0.002040863037109375, -0.057525634765625, 0.0152740478515625, 0.00811767578125, -0.001987457275390625, 0.0195770263671875, -0.038299560546875, -0.059326171875, 0.061279296875, 0.039947509765625, -0.03741455078125, 0.0022144317626953125, -0.00936126708984375, 0.0516357421875, -0.005390167236328125, 0.0204010009765625, 0.0223236083984375, 0.039215087890625, 0.01019287109375, -0.031646728515625, 0.022308349609375, -0.0309600830078125, 0.01392364501953125, 0.0206451416015625, -0.050445556640625, 0.06341552734375, -0.043243408203125, -0.040557861328125, 0.03155517578125, 0.03924560546875, 0.0276641845703125, 0.0290985107421875, 0.0343017578125, 0.04486083984375, 0.01445770263671875, -0.0009307861328125, 0.07476806640625, -0.032470703125, 0.006305694580078125, 0.04986572265625, 0.017852783203125, 0.02252197265625, 0.031005859375, -0.0242767333984375, 0.04046630859375, 0.0865478515625, -0.032989501953125, 0.03729248046875, 0.0216217041015625, -0.0077667236328125, -0.01288604736328125, -0.0017957687377929688, -0.048919677734375, 0.0258331298828125, 0.0252838134765625, -0.040130615234375, -0.0259857177734375, 0.0198974609375, -0.00020825862884521484, 0.00885772705078125, -0.034942626953125, 0.042144775390625, -0.01421356201171875, -0.053619384765625, 0.053619384765625, -0.0060577392578125, 0.030059814453125, -0.0278472900390625, -0.04083251953125, -0.040283203125, -0.00356292724609375, -0.0299072265625, -0.06939697265625, 0.0179443359375, -0.00382232666015625, -0.030792236328125, -0.0154266357421875, 0.043182373046875, -0.0019893646240234375, -0.05999755859375, 0.0036830902099609375, 0.0007810592651367188, 0.0283355712890625, 0.032684326171875, -0.052154541015625, -0.0089569091796875, 0.019622802734375, -0.02398681640625, 0.01230621337890625, 0.020751953125, 0.01416015625, 0.0279083251953125, 0.033233642578125, 0.0128936767578125, -0.01500701904296875, 0.016754150390625, 0.06622314453125, -0.041900634765625, -0.021484375, -0.05792236328125, 0.05487060546875, 0.006771087646484375, -0.044708251953125, 0.054351806640625, 0.024139404296875, 0.0506591796875, -0.03973388671875, 0.0201416015625, -0.0222015380859375, 0.0197906494140625, -0.04638671875, 0.041290283203125, -0.0391845703125, -0.0122528076171875, -0.053192138671875, -0.08935546875, -0.0193939208984375, 0.06500244140625, 0.001190185546875, 0.032623291015625, 0.053619384765625, 0.06317138671875, 0.009765625, -0.0011777877807617188, 0.032562255859375, -0.0005626678466796875, -0.005397796630859375, 0.043914794921875, 0.053070068359375, -0.023345947265625, 0.01806640625, -0.034759521484375, -0.0163421630859375, -0.0058441162109375, -0.059722900390625, -0.039520263671875, -0.042633056640625, -0.035675048828125, -0.03485107421875, -0.01479339599609375, 0.049041748046875, 0.048614501953125, -0.05914306640625, 0.00238037109375, -0.014251708984375, -0.01110076904296875, -0.00823211669921875, -0.0238189697265625, 0.00919342041015625, 0.0170745849609375, -0.04498291015625, 0.0010957717895507812, 0.01270294189453125, 0.03094482421875, -0.035614013671875, 0.0030059814453125, -0.0031375885009765625, -0.002140045166015625, 0.03399658203125, 0.02471923828125, -0.049530029296875, -0.045745849609375, -0.02630615234375, 0.0089569091796875, -0.0021572113037109375, 0.022308349609375, -0.04022216796875, 0.04669189453125, 0.031341552734375, -0.0023651123046875, 0.0232086181640625, 0.0159759521484375, 0.033355712890625, -0.03985595703125, 0.00585174560546875, 0.019744873046875, 0.0257110595703125, 0.021636962890625, -0.06988525390625, 0.019195556640625, 0.02032470703125, -0.02764892578125, -0.05413818359375, 0.0213775634765625, -0.060516357421875, -0.00351715087890625, 0.07891845703125, 0.01273345947265625, -0.0014772415161132812, -0.0029582977294921875, -0.05560302734375, 0.03021240234375, -0.023345947265625, 0.039947509765625, 0.048370361328125, -0.039398193359375, -0.0212554931640625, -0.023468017578125, 0.03326416015625, 0.005870819091796875, -0.0479736328125, -0.0135040283203125, 0.060821533203125, 0.00669097900390625, 0.03729248046875, 0.037933349609375, -0.0241546630859375, 0.036163330078125, 0.040191650390625, 0.01947021484375, -0.01058197021484375, -0.014617919921875, -0.0295257568359375, 0.0006537437438964844, 0.002758026123046875, -0.0234527587890625 ] ]
openai/whisper-tiny.en
2023-09-11T13:24:06.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "whisper", "automatic-speech-recognition", "audio", "hf-asr-leaderboard", "en", "arxiv:2212.04356", "license:apache-2.0", "model-index", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
openai
null
null
openai/whisper-tiny.en
60
146,456
transformers
2022-09-26T06:57:49
--- language: - en tags: - audio - automatic-speech-recognition - hf-asr-leaderboard widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac model-index: - name: whisper-tiny.en results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (clean) type: librispeech_asr config: clean split: test args: language: en metrics: - name: Test WER type: wer value: 8.4372112320138 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (other) type: librispeech_asr config: other split: test args: language: en metrics: - name: Test WER type: wer value: 14.857607503498355 pipeline_tag: automatic-speech-recognition license: apache-2.0 --- # Whisper Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need for fine-tuning. Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356) by Alec Radford et al. from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper). **Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were copied and pasted from the original model card. ## Model details Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model. It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision. The models were trained on either English-only data or multilingual data. The English-only models were trained on the task of speech recognition. The multilingual models were trained on both speech recognition and speech translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio. For speech translation, the model predicts transcriptions to a *different* language to the audio. Whisper checkpoints come in five configurations of varying model sizes. The smallest four are trained on either English-only or multilingual data. The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The checkpoints are summarised in the following table with links to the models on the Hub: | Size | Parameters | English-only | Multilingual | |----------|------------|------------------------------------------------------|-----------------------------------------------------| | tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) | | base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) | | small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) | | medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) | | large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) | | large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) | # Usage This checkpoint is an *English-only* model, meaning it can be used for English speech recognition. Multilingual speech recognition or speech translation is possible through use of a multilingual checkpoint. To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor). The `WhisperProcessor` is used to: 1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model) 2. Post-process the model outputs (converting them from tokens to text) ## Transcription ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny.en") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny.en") >>> # load dummy dataset and read audio files >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False) ['<|startoftranscript|><|notimestamps|> Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.'] ``` The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`. ## Evaluation This code snippet shows how to evaluate Whisper tiny.en on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr): ```python >>> from datasets import load_dataset >>> from transformers import WhisperForConditionalGeneration, WhisperProcessor >>> import torch >>> from evaluate import load >>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test") >>> processor = WhisperProcessor.from_pretrained("openai/whisper-tiny.en") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny.en").to("cuda") >>> def map_to_pred(batch): >>> audio = batch["audio"] >>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features >>> batch["reference"] = processor.tokenizer._normalize(batch['text']) >>> >>> with torch.no_grad(): >>> predicted_ids = model.generate(input_features.to("cuda"))[0] >>> transcription = processor.decode(predicted_ids) >>> batch["prediction"] = processor.tokenizer._normalize(transcription) >>> return batch >>> result = librispeech_test_clean.map(map_to_pred) >>> wer = load("wer") >>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"])) 5.655609406528749 ``` ## Long-Form Transcription The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers [`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline) method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`: ```python >>> import torch >>> from transformers import pipeline >>> from datasets import load_dataset >>> device = "cuda:0" if torch.cuda.is_available() else "cpu" >>> pipe = pipeline( >>> "automatic-speech-recognition", >>> model="openai/whisper-tiny.en", >>> chunk_length_s=30, >>> device=device, >>> ) >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> prediction = pipe(sample.copy(), batch_size=8)["text"] " Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel." >>> # we can also return timestamps for the predictions >>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"] [{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.', 'timestamp': (0.0, 5.44)}] ``` Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm. ## Fine-Tuning The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However, its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step guide to fine-tuning the Whisper model with as little as 5 hours of labelled data. ### Evaluated Use The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research. The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them. In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes. ## Training Data The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages. As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language. ## Performance and Limitations Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level. However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself. Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf). In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages. ## Broader Implications We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications. There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects. ### BibTeX entry and citation info ```bibtex @misc{radford2022whisper, doi = {10.48550/ARXIV.2212.04356}, url = {https://arxiv.org/abs/2212.04356}, author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya}, title = {Robust Speech Recognition via Large-Scale Weak Supervision}, publisher = {arXiv}, year = {2022}, copyright = {arXiv.org perpetual, non-exclusive license} } ```
14,830
[ [ -0.021148681640625, -0.046905517578125, 0.00809478759765625, 0.031982421875, -0.00420379638671875, -0.00040221214294433594, -0.0283660888671875, -0.046478271484375, 0.0181732177734375, 0.0222015380859375, -0.061737060546875, -0.037841796875, -0.05267333984375, -0.01136016845703125, -0.042144775390625, 0.074951171875, 0.01419830322265625, -0.00118255615234375, 0.0164794921875, -0.005161285400390625, -0.02532958984375, -0.01763916015625, -0.05450439453125, -0.01412200927734375, 0.01546478271484375, 0.01428985595703125, 0.029815673828125, 0.042236328125, 0.0108795166015625, 0.0311126708984375, -0.03204345703125, -0.005367279052734375, -0.025482177734375, -0.0098419189453125, 0.028961181640625, -0.035888671875, -0.046417236328125, 0.01165771484375, 0.058685302734375, 0.035858154296875, -0.0255126953125, 0.035064697265625, 0.0190582275390625, 0.022674560546875, -0.0196533203125, 0.0196075439453125, -0.050872802734375, -0.00957489013671875, -0.02081298828125, 0.0035305023193359375, -0.024810791015625, -0.0235748291015625, 0.041412353515625, -0.044158935546875, 0.030548095703125, 0.01279449462890625, 0.07598876953125, 0.018951416015625, -0.0043792724609375, -0.03265380859375, -0.053497314453125, 0.0836181640625, -0.066162109375, 0.037628173828125, 0.029144287109375, 0.0191802978515625, 0.00386810302734375, -0.0706787109375, -0.05401611328125, -0.0015668869018554688, -0.0046539306640625, 0.020904541015625, -0.0268707275390625, -0.0006937980651855469, 0.018463134765625, 0.03125, -0.035064697265625, 0.0033588409423828125, -0.05316162109375, -0.05023193359375, 0.047119140625, 0.0016460418701171875, 0.0218048095703125, -0.0225067138671875, -0.016326904296875, -0.03021240234375, -0.019805908203125, 0.03314208984375, 0.0271759033203125, 0.037139892578125, -0.055084228515625, 0.02740478515625, -0.0037136077880859375, 0.0450439453125, 0.0159912109375, -0.04461669921875, 0.04388427734375, -0.0135955810546875, -0.0144195556640625, 0.0276336669921875, 0.07891845703125, 0.0155029296875, 0.005901336669921875, 0.0081634521484375, -0.0103607177734375, 0.01422119140625, -0.0084075927734375, -0.06561279296875, -0.006511688232421875, 0.035919189453125, -0.04150390625, -0.023345947265625, -0.0192718505859375, -0.0484619140625, 0.0092620849609375, -0.01146697998046875, 0.052093505859375, -0.04290771484375, -0.022247314453125, 0.015777587890625, -0.0280609130859375, 0.021240234375, 0.002216339111328125, -0.06298828125, 0.027587890625, 0.03338623046875, 0.0667724609375, 0.0093841552734375, -0.044830322265625, -0.037261962890625, 0.006725311279296875, 0.0093536376953125, 0.03509521484375, -0.018951416015625, -0.04229736328125, -0.0175933837890625, 0.006702423095703125, -0.025177001953125, -0.044403076171875, 0.053497314453125, -0.0080413818359375, 0.0355224609375, 0.001514434814453125, -0.038360595703125, -0.01351165771484375, -0.015228271484375, -0.031036376953125, 0.0684814453125, 0.004009246826171875, -0.053192138671875, 0.0124053955078125, -0.038421630859375, -0.035247802734375, -0.020599365234375, 0.01318359375, -0.046630859375, -0.0026721954345703125, 0.03277587890625, 0.02996826171875, -0.01342010498046875, 0.0017595291137695312, -0.004547119140625, -0.0305328369140625, 0.0241851806640625, -0.0310516357421875, 0.0745849609375, 0.01229095458984375, -0.03125, 0.016265869140625, -0.057830810546875, 0.010894775390625, 0.00397491455078125, -0.01131439208984375, 0.01105499267578125, -0.00432586669921875, 0.021759033203125, 0.0034332275390625, 0.012298583984375, -0.05596923828125, -0.00732421875, -0.0504150390625, 0.055633544921875, 0.0477294921875, -0.004730224609375, 0.0287933349609375, -0.044830322265625, 0.02142333984375, 0.004154205322265625, 0.03253173828125, -0.01157379150390625, -0.0462646484375, -0.073486328125, -0.0322265625, 0.0360107421875, 0.051300048828125, -0.02593994140625, 0.042144775390625, -0.0163421630859375, -0.0562744140625, -0.09844970703125, -0.01004791259765625, 0.042724609375, 0.04119873046875, 0.051483154296875, -0.01192474365234375, -0.057220458984375, -0.052642822265625, -0.01123046875, -0.0240478515625, -0.01641845703125, 0.0285797119140625, 0.0248260498046875, -0.0275115966796875, 0.05169677734375, -0.03741455078125, -0.04119873046875, -0.019287109375, 0.00403594970703125, 0.034942626953125, 0.048797607421875, 0.01995849609375, -0.0516357421875, -0.0335693359375, -0.014312744140625, -0.041900634765625, -0.00566864013671875, -0.004611968994140625, 0.00611114501953125, 0.000408172607421875, 0.026123046875, -0.05438232421875, 0.0294189453125, 0.048583984375, -0.01375579833984375, 0.052947998046875, 0.01120758056640625, -0.00467681884765625, -0.0850830078125, -0.00450897216796875, -0.01114654541015625, -0.008819580078125, -0.049560546875, -0.019439697265625, -0.00726318359375, -0.006763458251953125, -0.03961181640625, 0.047698974609375, -0.0256195068359375, 0.00246429443359375, -0.003711700439453125, 0.00933837890625, -0.004108428955078125, 0.037567138671875, 0.0127105712890625, 0.049041748046875, 0.06158447265625, -0.041259765625, 0.014984130859375, 0.041900634765625, -0.02435302734375, 0.0203399658203125, -0.07427978515625, 0.01311492919921875, 0.01120758056640625, 0.01334381103515625, -0.050872802734375, -0.00853729248046875, 0.0016574859619140625, -0.07342529296875, 0.03375244140625, -0.0251312255859375, -0.027374267578125, -0.03997802734375, -0.015472412109375, 0.005603790283203125, 0.068603515625, -0.036529541015625, 0.051544189453125, 0.0322265625, -0.0173187255859375, -0.038177490234375, -0.0421142578125, -0.0187835693359375, -0.0185546875, -0.05810546875, 0.035888671875, -0.01000213623046875, -0.0019159317016601562, -0.013458251953125, -0.0094146728515625, 0.0090484619140625, -0.019287109375, 0.03546142578125, 0.0362548828125, -0.0104522705078125, -0.0189208984375, 0.01517486572265625, -0.0196075439453125, -0.0021152496337890625, -0.017242431640625, 0.05078125, -0.0282135009765625, -0.003803253173828125, -0.05889892578125, 0.01552581787109375, 0.03643798828125, -0.024383544921875, 0.041046142578125, 0.06549072265625, -0.020904541015625, -0.017486572265625, -0.05120849609375, -0.0205535888671875, -0.043731689453125, 0.0096282958984375, -0.0263824462890625, -0.0595703125, 0.052520751953125, 0.0129852294921875, 0.0059356689453125, 0.047760009765625, 0.038482666015625, -0.022491455078125, 0.0703125, 0.034149169921875, -0.0185546875, 0.0229339599609375, -0.05853271484375, -0.0097198486328125, -0.07818603515625, -0.0260772705078125, -0.042755126953125, -0.022308349609375, -0.03790283203125, -0.02740478515625, 0.03955078125, 0.00870513916015625, -0.0091094970703125, 0.03338623046875, -0.057464599609375, 0.002197265625, 0.04852294921875, 0.00250244140625, 0.00928497314453125, -0.00424957275390625, -0.010009765625, -0.0083160400390625, -0.029510498046875, -0.026031494140625, 0.07232666015625, 0.03900146484375, 0.0421142578125, -0.00640869140625, 0.05560302734375, -0.001491546630859375, 0.003231048583984375, -0.059417724609375, 0.03704833984375, -0.0097503662109375, -0.04388427734375, -0.0292510986328125, -0.0225067138671875, -0.061920166015625, 0.012115478515625, -0.01345062255859375, -0.0528564453125, 0.010223388671875, -0.0044097900390625, -0.0283050537109375, 0.019073486328125, -0.0557861328125, 0.04595947265625, 0.010711669921875, 0.0086517333984375, -0.004230499267578125, -0.057769775390625, 0.0094757080078125, 0.006252288818359375, 0.011138916015625, -0.0116729736328125, 0.014984130859375, 0.08245849609375, -0.03619384765625, 0.0684814453125, -0.025299072265625, 0.0100860595703125, 0.03961181640625, -0.01403045654296875, 0.0261688232421875, -0.0157470703125, -0.01165771484375, 0.032470703125, 0.0244903564453125, -0.02154541015625, -0.0218963623046875, 0.038360595703125, -0.08050537109375, -0.0224609375, -0.019805908203125, -0.027984619140625, -0.01250457763671875, 0.015380859375, 0.06097412109375, 0.0489501953125, -0.005306243896484375, 0.0001277923583984375, 0.0355224609375, -0.016082763671875, 0.041259765625, 0.048736572265625, -0.01654052734375, -0.034149169921875, 0.0721435546875, 0.01824951171875, 0.01861572265625, 0.01267242431640625, 0.03375244140625, -0.03167724609375, -0.050018310546875, -0.041290283203125, 0.02532958984375, -0.027496337890625, -0.01375579833984375, -0.0660400390625, -0.038787841796875, -0.045074462890625, 0.0001367330551147461, -0.03790283203125, -0.0228271484375, -0.031402587890625, 0.0074462890625, 0.044189453125, 0.0306396484375, -0.0008158683776855469, 0.0428466796875, -0.06988525390625, 0.03192138671875, 0.0263214111328125, 0.00716400146484375, 0.003742218017578125, -0.073486328125, -0.0087432861328125, 0.0145721435546875, -0.0255584716796875, -0.04498291015625, 0.0360107421875, 0.028656005859375, 0.02978515625, 0.0185394287109375, 0.0013761520385742188, 0.0712890625, -0.052581787109375, 0.05889892578125, 0.01611328125, -0.092041015625, 0.05657958984375, -0.026458740234375, 0.0179595947265625, 0.03314208984375, 0.0234527587890625, -0.044403076171875, -0.038177490234375, -0.053253173828125, -0.04791259765625, 0.0535888671875, 0.0228271484375, 0.00565338134765625, 0.0214691162109375, 0.01458740234375, 0.00878143310546875, 0.010406494140625, -0.033935546875, -0.0352783203125, -0.0264434814453125, -0.019378662109375, -0.0075531005859375, -0.00391387939453125, -0.0005884170532226562, -0.041900634765625, 0.057464599609375, -0.0017337799072265625, 0.03619384765625, 0.027313232421875, 0.004512786865234375, -0.002353668212890625, 0.0128631591796875, 0.02508544921875, 0.016754150390625, -0.0191650390625, -0.0274200439453125, 0.0272064208984375, -0.06414794921875, 0.0015497207641601562, 0.02496337890625, -0.02313232421875, 0.0091094970703125, 0.05230712890625, 0.08038330078125, 0.01470184326171875, -0.03656005859375, 0.0513916015625, -0.0081939697265625, -0.01500701904296875, -0.04644775390625, 0.00270843505859375, 0.02288818359375, 0.0228729248046875, 0.026153564453125, 0.0107269287109375, 0.012451171875, -0.03839111328125, 0.0115966796875, 0.020294189453125, -0.039306640625, -0.04034423828125, 0.0655517578125, 0.004497528076171875, -0.0296173095703125, 0.053924560546875, 0.0011463165283203125, -0.0447998046875, 0.0355224609375, 0.048583984375, 0.0723876953125, -0.035919189453125, -0.0023403167724609375, 0.03350830078125, 0.0187225341796875, 0.002658843994140625, 0.036376953125, -0.00460052490234375, -0.053253173828125, -0.03326416015625, -0.07879638671875, -0.0250396728515625, 0.00040411949157714844, -0.0723876953125, 0.02679443359375, -0.024383544921875, -0.02020263671875, 0.0252227783203125, 0.0084228515625, -0.05853271484375, 0.0145263671875, 0.00321197509765625, 0.07568359375, -0.051910400390625, 0.07806396484375, 0.01334381103515625, -0.0201416015625, -0.0841064453125, 0.0019407272338867188, 0.005832672119140625, -0.07452392578125, 0.0233154296875, 0.02349853515625, -0.015380859375, 0.0103759765625, -0.0382080078125, -0.05413818359375, 0.0809326171875, 0.01090240478515625, -0.05169677734375, -0.015716552734375, -0.00569915771484375, 0.039764404296875, -0.01556396484375, 0.016448974609375, 0.057098388671875, 0.03314208984375, 0.01064300537109375, -0.10968017578125, -0.0102996826171875, -0.01824951171875, -0.017669677734375, 0.0016078948974609375, -0.060638427734375, 0.06866455078125, -0.03436279296875, -0.018341064453125, 0.0250091552734375, 0.05889892578125, 0.0294952392578125, 0.03155517578125, 0.0478515625, 0.043487548828125, 0.057952880859375, -0.0149383544921875, 0.0723876953125, -0.01224517822265625, 0.018707275390625, 0.0731201171875, -0.0043182373046875, 0.08367919921875, 0.017486572265625, -0.037628173828125, 0.04974365234375, 0.02581787109375, -0.0030364990234375, 0.036590576171875, -0.0019102096557617188, -0.025634765625, 0.011749267578125, -0.008331298828125, -0.038818359375, 0.05841064453125, 0.032867431640625, -0.01580810546875, 0.032501220703125, 0.01177978515625, 0.0064849853515625, -0.010162353515625, -0.019805908203125, 0.0653076171875, 0.01654052734375, -0.026885986328125, 0.06060791015625, -0.00377655029296875, 0.081787109375, -0.060577392578125, 0.0161590576171875, 0.01392364501953125, 0.017120361328125, -0.016937255859375, -0.046539306640625, 0.0259552001953125, -0.0140533447265625, -0.019134521484375, -0.01277923583984375, 0.044219970703125, -0.048492431640625, -0.042266845703125, 0.03662109375, 0.0282135009765625, 0.022735595703125, -0.00872802734375, -0.061676025390625, 0.03338623046875, 0.01480865478515625, -0.01216888427734375, 0.0117034912109375, 0.0107879638671875, 0.026031494140625, 0.05126953125, 0.062744140625, 0.034332275390625, 0.016998291015625, 0.00897979736328125, 0.06024169921875, -0.0513916015625, -0.04351806640625, -0.04852294921875, 0.038360595703125, -0.0017843246459960938, -0.028289794921875, 0.0655517578125, 0.049224853515625, 0.054046630859375, 0.0033416748046875, 0.052886962890625, -0.0012798309326171875, 0.07733154296875, -0.03729248046875, 0.0665283203125, -0.0307464599609375, 0.004978179931640625, -0.02850341796875, -0.05120849609375, 0.009124755859375, 0.0418701171875, -0.007659912109375, -0.0011548995971679688, 0.0280609130859375, 0.0660400390625, -0.0010766983032226562, 0.0213775634765625, 0.003849029541015625, 0.036865234375, 0.015625, 0.03680419921875, 0.047943115234375, -0.06146240234375, 0.049835205078125, -0.04388427734375, -0.016693115234375, 0.0087738037109375, -0.033203125, -0.06396484375, -0.0595703125, -0.022125244140625, -0.04400634765625, -0.020843505859375, 0.05450439453125, 0.06781005859375, -0.0625, -0.0283355712890625, 0.0273895263671875, -0.004852294921875, -0.0259552001953125, -0.01898193359375, 0.04412841796875, 0.004489898681640625, -0.06988525390625, 0.04937744140625, 0.0028438568115234375, 0.0257415771484375, -0.016693115234375, -0.015045166015625, 0.0079345703125, -0.0001399517059326172, 0.0386962890625, 0.018280029296875, -0.05950927734375, -0.016387939453125, 0.006801605224609375, 0.01230621337890625, -0.0008296966552734375, 0.029052734375, -0.0557861328125, 0.03009033203125, 0.018218994140625, 0.006305694580078125, 0.0689697265625, -0.0233154296875, 0.0189361572265625, -0.054046630859375, 0.033203125, 0.0232391357421875, 0.026153564453125, 0.0275421142578125, -0.01152801513671875, 0.0159454345703125, 0.0189208984375, -0.045867919921875, -0.07415771484375, -0.00579833984375, -0.08837890625, -0.001346588134765625, 0.07421875, 0.00484466552734375, -0.02081298828125, -0.00496673583984375, -0.02374267578125, 0.0380859375, -0.038330078125, 0.032958984375, 0.0300140380859375, 0.0022640228271484375, -0.00402069091796875, -0.04437255859375, 0.04986572265625, 0.0149993896484375, -0.026763916015625, -0.00681304931640625, 0.00567626953125, 0.04522705078125, 0.0232391357421875, 0.064208984375, -0.0216827392578125, 0.0111541748046875, 0.01541900634765625, 0.0136871337890625, -0.001605987548828125, -0.01323699951171875, -0.0265655517578125, -0.005054473876953125, -0.0149383544921875, -0.032501220703125 ] ]
GroNLP/bert-base-dutch-cased
2023-09-11T08:57:51.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "bert", "fill-mask", "BERTje", "nl", "arxiv:1912.09582", "doi:10.57967/hf/0149", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
GroNLP
null
null
GroNLP/bert-base-dutch-cased
17
145,251
transformers
2022-03-02T23:29:04
--- language: nl thumbnail: "https://raw.githubusercontent.com/wietsedv/bertje/master/bertje.png" tags: - BERTje --- # BERTje: A Dutch BERT model [Wietse de Vries](https://www.semanticscholar.org/author/Wietse-de-Vries/144611157) • [Andreas van Cranenburgh](https://www.semanticscholar.org/author/Andreas-van-Cranenburgh/2791585) • [Arianna Bisazza](https://www.semanticscholar.org/author/Arianna-Bisazza/3242253) • [Tommaso Caselli](https://www.semanticscholar.org/author/Tommaso-Caselli/1864635) • [Gertjan van Noord](https://www.semanticscholar.org/author/Gertjan-van-Noord/143715131) • [Malvina Nissim](https://www.semanticscholar.org/author/M.-Nissim/2742475) ## Model description BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. <img src="https://raw.githubusercontent.com/wietsedv/bertje/master/bertje.png" height="250"> For details, check out our paper on [arXiv](https://arxiv.org/abs/1912.09582), the code on [Github](https://github.com/wietsedv/bertje) and related work on [Semantic Scholar](https://www.semanticscholar.org/paper/BERTje%3A-A-Dutch-BERT-Model-Vries-Cranenburgh/a4d5e425cac0bf84c86c0c9f720b6339d6288ffa). The paper and Github page mention fine-tuned models that are available [here](https://huggingface.co/wietsedv). ## How to use ```python from transformers import AutoTokenizer, AutoModel, TFAutoModel tokenizer = AutoTokenizer.from_pretrained("GroNLP/bert-base-dutch-cased") model = AutoModel.from_pretrained("GroNLP/bert-base-dutch-cased") # PyTorch model = TFAutoModel.from_pretrained("GroNLP/bert-base-dutch-cased") # Tensorflow ``` **WARNING:** The vocabulary size of BERTje has changed in 2021. If you use an older fine-tuned model and experience problems with the `GroNLP/bert-base-dutch-cased` tokenizer, use use the following tokenizer: ```python tokenizer = AutoTokenizer.from_pretrained("GroNLP/bert-base-dutch-cased", revision="v1") # v1 is the old vocabulary ``` ## Benchmarks The arXiv paper lists benchmarks. Here are a couple of comparisons between BERTje, multilingual BERT, BERT-NL and RobBERT that were done after writing the paper. Unlike some other comparisons, the fine-tuning procedures for these benchmarks are identical for each pre-trained model. You may be able to achieve higher scores for individual models by optimizing fine-tuning procedures. More experimental results will be added to this page when they are finished. Technical details about how a fine-tuned these models will be published later as well as downloadable fine-tuned checkpoints. All of the tested models are *base* sized (12) layers with cased tokenization. Headers in the tables below link to original data sources. Scores link to the model pages that corresponds to that specific fine-tuned model. These tables will be updated when more simple fine-tuned models are made available. ### Named Entity Recognition | Model | [CoNLL-2002](https://www.clips.uantwerpen.be/conll2002/ner/) | [SoNaR-1](https://ivdnt.org/downloads/taalmaterialen/tstc-sonar-corpus) | spaCy UD LassySmall | | ---------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | | **BERTje** | [**90.24**](https://huggingface.co/wietsedv/bert-base-dutch-cased-finetuned-conll2002-ner) | [**84.93**](https://huggingface.co/wietsedv/bert-base-dutch-cased-finetuned-sonar-ner) | [86.10](https://huggingface.co/wietsedv/bert-base-dutch-cased-finetuned-udlassy-ner) | | [mBERT](https://github.com/google-research/bert/blob/master/multilingual.md) | [88.61](https://huggingface.co/wietsedv/bert-base-multilingual-cased-finetuned-conll2002-ner) | [84.19](https://huggingface.co/wietsedv/bert-base-multilingual-cased-finetuned-sonar-ner) | [**86.77**](https://huggingface.co/wietsedv/bert-base-multilingual-cased-finetuned-udlassy-ner) | | [BERT-NL](http://textdata.nl) | 85.05 | 80.45 | 81.62 | | [RobBERT](https://github.com/iPieter/RobBERT) | 84.72 | 81.98 | 79.84 | ### Part-of-speech tagging | Model | [UDv2.5 LassySmall](https://universaldependencies.org/treebanks/nl_lassysmall/index.html) | | ---------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------- | | **BERTje** | **96.48** | | [mBERT](https://github.com/google-research/bert/blob/master/multilingual.md) | 96.20 | | [BERT-NL](http://textdata.nl) | 96.10 | | [RobBERT](https://github.com/iPieter/RobBERT) | 95.91 | ### BibTeX entry and citation info ```bibtex @misc{devries2019bertje, \ttitle = {{BERTje}: {A} {Dutch} {BERT} {Model}}, \tshorttitle = {{BERTje}}, \tauthor = {de Vries, Wietse and van Cranenburgh, Andreas and Bisazza, Arianna and Caselli, Tommaso and Noord, Gertjan van and Nissim, Malvina}, \tyear = {2019}, \tmonth = dec, \thowpublished = {arXiv:1912.09582}, \turl = {http://arxiv.org/abs/1912.09582}, } ```
6,589
[ [ -0.060272216796875, -0.04498291015625, 0.0147552490234375, 0.024993896484375, -0.014801025390625, -0.013397216796875, -0.029388427734375, -0.04730224609375, 0.032867431640625, 0.0122528076171875, -0.0283050537109375, -0.0421142578125, -0.050323486328125, 0.002613067626953125, -0.0110321044921875, 0.07452392578125, 0.0016965866088867188, 0.01491546630859375, 0.00811004638671875, -0.017120361328125, -0.0267791748046875, -0.04345703125, -0.03582763671875, -0.01558685302734375, 0.0362548828125, -0.005359649658203125, 0.028472900390625, 0.035308837890625, 0.0390625, 0.0174102783203125, -0.0199127197265625, 0.01300811767578125, -0.006893157958984375, -0.00594329833984375, 0.0017194747924804688, -0.002201080322265625, -0.042877197265625, -0.0079803466796875, 0.04632568359375, 0.046783447265625, -0.001483917236328125, 0.008941650390625, 0.0064849853515625, 0.038299560546875, -0.01922607421875, 0.0214080810546875, -0.0460205078125, -0.0165863037109375, -0.0182647705078125, -0.0111846923828125, -0.018524169921875, -0.01873779296875, 0.026641845703125, -0.042572021484375, 0.0200653076171875, -0.0131072998046875, 0.11285400390625, 0.012908935546875, -0.024139404296875, -0.018096923828125, -0.038330078125, 0.058624267578125, -0.06982421875, 0.046173095703125, 0.0250396728515625, 0.00009572505950927734, -0.022003173828125, -0.046478271484375, -0.040679931640625, -0.01385498046875, -0.0233001708984375, 0.0189971923828125, -0.01824951171875, -0.00693511962890625, 0.00994110107421875, 0.01194000244140625, -0.042816162109375, 0.0022983551025390625, -0.03436279296875, -0.0223388671875, 0.050445556640625, -0.01338958740234375, 0.01410675048828125, -0.04071044921875, -0.036590576171875, -0.0247802734375, -0.031402587890625, 0.018096923828125, 0.03131103515625, 0.0428466796875, -0.02545166015625, 0.033843994140625, -0.00028586387634277344, 0.056884765625, 0.00208282470703125, -0.012451171875, 0.046844482421875, -0.0361328125, -0.0021076202392578125, -0.004306793212890625, 0.056488037109375, 0.0250701904296875, 0.012298583984375, -0.0087890625, -0.0216827392578125, -0.004817962646484375, 0.0036525726318359375, -0.050537109375, -0.0267486572265625, 0.0200042724609375, -0.040252685546875, -0.006603240966796875, 0.003078460693359375, -0.05120849609375, 0.005100250244140625, -0.023345947265625, 0.04461669921875, -0.06414794921875, -0.032379150390625, 0.0092620849609375, -0.016693115234375, 0.0345458984375, 0.01548004150390625, -0.056427001953125, 0.0048065185546875, 0.0404052734375, 0.05841064453125, -0.0005321502685546875, -0.019622802734375, 0.0016698837280273438, 0.0035190582275390625, -0.02069091796875, 0.04132080078125, -0.018524169921875, -0.020751953125, -0.007564544677734375, 0.005138397216796875, -0.01461029052734375, -0.0104827880859375, 0.052032470703125, -0.033660888671875, 0.0277099609375, -0.0277099609375, -0.051361083984375, -0.0215911865234375, 0.02587890625, -0.048370361328125, 0.08544921875, 0.0014829635620117188, -0.06768798828125, 0.0404052734375, -0.06182861328125, -0.038299560546875, -0.008453369140625, 0.003204345703125, -0.040557861328125, -0.0078277587890625, 0.03228759765625, 0.036865234375, -0.00699615478515625, 0.0209197998046875, -0.0185546875, 0.000027835369110107422, -0.005489349365234375, -0.010284423828125, 0.08990478515625, 0.0279083251953125, -0.0238800048828125, 0.00010776519775390625, -0.0648193359375, 0.00885772705078125, 0.0194091796875, -0.03955078125, -0.02630615234375, -0.0006823539733886719, 0.01336669921875, 0.0164337158203125, 0.0341796875, -0.0361328125, 0.0196380615234375, -0.0347900390625, 0.032318115234375, 0.051361083984375, -0.0004410743713378906, 0.0185089111328125, -0.0394287109375, 0.018035888671875, 0.0008978843688964844, 0.0079345703125, -0.006885528564453125, -0.0384521484375, -0.07415771484375, -0.053802490234375, 0.058837890625, 0.039520263671875, -0.053985595703125, 0.05615234375, -0.03643798828125, -0.052032470703125, -0.049591064453125, -0.0102386474609375, 0.0241241455078125, 0.03253173828125, 0.03753662109375, -0.0246734619140625, -0.05303955078125, -0.0712890625, -0.0018644332885742188, -0.01529693603515625, 0.00669097900390625, 0.0321044921875, 0.04693603515625, -0.01514434814453125, 0.06768798828125, -0.022125244140625, -0.0250701904296875, -0.0014705657958984375, 0.01540374755859375, 0.038543701171875, 0.05419921875, 0.076904296875, -0.0555419921875, -0.038055419921875, -0.008575439453125, -0.04998779296875, 0.00550079345703125, 0.0014410018920898438, -0.004199981689453125, 0.0374755859375, 0.02239990234375, -0.046173095703125, 0.0170135498046875, 0.032257080078125, -0.0250701904296875, 0.0419921875, -0.0227508544921875, -0.00402069091796875, -0.09246826171875, 0.0198211669921875, 0.0089569091796875, -0.0031585693359375, -0.038543701171875, 0.006992340087890625, -0.0019989013671875, 0.02264404296875, -0.024444580078125, 0.055633544921875, -0.037353515625, -0.001613616943359375, 0.0186309814453125, -0.003292083740234375, -0.00768280029296875, 0.047149658203125, 0.00896453857421875, 0.052154541015625, 0.045806884765625, -0.0472412109375, 0.007808685302734375, 0.0278778076171875, -0.0443115234375, 0.0239410400390625, -0.057342529296875, -0.0082244873046875, -0.007633209228515625, 0.022552490234375, -0.0750732421875, -0.007965087890625, 0.0143280029296875, -0.050140380859375, 0.033538818359375, -0.0201263427734375, -0.04827880859375, -0.039764404296875, -0.031524658203125, 0.0033588409423828125, 0.028076171875, -0.042572021484375, 0.042724609375, 0.0254669189453125, -0.00817108154296875, -0.04827880859375, -0.051849365234375, -0.004695892333984375, -0.01029205322265625, -0.06292724609375, 0.032501220703125, -0.005321502685546875, 0.007076263427734375, 0.0106658935546875, -0.0002665519714355469, -0.002880096435546875, 0.0062713623046875, 0.00826263427734375, 0.037017822265625, -0.017822265625, 0.01861572265625, -0.00910186767578125, 0.00197601318359375, -0.01215362548828125, -0.024688720703125, 0.054168701171875, -0.0206146240234375, -0.006855010986328125, -0.0264434814453125, 0.0196990966796875, 0.0433349609375, -0.03875732421875, 0.056915283203125, 0.06536865234375, -0.0257568359375, 0.0066986083984375, -0.05072021484375, -0.013641357421875, -0.0310821533203125, 0.0183258056640625, -0.042633056640625, -0.063232421875, 0.0528564453125, 0.01385498046875, 0.022674560546875, 0.06170654296875, 0.045562744140625, -0.0265045166015625, 0.062744140625, 0.0440673828125, -0.003208160400390625, 0.048614501953125, -0.04071044921875, 0.0122222900390625, -0.056640625, -0.01519775390625, -0.0499267578125, -0.0209197998046875, -0.07568359375, -0.0159759521484375, 0.0252838134765625, 0.00727081298828125, -0.0222930908203125, 0.05029296875, -0.054779052734375, 0.00030922889709472656, 0.06402587890625, 0.0013570785522460938, -0.007625579833984375, 0.0109100341796875, -0.04150390625, -0.005462646484375, -0.049560546875, -0.04266357421875, 0.0848388671875, 0.0227508544921875, 0.032989501953125, 0.0227508544921875, 0.06201171875, 0.0113372802734375, -0.00252532958984375, -0.04180908203125, 0.036376953125, -0.0173187255859375, -0.08380126953125, -0.021453857421875, -0.02581787109375, -0.09124755859375, 0.0309600830078125, -0.021240234375, -0.0595703125, 0.027618408203125, 0.007038116455078125, -0.0230865478515625, 0.0271148681640625, -0.069580078125, 0.06414794921875, -0.0211029052734375, -0.00934600830078125, -0.0009775161743164062, -0.060577392578125, 0.01116180419921875, 0.00032138824462890625, 0.01390838623046875, -0.01751708984375, 0.0102996826171875, 0.06915283203125, -0.03851318359375, 0.053192138671875, -0.02630615234375, -0.0029773712158203125, 0.0280303955078125, -0.007476806640625, 0.038360595703125, -0.0050811767578125, -0.0117645263671875, 0.031219482421875, 0.0189208984375, -0.03875732421875, -0.004730224609375, 0.0589599609375, -0.0755615234375, -0.034454345703125, -0.046478271484375, -0.0311279296875, -0.01195526123046875, 0.0296783447265625, 0.026031494140625, 0.022186279296875, -0.0019178390502929688, 0.0185546875, 0.0477294921875, -0.02105712890625, 0.051361083984375, 0.046173095703125, -0.0020236968994140625, -0.0300140380859375, 0.06396484375, 0.0257568359375, -0.0018787384033203125, 0.0281829833984375, 0.00023686885833740234, -0.02093505859375, -0.04852294921875, -0.0218505859375, 0.021881103515625, -0.041229248046875, -0.0225372314453125, -0.064453125, -0.0270233154296875, -0.04522705078125, -0.006893157958984375, -0.025146484375, -0.04345703125, -0.029510498046875, -0.0101165771484375, 0.043304443359375, 0.025543212890625, -0.02825927734375, 0.005649566650390625, -0.050323486328125, 0.01276397705078125, 0.0221405029296875, 0.030548095703125, -0.00005078315734863281, -0.03857421875, -0.00690460205078125, 0.0096893310546875, -0.017486572265625, -0.048492431640625, 0.039093017578125, 0.01326751708984375, 0.0570068359375, 0.0082855224609375, -0.003177642822265625, 0.05096435546875, -0.033111572265625, 0.065185546875, 0.0157928466796875, -0.06768798828125, 0.04595947265625, -0.0396728515625, 0.0113067626953125, 0.038604736328125, 0.036651611328125, -0.027740478515625, -0.00975799560546875, -0.055908203125, -0.0914306640625, 0.056610107421875, 0.010345458984375, 0.0114288330078125, -0.00004649162292480469, 0.01041412353515625, 0.005023956298828125, 0.00885772705078125, -0.06573486328125, -0.0386962890625, -0.0204620361328125, -0.01306915283203125, -0.0009784698486328125, -0.01898193359375, -0.004787445068359375, -0.042236328125, 0.07464599609375, 0.010345458984375, 0.045623779296875, 0.0257568359375, -0.01251983642578125, 0.006290435791015625, 0.0167694091796875, 0.056488037109375, 0.0380859375, -0.03857421875, 0.002452850341796875, 0.0183258056640625, -0.0275421142578125, -0.006717681884765625, 0.031585693359375, -0.017791748046875, 0.024871826171875, 0.04339599609375, 0.0626220703125, 0.0033550262451171875, -0.02532958984375, 0.03424072265625, -0.0014362335205078125, -0.0350341796875, -0.035430908203125, -0.0078887939453125, 0.0104217529296875, 0.0213165283203125, 0.03228759765625, 0.00370025634765625, -0.0045166015625, -0.0240020751953125, 0.018524169921875, 0.038818359375, -0.022308349609375, -0.018463134765625, 0.0472412109375, 0.0144195556640625, 0.0018835067749023438, 0.031097412109375, -0.02423095703125, -0.04583740234375, 0.0477294921875, 0.0257568359375, 0.0562744140625, -0.00714111328125, -0.001651763916015625, 0.053924560546875, 0.038299560546875, 0.0052947998046875, 0.039581298828125, 0.0025157928466796875, -0.058197021484375, -0.02996826171875, -0.06231689453125, 0.004436492919921875, 0.01422119140625, -0.044677734375, 0.0278778076171875, -0.0257110595703125, -0.0280609130859375, 0.0391845703125, 0.02301025390625, -0.05615234375, 0.01070404052734375, 0.011749267578125, 0.08001708984375, -0.0499267578125, 0.0770263671875, 0.05914306640625, -0.041412353515625, -0.0673828125, -0.014862060546875, -0.017181396484375, -0.04986572265625, 0.053466796875, -0.006671905517578125, 0.00763702392578125, -0.003688812255859375, -0.039398193359375, -0.08416748046875, 0.07568359375, 0.0232391357421875, -0.037017822265625, -0.0017442703247070312, -0.01385498046875, 0.049468994140625, -0.005733489990234375, 0.013885498046875, 0.027252197265625, 0.0565185546875, 0.007724761962890625, -0.0855712890625, -0.0014982223510742188, -0.035369873046875, 0.011260986328125, 0.019744873046875, -0.059356689453125, 0.077392578125, -0.005634307861328125, -0.0037746429443359375, 0.0045318603515625, 0.044952392578125, 0.0135040283203125, -0.0019550323486328125, 0.03399658203125, 0.06341552734375, 0.046356201171875, -0.03009033203125, 0.07672119140625, -0.037200927734375, 0.0487060546875, 0.079345703125, 0.007724761962890625, 0.05792236328125, 0.0384521484375, -0.0256805419921875, 0.033599853515625, 0.0645751953125, -0.0251617431640625, 0.05621337890625, 0.00891876220703125, -0.005039215087890625, -0.0178375244140625, -0.0008940696716308594, -0.0521240234375, 0.0389404296875, 0.0218048095703125, -0.025115966796875, -0.01236724853515625, -0.0068206787109375, 0.026611328125, -0.02337646484375, -0.01136016845703125, 0.048431396484375, -0.01053619384765625, -0.041412353515625, 0.06170654296875, 0.01113128662109375, 0.07073974609375, -0.051177978515625, 0.0163421630859375, -0.00110626220703125, 0.00951385498046875, -0.01263427734375, -0.048553466796875, 0.0113067626953125, -0.0017185211181640625, -0.0186614990234375, -0.01800537109375, 0.048858642578125, -0.0279083251953125, -0.04608154296875, 0.033477783203125, 0.036346435546875, 0.01186370849609375, 0.0260467529296875, -0.06402587890625, -0.007053375244140625, 0.0026607513427734375, -0.035888671875, 0.0078887939453125, 0.032257080078125, 0.0025882720947265625, 0.033477783203125, 0.058319091796875, 0.005702972412109375, 0.0265350341796875, -0.007595062255859375, 0.06427001953125, -0.04693603515625, -0.03009033203125, -0.050537109375, 0.037139892578125, -0.00333404541015625, -0.03338623046875, 0.064208984375, 0.048248291015625, 0.08233642578125, -0.0080718994140625, 0.0552978515625, -0.03240966796875, 0.049560546875, -0.0209808349609375, 0.05975341796875, -0.0540771484375, 0.0026645660400390625, -0.032012939453125, -0.0599365234375, -0.03399658203125, 0.06072998046875, -0.0274200439453125, 0.0196533203125, 0.0275421142578125, 0.051849365234375, 0.0031299591064453125, -0.0198822021484375, 0.0017137527465820312, 0.0272216796875, 0.024505615234375, 0.0394287109375, 0.040069580078125, -0.043060302734375, 0.039886474609375, -0.0280303955078125, 0.00616455078125, -0.0306243896484375, -0.067138671875, -0.061614990234375, -0.06439208984375, -0.03253173828125, -0.02545166015625, 0.0054473876953125, 0.080322265625, 0.06744384765625, -0.07958984375, -0.00882720947265625, -0.0024547576904296875, 0.006717681884765625, -0.0227203369140625, -0.01551055908203125, 0.048370361328125, -0.022064208984375, -0.06158447265625, 0.01325225830078125, -0.0033702850341796875, 0.023834228515625, -0.0011281967163085938, -0.01169586181640625, -0.052276611328125, 0.01531219482421875, 0.038909912109375, 0.0241241455078125, -0.058624267578125, -0.0294647216796875, -0.00377655029296875, -0.022216796875, 0.00927734375, 0.0186309814453125, -0.0361328125, 0.04107666015625, 0.044769287109375, 0.0249786376953125, 0.052093505859375, -0.02313232421875, 0.0106048583984375, -0.05816650390625, 0.0167999267578125, 0.0129852294921875, 0.040283203125, 0.0188140869140625, -0.002895355224609375, 0.056427001953125, 0.007411956787109375, -0.0229644775390625, -0.06146240234375, -0.0185699462890625, -0.092529296875, -0.0294036865234375, 0.0789794921875, -0.020355224609375, -0.00969696044921875, 0.0069122314453125, -0.006916046142578125, 0.035491943359375, -0.04168701171875, 0.053497314453125, 0.07269287109375, -0.0018987655639648438, 0.00461578369140625, -0.044158935546875, 0.03143310546875, 0.035125732421875, -0.049224853515625, -0.0211029052734375, 0.02313232421875, 0.03155517578125, 0.03411865234375, 0.028076171875, -0.01540374755859375, 0.00621795654296875, -0.01100921630859375, 0.036346435546875, -0.01143646240234375, -0.00835418701171875, -0.01206207275390625, 0.0038776397705078125, -0.0089111328125, -0.0267791748046875 ] ]
laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft-soup
2023-04-18T19:28:53.000Z
[ "open_clip", "tensorboard", "zero-shot-image-classification", "clip", "arxiv:2201.03545", "arxiv:2210.08402", "arxiv:1910.04867", "license:mit", "has_space", "region:us" ]
zero-shot-image-classification
laion
null
null
laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft-soup
3
143,022
open_clip
2023-02-11T01:35:52
--- tags: - zero-shot-image-classification - clip license: mit library_name: open_clip pipeline_tag: zero-shot-image-classification --- # Model card for CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft-soup # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) # Model Details ## Model Description A series of CLIP [ConvNeXt-Large](https://arxiv.org/abs/2201.03545) (w/ extra text depth, vision MLP head) models trained on the LAION-2B (english) subset of [LAION-5B](https://arxiv.org/abs/2210.08402) using [OpenCLIP](https://github.com/mlfoundations/open_clip). The models utilize: * the [timm](https://github.com/rwightman/pytorch-image-models) ConvNeXt-Large model (`convnext_large`) as the image tower * a MLP (`fc - gelu - drop - fc`) head in vision tower instead of the single projection of other CLIP models * a text tower with same width but 4 layers more depth than ViT-L / RN50x16 models (depth 16, embed dim 768). This 320x320 resolution model is a soup (weight average) of 3 fine-tunes of [CLIP-convnext_large_d.laion2B-s26B-b102K-augreg](https://huggingface.co/laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg) at a higher resolution. It is an average of 3 fine-tunes from the final checkpoint of the original 256x256 training run w/ an additional ~2-3B samples for each fine-tune and a lower learning rate. Each fine-tune was a different learning rate (1e-4, 6e-5, 5e-5), and diff # of samples (3.2B, 2B, 2.5B). At 320x320, the ConvNext-Large-D is significantly more efficient than the L/14 model at 336x336 that OpenAI fine-tuned. L/14-336 model is 2.5x more GMAC, 2.8x more activations, and 1.22x more parameters. | Model | Dataset | Resolution | AugReg | Top-1 ImageNet Zero-Shot (%) | | ----- | ------- | ---------- | ------------ | --------- | | [convnext_large_d.laion2b_s26b_b102k-augreg](https://huggingface.co/laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg) | LAION-2B | 256x256 | RRC (0.33, 1.0), RE (0.35), SD (0.1), D(0.1) | 75.9 | | [convnext_large_d_320.laion2b_s29b_b131k-ft](https://huggingface.co/laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft) | LAION-2B | 320x320 | RRC (0.5, 1.0), RE (0.4), SD (0.1), D(0.0) | 76.6 | | [convnext_large_d_320.laion2b_s29b_b131k-ft-soup](https://huggingface.co/laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft-soup) | LAION-2B | 320x320 | RRC (0.5, 1.0), RE (0.4), SD (0.1), D(0.0) | 76.9 | RRC = Random Resize Crop (crop pcts), RE = Random Erasing (prob), SD = Stochastic Depth (prob) -- image tower only, D = Dropout (prob) -- image tower head only LAION-A = LAION Aesthetic, an ~900M sample subset of LAION-2B with pHash dedupe and asthetic score filtering. Model training done by Ross Wightman on the [stability.ai](https://stability.ai/) cluster. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases. Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below. # Training Details ## Training Data This model was trained with LAION-2B -- A 2 billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/). **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure All 320x320 model fine-tunes were trained with a global batch size of 131072 for 10-16 checkpoint intervals of 203.7M samples for a total of ~2-3B samples seen over fine-tune. For 320x320 models, a slurm script w/ srun below was used on 64 8-GPU (A100 40GB) nodes (Stability). ``` /opt/slurm/sbin/srun --cpu_bind=v --accel-bind=gn python -m training.main \ --save-frequency 1 \ --name "convnext_large_320" \ --pretrained ""/runs/convnext_large_256/epoch_128.pt" \ --resume 'latest' \ --train-data="pipe:aws s3 cp s3://mybucket/path/{laion{00000..xxxxx}.tar -" \ --train-num-samples 203666042 \ --dataset-type webdataset \ --precision amp_bfloat16 \ --beta2 0.98 \ --warmup 2000 \ --batch-size=256 \ --epochs=12 \ --dataset-resampled \ --aug-cfg use_timm=True scale='(0.5, 1.0)' re_prob=0.4 \ --clip-grad-norm 5.0 \ --lr 5e-5 \ --workers=6 \ --model "convnext_large_d_320" \ --seed 0 \ --ddp-static-graph \ --local-loss \ --gather-with-grad \ --grad-checkpointing ``` # Evaluation Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval. ## Results The models achieve between 75.9 and 76.9 top-1 zero-shot accuracy on ImageNet-1k. Zero-shot curve of origina from-scratch 256x256 training: ![](convnext_large_zero_shot.png) An initial round of benchmarks have been performed on a wider range of datasets, to be viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for compute used to train this model. # Citation **BibTeX:** LAION-5B ```bibtex @inproceedings{schuhmann2022laionb, title={{LAION}-5B: An open large-scale dataset for training next generation image-text models}, author={Christoph Schuhmann and Romain Beaumont and Richard Vencu and Cade W Gordon and Ross Wightman and Mehdi Cherti and Theo Coombes and Aarush Katta and Clayton Mullis and Mitchell Wortsman and Patrick Schramowski and Srivatsa R Kundurthy and Katherine Crowson and Ludwig Schmidt and Robert Kaczmarczyk and Jenia Jitsev}, booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, year={2022}, url={https://openreview.net/forum?id=M3Y74vmsMcY} } ``` OpenCLIP software ```bibtex @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` ``` @InProceedings{pmlr-v162-wortsman22a, title = {Model soups: averaging weights of multiple fine-tuned models improves accuracy without increasing inference time}, author = {Wortsman, Mitchell and Ilharco, Gabriel and Gadre, Samir Ya and Roelofs, Rebecca and Gontijo-Lopes, Raphael and Morcos, Ari S and Namkoong, Hongseok and Farhadi, Ali and Carmon, Yair and Kornblith, Simon and Schmidt, Ludwig}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {23965--23998}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/wortsman22a/wortsman22a.pdf}, url = {https://proceedings.mlr.press/v162/wortsman22a.html} } ``` OpenAI CLIP paper ```bibtex @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` ```bibtex @Article{liu2022convnet, author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie}, title = {A ConvNet for the 2020s}, journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2022}, } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/rwightman/pytorch-image-models}} } ```
12,498
[ [ -0.03326416015625, -0.033660888671875, 0.01248931884765625, 0.002346038818359375, -0.030426025390625, -0.031463623046875, -0.0153045654296875, -0.04364013671875, 0.0174102783203125, 0.0272369384765625, -0.036407470703125, -0.0321044921875, -0.04534912109375, -0.004138946533203125, -0.024505615234375, 0.0712890625, -0.00897979736328125, 0.0027599334716796875, 0.004070281982421875, -0.024932861328125, -0.0355224609375, -0.033203125, -0.058349609375, 0.0002334117889404297, 0.0196990966796875, 0.02777099609375, 0.045654296875, 0.06591796875, 0.051971435546875, 0.0181427001953125, -0.01244354248046875, -0.00867462158203125, -0.04913330078125, -0.037933349609375, 0.01053619384765625, -0.0247344970703125, -0.0450439453125, 0.01372528076171875, 0.050506591796875, 0.0248870849609375, -0.00832366943359375, 0.02313232421875, 0.00905609130859375, 0.043243408203125, -0.056182861328125, 0.003429412841796875, -0.043426513671875, 0.00750732421875, -0.0217132568359375, 0.007686614990234375, -0.0145263671875, -0.003383636474609375, 0.0155487060546875, -0.0540771484375, 0.02850341796875, -0.0085601806640625, 0.10003662109375, 0.0156707763671875, -0.0156402587890625, 0.01554107666015625, -0.044921875, 0.06231689453125, -0.0596923828125, 0.0262451171875, 0.023956298828125, 0.0272216796875, 0.01462554931640625, -0.062164306640625, -0.0294952392578125, -0.00882720947265625, 0.01300048828125, 0.02679443359375, -0.016845703125, -0.0025577545166015625, 0.038970947265625, 0.0328369140625, -0.0401611328125, 0.00878143310546875, -0.05108642578125, 0.00019729137420654297, 0.0614013671875, 0.004383087158203125, 0.0124969482421875, -0.022613525390625, -0.051910400390625, -0.0279083251953125, -0.048797607421875, 0.0260467529296875, 0.0197296142578125, -0.005359649658203125, -0.033111572265625, 0.031463623046875, -0.009521484375, 0.033416748046875, 0.003269195556640625, -0.022064208984375, 0.034759521484375, -0.023651123046875, -0.03558349609375, -0.00901031494140625, 0.08038330078125, 0.049591064453125, 0.0040283203125, 0.0167236328125, -0.01200103759765625, -0.01454925537109375, 0.0130462646484375, -0.08221435546875, -0.01373291015625, 0.0159454345703125, -0.047637939453125, -0.0265045166015625, 0.026092529296875, -0.049072265625, 0.0030612945556640625, -0.00782012939453125, 0.04437255859375, -0.04644775390625, -0.01422119140625, 0.007648468017578125, -0.0181121826171875, 0.0169525146484375, 0.0256805419921875, -0.040618896484375, 0.01367950439453125, 0.0281829833984375, 0.08447265625, -0.01082611083984375, -0.03009033203125, -0.0139923095703125, 0.01062774658203125, -0.0245361328125, 0.0399169921875, -0.0003554821014404297, -0.025909423828125, -0.0196990966796875, 0.033416748046875, -0.006267547607421875, -0.04364013671875, 0.045684814453125, -0.0157928466796875, -0.002155303955078125, -0.01332855224609375, -0.026092529296875, -0.036468505859375, 0.0094146728515625, -0.0570068359375, 0.06884765625, 0.01062774658203125, -0.06634521484375, 0.027496337890625, -0.0404052734375, -0.00986480712890625, -0.007808685302734375, 0.00424957275390625, -0.04217529296875, -0.0170745849609375, 0.0275726318359375, 0.047454833984375, -0.02813720703125, 0.0221405029296875, -0.03985595703125, -0.03662109375, 0.020172119140625, -0.034210205078125, 0.06988525390625, 0.00394439697265625, -0.030426025390625, 0.009368896484375, -0.04998779296875, -0.00934600830078125, 0.0271759033203125, -0.0015411376953125, -0.0067291259765625, -0.023956298828125, -0.0042266845703125, 0.01849365234375, 0.007389068603515625, -0.0419921875, -0.00334930419921875, -0.01422882080078125, 0.04644775390625, 0.058868408203125, 0.0102996826171875, 0.02532958984375, -0.04150390625, 0.034088134765625, 0.005290985107421875, 0.044677734375, -0.0185089111328125, -0.037109375, -0.05657958984375, -0.0450439453125, 0.0282440185546875, 0.034698486328125, -0.0311126708984375, 0.034332275390625, -0.0196380615234375, -0.03887939453125, -0.034698486328125, -0.011138916015625, 0.0291748046875, 0.039642333984375, 0.03155517578125, -0.029510498046875, -0.040802001953125, -0.079345703125, 0.0260162353515625, 0.0128021240234375, -0.0164642333984375, 0.05047607421875, 0.057861328125, -0.006687164306640625, 0.056732177734375, -0.05572509765625, -0.03448486328125, -0.01448822021484375, -0.0008034706115722656, 0.02020263671875, 0.03717041015625, 0.060546875, -0.055999755859375, -0.044097900390625, -0.01172637939453125, -0.07244873046875, 0.01715087890625, -0.001552581787109375, -0.0133514404296875, 0.009307861328125, 0.032073974609375, -0.046600341796875, 0.050018310546875, 0.030517578125, 0.01018524169921875, 0.046844482421875, -0.011962890625, 0.007080078125, -0.07940673828125, 0.0272674560546875, 0.01357269287109375, -0.01186370849609375, -0.038177490234375, 0.00223541259765625, 0.00492095947265625, -0.0253143310546875, -0.0643310546875, 0.04119873046875, -0.0280914306640625, 0.0032024383544921875, -0.008880615234375, 0.004375457763671875, 0.006343841552734375, 0.052703857421875, 0.01053619384765625, 0.070068359375, 0.04705810546875, -0.0472412109375, 0.0099029541015625, 0.0166015625, -0.0308837890625, 0.0316162109375, -0.07794189453125, 0.0009527206420898438, -0.0008754730224609375, 0.0276947021484375, -0.0384521484375, -0.03460693359375, 0.0268402099609375, -0.033905029296875, 0.0250091552734375, -0.0244293212890625, -0.01355743408203125, -0.04296875, -0.057098388671875, 0.042449951171875, 0.04901123046875, -0.0380859375, 0.013153076171875, 0.0283966064453125, 0.01837158203125, -0.05096435546875, -0.05126953125, -0.0233154296875, -0.0244598388671875, -0.05792236328125, 0.036651611328125, 0.003963470458984375, 0.005863189697265625, -0.000050067901611328125, -0.000027835369110107422, -0.00470733642578125, -0.00586700439453125, 0.043243408203125, 0.03192138671875, -0.00936126708984375, -0.0182342529296875, -0.0153350830078125, 0.0012035369873046875, -0.003948211669921875, -0.0162811279296875, 0.035400390625, -0.01212310791015625, -0.010772705078125, -0.0562744140625, 0.011627197265625, 0.04150390625, -0.01776123046875, 0.05889892578125, 0.061920166015625, -0.03131103515625, 0.00653839111328125, -0.030487060546875, -0.01284027099609375, -0.0357666015625, 0.0367431640625, -0.0109100341796875, -0.04791259765625, 0.0477294921875, 0.01302337646484375, -0.00730133056640625, 0.055938720703125, 0.0166778564453125, -0.01418304443359375, 0.06951904296875, 0.03857421875, -0.0068817138671875, 0.042510986328125, -0.080810546875, -0.002044677734375, -0.0919189453125, -0.022216796875, -0.013275146484375, -0.02191162109375, -0.037933349609375, -0.0382080078125, 0.04986572265625, 0.02545166015625, -0.0277557373046875, 0.02972412109375, -0.0256500244140625, 0.01788330078125, 0.04473876953125, 0.035552978515625, -0.01320648193359375, -0.00685882568359375, 0.002223968505859375, -0.0032444000244140625, -0.05755615234375, -0.01538848876953125, 0.09130859375, 0.04913330078125, 0.048065185546875, -0.01328277587890625, 0.0310821533203125, 0.01239776611328125, -0.0011243820190429688, -0.053131103515625, 0.0450439453125, -0.0295562744140625, -0.047332763671875, -0.0255584716796875, -0.0307464599609375, -0.06622314453125, 0.00843048095703125, -0.01509857177734375, -0.05572509765625, 0.016387939453125, 0.0040283203125, -0.03338623046875, 0.045562744140625, -0.042205810546875, 0.0728759765625, -0.0220184326171875, -0.032318115234375, -0.004550933837890625, -0.05633544921875, 0.042388916015625, 0.008392333984375, 0.004505157470703125, -0.015045166015625, 0.017425537109375, 0.08831787109375, -0.058868408203125, 0.059051513671875, -0.016204833984375, 0.018890380859375, 0.048828125, -0.014129638671875, 0.0289306640625, 0.01291656494140625, 0.005519866943359375, 0.052886962890625, 0.004138946533203125, -0.0185394287109375, -0.028228759765625, 0.040802001953125, -0.07635498046875, -0.0302734375, -0.035858154296875, -0.042144775390625, 0.0177001953125, 0.0182037353515625, 0.05963134765625, 0.0506591796875, -0.00782012939453125, 0.0305938720703125, 0.04205322265625, -0.0185546875, 0.034820556640625, 0.00830078125, -0.005428314208984375, -0.054290771484375, 0.075439453125, 0.0196533203125, 0.022918701171875, 0.004505157470703125, 0.01629638671875, -0.007648468017578125, -0.033660888671875, -0.042144775390625, 0.0292510986328125, -0.03887939453125, -0.035186767578125, -0.033843994140625, -0.042633056640625, -0.037109375, -0.011962890625, -0.039459228515625, -0.02142333984375, -0.048736572265625, 0.004299163818359375, 0.025726318359375, 0.04193115234375, -0.01448822021484375, 0.03509521484375, -0.0640869140625, 0.015838623046875, 0.016021728515625, 0.0264739990234375, 0.00879669189453125, -0.062744140625, -0.0223846435546875, 0.0258331298828125, -0.03826904296875, -0.054779052734375, 0.02996826171875, 0.020751953125, 0.0303192138671875, 0.058349609375, -0.005847930908203125, 0.05487060546875, -0.0277557373046875, 0.07763671875, 0.030548095703125, -0.053131103515625, 0.0433349609375, -0.040771484375, 0.021820068359375, 0.038177490234375, 0.052093505859375, -0.022552490234375, 0.0007038116455078125, -0.062225341796875, -0.068359375, 0.0797119140625, 0.011077880859375, -0.01041412353515625, 0.01352691650390625, 0.0404052734375, 0.00001239776611328125, 0.007396697998046875, -0.060089111328125, -0.01328277587890625, -0.03277587890625, 0.00778961181640625, 0.000013947486877441406, -0.0279998779296875, -0.01262664794921875, -0.0377197265625, 0.0609130859375, -0.0074005126953125, 0.042816162109375, 0.01715087890625, -0.0019989013671875, -0.005229949951171875, -0.01322174072265625, 0.039520263671875, 0.035858154296875, -0.043914794921875, -0.0175933837890625, 0.0178680419921875, -0.0531005859375, -0.005767822265625, 0.003627777099609375, -0.044830322265625, -0.01483917236328125, 0.0299072265625, 0.08917236328125, 0.0129547119140625, -0.038330078125, 0.06622314453125, -0.006381988525390625, -0.036529541015625, -0.0255126953125, 0.0113525390625, -0.019256591796875, 0.01265716552734375, 0.0185089111328125, 0.0141143798828125, 0.01371002197265625, -0.035247802734375, 0.0135955810546875, 0.037994384765625, -0.037109375, -0.033905029296875, 0.05474853515625, -0.00048041343688964844, -0.0048370361328125, 0.053192138671875, -0.01338958740234375, -0.04315185546875, 0.05999755859375, 0.042449951171875, 0.067626953125, -0.009185791015625, 0.025146484375, 0.06744384765625, 0.01450347900390625, -0.0228424072265625, 0.00830078125, 0.01367950439453125, -0.03997802734375, -0.004535675048828125, -0.03955078125, -0.00836181640625, 0.042938232421875, -0.06671142578125, 0.0361328125, -0.049774169921875, -0.027435302734375, -0.01348114013671875, -0.00858306884765625, -0.046173095703125, 0.02386474609375, 0.006072998046875, 0.078369140625, -0.06573486328125, 0.05426025390625, 0.046844482421875, -0.045806884765625, -0.07427978515625, 0.00273895263671875, -0.01141357421875, -0.044830322265625, 0.0292510986328125, 0.03753662109375, 0.005695343017578125, -0.0204315185546875, -0.06298828125, -0.08001708984375, 0.10888671875, 0.028167724609375, -0.028228759765625, 0.0016326904296875, 0.0019235610961914062, 0.0257110595703125, -0.02301025390625, 0.03302001953125, 0.0258636474609375, 0.01055908203125, 0.020355224609375, -0.0704345703125, 0.0029163360595703125, -0.0115814208984375, 0.01555633544921875, 0.010101318359375, -0.09130859375, 0.087890625, -0.0203704833984375, -0.0212860107421875, 0.0012159347534179688, 0.04986572265625, 0.01067352294921875, 0.01470184326171875, 0.035064697265625, 0.0565185546875, 0.044097900390625, 0.001926422119140625, 0.07830810546875, -0.0159454345703125, 0.023895263671875, 0.07098388671875, 0.0002168416976928711, 0.06671142578125, 0.0174713134765625, -0.0157928466796875, 0.018585205078125, 0.049835205078125, -0.0283355712890625, 0.048736572265625, -0.01158905029296875, 0.00580596923828125, -0.0051116943359375, -0.0250091552734375, -0.038330078125, 0.037628173828125, 0.008209228515625, -0.0227813720703125, 0.004795074462890625, 0.0181121826171875, 0.005474090576171875, -0.031463623046875, -0.0262298583984375, 0.034698486328125, 0.0078125, -0.039337158203125, 0.059326171875, 0.0052337646484375, 0.059906005859375, -0.045318603515625, 0.00650787353515625, -0.008636474609375, 0.0164794921875, -0.013763427734375, -0.06500244140625, 0.02154541015625, -0.00949859619140625, -0.0025157928466796875, 0.0014009475708007812, 0.044464111328125, -0.00616455078125, -0.034912109375, 0.0284576416015625, -0.005825042724609375, 0.0247344970703125, -0.0121612548828125, -0.059539794921875, 0.01458740234375, 0.005504608154296875, -0.007671356201171875, 0.0279693603515625, 0.01416015625, -0.002590179443359375, 0.042388916015625, 0.04248046875, -0.0016622543334960938, 0.0025844573974609375, -0.00970458984375, 0.07244873046875, -0.032501220703125, -0.031494140625, -0.044403076171875, 0.036529541015625, -0.01352691650390625, -0.03863525390625, 0.053466796875, 0.05230712890625, 0.0762939453125, -0.00868988037109375, 0.048004150390625, -0.0138092041015625, 0.0213623046875, -0.04742431640625, 0.04705810546875, -0.062164306640625, 0.001537322998046875, -0.023345947265625, -0.055145263671875, -0.00598907470703125, 0.048065185546875, -0.0181732177734375, 0.007648468017578125, 0.0496826171875, 0.050079345703125, -0.0230255126953125, 0.0002359151840209961, 0.0125579833984375, 0.01300811767578125, 0.01387786865234375, 0.048095703125, 0.04010009765625, -0.068115234375, 0.043792724609375, -0.060302734375, -0.026153564453125, -0.023773193359375, -0.0504150390625, -0.0738525390625, -0.048736572265625, -0.0274200439453125, -0.0241851806640625, -0.0019702911376953125, 0.0496826171875, 0.07366943359375, -0.052825927734375, -0.027069091796875, 0.01383209228515625, -0.0196380615234375, -0.0212554931640625, -0.01605224609375, 0.04010009765625, 0.01372528076171875, -0.04193115234375, 0.010955810546875, 0.01100921630859375, 0.0179901123046875, -0.00704193115234375, -0.007038116455078125, -0.0262451171875, -0.01219940185546875, 0.0377197265625, 0.0288848876953125, -0.04119873046875, -0.024444580078125, 0.01418304443359375, 0.0025005340576171875, 0.0198211669921875, 0.04248046875, -0.03753662109375, 0.0118865966796875, 0.03106689453125, 0.0202178955078125, 0.06768798828125, 0.0053558349609375, 0.015716552734375, -0.045867919921875, 0.032623291015625, 0.00464630126953125, 0.032196044921875, 0.0206756591796875, -0.02923583984375, 0.04937744140625, 0.031402587890625, -0.0460205078125, -0.064208984375, -0.00628662109375, -0.08782958984375, -0.016387939453125, 0.09478759765625, -0.0251922607421875, -0.045074462890625, 0.033203125, -0.0169525146484375, 0.0264434814453125, -0.028289794921875, 0.03564453125, 0.02362060546875, -0.001117706298828125, -0.0213470458984375, -0.053497314453125, 0.035186767578125, 0.0112152099609375, -0.04827880859375, -0.0160064697265625, 0.0276641845703125, 0.040008544921875, 0.015899658203125, 0.0460205078125, -0.0159149169921875, 0.0287628173828125, 0.0039520263671875, 0.00952911376953125, -0.0181427001953125, -0.040557861328125, -0.0328369140625, 0.00478363037109375, -0.0158843994140625, -0.036468505859375 ] ]
tiiuae/falcon-7b
2023-09-29T14:32:19.000Z
[ "transformers", "pytorch", "falcon", "text-generation", "custom_code", "en", "dataset:tiiuae/falcon-refinedweb", "arxiv:2205.14135", "arxiv:1911.02150", "arxiv:2101.00027", "arxiv:2005.14165", "arxiv:2104.09864", "arxiv:2306.01116", "license:apache-2.0", "has_space", "text-generation-inference", "region:us" ]
text-generation
tiiuae
null
null
tiiuae/falcon-7b
912
142,524
transformers
2023-04-24T16:36:24
--- datasets: - tiiuae/falcon-refinedweb language: - en inference: false license: apache-2.0 --- # 🚀 Falcon-7B **Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the Apache 2.0 license.** *Paper coming soon* 😊. 🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost fron HF](https://huggingface.co/blog/falcon)! ## Why use Falcon-7B? * **It outperforms comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). * **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)). * **It is made available under a permissive Apache 2.0 license allowing for commercial use**, without any royalties or restrictions. ⚠️ **This is a raw, pretrained model, which should be further finetuned for most usecases.** If you are looking for a version better suited to taking generic instructions in a chat format, we recommend taking a look at [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct). 🔥 **Looking for an even more powerful model?** [Falcon-40B](https://huggingface.co/tiiuae/falcon-40b) is Falcon-7B's big brother! ```python from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model = "tiiuae/falcon-7b" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto", ) sequences = pipeline( "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:", max_length=200, do_sample=True, top_k=10, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ``` 💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!** For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost]((https://huggingface.co/blog/falcon). You will need **at least 16GB of memory** to swiftly run inference with Falcon-7B. # Model Card for Falcon-7B ## Model Details ### Model Description - **Developed by:** [https://www.tii.ae](https://www.tii.ae); - **Model type:** Causal decoder-only; - **Language(s) (NLP):** English, German, Spanish, French (and limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish); - **License:** Apache 2.0. ### Model Source - **Paper:** *coming soon*. ## Uses ### Direct Use Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.) ### Out-of-Scope Use Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful. ## Bias, Risks, and Limitations Falcon-7B is trained on English and French data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online. ### Recommendations We recommend users of Falcon-7B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use. ## How to Get Started with the Model ```python from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model = "tiiuae/falcon-7b" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto", ) sequences = pipeline( "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:", max_length=200, do_sample=True, top_k=10, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ``` ## Training Details ### Training Data Falcon-7B was trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), a high-quality filtered and deduplicated web dataset which we enhanced with curated corpora. Significant components from our curated copora were inspired by The Pile ([Gao et al., 2020](https://arxiv.org/abs/2101.00027)). | **Data source** | **Fraction** | **Tokens** | **Sources** | |--------------------|--------------|------------|-----------------------------------| | [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 79% | 1,185B | massive web crawl | | Books | 7% | 110B | | | Conversations | 6% | 85B | Reddit, StackOverflow, HackerNews | | Code | 3% | 45B | | | RefinedWeb-French | 3% | 45B | massive web crawl | | Technical | 2% | 30B | arXiv, PubMed, USPTO, etc. | The data was tokenized with the Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) tokenizer. ### Training Procedure Falcon-7B was trained on 384 A100 40GB GPUs, using a 2D parallelism strategy (PP=2, DP=192) combined with ZeRO. #### Training Hyperparameters | **Hyperparameter** | **Value** | **Comment** | |--------------------|------------|-------------------------------------------| | Precision | `bfloat16` | | | Optimizer | AdamW | | | Learning rate | 6e-4 | 4B tokens warm-up, cosine decay to 1.2e-5 | | Weight decay | 1e-1 | | | Z-loss | 1e-4 | | | Batch size | 2304 | 30B tokens ramp-up | #### Speeds, Sizes, Times Training happened in early March 2023 and took about two weeks. ## Evaluation *Paper coming soon*. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results. ## Technical Specifications ### Model Architecture and Objective Falcon-7B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token). The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences: * **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864)); * **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)); * **Decoder-block:** parallel attention/MLP with a single layer norm. | **Hyperparameter** | **Value** | **Comment** | |--------------------|-----------|----------------------------------------| | Layers | 32 | | | `d_model` | 4544 | Increased to compensate for multiquery | | `head_dim` | 64 | Reduced to optimise for FlashAttention | | Vocabulary | 65024 | | | Sequence length | 2048 | | ### Compute Infrastructure #### Hardware Falcon-7B was trained on AWS SageMaker, on 384 A100 40GB GPUs in P4d instances. #### Software Falcon-7B was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.) ## Citation *Paper coming soon* 😊. In the meanwhile, you can use the following information to cite: ``` @article{falcon40b, title={{Falcon-40B}: an open large language model with state-of-the-art performance}, author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme}, year={2023} } ``` To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116). ``` @article{refinedweb, title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only}, author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay}, journal={arXiv preprint arXiv:2306.01116}, eprint={2306.01116}, eprinttype = {arXiv}, url={https://arxiv.org/abs/2306.01116}, year={2023} } ``` ## License Falcon-7B is made available under the Apache 2.0 license. ## Contact falconllm@tii.ae
10,292
[ [ -0.038909912109375, -0.06591796875, 0.00778961181640625, 0.022430419921875, -0.01406097412109375, -0.0079345703125, -0.01258087158203125, -0.037567138671875, 0.016571044921875, 0.0271759033203125, -0.032562255859375, -0.04010009765625, -0.05889892578125, 0.01763916015625, -0.0255126953125, 0.0750732421875, 0.008819580078125, -0.005084991455078125, 0.0203399658203125, -0.0026416778564453125, -0.01318359375, -0.03656005859375, -0.0654296875, -0.0044708251953125, 0.0372314453125, 0.0124359130859375, 0.04315185546875, 0.07769775390625, 0.0513916015625, 0.0263214111328125, -0.0225830078125, 0.01959228515625, -0.04534912109375, -0.019317626953125, -0.01071929931640625, -0.0181732177734375, -0.0177001953125, -0.00208282470703125, 0.05645751953125, 0.044219970703125, 0.0020771026611328125, 0.018951416015625, -0.006519317626953125, 0.043914794921875, -0.04376220703125, 0.0394287109375, -0.039276123046875, -0.004070281982421875, -0.015960693359375, 0.01261138916015625, -0.03277587890625, 0.0031642913818359375, -0.025482177734375, -0.06591796875, 0.022918701171875, 0.0167999267578125, 0.08770751953125, 0.02154541015625, -0.031585693359375, -0.0223846435546875, -0.0268096923828125, 0.048492431640625, -0.05645751953125, 0.032196044921875, 0.01520538330078125, 0.0241546630859375, -0.030609130859375, -0.07763671875, -0.040252685546875, -0.016571044921875, 0.000270843505859375, 0.0260772705078125, -0.009613037109375, 0.002899169921875, 0.0379638671875, 0.015472412109375, -0.034515380859375, 0.00011849403381347656, -0.04327392578125, -0.01305389404296875, 0.04730224609375, -0.0029506683349609375, 0.0168304443359375, -0.0197296142578125, -0.0290069580078125, -0.0274810791015625, -0.0302734375, 0.0230865478515625, 0.031341552734375, 0.026824951171875, -0.01959228515625, 0.03375244140625, -0.02349853515625, 0.0379638671875, 0.036468505859375, -0.005710601806640625, 0.0294647216796875, -0.0246734619140625, -0.028656005859375, 0.004627227783203125, 0.081787109375, 0.0126495361328125, 0.006603240966796875, -0.012298583984375, -0.00490570068359375, 0.00301361083984375, 0.01006317138671875, -0.07281494140625, 0.01111602783203125, 0.01340484619140625, -0.042724609375, -0.0204620361328125, 0.03497314453125, -0.05194091796875, -0.00568389892578125, 0.001453399658203125, 0.00988006591796875, -0.036346435546875, -0.02996826171875, 0.0184173583984375, -0.01331329345703125, 0.01751708984375, 0.0010614395141601562, -0.059326171875, 0.016326904296875, 0.04901123046875, 0.06597900390625, 0.002349853515625, -0.049224853515625, -0.0587158203125, 0.00446319580078125, -0.028076171875, 0.04052734375, -0.034759521484375, -0.031219482421875, -0.00760650634765625, 0.0272216796875, -0.0261993408203125, -0.01535797119140625, 0.06292724609375, -0.0244140625, 0.0147247314453125, -0.03253173828125, -0.040740966796875, -0.029449462890625, 0.0022735595703125, -0.049591064453125, 0.07330322265625, 0.00449371337890625, -0.0770263671875, 0.0198822021484375, -0.06195068359375, -0.0203094482421875, -0.016265869140625, -0.0028839111328125, -0.03173828125, -0.006702423095703125, 0.032470703125, 0.043914794921875, -0.027374267578125, 0.03759765625, -0.046783447265625, -0.0433349609375, -0.003284454345703125, -0.01031494140625, 0.0654296875, 0.04345703125, -0.046783447265625, 0.00782012939453125, -0.049591064453125, -0.0217132568359375, 0.0188751220703125, -0.00605010986328125, 0.01102447509765625, -0.00919342041015625, 0.0096282958984375, 0.024078369140625, 0.0033168792724609375, -0.046783447265625, 0.01708984375, -0.050018310546875, 0.045166015625, 0.03265380859375, -0.0055084228515625, 0.0257568359375, -0.036651611328125, 0.0307464599609375, 0.03662109375, 0.0243377685546875, -0.0192413330078125, -0.042510986328125, -0.06890869140625, -0.0226287841796875, 0.01122283935546875, 0.02398681640625, -0.0478515625, 0.03424072265625, -0.0125274658203125, -0.05029296875, -0.032379150390625, -0.01235198974609375, 0.03851318359375, 0.04937744140625, 0.036865234375, 0.006378173828125, -0.04608154296875, -0.0592041015625, -0.004291534423828125, -0.02020263671875, 0.0247955322265625, 0.005435943603515625, 0.05322265625, -0.02496337890625, 0.04730224609375, -0.021636962890625, -0.01690673828125, -0.01432037353515625, 0.0024166107177734375, 0.0236663818359375, 0.039703369140625, 0.057159423828125, -0.04547119140625, -0.016510009765625, 0.0014781951904296875, -0.0699462890625, -0.0029087066650390625, -0.01087188720703125, -0.0212554931640625, 0.037841796875, 0.043060302734375, -0.054046630859375, 0.0252685546875, 0.0221405029296875, -0.0291290283203125, 0.02935791015625, -0.0015764236450195312, 0.01108551025390625, -0.0963134765625, 0.0251922607421875, 0.00841522216796875, 0.01265716552734375, -0.0300140380859375, 0.022796630859375, 0.00232696533203125, -0.004673004150390625, -0.05010986328125, 0.06329345703125, -0.045440673828125, 0.00042128562927246094, -0.005191802978515625, -0.00916290283203125, -0.00925445556640625, 0.044952392578125, 0.00582122802734375, 0.0587158203125, 0.039520263671875, -0.0259552001953125, -0.003910064697265625, 0.026336669921875, -0.0038909912109375, 0.004150390625, -0.057525634765625, -0.0016222000122070312, -0.01074981689453125, 0.03192138671875, -0.059112548828125, -0.02203369140625, 0.034942626953125, -0.047882080078125, 0.022674560546875, -0.01305389404296875, -0.0282745361328125, -0.04144287109375, -0.0214385986328125, 0.0041046142578125, 0.038482666015625, -0.043121337890625, 0.03704833984375, 0.010467529296875, 0.01068878173828125, -0.07098388671875, -0.0484619140625, 0.001873016357421875, -0.0197601318359375, -0.06060791015625, 0.021728515625, -0.004047393798828125, 0.00121307373046875, -0.005008697509765625, 0.0080108642578125, 0.01274871826171875, 0.0079803466796875, 0.0423583984375, 0.01245880126953125, -0.0218048095703125, -0.00788116455078125, 0.01107025146484375, -0.00749969482421875, 0.004131317138671875, -0.0283966064453125, 0.032958984375, -0.04534912109375, -0.0221405029296875, -0.0266265869140625, 0.025726318359375, 0.042022705078125, -0.016265869140625, 0.06365966796875, 0.08221435546875, -0.026824951171875, 0.007328033447265625, -0.048370361328125, -0.004924774169921875, -0.03656005859375, 0.0304718017578125, -0.040618896484375, -0.065673828125, 0.05303955078125, 0.01522064208984375, 0.0036334991455078125, 0.0721435546875, 0.037750244140625, 0.016510009765625, 0.0836181640625, 0.0255584716796875, -0.00921630859375, 0.0285186767578125, -0.047515869140625, -0.004901885986328125, -0.05682373046875, -0.02435302734375, -0.050445556640625, -0.0076141357421875, -0.053070068359375, -0.01354217529296875, 0.00055694580078125, 0.0294647216796875, -0.065673828125, 0.0154876708984375, -0.045257568359375, 0.017333984375, 0.045867919921875, 0.0007495880126953125, 0.001628875732421875, 0.0019254684448242188, -0.0210418701171875, 0.023681640625, -0.06304931640625, -0.040740966796875, 0.08148193359375, 0.0290069580078125, 0.04571533203125, -0.006114959716796875, 0.06781005859375, 0.0015888214111328125, 0.023193359375, -0.0372314453125, 0.0333251953125, -0.0143280029296875, -0.032867431640625, -0.005016326904296875, -0.037750244140625, -0.0767822265625, 0.00839996337890625, -0.012237548828125, -0.056915283203125, 0.012786865234375, -0.0011081695556640625, -0.005779266357421875, 0.0236968994140625, -0.0718994140625, 0.06951904296875, -0.0040740966796875, -0.031585693359375, 0.01415252685546875, -0.059326171875, 0.039520263671875, -0.0014057159423828125, 0.018798828125, 0.0025177001953125, -0.0009946823120117188, 0.07073974609375, -0.044464111328125, 0.05731201171875, -0.02874755859375, 0.0311126708984375, 0.030975341796875, -0.0191497802734375, 0.050506591796875, 0.00846099853515625, -0.0215301513671875, 0.033599853515625, 0.0214385986328125, -0.037689208984375, -0.0299072265625, 0.06365966796875, -0.09210205078125, -0.04486083984375, -0.048675537109375, -0.03546142578125, -0.004352569580078125, 0.0265655517578125, 0.03289794921875, 0.0233917236328125, -0.00025343894958496094, 0.023956298828125, 0.01345062255859375, -0.0218505859375, 0.0545654296875, 0.0291290283203125, -0.016937255859375, -0.043701171875, 0.059906005859375, 0.00823211669921875, -0.0013484954833984375, 0.0225372314453125, 0.0236968994140625, -0.051239013671875, -0.041900634765625, -0.047515869140625, 0.0380859375, -0.047119140625, -0.018707275390625, -0.07049560546875, -0.0303497314453125, -0.052398681640625, -0.01308441162109375, -0.0201873779296875, -0.0198822021484375, -0.045501708984375, -0.00467681884765625, 0.028533935546875, 0.046783447265625, -0.0009570121765136719, 0.033538818359375, -0.05999755859375, 0.0088653564453125, -0.01490020751953125, 0.00955963134765625, 0.006626129150390625, -0.04840087890625, -0.0196990966796875, 0.032012939453125, -0.02850341796875, -0.04840087890625, 0.037445068359375, 0.01776123046875, 0.050628662109375, 0.03594970703125, 0.0064239501953125, 0.055267333984375, -0.01062774658203125, 0.06207275390625, 0.0224609375, -0.061859130859375, 0.02374267578125, -0.04248046875, 0.0222320556640625, 0.02996826171875, 0.033599853515625, -0.0263214111328125, -0.033355712890625, -0.0673828125, -0.03826904296875, 0.06866455078125, 0.02410888671875, -0.01229095458984375, -0.020538330078125, 0.033416748046875, -0.003658294677734375, 0.001995086669921875, -0.03582763671875, -0.0117340087890625, -0.0548095703125, -0.02874755859375, -0.014373779296875, 0.0006856918334960938, 0.01406097412109375, -0.02142333984375, 0.06317138671875, -0.0160675048828125, 0.042266845703125, 0.01751708984375, -0.017486572265625, 0.00428009033203125, -0.005451202392578125, 0.05670166015625, 0.03936767578125, -0.0201263427734375, -0.00067901611328125, 0.0052337646484375, -0.05322265625, 0.005657196044921875, 0.03289794921875, -0.018951416015625, -0.0121917724609375, 0.033416748046875, 0.077392578125, 0.00586700439453125, -0.0321044921875, 0.0296173095703125, -0.00421142578125, -0.0260162353515625, -0.0044708251953125, 0.01837158203125, 0.0180206298828125, 0.021820068359375, 0.02294921875, -0.01117706298828125, 0.01123809814453125, -0.026580810546875, 0.01085662841796875, 0.009521484375, -0.019378662109375, -0.0190277099609375, 0.0772705078125, 0.0121307373046875, -0.013092041015625, 0.035797119140625, -0.0271759033203125, -0.0290069580078125, 0.071044921875, 0.0504150390625, 0.0645751953125, 0.005092620849609375, 0.0188446044921875, 0.053741455078125, 0.02203369140625, -0.0199432373046875, 0.019073486328125, 0.0194244384765625, -0.037384033203125, -0.026092529296875, -0.065185546875, -0.01528167724609375, 0.011444091796875, -0.043243408203125, 0.027191162109375, -0.037017822265625, -0.01326751708984375, 0.0156097412109375, 0.02618408203125, -0.052825927734375, 0.0195465087890625, -0.0080718994140625, 0.0787353515625, -0.044677734375, 0.05999755859375, 0.06005859375, -0.070068359375, -0.080810546875, -0.018829345703125, -0.00786590576171875, -0.06964111328125, 0.058380126953125, 0.033050537109375, 0.0021762847900390625, 0.0186920166015625, -0.033660888671875, -0.06207275390625, 0.08050537109375, 0.03485107421875, -0.045501708984375, -0.007122039794921875, 0.016326904296875, 0.0361328125, -0.0300750732421875, 0.052947998046875, 0.0272369384765625, 0.038818359375, 0.0238494873046875, -0.052032470703125, 0.018341064453125, -0.043243408203125, 0.00760650634765625, 0.01166534423828125, -0.07757568359375, 0.0660400390625, -0.0196075439453125, -0.00904083251953125, -0.0038394927978515625, 0.0625, 0.022796630859375, 0.0226287841796875, 0.0294342041015625, 0.041107177734375, 0.048675537109375, -0.01056671142578125, 0.0753173828125, -0.04779052734375, 0.04302978515625, 0.064208984375, -0.00045180320739746094, 0.053436279296875, 0.0227508544921875, 0.00007218122482299805, 0.016265869140625, 0.0704345703125, -0.007587432861328125, 0.0160980224609375, -0.00455474853515625, 0.01055908203125, -0.0120391845703125, -0.004001617431640625, -0.051971435546875, 0.0301513671875, 0.013214111328125, -0.029541015625, -0.01326751708984375, 0.0038776397705078125, 0.0262298583984375, -0.0185089111328125, -0.01450347900390625, 0.042083740234375, 0.00506591796875, -0.062225341796875, 0.0745849609375, 0.01261138916015625, 0.054473876953125, -0.047607421875, 0.00945281982421875, -0.037567138671875, 0.01666259765625, -0.01490020751953125, -0.0423583984375, 0.028533935546875, -0.0015745162963867188, -0.00911712646484375, -0.0011997222900390625, 0.046875, -0.0200347900390625, -0.059326171875, 0.024383544921875, 0.0216522216796875, 0.019500732421875, -0.01666259765625, -0.0650634765625, 0.0265045166015625, -0.0134124755859375, -0.028656005859375, 0.0220489501953125, 0.0168304443359375, -0.0089111328125, 0.052764892578125, 0.061614990234375, -0.004611968994140625, 0.0156097412109375, 0.00005316734313964844, 0.059326171875, -0.05987548828125, -0.036834716796875, -0.047943115234375, 0.0278472900390625, -0.006900787353515625, -0.033782958984375, 0.050811767578125, 0.05419921875, 0.06591796875, -0.00403594970703125, 0.051910400390625, -0.0071868896484375, 0.0181121826171875, -0.038909912109375, 0.0572509765625, -0.043609619140625, 0.00029850006103515625, -0.0204620361328125, -0.05401611328125, -0.00893402099609375, 0.044464111328125, -0.01325225830078125, 0.01537322998046875, 0.06195068359375, 0.0736083984375, -0.0095367431640625, 0.0181732177734375, 0.01183319091796875, 0.02996826171875, 0.035430908203125, 0.053131103515625, 0.049072265625, -0.05438232421875, 0.051422119140625, -0.0213470458984375, -0.011444091796875, -0.0217437744140625, -0.058685302734375, -0.08343505859375, -0.050201416015625, -0.01629638671875, -0.034210205078125, 0.0196990966796875, 0.0684814453125, 0.050567626953125, -0.050384521484375, -0.021209716796875, -0.0183258056640625, -0.0028896331787109375, -0.0219573974609375, -0.01528167724609375, 0.045074462890625, -0.0372314453125, -0.055084228515625, 0.0123443603515625, 0.0018987655639648438, 0.0033626556396484375, -0.0097503662109375, -0.0201873779296875, -0.030517578125, -0.0007090568542480469, 0.042938232421875, 0.024871826171875, -0.06494140625, -0.0247955322265625, 0.0218353271484375, -0.014068603515625, 0.0005354881286621094, 0.0235595703125, -0.0396728515625, 0.0228271484375, 0.0305023193359375, 0.0560302734375, 0.062225341796875, -0.00325775146484375, 0.014984130859375, -0.0101470947265625, 0.025421142578125, -0.012237548828125, 0.0384521484375, 0.0113677978515625, -0.024139404296875, 0.050750732421875, 0.0404052734375, -0.035430908203125, -0.058013916015625, -0.0198516845703125, -0.10113525390625, -0.0133056640625, 0.1024169921875, -0.0191497802734375, -0.03857421875, 0.010589599609375, -0.0303802490234375, 0.043609619140625, -0.0465087890625, 0.046051025390625, 0.0408935546875, 0.005565643310546875, -0.0021457672119140625, -0.025146484375, 0.025909423828125, 0.005664825439453125, -0.07232666015625, -0.020965576171875, 0.03021240234375, 0.0243682861328125, -0.001979827880859375, 0.037841796875, 0.006214141845703125, 0.01316070556640625, 0.0184326171875, -0.004650115966796875, -0.05084228515625, -0.0174560546875, -0.006473541259765625, 0.022674560546875, -0.0195465087890625, -0.022308349609375 ] ]
Davlan/bert-base-multilingual-cased-ner-hrl
2023-08-14T19:34:17.000Z
[ "transformers", "pytorch", "tf", "bert", "token-classification", "license:afl-3.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
Davlan
null
null
Davlan/bert-base-multilingual-cased-ner-hrl
50
141,809
transformers
2022-03-02T23:29:04
--- license: afl-3.0 --- Hugging Face's logo --- language: - ar - de - en - es - fr - it - lv - nl - pt - zh - multilingual --- # bert-base-multilingual-cased-ner-hrl ## Model description **bert-base-multilingual-cased-ner-hrl** is a **Named Entity Recognition** model for 10 high resourced languages (Arabic, German, English, Spanish, French, Italian, Latvian, Dutch, Portuguese and Chinese) based on a fine-tuned mBERT base model. It has been trained to recognize three types of entities: location (LOC), organizations (ORG), and person (PER). Specifically, this model is a *bert-base-multilingual-cased* model that was fine-tuned on an aggregation of 10 high-resourced languages ## Intended uses & limitations #### How to use You can use this model with Transformers *pipeline* for NER. ```python from transformers import AutoTokenizer, AutoModelForTokenClassification from transformers import pipeline tokenizer = AutoTokenizer.from_pretrained("Davlan/bert-base-multilingual-cased-ner-hrl") model = AutoModelForTokenClassification.from_pretrained("Davlan/bert-base-multilingual-cased-ner-hrl") nlp = pipeline("ner", model=model, tokenizer=tokenizer) example = "Nader Jokhadar had given Syria the lead with a well-struck header in the seventh minute." ner_results = nlp(example) print(ner_results) ``` #### Limitations and bias This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains. ## Training data The training data for the 10 languages are from: Language|Dataset -|- Arabic | [ANERcorp](https://camel.abudhabi.nyu.edu/anercorp/) German | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/) English | [conll 2003](https://www.clips.uantwerpen.be/conll2003/ner/) Spanish | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/) French | [Europeana Newspapers](https://github.com/EuropeanaNewspapers/ner-corpora/tree/master/enp_FR.bnf.bio) Italian | [Italian I-CAB](https://ontotext.fbk.eu/icab.html) Latvian | [Latvian NER](https://github.com/LUMII-AILab/FullStack/tree/master/NamedEntities) Dutch | [conll 2002](https://www.clips.uantwerpen.be/conll2002/ner/) Portuguese |[Paramopama + Second Harem](https://github.com/davidsbatista/NER-datasets/tree/master/Portuguese) Chinese | [MSRA](https://huggingface.co/datasets/msra_ner) The training dataset distinguishes between the beginning and continuation of an entity so that if there are back-to-back entities of the same type, the model can output where the second entity begins. As in the dataset, each token will be classified as one of the following classes: Abbreviation|Description -|- O|Outside of a named entity B-PER |Beginning of a person’s name right after another person’s name I-PER |Person’s name B-ORG |Beginning of an organisation right after another organisation I-ORG |Organisation B-LOC |Beginning of a location right after another location I-LOC |Location ## Training procedure This model was trained on NVIDIA V100 GPU with recommended hyperparameters from HuggingFace code.
3,097
[ [ -0.0426025390625, -0.0469970703125, 0.007732391357421875, 0.0295257568359375, -0.0128173828125, 0.006931304931640625, -0.0231781005859375, -0.04339599609375, 0.0428466796875, 0.02459716796875, -0.036041259765625, -0.051971435546875, -0.060211181640625, 0.0266876220703125, -0.0158538818359375, 0.09429931640625, -0.01556396484375, 0.02178955078125, 0.007843017578125, -0.04248046875, -0.005390167236328125, -0.056427001953125, -0.06573486328125, -0.019378662109375, 0.030548095703125, 0.0141448974609375, 0.03466796875, 0.04071044921875, 0.020294189453125, 0.025421142578125, -0.0196685791015625, -0.005481719970703125, -0.0141754150390625, -0.017425537109375, -0.01146697998046875, -0.00951385498046875, -0.0234832763671875, -0.009552001953125, 0.05828857421875, 0.044647216796875, -0.01155853271484375, 0.0189971923828125, -0.004970550537109375, 0.054840087890625, -0.01410675048828125, 0.018524169921875, -0.029876708984375, -0.00391387939453125, -0.0195159912109375, 0.0212554931640625, -0.00913238525390625, -0.0176544189453125, 0.01983642578125, -0.0218353271484375, 0.013031005859375, -0.004627227783203125, 0.1097412109375, -0.0016660690307617188, -0.039764404296875, -0.019073486328125, -0.0230560302734375, 0.0433349609375, -0.039947509765625, 0.05889892578125, 0.0206756591796875, 0.012664794921875, 0.002460479736328125, -0.03448486328125, -0.05633544921875, 0.00004172325134277344, -0.011077880859375, 0.0011234283447265625, -0.0162811279296875, -0.017822265625, 0.02215576171875, 0.026397705078125, -0.04278564453125, -0.007053375244140625, -0.0290679931640625, -0.02325439453125, 0.041046142578125, -0.0207061767578125, 0.036346435546875, -0.0306396484375, -0.032958984375, -0.015594482421875, -0.036773681640625, 0.0140838623046875, 0.02935791015625, 0.035308837890625, -0.04052734375, 0.041534423828125, -0.014862060546875, 0.04949951171875, 0.0176849365234375, -0.0123291015625, 0.051055908203125, -0.0135498046875, -0.015716552734375, -0.00007605552673339844, 0.061920166015625, 0.0249481201171875, 0.011383056640625, -0.0122222900390625, -0.020965576171875, 0.008514404296875, -0.01552581787109375, -0.0615234375, -0.008453369140625, 0.003787994384765625, -0.04144287109375, -0.004665374755859375, -0.0090484619140625, -0.042144775390625, 0.0039215087890625, -0.0278167724609375, 0.0230560302734375, -0.05242919921875, -0.0300750732421875, -0.0018177032470703125, -0.0035572052001953125, 0.03717041015625, 0.00922393798828125, -0.05584716796875, 0.017974853515625, 0.0301513671875, 0.06256103515625, -0.01042938232421875, -0.031982421875, -0.036163330078125, -0.004650115966796875, -0.002933502197265625, 0.05047607421875, -0.029876708984375, -0.01053619384765625, 0.0016679763793945312, 0.0294036865234375, -0.01149749755859375, -0.0330810546875, 0.03265380859375, -0.038116455078125, 0.03216552734375, -0.0305328369140625, -0.040191650390625, -0.02496337890625, 0.028900146484375, -0.05181884765625, 0.092041015625, 0.0345458984375, -0.06671142578125, 0.0419921875, -0.034576416015625, -0.040313720703125, 0.0025882720947265625, -0.0251312255859375, -0.040924072265625, -0.000457763671875, 0.0231781005859375, 0.032562255859375, 0.004856109619140625, 0.0285186767578125, 0.004199981689453125, 0.0093536376953125, -0.01258087158203125, 0.00323486328125, 0.0845947265625, -0.0018224716186523438, -0.027099609375, 0.004138946533203125, -0.06781005859375, -0.017669677734375, 0.0161285400390625, -0.03582763671875, -0.0252685546875, -0.0240936279296875, 0.03460693359375, 0.04034423828125, 0.0139312744140625, -0.052581787109375, 0.017486572265625, -0.029571533203125, 0.0199432373046875, 0.037872314453125, -0.0010547637939453125, 0.0294036865234375, -0.01507568359375, 0.0357666015625, 0.024444580078125, -0.011688232421875, -0.0031909942626953125, -0.043182373046875, -0.07098388671875, -0.01230621337890625, 0.038421630859375, 0.04931640625, -0.0760498046875, 0.026763916015625, -0.0212554931640625, -0.034088134765625, -0.03857421875, 0.004276275634765625, 0.044525146484375, 0.0482177734375, 0.03118896484375, -0.042755126953125, -0.05810546875, -0.07135009765625, -0.0162200927734375, -0.0167083740234375, 0.0173187255859375, 0.026519775390625, 0.050445556640625, -0.0220794677734375, 0.051788330078125, 0.0007071495056152344, -0.031005859375, -0.019439697265625, 0.0014190673828125, 0.03729248046875, 0.040985107421875, 0.052825927734375, -0.06561279296875, -0.048675537109375, -0.0008306503295898438, -0.053131103515625, 0.016845703125, -0.0027828216552734375, -0.0244903564453125, 0.0478515625, 0.0298309326171875, -0.040924072265625, 0.03076171875, 0.048065185546875, -0.033294677734375, 0.0294036865234375, -0.0022983551025390625, -0.00620269775390625, -0.09429931640625, 0.00983428955078125, 0.01318359375, -0.00518035888671875, -0.0445556640625, 0.000009417533874511719, -0.0038852691650390625, -0.001708984375, -0.0443115234375, 0.0693359375, -0.055450439453125, -0.004367828369140625, -0.0010442733764648438, -0.0016803741455078125, -0.01287841796875, 0.040496826171875, 0.0445556640625, 0.043182373046875, 0.04473876953125, -0.0499267578125, 0.01406097412109375, 0.049530029296875, -0.0268707275390625, 0.058807373046875, -0.035858154296875, -0.003986358642578125, -0.019317626953125, 0.0182952880859375, -0.0396728515625, -0.0224761962890625, 0.0198516845703125, -0.045745849609375, 0.03326416015625, -0.0294189453125, -0.03863525390625, -0.0208892822265625, 0.00421142578125, 0.0237884521484375, 0.022674560546875, -0.044891357421875, 0.06085205078125, 0.03045654296875, -0.0062408447265625, -0.05389404296875, -0.060821533203125, 0.01904296875, -0.0204925537109375, -0.04364013671875, 0.02777099609375, 0.0036716461181640625, 0.006084442138671875, 0.00611114501953125, 0.0025234222412109375, -0.0155029296875, -0.007537841796875, 0.009521484375, 0.0272216796875, -0.0229034423828125, 0.005001068115234375, -0.01024627685546875, 0.0002741813659667969, -0.00923919677734375, -0.0160675048828125, 0.048431396484375, -0.02874755859375, -0.0035305023193359375, -0.0305633544921875, 0.0350341796875, 0.0225677490234375, -0.0240936279296875, 0.08294677734375, 0.06695556640625, -0.048004150390625, 0.0162353515625, -0.049713134765625, -0.004146575927734375, -0.030029296875, 0.0177459716796875, -0.043792724609375, -0.0650634765625, 0.0555419921875, 0.0129852294921875, 0.005245208740234375, 0.05157470703125, 0.05389404296875, 0.0216217041015625, 0.0552978515625, 0.06787109375, -0.033599853515625, 0.041229248046875, -0.0240325927734375, 0.00962066650390625, -0.0579833984375, -0.02996826171875, -0.03228759765625, -0.020050048828125, -0.0670166015625, -0.01377105712890625, 0.00989532470703125, 0.005828857421875, -0.0250091552734375, 0.05816650390625, -0.04449462890625, 0.0198822021484375, 0.041046142578125, -0.0052490234375, 0.007808685302734375, 0.0163726806640625, -0.0239410400390625, -0.0018892288208007812, -0.039886474609375, -0.04193115234375, 0.049072265625, 0.03582763671875, 0.028167724609375, 0.01519775390625, 0.0740966796875, -0.01346588134765625, 0.032745361328125, -0.04840087890625, 0.025634765625, -0.009796142578125, -0.056060791015625, -0.0108489990234375, -0.0338134765625, -0.06939697265625, 0.00484466552734375, -0.00885009765625, -0.061279296875, 0.03143310546875, -0.0106964111328125, -0.0203704833984375, 0.0292510986328125, -0.035919189453125, 0.06182861328125, -0.0218505859375, -0.0052032470703125, 0.0212249755859375, -0.06341552734375, 0.01395416259765625, 0.00577545166015625, 0.01690673828125, -0.015167236328125, -0.0010747909545898438, 0.06451416015625, -0.0178070068359375, 0.05255126953125, -0.01904296875, -0.00673675537109375, 0.00701141357421875, -0.0074005126953125, 0.023101806640625, 0.002117156982421875, -0.01092529296875, 0.047760009765625, -0.0006389617919921875, -0.0357666015625, -0.0159454345703125, 0.044647216796875, -0.0635986328125, -0.0168609619140625, -0.04608154296875, -0.0244598388671875, -0.00550079345703125, 0.033050537109375, 0.0401611328125, 0.02044677734375, -0.012542724609375, 0.00232696533203125, 0.032501220703125, -0.0298309326171875, 0.035980224609375, 0.04656982421875, -0.023681640625, -0.04339599609375, 0.0643310546875, 0.00826263427734375, -0.001918792724609375, 0.02154541015625, -0.0022754669189453125, -0.01904296875, -0.0229949951171875, -0.06024169921875, 0.041259765625, -0.035980224609375, -0.021331787109375, -0.0692138671875, -0.03076171875, -0.040985107421875, -0.0004069805145263672, -0.021240234375, -0.0244598388671875, -0.0225677490234375, -0.007366180419921875, 0.0278472900390625, 0.04071044921875, -0.00946807861328125, 0.028167724609375, -0.055023193359375, 0.02655029296875, -0.0007958412170410156, 0.03497314453125, -0.0135650634765625, -0.045745849609375, -0.023162841796875, -0.002040863037109375, -0.00888824462890625, -0.07696533203125, 0.051483154296875, 0.0306243896484375, 0.04022216796875, 0.03240966796875, -0.016876220703125, 0.0465087890625, -0.048004150390625, 0.0478515625, 0.0162811279296875, -0.06597900390625, 0.042236328125, -0.0227508544921875, 0.007144927978515625, 0.046295166015625, 0.056243896484375, -0.0631103515625, -0.0118560791015625, -0.064453125, -0.06842041015625, 0.053314208984375, 0.0165557861328125, 0.0212249755859375, -0.0270843505859375, 0.027923583984375, 0.001068115234375, 0.020721435546875, -0.0709228515625, -0.03802490234375, -0.006450653076171875, -0.01776123046875, 0.00278472900390625, -0.01454925537109375, -0.002532958984375, -0.0280609130859375, 0.08087158203125, -0.0032806396484375, 0.0291748046875, 0.026214599609375, -0.0171356201171875, -0.0043487548828125, 0.0016870498657226562, 0.0304412841796875, 0.0330810546875, -0.00888824462890625, -0.0065155029296875, 0.0220489501953125, -0.03143310546875, 0.0007090568542480469, 0.0232696533203125, -0.0157623291015625, 0.03094482421875, 0.0228271484375, 0.073974609375, 0.001491546630859375, -0.033172607421875, 0.0531005859375, -0.01045989990234375, -0.0097198486328125, -0.0382080078125, -0.0258331298828125, 0.006877899169921875, 0.023284912109375, 0.0288848876953125, -0.01366424560546875, 0.0022106170654296875, -0.04364013671875, 0.03460693359375, 0.033782958984375, -0.0271148681640625, -0.0287933349609375, 0.036468505859375, 0.005611419677734375, -0.02032470703125, 0.0562744140625, -0.028778076171875, -0.0640869140625, 0.04754638671875, 0.034698486328125, 0.06231689453125, -0.03155517578125, 0.0101776123046875, 0.058990478515625, 0.0268402099609375, 0.007129669189453125, 0.035736083984375, 0.005016326904296875, -0.06982421875, -0.027252197265625, -0.070068359375, -0.01325225830078125, 0.01483917236328125, -0.070556640625, 0.037078857421875, -0.034454345703125, -0.0191497802734375, 0.0113677978515625, 0.012054443359375, -0.07684326171875, 0.0198822021484375, 0.030181884765625, 0.079833984375, -0.078369140625, 0.06170654296875, 0.07647705078125, -0.049957275390625, -0.0635986328125, -0.0146636962890625, 0.004581451416015625, -0.07281494140625, 0.062103271484375, 0.027557373046875, 0.0218048095703125, -0.00679779052734375, -0.03265380859375, -0.0751953125, 0.055816650390625, 0.0195770263671875, -0.036529541015625, -0.017303466796875, -0.0103912353515625, 0.03564453125, -0.039794921875, 0.022796630859375, 0.0455322265625, 0.0301513671875, 0.00493621826171875, -0.07989501953125, 0.0004944801330566406, -0.035797119140625, 0.003772735595703125, 0.01885986328125, -0.0635986328125, 0.058929443359375, -0.01461029052734375, -0.02618408203125, -0.00011992454528808594, 0.0653076171875, 0.0176849365234375, 0.0224456787109375, 0.041839599609375, 0.06268310546875, 0.0426025390625, -0.0080718994140625, 0.0653076171875, -0.04290771484375, 0.030548095703125, 0.07806396484375, -0.0025959014892578125, 0.064208984375, 0.035919189453125, -0.01189422607421875, 0.06561279296875, 0.05755615234375, -0.0162200927734375, 0.0255889892578125, 0.0003974437713623047, -0.0155029296875, 0.005397796630859375, -0.0213470458984375, -0.0278167724609375, 0.045257568359375, 0.0174560546875, -0.033050537109375, -0.0182647705078125, 0.008544921875, 0.03515625, -0.01294708251953125, -0.0017995834350585938, 0.06573486328125, 0.005397796630859375, -0.042144775390625, 0.04364013671875, 0.009735107421875, 0.05511474609375, -0.029296875, -0.002193450927734375, -0.0177764892578125, -0.00391387939453125, -0.01959228515625, -0.042388916015625, 0.0249481201171875, 0.002197265625, -0.017303466796875, -0.01398468017578125, 0.033233642578125, -0.05242919921875, -0.0521240234375, 0.03143310546875, 0.04827880859375, 0.03466796875, 0.004558563232421875, -0.07110595703125, 0.0085906982421875, 0.0031795501708984375, -0.027740478515625, 0.02349853515625, 0.037506103515625, -0.006175994873046875, 0.03204345703125, 0.04833984375, 0.01885986328125, 0.00008445978164672852, 0.0126800537109375, 0.07025146484375, -0.058135986328125, -0.035736083984375, -0.055450439453125, 0.02813720703125, -0.0099334716796875, -0.035400390625, 0.058441162109375, 0.055694580078125, 0.08746337890625, 0.005649566650390625, 0.0482177734375, -0.022979736328125, 0.048675537109375, -0.024658203125, 0.054840087890625, -0.03887939453125, -0.01493072509765625, -0.0301971435546875, -0.0762939453125, -0.0194549560546875, 0.061676025390625, -0.00930023193359375, 0.01369476318359375, 0.03460693359375, 0.037322998046875, -0.00301361083984375, -0.0230560302734375, 0.001384735107421875, 0.0186920166015625, 0.0070648193359375, 0.045166015625, 0.035369873046875, -0.04400634765625, 0.0203094482421875, -0.03851318359375, -0.0152587890625, -0.003467559814453125, -0.07244873046875, -0.07257080078125, -0.053436279296875, -0.041229248046875, -0.04901123046875, -0.006603240966796875, 0.0716552734375, 0.05462646484375, -0.07830810546875, -0.0171661376953125, 0.007122039794921875, 0.0004134178161621094, -0.00452423095703125, -0.016845703125, 0.039581298828125, -0.01177978515625, -0.0599365234375, 0.006786346435546875, 0.006847381591796875, 0.01306915283203125, -0.0150604248046875, -0.008697509765625, -0.038909912109375, -0.007781982421875, 0.043731689453125, 0.03759765625, -0.057342529296875, -0.011749267578125, -0.0007848739624023438, -0.0169677734375, 0.010772705078125, 0.0328369140625, -0.0599365234375, 0.0234832763671875, 0.0190582275390625, 0.05169677734375, 0.04296875, 0.00016605854034423828, 0.0143280029296875, -0.060882568359375, 0.02032470703125, 0.01126861572265625, 0.0408935546875, 0.049530029296875, -0.032958984375, 0.048553466796875, 0.02410888671875, -0.0308685302734375, -0.05438232421875, -0.000015497207641601562, -0.07330322265625, 0.004985809326171875, 0.08599853515625, -0.01488494873046875, -0.031219482421875, 0.0006914138793945312, -0.00724029541015625, 0.037933349609375, -0.035430908203125, 0.03729248046875, 0.0631103515625, -0.00592041015625, -0.017547607421875, -0.040130615234375, 0.03662109375, 0.023712158203125, -0.048065185546875, -0.024658203125, 0.029144287109375, 0.041229248046875, 0.017608642578125, 0.055633544921875, -0.00728607177734375, 0.004436492919921875, -0.0157623291015625, 0.03179931640625, 0.017608642578125, -0.019287109375, -0.0343017578125, -0.0191650390625, -0.0225677490234375, -0.00771331787109375 ] ]
prithivida/parrot_fluency_model
2022-06-24T09:54:04.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "text-classification", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-classification
prithivida
null
null
prithivida/parrot_fluency_model
1
140,273
transformers
2022-05-27T02:04:04
--- license: apache-2.0 --- Parrot THIS IS AN ANCILLARY MODEL FOR PARROT PARAPHRASER 1. What is Parrot? Parrot is a paraphrase-based utterance augmentation framework purpose-built to accelerate training NLU models. A paraphrase framework is more than just a paraphrasing model. Please refer to the GitHub page or The model card prithivida/parrot_paraphraser_on_T5
364
[ [ -0.00707244873046875, -0.07470703125, -0.00003933906555175781, 0.033843994140625, -0.0135498046875, -0.021514892578125, 0.017913818359375, -0.02459716796875, 0.00505828857421875, 0.0526123046875, -0.041778564453125, 0.01904296875, -0.0021228790283203125, 0.018890380859375, -0.04998779296875, 0.05426025390625, 0.0537109375, 0.007411956787109375, -0.02484130859375, -0.00797271728515625, -0.0161590576171875, -0.054534912109375, -0.05255126953125, -0.0286712646484375, 0.02880859375, 0.03216552734375, 0.08221435546875, 0.047607421875, 0.039337158203125, 0.027374267578125, -0.019134521484375, -0.0312042236328125, -0.01348876953125, 0.00228118896484375, -0.0218048095703125, -0.052001953125, -0.040771484375, -0.00554656982421875, 0.031585693359375, 0.051605224609375, -0.0192413330078125, 0.0079345703125, 0.00870513916015625, 0.0291595458984375, -0.048065185546875, 0.01218414306640625, -0.041748046875, 0.0189208984375, -0.0256500244140625, -0.030120849609375, -0.0130157470703125, -0.0170135498046875, 0.0198516845703125, -0.05865478515625, 0.005706787109375, -0.01000213623046875, 0.03985595703125, 0.01549530029296875, -0.0283203125, -0.03167724609375, -0.034698486328125, 0.058624267578125, -0.0782470703125, -0.0003402233123779297, 0.024627685546875, 0.041168212890625, -0.0003323554992675781, -0.0845947265625, -0.061248779296875, 0.0005598068237304688, -0.01003265380859375, 0.004222869873046875, 0.0035228729248046875, 0.025054931640625, 0.00701141357421875, 0.050750732421875, -0.01971435546875, -0.040771484375, -0.03326416015625, 0.002193450927734375, 0.0307464599609375, 0.00812530517578125, 0.045074462890625, -0.0168609619140625, -0.0335693359375, 0.0027217864990234375, -0.0261077880859375, 0.007740020751953125, 0.009521484375, 0.02203369140625, -0.0013074874877929688, 0.039581298828125, -0.006443023681640625, 0.07501220703125, 0.0029773712158203125, 0.0018062591552734375, 0.01483154296875, -0.0089569091796875, -0.031829833984375, 0.005771636962890625, 0.038360595703125, 0.007526397705078125, 0.0165252685546875, -0.02081298828125, 0.0016565322875976562, -0.009002685546875, 0.0404052734375, -0.039825439453125, -0.030181884765625, 0.00879669189453125, -0.057708740234375, -0.059478759765625, 0.0009660720825195312, -0.04144287109375, -0.0089569091796875, -0.00506591796875, 0.025177001953125, -0.0440673828125, -0.026885986328125, -0.003551483154296875, -0.0496826171875, 0.037445068359375, 0.010223388671875, -0.054107666015625, 0.02264404296875, 0.05242919921875, 0.06072998046875, 0.0276947021484375, -0.059661865234375, -0.0311431884765625, 0.0159759521484375, -0.030120849609375, 0.04486083984375, -0.07818603515625, -0.0517578125, -0.00240325927734375, 0.022064208984375, -0.024993896484375, -0.0128173828125, 0.05938720703125, -0.02630615234375, 0.039703369140625, -0.033905029296875, -0.06878662109375, -0.05792236328125, 0.01464080810546875, -0.04669189453125, 0.0472412109375, 0.0129852294921875, -0.06561279296875, 0.0064697265625, -0.0726318359375, -0.03338623046875, 0.03533935546875, -0.00203704833984375, -0.0015621185302734375, 0.00994873046875, 0.0176239013671875, 0.0180206298828125, -0.0276641845703125, 0.00745391845703125, -0.038177490234375, -0.042724609375, 0.03021240234375, -0.033843994140625, 0.0770263671875, 0.0325927734375, 0.0006194114685058594, 0.01715087890625, -0.0667724609375, -0.003528594970703125, -0.01097869873046875, -0.0306854248046875, -0.01511383056640625, 0.01430511474609375, 0.03363037109375, 0.025970458984375, 0.0177459716796875, -0.041107177734375, 0.0009541511535644531, -0.025970458984375, 0.08154296875, 0.00981903076171875, 0.0270233154296875, 0.0211181640625, -0.06964111328125, 0.051971435546875, -0.01238250732421875, 0.023681640625, -0.015289306640625, -0.036102294921875, -0.034942626953125, -0.0178375244140625, 0.01172637939453125, 0.03997802734375, -0.046173095703125, 0.0243682861328125, -0.004558563232421875, -0.0276641845703125, -0.0298919677734375, 0.00940704345703125, 0.0223388671875, 0.0577392578125, 0.031097412109375, 0.003326416015625, -0.036346435546875, -0.062286376953125, -0.01556396484375, -0.02886962890625, -0.0013761520385742188, -0.00698089599609375, 0.01486968994140625, 0.006000518798828125, 0.062042236328125, 0.0304107666015625, -0.0245513916015625, -0.0340576171875, -0.005817413330078125, 0.004116058349609375, 0.045562744140625, 0.07147216796875, -0.07086181640625, -0.016815185546875, -0.00812530517578125, -0.0799560546875, 0.00817108154296875, -0.0279388427734375, -0.0015392303466796875, -0.040802001953125, 0.049346923828125, -0.034423828125, 0.021514892578125, 0.017669677734375, -0.01313018798828125, 0.03363037109375, -0.0107269287109375, 0.006927490234375, -0.07470703125, 0.01268768310546875, -0.037017822265625, -0.01345062255859375, -0.045654296875, 0.046295166015625, 0.006465911865234375, -0.018096923828125, -0.0472412109375, 0.0271759033203125, -0.0193328857421875, -0.003856658935546875, -0.04302978515625, -0.006999969482421875, 0.0216217041015625, 0.02557373046875, 0.0165252685546875, 0.08172607421875, 0.07257080078125, -0.05096435546875, 0.0318603515625, 0.059539794921875, -0.005527496337890625, 0.045440673828125, -0.062408447265625, 0.003681182861328125, 0.01641845703125, 0.01380157470703125, -0.050140380859375, -0.0295257568359375, -0.0005717277526855469, -0.019378662109375, -0.01428985595703125, -0.00231170654296875, -0.01348876953125, -0.040191650390625, -0.0175933837890625, 0.03485107421875, 0.0193939208984375, -0.055633544921875, 0.0401611328125, 0.00319671630859375, -0.026153564453125, -0.0201873779296875, -0.08056640625, 0.035858154296875, -0.031951904296875, -0.03314208984375, 0.007167816162109375, 0.00402069091796875, -0.00508880615234375, 0.00753021240234375, 0.0257720947265625, -0.0175628662109375, 0.01308441162109375, 0.00847625732421875, -0.00989532470703125, -0.036590576171875, 0.0157623291015625, -0.00537109375, 0.0206298828125, 0.00826263427734375, -0.0161285400390625, 0.0238800048828125, -0.0270843505859375, -0.00902557373046875, -0.053009033203125, 0.040985107421875, 0.056976318359375, -0.0204315185546875, 0.038421630859375, 0.009124755859375, -0.0032100677490234375, 0.009765625, -0.0277099609375, 0.0023097991943359375, -0.038330078125, 0.015869140625, -0.058319091796875, -0.0110321044921875, 0.041778564453125, 0.0135650634765625, -0.006587982177734375, 0.024078369140625, 0.0154876708984375, -0.001628875732421875, 0.04315185546875, 0.0364990234375, -0.0157318115234375, 0.045989990234375, -0.006649017333984375, -0.02880859375, -0.0648193359375, 0.002223968505859375, -0.01611328125, -0.0160064697265625, -0.0095672607421875, -0.0244598388671875, -0.005619049072265625, 0.044403076171875, -0.0203094482421875, 0.0219573974609375, -0.0302734375, 0.0284576416015625, 0.06585693359375, 0.007373809814453125, 0.01021575927734375, -0.0093841552734375, -0.001140594482421875, 0.026641845703125, -0.0679931640625, -0.049591064453125, 0.0797119140625, 0.00843048095703125, 0.03729248046875, 0.01153564453125, 0.0279388427734375, 0.0284576416015625, -0.0199127197265625, -0.037353515625, 0.042755126953125, -0.00246429443359375, -0.052703857421875, -0.0261077880859375, 0.00992584228515625, -0.043121337890625, 0.023040771484375, -0.0129852294921875, -0.04547119140625, 0.040802001953125, 0.010406494140625, -0.01499176025390625, 0.0242919921875, -0.018402099609375, 0.06170654296875, -0.00598907470703125, -0.00800323486328125, -0.01166534423828125, -0.0350341796875, 0.0276641845703125, 0.0037021636962890625, 0.005008697509765625, 0.016021728515625, 0.00766754150390625, 0.05572509765625, -0.0859375, 0.044647216796875, -0.01080322265625, 0.0220489501953125, 0.04193115234375, 0.00344085693359375, 0.02789306640625, -0.005130767822265625, -0.056671142578125, 0.006168365478515625, 0.02679443359375, -0.06591796875, -0.041168212890625, 0.0290679931640625, -0.04345703125, -0.032470703125, -0.0123138427734375, -0.04144287109375, -0.01108551025390625, 0.0028533935546875, 0.033935546875, 0.038116455078125, -0.0229949951171875, 0.0556640625, -0.0173187255859375, -0.016204833984375, 0.048828125, 0.0098114013671875, -0.00798797607421875, -0.0253753662109375, 0.0587158203125, -0.0304718017578125, 0.018768310546875, 0.037811279296875, 0.0323486328125, -0.0256500244140625, -0.035675048828125, -0.0225372314453125, -0.004779815673828125, -0.05804443359375, 0.006134033203125, -0.044219970703125, -0.03546142578125, -0.0343017578125, -0.02911376953125, 0.00677490234375, -0.05621337890625, -0.0181884765625, -0.01187896728515625, 0.044708251953125, 0.05462646484375, -0.006282806396484375, 0.059539794921875, -0.043212890625, 0.0180511474609375, 0.0457763671875, -0.022918701171875, 0.0127105712890625, -0.031494140625, 0.0145111083984375, 0.021514892578125, -0.03326416015625, -0.067626953125, 0.033477783203125, 0.07415771484375, 0.050384521484375, 0.045166015625, 0.006931304931640625, 0.06793212890625, -0.021697998046875, 0.08837890625, 0.012542724609375, -0.083251953125, 0.049407958984375, -0.03173828125, 0.0170440673828125, 0.040740966796875, -0.01308441162109375, -0.0288543701171875, -0.054046630859375, -0.03936767578125, -0.05010986328125, 0.043914794921875, 0.030181884765625, 0.0175933837890625, -0.01433563232421875, 0.03643798828125, 0.0071258544921875, 0.03448486328125, -0.056976318359375, -0.007415771484375, -0.06536865234375, -0.009033203125, -0.0155487060546875, -0.0302734375, 0.001789093017578125, -0.0173187255859375, 0.0458984375, -0.006565093994140625, 0.0091705322265625, 0.01629638671875, -0.0277252197265625, -0.0071563720703125, 0.0197601318359375, 0.02899169921875, 0.0296478271484375, -0.01934814453125, -0.013763427734375, 0.007904052734375, -0.0267333984375, -0.0007166862487792969, -0.009490966796875, -0.019378662109375, 0.0010557174682617188, 0.06951904296875, 0.068603515625, 0.028045654296875, -0.027374267578125, 0.0438232421875, -0.0014400482177734375, -0.035491943359375, -0.0178375244140625, 0.016326904296875, 0.0009713172912597656, 0.0299072265625, 0.04229736328125, 0.006443023681640625, 0.046051025390625, -0.05218505859375, 0.0281829833984375, 0.019561767578125, -0.0300750732421875, -0.011138916015625, 0.084716796875, 0.033050537109375, -0.03955078125, 0.036346435546875, 0.01401519775390625, -0.035247802734375, 0.04345703125, 0.045654296875, 0.060211181640625, -0.007129669189453125, 0.019927978515625, 0.00202178955078125, -0.0066986083984375, -0.0040740966796875, 0.032073974609375, -0.0401611328125, -0.0254669189453125, -0.00446319580078125, -0.0650634765625, -0.041107177734375, 0.01457977294921875, -0.08721923828125, 0.01910400390625, -0.02972412109375, 0.0038700103759765625, 0.0134429931640625, -0.0186309814453125, -0.0302276611328125, 0.040008544921875, -0.00704193115234375, 0.0792236328125, -0.034912109375, 0.08447265625, 0.062255859375, -0.0498046875, -0.06219482421875, 0.0174407958984375, -0.0050201416015625, -0.052642822265625, 0.050140380859375, 0.0004813671112060547, 0.00646209716796875, 0.00795745849609375, -0.007167816162109375, -0.044219970703125, 0.061126708984375, 0.017791748046875, -0.038360595703125, -0.038177490234375, 0.01462554931640625, 0.03753662109375, -0.0382080078125, 0.041717529296875, 0.014312744140625, 0.01209259033203125, 0.01537322998046875, -0.06182861328125, 0.00023043155670166016, -0.0423583984375, 0.0279998779296875, -0.0482177734375, -0.040008544921875, 0.07696533203125, 0.0163421630859375, 0.0158843994140625, 0.045806884765625, 0.041748046875, 0.0021610260009765625, -0.0081634521484375, 0.029144287109375, 0.007411956787109375, 0.0458984375, 0.0268402099609375, 0.08843994140625, -0.050750732421875, 0.0185394287109375, 0.10284423828125, -0.0035610198974609375, 0.07806396484375, 0.03668212890625, -0.013946533203125, 0.047210693359375, 0.047760009765625, 0.004009246826171875, 0.04559326171875, -0.011505126953125, -0.004730224609375, -0.0304107666015625, 0.031097412109375, -0.05841064453125, 0.04693603515625, 0.0302734375, -0.0241546630859375, -0.02642822265625, 0.037750244140625, -0.05609130859375, 0.0121307373046875, -0.0161895751953125, 0.06976318359375, 0.0007996559143066406, -0.0233612060546875, 0.05462646484375, -0.0197906494140625, 0.03387451171875, -0.03350830078125, -0.0002105236053466797, -0.01079559326171875, 0.0187530517578125, -0.006031036376953125, -0.02740478515625, 0.03948974609375, -0.0241851806640625, -0.0024967193603515625, 0.0068817138671875, 0.03045654296875, -0.0313720703125, -0.0665283203125, 0.0252227783203125, 0.008636474609375, 0.01556396484375, -0.0228271484375, -0.09149169921875, -0.0180511474609375, -0.016632080078125, -0.0217742919921875, -0.0153350830078125, 0.0394287109375, 0.0014085769653320312, 0.060699462890625, 0.03314208984375, 0.0107269287109375, 0.0237579345703125, 0.003345489501953125, 0.038421630859375, -0.053924560546875, -0.079345703125, -0.057708740234375, 0.0133209228515625, -0.019775390625, -0.0189666748046875, 0.077392578125, 0.049468994140625, 0.0204010009765625, -0.03314208984375, 0.08392333984375, -0.004840850830078125, 0.043304443359375, -0.01934814453125, 0.054168701171875, -0.053375244140625, 0.0207977294921875, -0.02301025390625, -0.0626220703125, 0.0022907257080078125, 0.0711669921875, -0.005279541015625, -0.0225982666015625, 0.034912109375, 0.0462646484375, -0.0178375244140625, 0.043914794921875, 0.044219970703125, 0.0065765380859375, 0.008331298828125, 0.0170745849609375, 0.0714111328125, -0.056884765625, 0.03875732421875, -0.0190887451171875, 0.01348876953125, -0.0221099853515625, -0.047119140625, -0.0743408203125, -0.03582763671875, -0.0362548828125, -0.015106201171875, 0.033233642578125, 0.041717529296875, 0.06817626953125, -0.07574462890625, -0.024993896484375, -0.06915283203125, -0.0104217529296875, -0.041107177734375, -0.02069091796875, 0.0004267692565917969, -0.056365966796875, -0.048828125, 0.045654296875, 0.005290985107421875, 0.0008625984191894531, -0.005054473876953125, -0.01305389404296875, -0.03045654296875, 0.041046142578125, 0.0154571533203125, 0.0394287109375, -0.07232666015625, -0.0242919921875, -0.0270233154296875, 0.00897979736328125, -0.0003452301025390625, 0.043182373046875, -0.04815673828125, 0.034423828125, 0.0181732177734375, 0.031158447265625, 0.027374267578125, 0.0204315185546875, 0.0804443359375, -0.05889892578125, 0.048248291015625, 0.00710296630859375, 0.0272674560546875, 0.0240325927734375, -0.0212860107421875, 0.024505615234375, 0.007083892822265625, -0.040740966796875, -0.05816650390625, 0.0025539398193359375, -0.06842041015625, -0.049072265625, 0.06988525390625, -0.021697998046875, -0.037567138671875, -0.0168609619140625, -0.02557373046875, 0.0064544677734375, -0.0008697509765625, 0.02606201171875, 0.041900634765625, 0.006191253662109375, -0.01552581787109375, -0.01788330078125, 0.045562744140625, 0.0177764892578125, -0.036865234375, -0.00360107421875, 0.005634307861328125, 0.0316162109375, 0.003597259521484375, 0.03863525390625, -0.0024394989013671875, 0.049346923828125, 0.040130615234375, 0.004150390625, -0.025421142578125, -0.001018524169921875, -0.01702880859375, 0.03472900390625, 0.00452423095703125, -0.0533447265625 ] ]
microsoft/dit-base-finetuned-rvlcdip
2023-02-27T17:57:24.000Z
[ "transformers", "pytorch", "beit", "image-classification", "dit", "vision", "dataset:rvl_cdip", "arxiv:2203.02378", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
microsoft
null
null
microsoft/dit-base-finetuned-rvlcdip
20
140,134
transformers
2022-03-07T20:48:42
--- tags: - dit - vision - image-classification datasets: - rvl_cdip widget: - src: https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip/resolve/main/coca_cola_advertisement.png example_title: Advertisement - src: https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip/resolve/main/scientific_publication.png example_title: Scientific publication --- # Document Image Transformer (base-sized model) Document Image Transformer (DiT) model pre-trained on IIT-CDIP (Lewis et al., 2006), a dataset that includes 42 million document images and fine-tuned on [RVL-CDIP](https://www.cs.cmu.edu/~aharley/rvl-cdip/), a dataset consisting of 400,000 grayscale images in 16 classes, with 25,000 images per class. It was introduced in the paper [DiT: Self-supervised Pre-training for Document Image Transformer](https://arxiv.org/abs/2203.02378) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/dit). Note that DiT is identical to the architecture of [BEiT](https://huggingface.co/docs/transformers/model_doc/beit). Disclaimer: The team releasing DiT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Document Image Transformer (DiT) is a transformer encoder model (BERT-like) pre-trained on a large collection of images in a self-supervised fashion. The pre-training objective for the model is to predict visual tokens from the encoder of a discrete VAE (dVAE), based on masked patches. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled document images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for encoding document images into a vector space, but it's mostly meant to be fine-tuned on tasks like document image classification, table detection or document layout analysis. See the [model hub](https://huggingface.co/models?search=microsoft/dit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model in PyTorch: ```python from transformers import AutoImageProcessor, AutoModelForImageClassification import torch from PIL import Image image = Image.open('path_to_your_document_image').convert('RGB') processor = AutoImageProcessor.from_pretrained("microsoft/dit-base-finetuned-rvlcdip") model = AutoModelForImageClassification.from_pretrained("microsoft/dit-base-finetuned-rvlcdip") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 16 RVL-CDIP classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` ### BibTeX entry and citation info ```bibtex @article{Lewis2006BuildingAT, title={Building a test collection for complex document information processing}, author={David D. Lewis and Gady Agam and Shlomo Engelson Argamon and Ophir Frieder and David A. Grossman and Jefferson Heard}, journal={Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval}, year={2006} } ```
3,583
[ [ -0.0462646484375, -0.03985595703125, 0.0166778564453125, -0.006443023681640625, -0.0143280029296875, -0.00905609130859375, 0.002498626708984375, -0.03033447265625, -0.0182952880859375, 0.0150604248046875, -0.03369140625, -0.0223541259765625, -0.06866455078125, 0.0004096031188964844, -0.047271728515625, 0.079345703125, -0.007312774658203125, -0.00972747802734375, -0.0105743408203125, 0.001922607421875, -0.03125, -0.02874755859375, -0.02593994140625, -0.0233001708984375, 0.0213470458984375, 0.01326751708984375, 0.042755126953125, 0.038055419921875, 0.05810546875, 0.035186767578125, 0.01383209228515625, -0.007068634033203125, -0.032196044921875, -0.0213623046875, 0.00007021427154541016, -0.038177490234375, -0.035308837890625, 0.02496337890625, 0.033294677734375, 0.0152130126953125, 0.0095367431640625, 0.0081634521484375, 0.00878143310546875, 0.046600341796875, -0.033477783203125, 0.01239776611328125, -0.03753662109375, 0.006824493408203125, -0.002933502197265625, -0.007747650146484375, -0.03594970703125, -0.0083465576171875, 0.02001953125, -0.03826904296875, 0.05621337890625, -0.01189422607421875, 0.09375, 0.020721435546875, -0.0168914794921875, -0.0240478515625, -0.051239013671875, 0.04437255859375, -0.03582763671875, 0.031707763671875, 0.01507568359375, 0.0178680419921875, -0.00039267539978027344, -0.08245849609375, -0.061187744140625, -0.013427734375, -0.0252532958984375, 0.0254974365234375, -0.0364990234375, 0.01512908935546875, 0.031463623046875, 0.04498291015625, -0.05328369140625, 0.00029158592224121094, -0.047698974609375, -0.00664520263671875, 0.03564453125, -0.01418304443359375, 0.0233917236328125, -0.01446533203125, -0.0379638671875, -0.038909912109375, -0.00888824462890625, -0.0186614990234375, 0.02392578125, -0.0171661376953125, -0.01306915283203125, 0.034942626953125, 0.01554107666015625, 0.057159423828125, 0.029571533203125, -0.011932373046875, 0.0299530029296875, -0.00699615478515625, -0.025482177734375, -0.012725830078125, 0.06341552734375, 0.0228729248046875, 0.015838623046875, -0.0012159347534179688, -0.024993896484375, 0.0185394287109375, 0.0236968994140625, -0.08087158203125, -0.039337158203125, 0.00922393798828125, -0.046875, -0.02264404296875, 0.01056671142578125, -0.052978515625, -0.0020313262939453125, -0.033721923828125, 0.0478515625, -0.054107666015625, -0.020233154296875, -0.00848388671875, -0.0197906494140625, 0.0264892578125, 0.036956787109375, -0.049530029296875, 0.0118560791015625, 0.009857177734375, 0.066162109375, -0.00994873046875, -0.040679931640625, -0.0163421630859375, -0.01293182373046875, -0.009185791015625, 0.059967041015625, -0.0194091796875, -0.030914306640625, -0.003765106201171875, 0.01535797119140625, -0.006534576416015625, -0.02545166015625, 0.03082275390625, -0.041351318359375, 0.031982421875, -0.0007977485656738281, -0.03759765625, -0.00498199462890625, 0.0232696533203125, -0.043182373046875, 0.091064453125, 0.03485107421875, -0.07281494140625, 0.0249176025390625, -0.038604736328125, -0.01042938232421875, -0.0035724639892578125, -0.006011962890625, -0.050445556640625, 0.0034847259521484375, 0.0128021240234375, 0.04852294921875, -0.0190277099609375, 0.005779266357421875, -0.01116180419921875, -0.0256805419921875, 0.00386810302734375, -0.032073974609375, 0.049163818359375, 0.034759521484375, -0.02838134765625, -0.005641937255859375, -0.05859375, -0.01049041748046875, 0.03082275390625, -0.0172882080078125, -0.007537841796875, -0.033782958984375, 0.02496337890625, 0.0338134765625, 0.0298004150390625, -0.044769287109375, 0.03338623046875, -0.005748748779296875, 0.023712158203125, 0.05096435546875, -0.028228759765625, 0.0389404296875, -0.0202484130859375, 0.03631591796875, 0.00920867919921875, 0.020416259765625, -0.024200439453125, -0.01494598388671875, -0.05670166015625, -0.037628173828125, 0.0270538330078125, 0.0308990478515625, -0.04302978515625, 0.04034423828125, -0.02386474609375, -0.055084228515625, -0.0264129638671875, -0.00997161865234375, 0.0259552001953125, 0.04412841796875, 0.0299072265625, -0.029632568359375, -0.0261688232421875, -0.064697265625, 0.0225067138671875, -0.004169464111328125, -0.0013818740844726562, -0.004070281982421875, 0.05126953125, -0.0275726318359375, 0.07037353515625, -0.0435791015625, -0.039093017578125, -0.00922393798828125, 0.03228759765625, 0.0210418701171875, 0.0394287109375, 0.0517578125, -0.07147216796875, -0.058380126953125, -0.0209808349609375, -0.050933837890625, 0.004215240478515625, -0.0195465087890625, -0.006191253662109375, 0.0240478515625, 0.0360107421875, -0.05621337890625, 0.05340576171875, 0.031494140625, -0.0168609619140625, 0.03289794921875, -0.0384521484375, 0.00847625732421875, -0.07879638671875, 0.0004887580871582031, 0.0178985595703125, -0.01145172119140625, -0.033447265625, -0.012115478515625, 0.02508544921875, -0.01678466796875, -0.0167999267578125, 0.0306854248046875, -0.06005859375, -0.009124755859375, -0.029632568359375, -0.00775909423828125, 0.01403045654296875, 0.037628173828125, 0.019927978515625, 0.04998779296875, 0.050262451171875, -0.0262603759765625, 0.0193634033203125, 0.03564453125, -0.0088043212890625, 0.060577392578125, -0.053192138671875, 0.0200958251953125, -0.01806640625, 0.02117919921875, -0.0828857421875, -0.0008578300476074219, -0.0057830810546875, -0.032257080078125, 0.07000732421875, -0.02056884765625, -0.051544189453125, -0.05462646484375, -0.01306915283203125, 0.015625, 0.050872802734375, -0.048095703125, 0.050506591796875, 0.0161590576171875, 0.0209503173828125, -0.045745849609375, -0.051422119140625, -0.006099700927734375, -0.003192901611328125, -0.053985595703125, 0.052734375, -0.01065826416015625, 0.0158843994140625, 0.015869140625, -0.003223419189453125, -0.002040863037109375, -0.011962890625, 0.027130126953125, 0.038360595703125, -0.0019550323486328125, 0.01044464111328125, -0.0255279541015625, -0.0186920166015625, -0.00905609130859375, -0.016448974609375, 0.04962158203125, -0.0289459228515625, -0.03173828125, -0.02392578125, 0.0168609619140625, 0.04241943359375, -0.0203399658203125, 0.042236328125, 0.0699462890625, -0.023193359375, -0.00978851318359375, -0.049652099609375, -0.009429931640625, -0.038055419921875, 0.041412353515625, -0.03875732421875, -0.0278472900390625, 0.035858154296875, -0.007358551025390625, -0.00035691261291503906, 0.061370849609375, 0.044647216796875, -0.0167999267578125, 0.0548095703125, 0.060302734375, -0.0026226043701171875, 0.058349609375, -0.0631103515625, 0.0006127357482910156, -0.054107666015625, -0.008026123046875, -0.0258941650390625, -0.023956298828125, -0.0499267578125, -0.0293121337890625, 0.0175018310546875, 0.00684356689453125, -0.01453399658203125, 0.044952392578125, -0.089111328125, 0.0257568359375, 0.0576171875, 0.00952911376953125, -0.00074005126953125, 0.002410888671875, 0.00664520263671875, 0.00893402099609375, -0.040069580078125, -0.032684326171875, 0.0849609375, 0.034912109375, 0.0731201171875, -0.0217132568359375, 0.060302734375, 0.021026611328125, 0.02008056640625, -0.0498046875, 0.0242919921875, -0.0207061767578125, -0.04376220703125, -0.0104522705078125, -0.00152587890625, -0.08172607421875, 0.0037746429443359375, -0.01224517822265625, -0.04388427734375, 0.03631591796875, 0.00772857666015625, 0.0012454986572265625, 0.023468017578125, -0.0714111328125, 0.07586669921875, -0.0291900634765625, -0.020233154296875, 0.0034275054931640625, -0.051055908203125, -0.00014221668243408203, -0.007434844970703125, -0.0087890625, 0.009796142578125, 0.0296630859375, 0.06011962890625, -0.042755126953125, 0.06011962890625, -0.01372528076171875, 0.01079559326171875, 0.034271240234375, -0.005641937255859375, 0.040283203125, -0.031402587890625, -0.0020847320556640625, 0.02386474609375, 0.025299072265625, -0.0218505859375, -0.039581298828125, 0.038299560546875, -0.09197998046875, -0.033203125, -0.05706787109375, -0.039581298828125, 0.01392364501953125, 0.02801513671875, 0.055908203125, 0.036895751953125, -0.01189422607421875, 0.0087890625, 0.050872802734375, -0.01468658447265625, 0.03778076171875, 0.0244903564453125, -0.001789093017578125, -0.01317596435546875, 0.06170654296875, 0.0306243896484375, 0.0134429931640625, 0.028167724609375, 0.0088043212890625, -0.030670166015625, -0.046173095703125, -0.0177764892578125, 0.0223388671875, -0.07427978515625, -0.0263519287109375, -0.054901123046875, -0.04107666015625, -0.034515380859375, -0.0208587646484375, -0.0257110595703125, -0.006763458251953125, -0.0423583984375, -0.0016336441040039062, 0.03118896484375, 0.05816650390625, -0.0030002593994140625, 0.040863037109375, -0.04632568359375, 0.02490234375, 0.02728271484375, 0.0272064208984375, -0.01068115234375, -0.05731201171875, -0.0038280487060546875, -0.01385498046875, -0.048797607421875, -0.06414794921875, 0.042633056640625, 0.00971221923828125, 0.048797607421875, 0.006916046142578125, 0.0082550048828125, 0.0416259765625, -0.034027099609375, 0.0364990234375, 0.0267486572265625, -0.050628662109375, 0.048187255859375, -0.0180206298828125, 0.016937255859375, 0.02587890625, 0.039703369140625, -0.0159149169921875, -0.0073699951171875, -0.08203125, -0.04962158203125, 0.062347412109375, 0.0154266357421875, 0.01180267333984375, 0.006763458251953125, 0.0237884521484375, 0.0010862350463867188, 0.0014705657958984375, -0.06451416015625, -0.02191162109375, -0.0509033203125, -0.033203125, -0.002758026123046875, -0.0310211181640625, 0.0021076202392578125, -0.028900146484375, 0.0455322265625, -0.005184173583984375, 0.051544189453125, 0.037322998046875, -0.0275726318359375, 0.0006227493286132812, -0.005786895751953125, 0.0144500732421875, 0.030059814453125, -0.04022216796875, 0.0095062255859375, -0.004199981689453125, -0.058319091796875, 0.00763702392578125, 0.0340576171875, 0.0034465789794921875, 0.006927490234375, 0.02557373046875, 0.066162109375, -0.023223876953125, -0.012542724609375, 0.05340576171875, -0.01141357421875, -0.025604248046875, -0.04278564453125, -0.005916595458984375, -0.009552001953125, 0.01119232177734375, 0.028533935546875, 0.00449371337890625, 0.004917144775390625, -0.037933349609375, 0.0289306640625, 0.0289764404296875, -0.030181884765625, -0.0223846435546875, 0.0467529296875, 0.005237579345703125, -0.0113983154296875, 0.061798095703125, -0.0079193115234375, -0.039093017578125, 0.06298828125, 0.047271728515625, 0.055572509765625, 0.001216888427734375, 0.01490020751953125, 0.04351806640625, 0.0282745361328125, 0.0023555755615234375, -0.004344940185546875, 0.0028095245361328125, -0.055145263671875, 0.00830078125, -0.054107666015625, -0.0033817291259765625, 0.016754150390625, -0.037078857421875, 0.039642333984375, -0.0271759033203125, -0.01861572265625, 0.0190582275390625, 0.0175628662109375, -0.088623046875, 0.039215087890625, 0.0216217041015625, 0.069091796875, -0.057647705078125, 0.08062744140625, 0.04351806640625, -0.0628662109375, -0.06317138671875, -0.00838470458984375, -0.01453399658203125, -0.0623779296875, 0.0712890625, 0.0286712646484375, 0.01085662841796875, 0.012481689453125, -0.0323486328125, -0.059417724609375, 0.09661865234375, 0.0206298828125, -0.045654296875, -0.01025390625, 0.00531768798828125, 0.038604736328125, -0.0218658447265625, 0.056427001953125, 0.021881103515625, 0.024749755859375, 0.0235137939453125, -0.0509033203125, 0.00916290283203125, -0.032318115234375, 0.008026123046875, 0.006984710693359375, -0.04736328125, 0.078369140625, -0.0014629364013671875, -0.01085662841796875, -0.00970458984375, 0.04156494140625, -0.0033931732177734375, 0.00618743896484375, 0.050140380859375, 0.053802490234375, 0.046051025390625, -0.01198577880859375, 0.09808349609375, -0.0167236328125, 0.040191650390625, 0.0762939453125, 0.0089874267578125, 0.0335693359375, 0.0229034423828125, -0.0260467529296875, 0.0304718017578125, 0.059967041015625, -0.044281005859375, 0.045745849609375, 0.00311279296875, 0.006969451904296875, -0.00286102294921875, 0.0139923095703125, -0.035858154296875, 0.034576416015625, 0.01186370849609375, -0.0484619140625, -0.0203857421875, 0.00739288330078125, -0.01258087158203125, -0.01413726806640625, -0.001110076904296875, 0.049591064453125, -0.006282806396484375, -0.044677734375, 0.05322265625, -0.00632476806640625, 0.061370849609375, -0.0462646484375, -0.005084991455078125, -0.005542755126953125, 0.027496337890625, -0.017364501953125, -0.06915283203125, 0.0213165283203125, -0.0138092041015625, -0.0218505859375, -0.01531219482421875, 0.07196044921875, -0.017608642578125, -0.046173095703125, 0.0256805419921875, 0.01898193359375, 0.01392364501953125, -0.0083465576171875, -0.0718994140625, 0.0012760162353515625, -0.005771636962890625, -0.035125732421875, 0.041656494140625, 0.044158935546875, 0.0078277587890625, 0.0280914306640625, 0.04608154296875, -0.0166473388671875, 0.0109405517578125, -0.01219940185546875, 0.08123779296875, -0.0199737548828125, -0.015960693359375, -0.05340576171875, 0.052978515625, -0.01200103759765625, -0.0282135009765625, 0.048553466796875, 0.03826904296875, 0.073486328125, -0.0221405029296875, 0.062225341796875, -0.0272369384765625, 0.00323486328125, -0.0176544189453125, 0.054473876953125, -0.048583984375, -0.015777587890625, -0.034820556640625, -0.06414794921875, -0.005878448486328125, 0.062042236328125, -0.0364990234375, 0.01056671142578125, 0.047119140625, 0.05914306640625, -0.0255279541015625, -0.02227783203125, 0.02923583984375, 0.0084686279296875, 0.025390625, 0.0144500732421875, 0.04705810546875, -0.06134033203125, 0.04400634765625, -0.049224853515625, -0.022308349609375, -0.0066680908203125, -0.06610107421875, -0.074462890625, -0.0701904296875, -0.041961669921875, -0.0335693359375, -0.0199737548828125, 0.041656494140625, 0.07489013671875, -0.045013427734375, 0.007511138916015625, -0.01861572265625, -0.01393890380859375, 0.0033855438232421875, -0.017242431640625, 0.054901123046875, -0.0213775634765625, -0.06903076171875, -0.0216217041015625, -0.00392913818359375, 0.029327392578125, -0.01184844970703125, 0.0003647804260253906, -0.0257568359375, -0.0119781494140625, 0.03997802734375, 0.0177459716796875, -0.03167724609375, -0.01483154296875, -0.0021076202392578125, -0.018524169921875, 0.0218505859375, 0.030853271484375, -0.048248291015625, 0.040618896484375, 0.038360595703125, 0.0278778076171875, 0.07513427734375, -0.01357269287109375, 0.018829345703125, -0.047393798828125, 0.03729248046875, 0.00852203369140625, 0.04766845703125, 0.0231475830078125, -0.039642333984375, 0.043060302734375, 0.025054931640625, -0.04833984375, -0.05560302734375, -0.0019083023071289062, -0.09088134765625, -0.0276641845703125, 0.0667724609375, -0.0229034423828125, -0.0264892578125, 0.02105712890625, -0.031158447265625, 0.040435791015625, -0.015380859375, 0.0543212890625, 0.036041259765625, 0.0064697265625, -0.03472900390625, -0.036346435546875, 0.0236053466796875, -0.0004799365997314453, -0.05316162109375, -0.0205078125, 0.031829833984375, 0.033294677734375, 0.037994384765625, 0.051422119140625, -0.0167388916015625, 0.00782012939453125, 0.016204833984375, 0.0416259765625, -0.015838623046875, -0.01557159423828125, -0.0077972412109375, 0.0036468505859375, -0.0030612945556640625, -0.035003662109375 ] ]
DeepPavlov/rubert-base-cased
2021-11-23T08:03:04.000Z
[ "transformers", "pytorch", "jax", "bert", "feature-extraction", "ru", "arxiv:1905.07213", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
DeepPavlov
null
null
DeepPavlov/rubert-base-cased
43
139,873
transformers
2022-03-02T23:29:04
--- language: - ru --- # rubert-base-cased RuBERT \(Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters\) was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERT‑base as an initialization for RuBERT\[1\]. 08.11.2021: upload model with MLM and NSP heads \[1\]: Kuratov, Y., Arkhipov, M. \(2019\). Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language. arXiv preprint [arXiv:1905.07213](https://arxiv.org/abs/1905.07213).
576
[ [ -0.02362060546875, -0.039794921875, 0.0011701583862304688, 0.01207733154296875, -0.03167724609375, 0.01202392578125, -0.05438232421875, -0.006336212158203125, 0.014556884765625, 0.021240234375, -0.049713134765625, -0.0308380126953125, -0.04656982421875, -0.00789642333984375, -0.01763916015625, 0.1253662109375, 0.0134735107421875, 0.04681396484375, -0.006591796875, -0.0028228759765625, -0.01311492919921875, -0.043914794921875, -0.0288238525390625, -0.0309906005859375, 0.049163818359375, 0.045257568359375, 0.05767822265625, 0.01947021484375, 0.043548583984375, 0.0240631103515625, 0.009490966796875, -0.02008056640625, -0.0341796875, 0.0070037841796875, -0.020172119140625, -0.03515625, -0.03277587890625, -0.007373809814453125, 0.06512451171875, 0.0576171875, -0.02313232421875, 0.0182952880859375, -0.02490234375, 0.058624267578125, -0.0131988525390625, -0.00925445556640625, -0.061859130859375, 0.01097869873046875, -0.0243682861328125, -0.006519317626953125, -0.038238525390625, 0.01204681396484375, 0.020355224609375, -0.04541015625, 0.0423583984375, -0.00815582275390625, 0.07843017578125, -0.01015472412109375, -0.0201263427734375, -0.0143280029296875, -0.039764404296875, 0.07763671875, -0.031005859375, 0.048309326171875, 0.01311492919921875, 0.03729248046875, -0.0149688720703125, -0.046356201171875, -0.037017822265625, -0.0203399658203125, -0.0232391357421875, 0.0007395744323730469, -0.004573822021484375, -0.0254058837890625, 0.00742340087890625, 0.0177001953125, -0.06048583984375, -0.018096923828125, -0.042877197265625, -0.01537322998046875, 0.03741455078125, -0.018829345703125, 0.0084075927734375, -0.0097808837890625, -0.031524658203125, -0.0204925537109375, -0.038238525390625, -0.018157958984375, 0.043365478515625, 0.031341552734375, -0.033843994140625, 0.04205322265625, -0.0279083251953125, 0.03924560546875, 0.01438140869140625, -0.0169219970703125, 0.0457763671875, -0.00826263427734375, -0.0194091796875, -0.0148468017578125, 0.052154541015625, -0.001720428466796875, 0.043182373046875, -0.0308380126953125, -0.0030803680419921875, -0.0231781005859375, 0.03253173828125, -0.03546142578125, -0.02880859375, 0.0062713623046875, -0.0162353515625, -0.016754150390625, 0.01250457763671875, -0.05364990234375, -0.017913818359375, -0.022186279296875, 0.03668212890625, -0.03253173828125, -0.022674560546875, 0.005512237548828125, -0.00244140625, 0.0335693359375, 0.014617919921875, -0.059295654296875, 0.0216827392578125, 0.03790283203125, 0.03570556640625, -0.0208282470703125, -0.03924560546875, -0.0209503173828125, -0.011444091796875, -0.0190277099609375, 0.058349609375, -0.0058746337890625, -0.003692626953125, 0.0088043212890625, 0.0166473388671875, 0.01303863525390625, -0.028717041015625, 0.02386474609375, -0.058502197265625, 0.050048828125, 0.000579833984375, 0.0009217262268066406, -0.01517486572265625, 0.031585693359375, -0.0287017822265625, 0.07550048828125, 0.0238037109375, -0.04541015625, 0.0377197265625, -0.0428466796875, -0.049072265625, 0.018829345703125, 0.004016876220703125, -0.03228759765625, 0.005077362060546875, 0.01409912109375, 0.045257568359375, -0.00827789306640625, 0.021270751953125, 0.013031005859375, 0.0179443359375, 0.031036376953125, -0.0080108642578125, 0.07586669921875, 0.011566162109375, -0.0189666748046875, 0.0003523826599121094, -0.06658935546875, 0.006504058837890625, 0.0037784576416015625, -0.0462646484375, -0.01201629638671875, -0.01537322998046875, 0.0247344970703125, 0.01837158203125, 0.0184326171875, -0.027008056640625, 0.01361846923828125, -0.0253753662109375, 0.0005469322204589844, 0.045684814453125, -0.0157012939453125, 0.050018310546875, -0.0178680419921875, 0.03643798828125, 0.02593994140625, 0.0094757080078125, -0.0034942626953125, -0.0305938720703125, -0.10076904296875, -0.0157928466796875, 0.04931640625, 0.041412353515625, -0.06561279296875, 0.0159912109375, -0.039947509765625, -0.04132080078125, -0.0285186767578125, 0.021087646484375, 0.03192138671875, 0.0179901123046875, 0.0268402099609375, -0.021728515625, -0.045562744140625, -0.09478759765625, -0.011474609375, 0.00405120849609375, 0.00722503662109375, -0.00043392181396484375, 0.044769287109375, -0.01538848876953125, 0.03912353515625, -0.00994873046875, -0.013824462890625, -0.0269622802734375, 0.014007568359375, 0.041412353515625, 0.0517578125, 0.050567626953125, -0.06072998046875, -0.0810546875, 0.034820556640625, -0.03350830078125, 0.01245880126953125, -0.01971435546875, -0.03411865234375, 0.0234222412109375, 0.00748443603515625, -0.06732177734375, 0.00637054443359375, 0.038726806640625, -0.028717041015625, 0.03204345703125, -0.01155853271484375, -0.00328826904296875, -0.103759765625, 0.00711822509765625, -0.0035381317138671875, -0.0246429443359375, -0.06951904296875, 0.0164947509765625, 0.016082763671875, -0.0029811859130859375, -0.05145263671875, 0.02349853515625, -0.058502197265625, -0.01142120361328125, -0.002033233642578125, -0.004161834716796875, -0.0198822021484375, 0.05303955078125, 0.0033664703369140625, 0.048675537109375, 0.064453125, -0.03375244140625, 0.049102783203125, 0.01409912109375, -0.028839111328125, 0.05126953125, -0.0423583984375, 0.01200103759765625, -0.0194549560546875, -0.01184844970703125, -0.05987548828125, -0.006591796875, 0.0260772705078125, -0.0560302734375, 0.05145263671875, -0.0224456787109375, -0.0335693359375, -0.0048370361328125, -0.01242828369140625, 0.0215606689453125, 0.041839599609375, -0.029998779296875, 0.029632568359375, 0.034637451171875, -0.024169921875, -0.0662841796875, -0.060028076171875, 0.0252838134765625, -0.009063720703125, -0.0419921875, 0.0302276611328125, -0.0147705078125, -0.0128021240234375, -0.0102386474609375, 0.019989013671875, -0.01288604736328125, -0.0215301513671875, -0.00742340087890625, 0.0175018310546875, -0.0303497314453125, 0.00972747802734375, -0.0084686279296875, -0.0045318603515625, -0.010498046875, -0.0023097991943359375, 0.050811767578125, -0.0235748291015625, 0.007297515869140625, -0.00649261474609375, 0.0256500244140625, 0.05328369140625, -0.0034885406494140625, 0.06097412109375, 0.061309814453125, -0.02349853515625, -0.0285491943359375, -0.040130615234375, -0.013885498046875, -0.0341796875, 0.040191650390625, -0.026947021484375, -0.045684814453125, 0.04180908203125, 0.0306549072265625, -0.0136260986328125, 0.040374755859375, 0.0792236328125, 0.0023403167724609375, 0.0426025390625, 0.046295166015625, -0.01186370849609375, 0.0261077880859375, -0.028350830078125, -0.0019130706787109375, -0.041748046875, -0.033905029296875, -0.0255889892578125, -0.02496337890625, -0.0455322265625, -0.034576416015625, 0.0181427001953125, -0.003726959228515625, -0.01270294189453125, 0.0474853515625, -0.024627685546875, 0.034881591796875, 0.05255126953125, 0.00788116455078125, 0.003940582275390625, 0.03521728515625, -0.0181427001953125, -0.0158233642578125, -0.058990478515625, -0.006839752197265625, 0.0748291015625, 0.028411865234375, 0.06341552734375, 0.0273590087890625, 0.01541900634765625, 0.0221710205078125, 0.0243377685546875, -0.0660400390625, 0.03656005859375, -0.00284576416015625, -0.08697509765625, -0.020538330078125, -0.010894775390625, -0.0672607421875, 0.03228759765625, -0.03411865234375, -0.03131103515625, 0.01183319091796875, -0.006015777587890625, 0.00951385498046875, -0.003482818603515625, -0.047821044921875, 0.05718994140625, -0.0245513916015625, -0.00896453857421875, -0.0198822021484375, -0.052154541015625, 0.008453369140625, -0.00040268898010253906, 0.0121307373046875, -0.0096588134765625, 0.00231170654296875, 0.05364990234375, -0.00756072998046875, 0.05859375, -0.007076263427734375, 0.005157470703125, 0.0017032623291015625, 0.0037250518798828125, 0.0204925537109375, 0.0179290771484375, 0.01544189453125, 0.026641845703125, -0.0246124267578125, -0.040191650390625, -0.0247344970703125, 0.058929443359375, -0.07080078125, -0.0367431640625, -0.050140380859375, -0.047637939453125, -0.0196380615234375, 0.03759765625, 0.0157012939453125, 0.020263671875, -0.015380859375, 0.0474853515625, 0.03265380859375, -0.01007080078125, 0.031585693359375, 0.06488037109375, 0.0020809173583984375, -0.040771484375, 0.0494384765625, -0.005420684814453125, 0.0036563873291015625, 0.040008544921875, 0.01425933837890625, -0.0011625289916992188, -0.0304718017578125, -0.05145263671875, 0.04534912109375, -0.0631103515625, -0.02764892578125, -0.031768798828125, -0.022430419921875, -0.03369140625, 0.00010532140731811523, -0.03350830078125, -0.04437255859375, -0.027862548828125, -0.00710296630859375, 0.02764892578125, 0.041229248046875, -0.00946044921875, 0.035491943359375, -0.0391845703125, 0.025970458984375, 0.027313232421875, 0.02801513671875, -0.0225830078125, -0.061981201171875, -0.0655517578125, 0.01513671875, -0.01282501220703125, -0.049560546875, 0.038818359375, 0.031341552734375, 0.0670166015625, 0.01715087890625, -0.02020263671875, 0.03558349609375, -0.06549072265625, 0.059783935546875, 0.022308349609375, -0.05767822265625, 0.026824951171875, -0.03955078125, 0.020111083984375, 0.0271148681640625, 0.048095703125, -0.04852294921875, 0.002048492431640625, -0.038482666015625, -0.037750244140625, 0.06866455078125, 0.0085296630859375, 0.03607177734375, 0.0091094970703125, 0.0174407958984375, 0.0272979736328125, 0.019775390625, -0.08905029296875, -0.038299560546875, -0.0250701904296875, -0.01415252685546875, -0.02325439453125, -0.032745361328125, -0.023773193359375, -0.03790283203125, 0.06854248046875, 0.001262664794921875, 0.038726806640625, -0.000965118408203125, -0.03680419921875, -0.01425933837890625, 0.018890380859375, 0.07415771484375, 0.05462646484375, 0.0023097991943359375, 0.0189056396484375, 0.01409912109375, -0.05108642578125, 0.0010662078857421875, -0.0038890838623046875, -0.0001952648162841797, 0.020477294921875, 0.0221099853515625, 0.055206298828125, 0.005435943603515625, -0.034210205078125, 0.042877197265625, 0.0145263671875, -0.030426025390625, -0.043426513671875, -0.026458740234375, -0.0032196044921875, 0.0155029296875, 0.044219970703125, -0.003032684326171875, -0.01415252685546875, -0.034271240234375, 0.02178955078125, 0.048187255859375, -0.03485107421875, -0.046234130859375, 0.01467132568359375, 0.00732421875, -0.025421142578125, 0.0258331298828125, -0.0007262229919433594, -0.049224853515625, 0.019256591796875, 0.040496826171875, 0.06378173828125, -0.0157928466796875, 0.03302001953125, 0.0168304443359375, 0.0394287109375, 0.01027679443359375, 0.0233612060546875, 0.008880615234375, -0.07659912109375, -0.031707763671875, -0.048614501953125, -0.0164337158203125, 0.021728515625, -0.048614501953125, 0.02911376953125, -0.0260772705078125, -0.003482818603515625, 0.016204833984375, -0.0072021484375, -0.0496826171875, 0.021148681640625, 0.0149688720703125, 0.050994873046875, -0.046051025390625, 0.07855224609375, 0.0843505859375, -0.00435638427734375, -0.01192474365234375, -0.029754638671875, -0.0226593017578125, -0.07366943359375, 0.077392578125, -0.020721435546875, 0.02032470703125, -0.009124755859375, -0.03692626953125, -0.07843017578125, 0.056854248046875, 0.01983642578125, -0.0102691650390625, 0.017578125, 0.0272216796875, 0.050628662109375, -0.0364990234375, -0.00007516145706176758, 0.01019287109375, 0.0306396484375, -0.0173492431640625, -0.071044921875, -0.0207061767578125, -0.039947509765625, 0.00423431396484375, 0.017913818359375, -0.047149658203125, 0.08935546875, -0.00461578369140625, -0.008819580078125, -0.004638671875, 0.03515625, -0.0232696533203125, -0.01702880859375, 0.0249481201171875, 0.049102783203125, 0.01183319091796875, 0.000579833984375, 0.0496826171875, -0.02911376953125, 0.043701171875, 0.060791015625, 0.0152587890625, 0.057403564453125, 0.053680419921875, -0.031951904296875, 0.0704345703125, 0.0292816162109375, -0.008758544921875, 0.064697265625, 0.0091400146484375, -0.01137542724609375, -0.0090789794921875, 0.0155181884765625, -0.028717041015625, 0.03814697265625, 0.031463623046875, -0.0289306640625, -0.0156402587890625, 0.00826263427734375, 0.01079559326171875, -0.0134124755859375, -0.00968170166015625, 0.048004150390625, -0.01358795166015625, -0.045684814453125, 0.04541015625, 0.0186004638671875, 0.050628662109375, -0.04486083984375, 0.010284423828125, -0.0185394287109375, 0.0186004638671875, -0.0025997161865234375, -0.045318603515625, 0.004528045654296875, -0.0029735565185546875, -0.033233642578125, -0.022918701171875, 0.058685302734375, -0.054229736328125, -0.06317138671875, 0.0014505386352539062, 0.05084228515625, 0.0152435302734375, 0.01306915283203125, -0.0655517578125, -0.02618408203125, -0.0061492919921875, -0.03924560546875, 0.029296875, 0.034942626953125, 0.00873565673828125, 0.037933349609375, 0.05926513671875, 0.0155181884765625, 0.0226287841796875, 0.008056640625, 0.0738525390625, -0.04364013671875, -0.0312347412109375, -0.054840087890625, 0.057037353515625, -0.00018393993377685547, -0.0291900634765625, 0.06695556640625, 0.034393310546875, 0.082275390625, -0.048492431640625, 0.04510498046875, -0.008880615234375, 0.052978515625, -0.026580810546875, 0.064697265625, -0.047027587890625, -0.0169830322265625, -0.005840301513671875, -0.07464599609375, -0.036590576171875, 0.059661865234375, 0.002727508544921875, 0.0016546249389648438, 0.051300048828125, 0.057891845703125, -0.0136566162109375, -0.04730224609375, 0.036865234375, 0.01378631591796875, -0.005584716796875, 0.0246429443359375, 0.044586181640625, -0.045928955078125, 0.047332763671875, -0.05303955078125, -0.0085906982421875, -0.0269317626953125, -0.07080078125, -0.10125732421875, -0.047515869140625, -0.021820068359375, -0.0200653076171875, 0.0002770423889160156, 0.05120849609375, 0.0650634765625, -0.0877685546875, -0.038543701171875, 0.004497528076171875, -0.00019681453704833984, 0.005435943603515625, -0.0141143798828125, 0.00225067138671875, -0.021209716796875, -0.051300048828125, 0.0222625732421875, 0.009002685546875, 0.0096893310546875, -0.038970947265625, -0.0157318115234375, -0.034210205078125, -0.0251922607421875, 0.0455322265625, 0.00909423828125, -0.05224609375, -0.0285186767578125, 0.02374267578125, -0.0149688720703125, 0.01380157470703125, 0.05780029296875, -0.040130615234375, 0.02825927734375, 0.0360107421875, 0.031707763671875, 0.0379638671875, -0.01055908203125, 0.042877197265625, -0.08526611328125, 0.0452880859375, 0.0089874267578125, 0.0291290283203125, 0.04638671875, -0.02899169921875, 0.02020263671875, 0.040496826171875, -0.03533935546875, -0.051544189453125, 0.00585174560546875, -0.0872802734375, -0.01306915283203125, 0.09979248046875, 0.0078582763671875, -0.005710601806640625, 0.0001633167266845703, -0.01529693603515625, 0.00875091552734375, -0.0242462158203125, 0.045166015625, 0.065185546875, 0.0193328857421875, -0.037841796875, -0.0304107666015625, 0.032806396484375, 0.031463623046875, -0.03564453125, -0.0211181640625, 0.0120697021484375, 0.0240936279296875, 0.0361328125, 0.020751953125, -0.0006046295166015625, 0.030181884765625, 0.00194549560546875, 0.03521728515625, 0.00782012939453125, -0.036590576171875, -0.0185089111328125, -0.00997161865234375, -0.0013427734375, -0.0140533447265625 ] ]
princeton-nlp/sup-simcse-roberta-large
2022-11-11T20:04:02.000Z
[ "transformers", "pytorch", "jax", "roberta", "feature-extraction", "arxiv:2104.08821", "arxiv:1910.09700", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
princeton-nlp
null
null
princeton-nlp/sup-simcse-roberta-large
9
139,837
transformers
2022-03-02T23:29:05
--- tags: - feature-extraction --- # Model Card for sup-simcse-roberta-large # Model Details ## Model Description - **Developed by:** Princeton-nlp - **Shared by [Optional]:** More information needed - **Model type:** Feature Extraction - **Language(s) (NLP):** More information needed - **License:** More information needed - **Related Models:** - **Parent Model:** RoBERTa-large - **Resources for more information:** - [GitHub Repo](https://github.com/princeton-nlp/SimCSE) - [Associated Paper](https://arxiv.org/abs/2104.08821) - [Blog Post]({0}) # Uses ## Direct Use This model can be used for the task of Feature Extraction ## Downstream Use [Optional] More information needed ## Out-of-Scope Use The model should not be used to intentionally create hostile or alienating environments for people. # Bias, Risks, and Limitations Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. # Training Details ## Training Data The model craters note in the [Github Repository](https://github.com/princeton-nlp/SimCSE/blob/main/README.md) > We train unsupervised SimCSE on 106 randomly sampled sentences from English Wikipedia, and train supervised SimCSE on the combination of MNLI and SNLI datasets (314k). ## Training Procedure ### Preprocessing More information needed ### Speeds, Sizes, Times More information needed # Evaluation ## Testing Data, Factors & Metrics ### Testing Data The model craters note in the [associated paper](https://arxiv.org/pdf/2104.08821.pdf) > Our evaluation code for sentence embeddings is based on a modified version of [SentEval](https://github.com/facebookresearch/SentEval). It evaluates sentence embeddings on semantic textual similarity (STS) tasks and downstream transfer tasks. For STS tasks, our evaluation takes the "all" setting, and report Spearman's correlation. See [associated paper](https://arxiv.org/pdf/2104.08821.pdf) (Appendix B) for evaluation details. ### Factors ### Metrics More information needed ## Results More information needed # Model Examination More information needed # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** More information needed - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Technical Specifications [optional] ## Model Architecture and Objective More information needed ## Compute Infrastructure More information needed ### Hardware More information needed ### Software More information needed # Citation **BibTeX:** ```bibtex @inproceedings{gao2021simcse, title={{SimCSE}: Simple Contrastive Learning of Sentence Embeddings}, author={Gao, Tianyu and Yao, Xingcheng and Chen, Danqi}, booktitle={Empirical Methods in Natural Language Processing (EMNLP)}, year={2021} } ``` # Glossary [optional] More information needed # More Information [optional] If you have any questions related to the code or the paper, feel free to email Tianyu (`tianyug@cs.princeton.edu`) and Xingcheng (`yxc18@mails.tsinghua.edu.cn`). If you encounter any problems when using the code, or want to report a bug, you can open an issue. Please try to specify the problem with details so we can help you better and quicker! # Model Card Authors [optional] Princeton NLP group in collaboration with Ezi Ozoani and the Hugging Face team # Model Card Contact More information needed # How to Get Started with the Model Use the code below to get started with the model. <details> <summary> Click to expand </summary> ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("princeton-nlp/sup-simcse-roberta-large") model = AutoModel.from_pretrained("princeton-nlp/sup-simcse-roberta-large") ``` </details>
4,642
[ [ -0.0233306884765625, -0.052276611328125, 0.03338623046875, 0.0160980224609375, -0.0228729248046875, -0.02960205078125, -0.0306854248046875, -0.0226287841796875, 0.01250457763671875, 0.0311431884765625, -0.039886474609375, -0.050567626953125, -0.053436279296875, 0.0101470947265625, -0.0190887451171875, 0.07794189453125, 0.016754150390625, 0.01263427734375, -0.02911376953125, 0.00927734375, -0.013671875, -0.045684814453125, -0.04669189453125, -0.002201080322265625, 0.016632080078125, 0.012969970703125, 0.042938232421875, 0.049713134765625, 0.032318115234375, 0.02752685546875, -0.0290069580078125, 0.00011605024337768555, -0.0258941650390625, -0.0181121826171875, -0.006999969482421875, -0.0230712890625, -0.033843994140625, 0.005611419677734375, 0.045745849609375, 0.04168701171875, -0.0142059326171875, 0.0227508544921875, 0.0163726806640625, 0.0181884765625, -0.04156494140625, 0.0258331298828125, -0.0504150390625, -0.00948333740234375, -0.0137176513671875, 0.0008192062377929688, -0.0361328125, -0.00848388671875, 0.0016775131225585938, -0.041107177734375, 0.004734039306640625, 0.01116943359375, 0.09173583984375, 0.0195770263671875, -0.024688720703125, -0.02947998046875, -0.028961181640625, 0.07904052734375, -0.07330322265625, 0.0262451171875, 0.0170135498046875, -0.001094818115234375, -0.00829315185546875, -0.06365966796875, -0.04705810546875, -0.034454345703125, -0.0148468017578125, 0.022491455078125, -0.016845703125, 0.0006623268127441406, 0.039154052734375, 0.0160369873046875, -0.041168212890625, 0.00702667236328125, -0.027557373046875, -0.004955291748046875, 0.056182861328125, 0.00974273681640625, 0.02685546875, -0.0504150390625, -0.03802490234375, -0.0214691162109375, -0.0188751220703125, 0.00446319580078125, 0.028839111328125, 0.030487060546875, -0.036651611328125, 0.0465087890625, -0.002223968505859375, 0.038818359375, -0.020111083984375, 0.002227783203125, 0.038818359375, -0.027587890625, -0.022216796875, -0.01447296142578125, 0.0833740234375, 0.00937652587890625, 0.01468658447265625, 0.00649261474609375, -0.002231597900390625, -0.01043701171875, 0.0112457275390625, -0.059661865234375, -0.01546478271484375, 0.00962066650390625, -0.0406494140625, -0.01666259765625, 0.0214691162109375, -0.0682373046875, 0.004924774169921875, -0.029632568359375, 0.01910400390625, -0.0399169921875, -0.01090240478515625, 0.005382537841796875, -0.00946044921875, 0.0137481689453125, 0.0034046173095703125, -0.051483154296875, 0.035125732421875, 0.04302978515625, 0.058990478515625, -0.01551055908203125, -0.0199432373046875, -0.0112457275390625, -0.001140594482421875, 0.0082855224609375, 0.03912353515625, -0.04144287109375, -0.0274505615234375, -0.0006284713745117188, 0.0064849853515625, -0.0149383544921875, -0.0181884765625, 0.07659912109375, -0.015899658203125, 0.031890869140625, -0.01290130615234375, -0.050445556640625, -0.0171661376953125, 0.00785064697265625, -0.042205810546875, 0.09808349609375, -0.00902557373046875, -0.0814208984375, -0.0015544891357421875, -0.058563232421875, -0.01404571533203125, -0.0033397674560546875, -0.0123748779296875, -0.046356201171875, -0.004329681396484375, 0.0198974609375, 0.0367431640625, -0.02752685546875, 0.0306396484375, -0.0258636474609375, -0.0237274169921875, 0.00821685791015625, -0.032012939453125, 0.11273193359375, 0.024169921875, -0.01873779296875, -0.0022411346435546875, -0.0552978515625, 0.0009279251098632812, 0.0249176025390625, -0.01122283935546875, -0.031829833984375, -0.01107025146484375, 0.026641845703125, 0.0245208740234375, 0.033447265625, -0.043121337890625, 0.006366729736328125, -0.027191162109375, 0.023223876953125, 0.044952392578125, -0.004673004150390625, 0.020233154296875, -0.01425933837890625, 0.03814697265625, 0.0083465576171875, 0.0227203369140625, 0.00310516357421875, -0.03607177734375, -0.06719970703125, -0.0221405029296875, 0.0439453125, 0.0535888671875, -0.04150390625, 0.06634521484375, -0.0282440185546875, -0.0469970703125, -0.037445068359375, -0.004367828369140625, 0.034515380859375, 0.0108489990234375, 0.048675537109375, -0.008392333984375, -0.05059814453125, -0.0731201171875, -0.027496337890625, -0.004581451416015625, -0.0011835098266601562, 0.039459228515625, 0.05682373046875, -0.01366424560546875, 0.06817626953125, -0.056915283203125, -0.02392578125, -0.021026611328125, 0.01245880126953125, 0.00475311279296875, 0.052734375, 0.038848876953125, -0.061492919921875, -0.0270233154296875, -0.0289459228515625, -0.057586669921875, -0.0117950439453125, -0.0159454345703125, -0.004787445068359375, 0.01531982421875, 0.055999755859375, -0.05120849609375, 0.02764892578125, 0.04473876953125, -0.0201568603515625, 0.03564453125, -0.017303466796875, -0.0092010498046875, -0.1104736328125, 0.01146697998046875, 0.0126190185546875, 0.0015554428100585938, -0.0292816162109375, -0.003162384033203125, -0.0088043212890625, -0.007534027099609375, -0.033843994140625, 0.043426513671875, -0.0391845703125, -0.00005447864532470703, 0.002895355224609375, 0.0241851806640625, -0.006591796875, 0.050018310546875, 0.005001068115234375, 0.039794921875, 0.037078857421875, -0.043365478515625, 0.0009160041809082031, 0.0262451171875, -0.019744873046875, 0.02215576171875, -0.052825927734375, 0.00495147705078125, 0.0035457611083984375, 0.0217132568359375, -0.053314208984375, 0.005001068115234375, 0.019256591796875, -0.040771484375, 0.0252685546875, 0.0012540817260742188, -0.045745849609375, -0.0267486572265625, -0.011627197265625, 0.0240631103515625, 0.042236328125, -0.0335693359375, 0.057586669921875, 0.0306396484375, -0.01444244384765625, -0.04864501953125, -0.062103271484375, 0.00447845458984375, -0.0102996826171875, -0.03680419921875, 0.041015625, -0.0169677734375, -0.00890350341796875, 0.01324462890625, 0.0272216796875, -0.0113372802734375, 0.006443023681640625, 0.0229949951171875, 0.023345947265625, 0.0018749237060546875, 0.0143890380859375, 0.0017290115356445312, -0.0151214599609375, 0.009002685546875, -0.00008094310760498047, 0.04150390625, -0.00909423828125, -0.0114898681640625, -0.042572021484375, 0.0221099853515625, 0.017242431640625, -0.01303863525390625, 0.0682373046875, 0.059234619140625, -0.03021240234375, -0.0162506103515625, -0.034912109375, -0.016143798828125, -0.034271240234375, 0.045166015625, -0.0230712890625, -0.059417724609375, 0.0282440185546875, -0.00447845458984375, 0.0012559890747070312, 0.06390380859375, 0.033599853515625, -0.0025310516357421875, 0.061492919921875, 0.05584716796875, 0.0015268325805664062, 0.037567138671875, -0.0271759033203125, 0.019775390625, -0.070068359375, -0.0216522216796875, -0.0662841796875, 0.0031299591064453125, -0.06964111328125, -0.03338623046875, -0.00315093994140625, 0.006366729736328125, -0.025054931640625, 0.03753662109375, -0.039520263671875, 0.0197601318359375, 0.040252685546875, 0.0008864402770996094, 0.0139617919921875, -0.004413604736328125, -0.0224761962890625, -0.0231475830078125, -0.057952880859375, -0.043060302734375, 0.06060791015625, 0.034698486328125, 0.035125732421875, -0.003429412841796875, 0.059783935546875, 0.005687713623046875, 0.007160186767578125, -0.041839599609375, 0.060302734375, -0.0362548828125, -0.03765869140625, -0.0158843994140625, -0.044952392578125, -0.061981201171875, 0.0172576904296875, -0.03338623046875, -0.053131103515625, 0.0137786865234375, -0.0087432861328125, -0.01386260986328125, 0.026031494140625, -0.039825439453125, 0.07257080078125, -0.01544952392578125, -0.0024700164794921875, -0.0092620849609375, -0.04510498046875, 0.0312347412109375, 0.005962371826171875, 0.0211029052734375, -0.00131988525390625, -0.004871368408203125, 0.06866455078125, -0.0229949951171875, 0.06903076171875, -0.01210784912109375, 0.0164337158203125, 0.018035888671875, -0.0154266357421875, 0.034759521484375, -0.005870819091796875, -0.01200103759765625, 0.046356201171875, -0.00555419921875, -0.021514892578125, -0.033050537109375, 0.062469482421875, -0.067138671875, -0.0282135009765625, -0.049224853515625, -0.044097900390625, 0.006862640380859375, 0.0173797607421875, 0.023040771484375, 0.00986480712890625, -0.019866943359375, 0.025146484375, 0.04052734375, -0.049102783203125, 0.033843994140625, 0.0213470458984375, -0.00689697265625, -0.031219482421875, 0.05792236328125, 0.01727294921875, 0.011749267578125, 0.0183258056640625, 0.0142059326171875, -0.0205535888671875, -0.02996826171875, -0.00875091552734375, 0.032806396484375, -0.045745849609375, -0.01062774658203125, -0.0780029296875, -0.0418701171875, -0.0408935546875, 0.00261688232421875, -0.0292816162109375, -0.03314208984375, -0.039337158203125, -0.00299072265625, 0.0223236083984375, 0.0413818359375, -0.0010976791381835938, 0.028533935546875, -0.037384033203125, 0.01328277587890625, 0.003795623779296875, 0.01546478271484375, -0.004657745361328125, -0.061920166015625, -0.0187530517578125, 0.0097503662109375, -0.027679443359375, -0.050872802734375, 0.03466796875, 0.006938934326171875, 0.046173095703125, 0.00861358642578125, 0.0146484375, 0.045928955078125, -0.0306243896484375, 0.0833740234375, 0.0068511962890625, -0.08453369140625, 0.03411865234375, -0.011749267578125, 0.021026611328125, 0.0472412109375, 0.03936767578125, -0.02996826171875, -0.0290985107421875, -0.08819580078125, -0.07025146484375, 0.054779052734375, 0.03564453125, 0.0221099853515625, -0.011199951171875, 0.0309295654296875, -0.02777099609375, 0.0009889602661132812, -0.0850830078125, -0.036376953125, -0.00814056396484375, -0.02685546875, -0.0022735595703125, -0.03485107421875, 0.004184722900390625, -0.0204620361328125, 0.07763671875, 0.004634857177734375, 0.036956787109375, 0.01065826416015625, -0.009185791015625, 0.0227508544921875, 0.0205535888671875, 0.037933349609375, 0.0086822509765625, -0.020263671875, -0.002887725830078125, 0.04071044921875, -0.036041259765625, -0.022216796875, 0.0191802978515625, -0.024810791015625, 0.0109100341796875, 0.026031494140625, 0.0609130859375, 0.0223846435546875, -0.039520263671875, 0.06494140625, -0.0097198486328125, -0.024871826171875, -0.041259765625, -0.0011425018310546875, 0.01544952392578125, 0.00623321533203125, 0.00606536865234375, 0.00562286376953125, 0.0169219970703125, -0.0211944580078125, 0.0161590576171875, 0.037628173828125, -0.033599853515625, -0.00008511543273925781, 0.048553466796875, 0.007724761962890625, -0.0213775634765625, 0.053314208984375, -0.0198211669921875, -0.034027099609375, 0.059234619140625, 0.049835205078125, 0.0751953125, -0.007137298583984375, 0.00786590576171875, 0.061920166015625, 0.03375244140625, -0.0037212371826171875, 0.0157623291015625, 0.01271820068359375, -0.048126220703125, -0.02569580078125, -0.05328369140625, -0.00969696044921875, 0.00792694091796875, -0.060394287109375, 0.046722412109375, -0.032958984375, -0.00862884521484375, -0.005016326904296875, 0.01189422607421875, -0.05352783203125, 0.022369384765625, 0.010162353515625, 0.0704345703125, -0.07977294921875, 0.0654296875, 0.0430908203125, -0.050872802734375, -0.066650390625, 0.0036373138427734375, -0.003143310546875, -0.037384033203125, 0.05194091796875, 0.01259613037109375, 0.00006288290023803711, 0.0023174285888671875, -0.052734375, -0.06787109375, 0.0926513671875, 0.0245819091796875, -0.045806884765625, 0.00487518310546875, 0.0010061264038085938, 0.0421142578125, -0.043609619140625, 0.047454833984375, 0.0225372314453125, 0.034393310546875, 0.00238037109375, -0.0577392578125, 0.01273345947265625, -0.0209197998046875, 0.022247314453125, -0.01532745361328125, -0.059234619140625, 0.0704345703125, -0.0092926025390625, -0.021942138671875, 0.01080322265625, 0.06280517578125, 0.0216522216796875, 0.006664276123046875, 0.046142578125, 0.056915283203125, 0.051971435546875, -0.0031795501708984375, 0.08172607421875, -0.0225372314453125, 0.04608154296875, 0.1021728515625, 0.006328582763671875, 0.08367919921875, 0.02508544921875, -0.0171051025390625, 0.06024169921875, 0.038360595703125, -0.0228729248046875, 0.0200958251953125, 0.018402099609375, 0.010955810546875, -0.01861572265625, -0.0236053466796875, -0.0302886962890625, 0.03802490234375, 0.0166473388671875, -0.050048828125, -0.004917144775390625, -0.01525115966796875, 0.016510009765625, 0.011077880859375, 0.0010519027709960938, 0.0533447265625, 0.019683837890625, -0.0274505615234375, 0.0307464599609375, 0.01216888427734375, 0.0565185546875, -0.048858642578125, -0.004619598388671875, 0.007106781005859375, 0.0089111328125, -0.017852783203125, -0.04229736328125, 0.020263671875, -0.0010786056518554688, -0.01509857177734375, -0.0027923583984375, 0.028167724609375, -0.04388427734375, -0.05548095703125, 0.045074462890625, 0.02593994140625, 0.023345947265625, 0.01715087890625, -0.09539794921875, 0.00940704345703125, 0.01812744140625, -0.030242919921875, 0.014739990234375, 0.027587890625, 0.007213592529296875, 0.0396728515625, 0.050750732421875, 0.00344085693359375, -0.0034885406494140625, 0.01177978515625, 0.057373046875, -0.05352783203125, -0.0439453125, -0.061767578125, 0.051239013671875, -0.032379150390625, -0.03436279296875, 0.06768798828125, 0.055908203125, 0.06134033203125, 0.0011234283447265625, 0.059600830078125, -0.0134429931640625, 0.03289794921875, -0.039703369140625, 0.03497314453125, -0.050018310546875, 0.024322509765625, -0.037445068359375, -0.082763671875, -0.029052734375, 0.05670166015625, -0.02569580078125, 0.0258026123046875, 0.07080078125, 0.068359375, -0.00574493408203125, 0.00632476806640625, 0.0137176513671875, 0.04241943359375, 0.01529693603515625, 0.0298004150390625, 0.032318115234375, -0.0577392578125, 0.043487548828125, -0.0258331298828125, -0.0143890380859375, -0.005924224853515625, -0.081787109375, -0.0682373046875, -0.048187255859375, -0.04278564453125, -0.0287933349609375, -0.004772186279296875, 0.0782470703125, 0.037933349609375, -0.060791015625, -0.0161895751953125, -0.007427215576171875, -0.007537841796875, -0.0194091796875, -0.0188446044921875, 0.052581787109375, -0.0258941650390625, -0.06719970703125, 0.0019321441650390625, -0.002593994140625, 0.005283355712890625, -0.0151214599609375, -0.01019287109375, -0.046722412109375, 0.01277923583984375, 0.02862548828125, 0.003551483154296875, -0.06353759765625, -0.003814697265625, -0.01209259033203125, -0.031707763671875, -0.01366424560546875, 0.036895751953125, -0.0322265625, 0.0167388916015625, 0.0307464599609375, 0.038330078125, 0.044219970703125, 0.0020313262939453125, 0.0227508544921875, -0.048431396484375, 0.0198974609375, 0.01171875, 0.04107666015625, 0.023162841796875, -0.038726806640625, 0.038726806640625, 0.0191802978515625, -0.046875, -0.049560546875, 0.0004286766052246094, -0.0980224609375, -0.0306243896484375, 0.10467529296875, -0.0194854736328125, -0.02301025390625, -0.00036716461181640625, -0.022979736328125, 0.02569580078125, -0.02752685546875, 0.0426025390625, 0.050750732421875, 0.01480865478515625, -0.0159454345703125, -0.04730224609375, 0.042572021484375, 0.03619384765625, -0.06732177734375, 0.00272369384765625, 0.02581787109375, 0.0186614990234375, 0.0193634033203125, 0.032318115234375, -0.016021728515625, -0.003299713134765625, 0.00698089599609375, 0.037841796875, -0.008056640625, -0.01464080810546875, -0.034210205078125, 0.00397491455078125, -0.015869140625, -0.0003898143768310547 ] ]
SAPOSS/password-model
2022-11-09T10:12:15.000Z
[ "transformers", "tf", "roberta", "text-classification", "en", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
text-classification
SAPOSS
null
null
SAPOSS/password-model
7
138,140
transformers
2022-03-02T23:29:04
--- language: - en --- # Model Card for Password-Model # Model Details ## Model Description The Password Model is intended to be used with [Credential Digger](https://github.com/SAP/credential-digger) in order to automatically filter false positive password discoveries. - **Developed by:** SAP OSS - **Shared by [Optional]:** Hugging Face - **Model type:** Text Classification - **Language(s) (NLP):** en - **License:** Apache-2.0 - **Related Models:** - **Parent Model:** RoBERTa - **Resources for more information:** - [GitHub Repo](https://github.com/SAP/credential-digger) - [Associated Paper](https://www.scitepress.org/Papers/2021/102381/102381.pdf) # Uses ## Direct Use The model is directly integrated into [Credential Digger]((https://github.com/SAP/credential-digger) and can be used to filter the false positive password discoveries of a scan. ## Out-of-Scope Use The model should not be used to intentionally create hostile or alienating environments for people. # Training Details ## Training Data [CodeBERT-base-mlm](https://huggingface.co/microsoft/codebert-base-mlm) fine-tuned on a dataset for leak detection. ## Training Procedure ### Preprocessing More information needed ### Speeds, Sizes, Times More information needed # Evaluation More information needed ## Testing Data, Factors & Metrics ### Testing Data More information needed ### Factors More information needed ### Metrics More information needed ## Results More information needed # Model Examination More information needed # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** More information needed - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Technical Specifications [optional] ## Model Architecture and Objective More information needed ## Compute Infrastructure More information needed ### Hardware More information needed ### Software More information needed # Citation **BibTeX:** ``` TBD ``` # Model Card Authors [optional] SAP OSS in collaboration with Ezi Ozoani and the Hugging Face team. # Model Card Contact More information needed # How to Get Started with the Model The model is directly integrated into Credential Digger and can be used to filter the false positive discoveries of a scan <details> <summary> Click to expand </summary> ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("SAPOSS/password-model") model = AutoModelForSequenceClassification.from_pretrained("SAPOSS/password-model") ``` </details>
2,938
[ [ -0.0211181640625, -0.0250244140625, 0.01898193359375, -0.0019779205322265625, -0.0238494873046875, -0.0100250244140625, 0.0400390625, -0.0380859375, 0.00007671117782592773, 0.06292724609375, -0.05084228515625, -0.039337158203125, -0.044525146484375, -0.0101776123046875, -0.03582763671875, 0.06866455078125, 0.0023193359375, 0.0171356201171875, -0.006031036376953125, 0.00388336181640625, -0.038787841796875, -0.05914306640625, -0.031005859375, -0.0364990234375, 0.003795623779296875, 0.00791168212890625, 0.0218658447265625, 0.025421142578125, 0.049163818359375, 0.0174560546875, -0.017822265625, 0.0025196075439453125, 0.00638580322265625, -0.016571044921875, -0.0194549560546875, -0.038299560546875, -0.055694580078125, 0.005695343017578125, 0.0272369384765625, 0.0171661376953125, -0.0258636474609375, 0.044219970703125, -0.00653076171875, 0.0396728515625, -0.055023193359375, 0.0297393798828125, -0.06243896484375, -0.00821685791015625, -0.016021728515625, -0.01458740234375, -0.0235595703125, -0.0034389495849609375, -0.0103912353515625, -0.0201568603515625, 0.0311279296875, 0.0209503173828125, 0.09039306640625, 0.01464080810546875, -0.0214996337890625, -0.0300140380859375, -0.0531005859375, 0.052490234375, -0.050018310546875, 0.011322021484375, 0.033050537109375, 0.025726318359375, 0.0232696533203125, -0.049957275390625, -0.028900146484375, -0.00013530254364013672, -0.01367950439453125, 0.0056915283203125, -0.006931304931640625, 0.0081634521484375, 0.052734375, 0.0285797119140625, -0.043792724609375, -0.0080718994140625, -0.057220458984375, -0.031219482421875, 0.04937744140625, 0.03814697265625, 0.0171661376953125, -0.038787841796875, -0.051605224609375, -0.01617431640625, -0.032257080078125, 0.0196380615234375, 0.0289154052734375, 0.0247039794921875, -0.019134521484375, 0.037933349609375, -0.00482177734375, 0.04754638671875, -0.0197906494140625, -0.0080413818359375, 0.04058837890625, -0.0222320556640625, -0.0290374755859375, 0.00933837890625, 0.042449951171875, 0.0018482208251953125, -0.01708984375, 0.019287109375, -0.042144775390625, 0.00827789306640625, 0.0129241943359375, -0.07757568359375, -0.0357666015625, 0.030792236328125, -0.0438232421875, -0.040008544921875, 0.03216552734375, -0.04632568359375, -0.015777587890625, -0.01500701904296875, 0.0360107421875, -0.03729248046875, -0.0237579345703125, 0.00299072265625, -0.04150390625, 0.0321044921875, 0.0186004638671875, -0.054168701171875, 0.03668212890625, 0.05145263671875, 0.06268310546875, 0.01361846923828125, -0.0169830322265625, -0.0167694091796875, 0.0147705078125, -0.00936126708984375, 0.03515625, -0.0230255126953125, -0.0299072265625, -0.008544921875, 0.005062103271484375, -0.019317626953125, -0.03228759765625, 0.061920166015625, -0.0450439453125, 0.033203125, -0.01096343994140625, -0.047698974609375, -0.0226287841796875, 0.01043701171875, -0.040313720703125, 0.0946044921875, 0.0198974609375, -0.06512451171875, 0.00943756103515625, -0.068359375, -0.01132965087890625, 0.035186767578125, 0.01200103759765625, -0.043609619140625, -0.0148162841796875, -0.007312774658203125, 0.0214691162109375, -0.018707275390625, 0.0195159912109375, -0.00720977783203125, -0.0194854736328125, -0.01296234130859375, -0.01291656494140625, 0.11279296875, 0.023651123046875, -0.0220947265625, -0.01219940185546875, -0.047119140625, 0.0021305084228515625, 0.024993896484375, -0.0027523040771484375, -0.025665283203125, -0.018463134765625, -0.00467681884765625, 0.01493072509765625, 0.0325927734375, -0.03936767578125, -0.0098419189453125, -0.0178680419921875, 0.0323486328125, 0.04266357421875, 0.008087158203125, 0.0191650390625, -0.030120849609375, 0.037628173828125, 0.00882720947265625, 0.04510498046875, -0.01983642578125, -0.0445556640625, -0.042022705078125, -0.0302276611328125, 0.0218658447265625, 0.07781982421875, -0.0214080810546875, 0.06341552734375, -0.007022857666015625, -0.06927490234375, -0.0149383544921875, 0.0124969482421875, 0.0237274169921875, 0.060028076171875, 0.036895751953125, -0.032501220703125, -0.043212890625, -0.06964111328125, -0.013763427734375, -0.02716064453125, -0.0134735107421875, 0.039581298828125, 0.06689453125, -0.025665283203125, 0.060150146484375, -0.05352783203125, -0.040740966796875, -0.00879669189453125, 0.01250457763671875, 0.0197906494140625, 0.0521240234375, 0.054718017578125, -0.07403564453125, -0.005279541015625, -0.06304931640625, -0.04931640625, 0.0098876953125, -0.00848388671875, -0.00701141357421875, 0.0180816650390625, 0.0224609375, -0.046600341796875, 0.051666259765625, 0.029052734375, -0.040191650390625, 0.05438232421875, -0.016815185546875, -0.002346038818359375, -0.07427978515625, 0.0338134765625, -0.007709503173828125, -0.01210784912109375, -0.0394287109375, 0.01446533203125, 0.0226898193359375, -0.032806396484375, -0.04718017578125, 0.046356201171875, -0.03680419921875, 0.0013875961303710938, -0.0099945068359375, -0.003208160400390625, 0.025665283203125, 0.033447265625, 0.00844573974609375, 0.0443115234375, 0.052490234375, -0.0667724609375, 0.03106689453125, 0.044219970703125, -0.028839111328125, 0.034912109375, -0.04388427734375, 0.0029621124267578125, 0.007434844970703125, 0.030426025390625, -0.041900634765625, -0.006771087646484375, 0.039459228515625, -0.055419921875, 0.0298309326171875, -0.0182037353515625, -0.04156494140625, -0.046478271484375, -0.0099945068359375, 0.01442718505859375, 0.04815673828125, -0.0391845703125, 0.07843017578125, 0.02789306640625, 0.01328277587890625, -0.0290985107421875, -0.06817626953125, 0.008331298828125, -0.0167083740234375, -0.047760009765625, 0.038055419921875, -0.006443023681640625, -0.0183563232421875, -0.0013332366943359375, -0.00627899169921875, -0.021759033203125, 0.022674560546875, 0.0265655517578125, 0.06182861328125, -0.0009889602661132812, -0.0013685226440429688, -0.0280303955078125, -0.0258331298828125, 0.020660400390625, 0.0008001327514648438, 0.040435791015625, -0.0035762786865234375, -0.007556915283203125, -0.0297393798828125, 0.0266265869140625, 0.0293426513671875, -0.0313720703125, 0.08648681640625, 0.04034423828125, -0.061370849609375, -0.0136871337890625, -0.0367431640625, -0.03070068359375, -0.03631591796875, 0.01203155517578125, -0.0293426513671875, -0.0230255126953125, 0.056976318359375, 0.01311492919921875, 0.00921630859375, 0.056182861328125, 0.0178375244140625, -0.0009074211120605469, 0.060028076171875, 0.045196533203125, -0.004718780517578125, 0.0445556640625, -0.025177001953125, 0.032501220703125, -0.08685302734375, -0.0333251953125, -0.061370849609375, 0.0228424072265625, -0.035308837890625, -0.0078582763671875, 0.0038909912109375, 0.0186004638671875, -0.0443115234375, 0.0489501953125, -0.055023193359375, 0.0090179443359375, 0.048309326171875, 0.0208892822265625, -0.0076904296875, -0.0171966552734375, 0.0060272216796875, 0.0150604248046875, -0.04876708984375, -0.039154052734375, 0.059356689453125, 0.0230712890625, 0.032806396484375, 0.0029544830322265625, 0.062255859375, 0.024078369140625, 0.016815185546875, -0.0170745849609375, 0.04864501953125, 0.00238037109375, -0.0867919921875, -0.019317626953125, -0.044219970703125, -0.052978515625, 0.01323699951171875, -0.00437164306640625, -0.06549072265625, 0.0247039794921875, 0.0259857177734375, -0.020538330078125, 0.05029296875, -0.04632568359375, 0.07568359375, -0.028839111328125, 0.005298614501953125, -0.0094451904296875, -0.041717529296875, 0.05767822265625, -0.00818634033203125, 0.022674560546875, -0.01422882080078125, 0.0004818439483642578, 0.063720703125, -0.062042236328125, 0.055023193359375, -0.03985595703125, 0.0206298828125, 0.03167724609375, 0.00601959228515625, 0.04974365234375, -0.0019397735595703125, -0.0214080810546875, 0.047607421875, 0.01209259033203125, -0.030426025390625, -0.018646240234375, 0.041015625, -0.044403076171875, -0.032806396484375, -0.061920166015625, -0.0098876953125, 0.0063323974609375, 0.0290985107421875, 0.0299835205078125, 0.0229034423828125, 0.0023746490478515625, -0.0019426345825195312, 0.069091796875, -0.0308685302734375, 0.016326904296875, 0.04034423828125, 0.0021152496337890625, -0.0292510986328125, 0.05743408203125, 0.002559661865234375, 0.0202484130859375, 0.021942138671875, 0.019683837890625, -0.01548004150390625, -0.0144500732421875, -0.027557373046875, 0.0237274169921875, -0.05255126953125, -0.04107666015625, -0.05706787109375, -0.04937744140625, -0.05255126953125, 0.005657196044921875, -0.0229949951171875, -0.01605224609375, -0.0318603515625, -0.010498046875, 0.0321044921875, 0.00827789306640625, -0.02142333984375, 0.0238189697265625, -0.06536865234375, 0.0309600830078125, 0.0001461505889892578, 0.0243988037109375, 0.002758026123046875, -0.055999755859375, -0.0057220458984375, 0.004444122314453125, -0.0150299072265625, -0.0740966796875, 0.030120849609375, 0.004718780517578125, 0.043853759765625, 0.0165557861328125, 0.0262298583984375, 0.016082763671875, -0.00760650634765625, 0.07611083984375, 0.01251220703125, -0.07916259765625, 0.051055908203125, -0.01422882080078125, 0.007354736328125, 0.050201416015625, 0.022064208984375, -0.03277587890625, 0.00196075439453125, -0.08135986328125, -0.08294677734375, 0.07073974609375, 0.028289794921875, 0.00959014892578125, 0.029754638671875, 0.022735595703125, -0.0286407470703125, 0.024932861328125, -0.05487060546875, -0.043304443359375, -0.02655029296875, 0.01187896728515625, 0.004787445068359375, -0.0272064208984375, -0.0098724365234375, -0.016448974609375, 0.09375, 0.025390625, 0.03472900390625, 0.01030731201171875, -0.0084381103515625, 0.00862884521484375, -0.013031005859375, 0.028411865234375, 0.0232391357421875, -0.04833984375, -0.01399993896484375, 0.023284912109375, -0.047332763671875, -0.022796630859375, 0.0141754150390625, -0.043182373046875, 0.0134735107421875, 0.0174560546875, 0.0631103515625, -0.01385498046875, -0.0232086181640625, 0.049285888671875, -0.004180908203125, -0.034912109375, -0.059722900390625, 0.02783203125, 0.006938934326171875, -0.0145721435546875, 0.0234222412109375, 0.031707763671875, 0.0175933837890625, -0.042938232421875, 0.00616455078125, 0.02740478515625, -0.032012939453125, 0.00920867919921875, 0.049591064453125, 0.01953125, -0.052490234375, 0.0673828125, -0.0174560546875, -0.07086181640625, 0.0615234375, 0.0218353271484375, 0.07171630859375, -0.035675048828125, -0.00577545166015625, 0.054718017578125, 0.04443359375, -0.0024127960205078125, 0.0241851806640625, -0.01824951171875, -0.042999267578125, 0.023162841796875, -0.03924560546875, -0.033447265625, 0.0092620849609375, -0.06256103515625, 0.0489501953125, -0.041900634765625, 0.0036449432373046875, -0.00495147705078125, 0.01515960693359375, -0.06976318359375, 0.028839111328125, 0.013519287109375, 0.0908203125, -0.0770263671875, 0.048919677734375, 0.0259857177734375, -0.0430908203125, -0.07501220703125, -0.034149169921875, 0.00432586669921875, -0.054229736328125, 0.028594970703125, 0.00818634033203125, -0.0010881423950195312, -0.0216217041015625, -0.04559326171875, -0.059814453125, 0.10040283203125, 0.0014123916625976562, -0.04388427734375, 0.002117156982421875, 0.0027065277099609375, 0.01486968994140625, -0.0247802734375, 0.0299530029296875, 0.033050537109375, 0.042022705078125, 0.01209259033203125, -0.044647216796875, 0.010650634765625, -0.01525115966796875, -0.022857666015625, 0.01340484619140625, -0.055511474609375, 0.0615234375, -0.00775909423828125, -0.01397705078125, -0.0028209686279296875, 0.02435302734375, 0.03289794921875, 0.035614013671875, 0.045623779296875, 0.035186767578125, 0.07342529296875, 0.007289886474609375, 0.06353759765625, -0.049224853515625, 0.0303192138671875, 0.09503173828125, -0.0169219970703125, 0.052581787109375, 0.003688812255859375, -0.0005521774291992188, 0.04541015625, 0.06610107421875, -0.04034423828125, 0.0223236083984375, 0.00948333740234375, -0.0098419189453125, -0.0276336669921875, 0.001987457275390625, -0.03863525390625, 0.012176513671875, 0.0243988037109375, -0.04852294921875, -0.006046295166015625, -0.0175933837890625, -0.014495849609375, -0.03436279296875, -0.020843505859375, 0.041778564453125, 0.0124664306640625, -0.0209503173828125, 0.01145172119140625, 0.0223236083984375, 0.040191650390625, -0.0640869140625, 0.005619049072265625, 0.007419586181640625, 0.016448974609375, -0.036651611328125, -0.030792236328125, 0.0302276611328125, 0.0023517608642578125, -0.0229949951171875, -0.01483154296875, 0.06610107421875, 0.005489349365234375, -0.046173095703125, 0.0242462158203125, 0.01436614990234375, 0.024139404296875, -0.01947021484375, -0.0855712890625, -0.0024871826171875, 0.0042877197265625, -0.0185546875, 0.0038852691650390625, 0.006778717041015625, 0.004947662353515625, 0.060546875, 0.050506591796875, 0.01152801513671875, -0.011138916015625, 0.00452423095703125, 0.0506591796875, -0.042877197265625, -0.051666259765625, -0.05743408203125, 0.03314208984375, -0.0275115966796875, -0.03753662109375, 0.060150146484375, 0.0775146484375, 0.051239013671875, 0.011077880859375, 0.054473876953125, -0.005619049072265625, 0.0494384765625, -0.013153076171875, 0.04559326171875, -0.03106689453125, 0.027313232421875, -0.03155517578125, -0.0689697265625, -0.00226593017578125, 0.039794921875, -0.016876220703125, 0.00974273681640625, 0.0411376953125, 0.07281494140625, -0.0041656494140625, 0.01375579833984375, 0.0119781494140625, 0.006313323974609375, 0.02545166015625, 0.0143585205078125, 0.019134521484375, -0.06884765625, 0.036529541015625, -0.0211639404296875, -0.0271148681640625, -0.01318359375, -0.060943603515625, -0.0615234375, -0.037933349609375, -0.039581298828125, -0.05352783203125, -0.01226806640625, 0.0445556640625, 0.06201171875, -0.060028076171875, -0.002613067626953125, -0.002353668212890625, 0.0089569091796875, -0.0106658935546875, -0.016876220703125, 0.029632568359375, -0.0027370452880859375, -0.0222015380859375, -0.0024261474609375, -0.0006756782531738281, 0.0124359130859375, -0.0352783203125, -0.00832366943359375, -0.0017147064208984375, 0.033905029296875, 0.0122528076171875, 0.022552490234375, -0.0275726318359375, 0.0027980804443359375, -0.004558563232421875, -0.0223541259765625, -0.007312774658203125, 0.048980712890625, -0.03778076171875, 0.0018396377563476562, 0.045928955078125, 0.0037899017333984375, 0.0538330078125, 0.0030364990234375, 0.0187225341796875, -0.013824462890625, -0.0032405853271484375, 0.03204345703125, 0.047821044921875, 0.0262451171875, -0.04815673828125, 0.033111572265625, 0.0259552001953125, -0.040496826171875, -0.07891845703125, 0.01256561279296875, -0.06329345703125, -0.02947998046875, 0.0701904296875, -0.007320404052734375, -0.042755126953125, -0.01361846923828125, -0.004741668701171875, 0.0038909912109375, -0.0272369384765625, 0.054718017578125, 0.046356201171875, -0.002399444580078125, -0.002315521240234375, -0.023406982421875, 0.040008544921875, 0.004852294921875, -0.050933837890625, 0.00791168212890625, 0.02294921875, 0.031890869140625, 0.01556396484375, 0.037139892578125, -0.01666259765625, 0.0232086181640625, 0.026824951171875, 0.039154052734375, -0.02392578125, -0.01934814453125, -0.025146484375, -0.00739288330078125, -0.01898193359375, -0.02410888671875 ] ]
microsoft/DialoGPT-medium
2023-01-24T16:55:09.000Z
[ "transformers", "pytorch", "tf", "jax", "rust", "gpt2", "text-generation", "conversational", "arxiv:1911.00536", "license:mit", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
conversational
microsoft
null
null
microsoft/DialoGPT-medium
243
137,160
transformers
2022-03-02T23:29:05
--- thumbnail: https://huggingface.co/front/thumbnails/dialogpt.png tags: - conversational license: mit --- ## A State-of-the-Art Large-scale Pretrained Response generation model (DialoGPT) DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations. The [human evaluation results](https://github.com/dreasysnail/Dialogpt_dev#human-evaluation) indicate that the response generated from DialoGPT is comparable to human response quality under a single-turn conversation Turing test. The model is trained on 147M multi-turn dialogue from Reddit discussion thread. * Multi-turn generation examples from an interactive environment: |Role | Response | |---------|--------| |User | Does money buy happiness? | | Bot | Depends how much money you spend on it .| |User | What is the best way to buy happiness ? | | Bot | You just have to be a millionaire by your early 20s, then you can be happy . | |User |This is so difficult ! | | Bot | You have no idea how hard it is to be a millionaire and happy . There is a reason the rich have a lot of money | Please find the information about preprocessing, training and full details of the DialoGPT in the [original DialoGPT repository](https://github.com/microsoft/DialoGPT) ArXiv paper: [https://arxiv.org/abs/1911.00536](https://arxiv.org/abs/1911.00536) ### How to use Now we are ready to try out how the model works as a chatting partner! ```python from transformers import AutoModelForCausalLM, AutoTokenizer import torch tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-medium") model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-medium") # Let's chat for 5 lines for step in range(5): # encode the new user input, add the eos_token and return a tensor in Pytorch new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt') # append the new user input tokens to the chat history bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids # generated a response while limiting the total chat history to 1000 tokens, chat_history_ids = model.generate(bot_input_ids, max_length=1000, pad_token_id=tokenizer.eos_token_id) # pretty print last ouput tokens from bot print("DialoGPT: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True))) ```
2,419
[ [ -0.029449462890625, -0.07086181640625, 0.00241851806640625, 0.00937652587890625, -0.012969970703125, 0.01256561279296875, 0.000009298324584960938, -0.0147705078125, 0.0143585205078125, 0.032989501953125, -0.06365966796875, -0.01007080078125, -0.032623291015625, -0.0038928985595703125, -0.01479339599609375, 0.08154296875, 0.0295257568359375, 0.014251708984375, -0.0005936622619628906, 0.009246826171875, -0.0372314453125, -0.05462646484375, -0.06390380859375, -0.0140228271484375, 0.002819061279296875, 0.019927978515625, 0.038116455078125, -0.002925872802734375, 0.0244903564453125, 0.03662109375, -0.0024318695068359375, 0.00856781005859375, -0.0567626953125, 0.004253387451171875, 0.0159759521484375, -0.041229248046875, -0.053466796875, 0.00830078125, 0.0172882080078125, 0.0217742919921875, 0.00035572052001953125, 0.02838134765625, 0.0102081298828125, 0.02386474609375, -0.027801513671875, 0.0195159912109375, -0.045013427734375, 0.0035953521728515625, 0.01322174072265625, -0.047210693359375, -0.033447265625, -0.01837158203125, 0.036041259765625, -0.041259765625, 0.0206146240234375, 0.0159759521484375, 0.06756591796875, -0.003467559814453125, -0.033050537109375, -0.03985595703125, -0.0362548828125, 0.057342529296875, -0.06787109375, 0.0201568603515625, 0.023284912109375, 0.0177154541015625, -0.038238525390625, -0.0645751953125, -0.044525146484375, -0.0215606689453125, 0.001434326171875, 0.011993408203125, -0.0170745849609375, 0.026153564453125, 0.028076171875, 0.0254364013671875, -0.053558349609375, -0.0214691162109375, -0.040008544921875, -0.046478271484375, 0.0394287109375, 0.0194091796875, 0.0167999267578125, -0.02349853515625, -0.0316162109375, -0.0095977783203125, -0.030059814453125, 0.007038116455078125, 0.03411865234375, 0.016998291015625, -0.0150909423828125, 0.049957275390625, -0.0207061767578125, 0.059844970703125, 0.01082611083984375, -0.02313232421875, 0.0287933349609375, -0.039276123046875, -0.017333984375, -0.0120849609375, 0.0718994140625, 0.040435791015625, 0.02099609375, 0.018524169921875, -0.00214385986328125, -0.0267333984375, -0.00463104248046875, -0.07818603515625, -0.0131988525390625, 0.034210205078125, -0.04254150390625, -0.02410888671875, -0.0175933837890625, -0.0582275390625, -0.01275634765625, -0.01068878173828125, 0.05230712890625, -0.03857421875, -0.03204345703125, 0.00655364990234375, -0.0185546875, 0.018524169921875, 0.0253143310546875, -0.058685302734375, 0.007350921630859375, 0.0296630859375, 0.0716552734375, 0.0172119140625, -0.0313720703125, -0.039276123046875, -0.03082275390625, -0.0022716522216796875, 0.041839599609375, -0.015655517578125, -0.0239105224609375, 0.004421234130859375, -0.00618743896484375, -0.00922393798828125, -0.032684326171875, -0.004520416259765625, -0.03668212890625, 0.052001953125, 0.01000213623046875, -0.054779052734375, -0.0013103485107421875, 0.026580810546875, -0.0228424072265625, 0.05889892578125, 0.004405975341796875, -0.06365966796875, 0.020416259765625, -0.0712890625, -0.01416015625, 0.006603240966796875, -0.003345489501953125, -0.018096923828125, 0.000804901123046875, -0.0004782676696777344, 0.03961181640625, -0.0199127197265625, 0.0040435791015625, -0.027069091796875, -0.01177215576171875, 0.04522705078125, -0.04010009765625, 0.07427978515625, 0.0276641845703125, -0.0221710205078125, 0.031463623046875, -0.046661376953125, 0.016876220703125, 0.01508331298828125, -0.0243988037109375, 0.0237884521484375, -0.01983642578125, 0.01314544677734375, 0.037933349609375, 0.034088134765625, -0.040130615234375, 0.01094818115234375, -0.029052734375, 0.06451416015625, 0.06365966796875, 0.000850677490234375, 0.0159759521484375, -0.0222625732421875, 0.034149169921875, 0.00360870361328125, 0.018096923828125, -0.032196044921875, -0.033843994140625, -0.062225341796875, -0.0234222412109375, 0.0121002197265625, 0.039398193359375, -0.058807373046875, 0.054901123046875, -0.006023406982421875, -0.0289154052734375, -0.02960205078125, -0.006679534912109375, 0.016937255859375, 0.03887939453125, 0.00879669189453125, -0.0289154052734375, -0.051055908203125, -0.04876708984375, -0.005664825439453125, -0.0291900634765625, -0.01302337646484375, 0.027801513671875, 0.04473876953125, -0.0034503936767578125, 0.0748291015625, -0.04632568359375, -0.00867462158203125, -0.033355712890625, 0.0282745361328125, 0.0058746337890625, 0.045684814453125, 0.030609130859375, -0.04656982421875, -0.033233642578125, -0.0288238525390625, -0.041259765625, 0.0173492431640625, -0.01528167724609375, -0.0191497802734375, 0.0159759521484375, 0.033111572265625, -0.054779052734375, 0.041168212890625, 0.0374755859375, -0.0496826171875, 0.052947998046875, -0.01235198974609375, 0.027923583984375, -0.1007080078125, 0.0006761550903320312, -0.028839111328125, -0.035064697265625, -0.045623779296875, -0.01355743408203125, -0.0287628173828125, -0.0328369140625, -0.051116943359375, 0.0439453125, -0.02581787109375, -0.00003403425216674805, -0.01873779296875, 0.0030498504638671875, -0.0267791748046875, 0.058746337890625, -0.0008997917175292969, 0.056884765625, 0.043548583984375, -0.03131103515625, 0.052032470703125, 0.02960205078125, -0.011505126953125, 0.041595458984375, -0.0633544921875, 0.025665283203125, 0.007152557373046875, 0.0291290283203125, -0.10791015625, -0.02850341796875, 0.01148223876953125, -0.071533203125, 0.0101165771484375, -0.01412200927734375, -0.040557861328125, -0.0362548828125, -0.0211029052734375, 0.0160980224609375, 0.0472412109375, -0.0261077880859375, 0.04541015625, 0.0268402099609375, -0.01001739501953125, -0.0308685302734375, -0.0297088623046875, 0.00717926025390625, -0.01055145263671875, -0.06439208984375, -0.0081024169921875, -0.0316162109375, 0.017333984375, -0.029541015625, 0.00701904296875, -0.0098876953125, -0.002529144287109375, 0.018646240234375, 0.033111572265625, -0.0034503936767578125, 0.0011444091796875, -0.0372314453125, -0.0189971923828125, 0.0007290840148925781, -0.005771636962890625, 0.10369873046875, -0.02740478515625, -0.0159759521484375, -0.055328369140625, 0.0205841064453125, 0.050811767578125, 0.0007357597351074219, 0.04669189453125, 0.050628662109375, -0.020050048828125, 0.016937255859375, -0.049560546875, -0.046478271484375, -0.040679931640625, 0.050872802734375, -0.029022216796875, -0.07281494140625, 0.044525146484375, 0.0014944076538085938, 0.0266265869140625, 0.03350830078125, 0.0653076171875, -0.002044677734375, 0.09429931640625, 0.0396728515625, 0.0009732246398925781, 0.0546875, -0.0299224853515625, 0.0150299072265625, -0.041412353515625, -0.00234222412109375, -0.0217437744140625, -0.01213836669921875, -0.044464111328125, -0.01413726806640625, 0.01038360595703125, 0.000492095947265625, -0.034881591796875, 0.0282745361328125, -0.03350830078125, 0.011474609375, 0.05560302734375, 0.003826141357421875, 0.0068511962890625, -0.006244659423828125, 0.005802154541015625, -0.0023746490478515625, -0.055511474609375, -0.039886474609375, 0.09320068359375, 0.0293426513671875, 0.0509033203125, -0.014801025390625, 0.0592041015625, 0.00611114501953125, 0.007251739501953125, -0.0625, 0.05499267578125, 0.04010009765625, -0.07037353515625, -0.033447265625, -0.0455322265625, -0.07208251953125, 0.00992584228515625, -0.0203857421875, -0.08074951171875, -0.0135498046875, 0.02960205078125, -0.035736083984375, 0.01325225830078125, -0.0699462890625, 0.06982421875, -0.022247314453125, -0.019927978515625, -0.0083770751953125, -0.052825927734375, 0.0150299072265625, 0.0168609619140625, -0.00933837890625, -0.0127410888671875, 0.02197265625, 0.06646728515625, -0.037750244140625, 0.0592041015625, -0.0175933837890625, 0.022705078125, 0.027374267578125, 0.01244354248046875, 0.0226593017578125, 0.007373809814453125, 0.01885986328125, -0.0019464492797851562, 0.01174163818359375, -0.03515625, -0.02301025390625, 0.03924560546875, -0.07220458984375, -0.042388916015625, -0.026153564453125, -0.039886474609375, -0.010772705078125, 0.030242919921875, 0.05047607421875, 0.037567138671875, -0.02099609375, 0.021636962890625, 0.02532958984375, -0.0266265869140625, 0.037322998046875, 0.02508544921875, -0.02081298828125, -0.037689208984375, 0.06451416015625, 0.00653839111328125, 0.035186767578125, 0.005702972412109375, 0.0023822784423828125, -0.0243072509765625, -0.01517486572265625, -0.0282745361328125, 0.004955291748046875, -0.034149169921875, -0.016021728515625, -0.047332763671875, -0.034912109375, -0.04876708984375, -0.0084228515625, -0.04522705078125, -0.022705078125, -0.016387939453125, 0.0018024444580078125, 0.0252685546875, 0.0283355712890625, -0.001277923583984375, 0.029022216796875, -0.05072021484375, 0.0199127197265625, 0.046661376953125, 0.008941650390625, 0.0029544830322265625, -0.039642333984375, 0.004108428955078125, 0.0207977294921875, -0.03985595703125, -0.056549072265625, 0.038238525390625, 0.00835418701171875, 0.037811279296875, 0.033172607421875, 0.00881195068359375, 0.0577392578125, -0.0198974609375, 0.0733642578125, 0.038543701171875, -0.0672607421875, 0.0283660888671875, -0.013702392578125, 0.027923583984375, 0.032257080078125, 0.00856781005859375, -0.050872802734375, -0.02203369140625, -0.06689453125, -0.0692138671875, 0.06610107421875, 0.046630859375, 0.031951904296875, 0.00769805908203125, 0.0034580230712890625, -0.0002034902572631836, 0.03692626953125, -0.05804443359375, -0.028900146484375, -0.026885986328125, -0.00750732421875, 0.0028133392333984375, -0.0224609375, -0.00864410400390625, -0.01244354248046875, 0.04620361328125, -0.0069732666015625, 0.0592041015625, 0.01198577880859375, -0.0081329345703125, 0.004398345947265625, 0.01358795166015625, 0.048828125, 0.06195068359375, -0.0275726318359375, -0.00878143310546875, 0.01180267333984375, -0.03466796875, -0.004085540771484375, 0.01244354248046875, 0.01934814453125, -0.00684356689453125, 0.0313720703125, 0.06817626953125, -0.006683349609375, -0.048309326171875, 0.0499267578125, -0.03070068359375, -0.02734375, -0.0360107421875, 0.0019855499267578125, 0.012908935546875, 0.01166534423828125, 0.041168212890625, -0.00040340423583984375, 0.0051727294921875, -0.053741455078125, 0.01049041748046875, 0.036041259765625, -0.0272674560546875, -0.02508544921875, 0.045684814453125, 0.0455322265625, -0.047943115234375, 0.06451416015625, -0.00890350341796875, -0.051513671875, 0.0362548828125, 0.036346435546875, 0.0721435546875, 0.0009784698486328125, 0.0169677734375, 0.037933349609375, -0.00022923946380615234, 0.01233673095703125, 0.026611328125, -0.0137786865234375, -0.05816650390625, -0.0149383544921875, -0.03070068359375, -0.0158538818359375, 0.024566650390625, -0.03558349609375, 0.0218963623046875, -0.03460693359375, -0.030670166015625, 0.003009796142578125, 0.0012350082397460938, -0.074951171875, 0.0012340545654296875, -0.006092071533203125, 0.05535888671875, -0.0469970703125, 0.0257720947265625, 0.034027099609375, -0.02569580078125, -0.042510986328125, -0.0045928955078125, 0.01050567626953125, -0.0750732421875, 0.038177490234375, 0.0236358642578125, 0.00716400146484375, 0.0190582275390625, -0.06005859375, -0.053070068359375, 0.068603515625, 0.02435302734375, -0.033721923828125, -0.0080108642578125, 0.015594482421875, 0.028656005859375, -0.0282440185546875, 0.052581787109375, 0.0316162109375, 0.00811004638671875, 0.025909423828125, -0.08245849609375, -0.0008625984191894531, -0.02294921875, -0.008941650390625, -0.004154205322265625, -0.054229736328125, 0.0662841796875, -0.0164642333984375, -0.01084136962890625, 0.021026611328125, 0.04376220703125, 0.0241241455078125, 0.0038852691650390625, 0.0543212890625, 0.0241546630859375, 0.036865234375, -0.016143798828125, 0.060760498046875, -0.044677734375, 0.052093505859375, 0.07293701171875, 0.01294708251953125, 0.05072021484375, 0.040557861328125, -0.0131683349609375, 0.017852783203125, 0.058990478515625, 0.0157470703125, 0.0254364013671875, 0.0191192626953125, -0.013458251953125, -0.030181884765625, 0.001918792724609375, -0.033447265625, 0.035369873046875, 0.0119781494140625, -0.0240325927734375, -0.0090484619140625, 0.007625579833984375, 0.0137481689453125, -0.04803466796875, 0.002109527587890625, 0.06805419921875, -0.006435394287109375, -0.04754638671875, 0.049407958984375, -0.020965576171875, 0.0657958984375, -0.060577392578125, -0.00605010986328125, -0.0073089599609375, 0.0182037353515625, -0.01020050048828125, -0.041046142578125, -0.00994873046875, -0.01325225830078125, 0.01207733154296875, -0.0037860870361328125, 0.0479736328125, -0.0272674560546875, -0.023040771484375, -0.0010223388671875, 0.040374755859375, 0.01922607421875, 0.0013551712036132812, -0.07073974609375, -0.004032135009765625, 0.01953125, -0.052825927734375, 0.020904541015625, 0.0180206298828125, 0.0274505615234375, 0.056365966796875, 0.06036376953125, -0.01129150390625, 0.01155853271484375, -0.012298583984375, 0.06365966796875, -0.04559326171875, -0.043792724609375, -0.05889892578125, 0.05499267578125, -0.02716064453125, -0.0592041015625, 0.055328369140625, 0.044036865234375, 0.057220458984375, -0.01526641845703125, 0.05084228515625, -0.0248870849609375, 0.0255279541015625, -0.0183563232421875, 0.04412841796875, -0.035491943359375, -0.0027408599853515625, -0.0196533203125, -0.057647705078125, 0.0004642009735107422, 0.06414794921875, -0.01062774658203125, 0.016021728515625, 0.0352783203125, 0.0662841796875, 0.0093536376953125, -0.006317138671875, 0.03009033203125, 0.0252532958984375, 0.03997802734375, 0.03857421875, 0.0697021484375, -0.0303497314453125, 0.059326171875, -0.0094757080078125, -0.0306854248046875, -0.032379150390625, -0.047943115234375, -0.0908203125, -0.053253173828125, -0.0159912109375, -0.041046142578125, -0.00879669189453125, 0.09991455078125, 0.07562255859375, -0.0496826171875, -0.031829833984375, -0.01239776611328125, -0.0086212158203125, 0.0017480850219726562, -0.0235748291015625, 0.01291656494140625, -0.033477783203125, -0.06439208984375, -0.010406494140625, 0.006328582763671875, 0.02789306640625, -0.031402587890625, -0.004192352294921875, -0.01247406005859375, 0.0091705322265625, 0.0455322265625, 0.0282745361328125, -0.0394287109375, -0.024505615234375, 0.01194000244140625, -0.01071929931640625, 0.003509521484375, 0.05029296875, -0.031951904296875, 0.053863525390625, 0.05511474609375, 0.0108642578125, 0.0567626953125, -0.0139617919921875, 0.0601806640625, -0.037353515625, 0.0295867919921875, 0.0222625732421875, 0.03125, 0.0144195556640625, -0.0195159912109375, 0.0209197998046875, 0.01483154296875, -0.056671142578125, -0.06011962890625, 0.015594482421875, -0.0692138671875, -0.0106353759765625, 0.074462890625, -0.0191802978515625, -0.011932373046875, -0.0121612548828125, -0.056365966796875, 0.018829345703125, -0.051788330078125, 0.06292724609375, 0.05133056640625, -0.026397705078125, -0.0057525634765625, -0.035186767578125, 0.04425048828125, 0.02362060546875, -0.049591064453125, 0.004940032958984375, 0.03173828125, 0.03411865234375, 0.0219268798828125, 0.0711669921875, 0.0007762908935546875, 0.0257568359375, 0.01032257080078125, 0.0164337158203125, -0.0075225830078125, -0.0020294189453125, 0.0033664703369140625, 0.015655517578125, -0.0032520294189453125, -0.03533935546875 ] ]
Mitsua/mitsua-diffusion-cc0
2023-03-03T11:04:16.000Z
[ "diffusers", "stable-diffusion", "text-to-image", "stable-diffusion-diffusers", "license:openrail++", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Mitsua
null
null
Mitsua/mitsua-diffusion-cc0
60
137,006
diffusers
2022-12-21T23:04:27
--- license: openrail++ tags: - stable-diffusion - text-to-image - stable-diffusion-diffusers - diffusers inference: true --- # . # . # . # . # . # . # ❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗ # This version is deprecated. # Please use [Mitsua Diffusion One](https://huggingface.co/Mitsua/mitsua-diffusion-one), which is a successor of this model. # ❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗❗ # . # . # . # . # . # Mitsua Diffusion CC0 Model Card Mitsua Diffusion CC0 is a latent text-to-image diffusion model, whose U-Net is **trained from scratch using only public domain/CC0 or copyright images with permission for use**. Text Encoder and VAE are borrowed from [Stable Diffusion v2.1 base](https://huggingface.co/stabilityai/stable-diffusion-2-1-base/). This will be used as a base model for [**AI VTuber Elan Mitsua🖌️**](https://elanmitsua.com/en/)’s activity. ❗❗ **Currently the model has super low visual quality and limited diversity** ❗❗ Yes, the visual quality is not so good. Most of modern artistic concept is lost completely. However, since she is a growing AI in an ethical fashion, it would be good starting point for Mitsua-chan! You can join [her training on Twitter](https://twitter.com/elanmitsua)! Please support Mitsua-chan!🎉 Further training will be done in a fully opt-in basis. If you are interested in, [please click here to submit an opt-in application](https://forms.gle/Nk3M7UyqSgYAqdpA6). We are active on [a Discord server for opt-in participants only](https://discord.com/invite/7VTGRweTUg). Communication is currently in Japanese. ![Header](https://huggingface.co/Mitsua/mitsua-diffusion-cc0/resolve/main/images/mitsua_cc0_works.webp) You can check [here to all prompts to generate these images](https://huggingface.co/Mitsua/mitsua-diffusion-cc0/resolve/main/images/mitsua_cc0_works_prompts.csv). ## Training Data Sources All data was obtained ethically and in compliance with the site's terms and conditions. No copyright images are used in the training of this model without the permission. No AI generated images are in the dataset. - Traditional Artwork in public domain / CC0 - MET Museum Open Access - Smithsonian Open Access - Cleveland Museum of Art Open Access - National Gallery of Art Open Access - ArtBench-10 (public domain subset) - CC0 Photos - Flickr, Wikimedia Commons - CC0 NFTs *1 - goblintown.nft, mfer, tubby-cats, Timeless - CC0 VRM models - made by VRoid Project, pastelkies, yomox9 (all CC0 subset) - We generated a bunch of synthesized images dataset rendered with various poses and camera angles. - Copyright images with permission for use - Generative and Visual Artworks made by Rhizomatiks Approx 11M images in total with data augmentation. 1. Their work is released under a CC0 license, but if you are considering using this model to create a work inspired by their NFT and sell it as NFT, please consider paying them a royalty to help the CC0 NFT community grow. ## License [Creative Open-Rail++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) ❗❗ “Mitsua Diffusion CC0” means most of the training data is CC0. **the model license itself is NOT CC0**.❗❗ This model is open access and available to all, with a CreativeML OpenRAIL++-M license further specifying rights and usage. The CreativeML OpenRAIL++-M License specifies: 1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content 2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL++-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL) ## Developed by - Stable Diffusion 2.1: Robin Rombach, Patrick Esser - Mitsua Diffusion CC0 : Abstract Engine dev team
4,177
[ [ -0.03387451171875, -0.0543212890625, 0.031280517578125, 0.01235198974609375, -0.038787841796875, -0.01023101806640625, 0.0135955810546875, -0.040435791015625, 0.0205535888671875, 0.0313720703125, -0.053375244140625, -0.0246124267578125, -0.04156494140625, -0.0185089111328125, -0.02081298828125, 0.07110595703125, -0.007904052734375, 0.0283660888671875, -0.0206756591796875, -0.0214080810546875, -0.020172119140625, -0.014556884765625, -0.0762939453125, -0.03765869140625, 0.031219482421875, 0.016326904296875, 0.0308685302734375, 0.03729248046875, 0.01558685302734375, 0.019683837890625, -0.0195770263671875, -0.019989013671875, -0.035736083984375, -0.01222991943359375, 0.00794219970703125, -0.0311737060546875, -0.056182861328125, 0.015106201171875, 0.03912353515625, 0.016510009765625, -0.035675048828125, 0.00167083740234375, -0.006938934326171875, 0.0401611328125, -0.022003173828125, 0.0010852813720703125, -0.023162841796875, -0.0016574859619140625, -0.01361846923828125, 0.0257720947265625, -0.018218994140625, -0.01242828369140625, -0.00009232759475708008, -0.0777587890625, 0.017364501953125, -0.0178680419921875, 0.0762939453125, 0.007843017578125, -0.0158843994140625, 0.0112457275390625, -0.0286712646484375, 0.0419921875, -0.0643310546875, 0.03289794921875, 0.017822265625, 0.042388916015625, 0.0089874267578125, -0.05828857421875, -0.04296875, -0.003002166748046875, 0.0183258056640625, 0.033111572265625, -0.01617431640625, -0.01947021484375, 0.0187835693359375, 0.021942138671875, -0.038848876953125, 0.004840850830078125, -0.0294342041015625, -0.005023956298828125, 0.047119140625, 0.014556884765625, 0.046295166015625, 0.0084381103515625, -0.04815673828125, -0.020477294921875, -0.0458984375, -0.0010843276977539062, 0.0335693359375, 0.002391815185546875, -0.07177734375, 0.0296173095703125, -0.005817413330078125, 0.02374267578125, 0.00554656982421875, 0.0080413818359375, 0.03216552734375, -0.0172882080078125, -0.026519775390625, -0.0236663818359375, 0.06982421875, 0.0292205810546875, 0.007038116455078125, -0.0089263916015625, -0.007083892822265625, 0.003643035888671875, 0.00962066650390625, -0.071044921875, -0.0290374755859375, 0.005847930908203125, -0.02569580078125, -0.02740478515625, -0.0033969879150390625, -0.0830078125, -0.01424407958984375, -0.011505126953125, 0.02423095703125, -0.0182037353515625, -0.05938720703125, 0.00933837890625, -0.0260772705078125, 0.0201416015625, 0.0367431640625, -0.04510498046875, 0.03466796875, 0.01605224609375, 0.07159423828125, -0.001678466796875, -0.005279541015625, 0.0012683868408203125, 0.00653076171875, -0.0120697021484375, 0.046966552734375, -0.027008056640625, -0.042083740234375, -0.0227508544921875, 0.0220947265625, 0.005420684814453125, -0.0278167724609375, 0.04638671875, -0.030975341796875, 0.01727294921875, -0.0019969940185546875, -0.0306854248046875, -0.0204620361328125, 0.002262115478515625, -0.049407958984375, 0.056732177734375, 0.01776123046875, -0.060577392578125, 0.01873779296875, -0.05987548828125, -0.001972198486328125, 0.00807952880859375, 0.0135040283203125, -0.05908203125, -0.009674072265625, 0.00007003545761108398, 0.036102294921875, -0.0006375312805175781, 0.0214691162109375, -0.05657958984375, -0.00572967529296875, -0.0038623809814453125, -0.0262451171875, 0.07733154296875, 0.01558685302734375, -0.0211181640625, -0.00286865234375, -0.057708740234375, -0.01690673828125, 0.03375244140625, -0.007625579833984375, -0.015228271484375, -0.014007568359375, 0.0260162353515625, 0.03125, 0.004726409912109375, -0.052398681640625, 0.020751953125, -0.036346435546875, 0.04180908203125, 0.06842041015625, 0.01568603515625, 0.0458984375, -0.0283966064453125, 0.03826904296875, 0.0283355712890625, 0.0255889892578125, 0.0156097412109375, -0.06634521484375, -0.05267333984375, -0.0129241943359375, 0.0138702392578125, 0.025054931640625, -0.0660400390625, 0.004787445068359375, -0.00926971435546875, -0.06768798828125, -0.0215301513671875, -0.00687408447265625, 0.0119171142578125, 0.04052734375, 0.0280914306640625, -0.03179931640625, -0.00939178466796875, -0.05401611328125, 0.0151824951171875, 0.01381683349609375, 0.010986328125, 0.01568603515625, 0.04931640625, -0.0125274658203125, 0.048309326171875, -0.03448486328125, -0.021636962890625, -0.0151824951171875, 0.00870513916015625, 0.01157379150390625, 0.06640625, 0.06390380859375, -0.065673828125, -0.048919677734375, -0.0117950439453125, -0.045806884765625, -0.00382232666015625, -0.004322052001953125, -0.0330810546875, 0.0120849609375, 0.01030731201171875, -0.060821533203125, 0.0545654296875, 0.04913330078125, -0.037811279296875, 0.022003173828125, -0.03155517578125, 0.0020999908447265625, -0.0992431640625, 0.0197296142578125, 0.04034423828125, -0.0283966064453125, -0.04541015625, 0.0220794677734375, -0.0077667236328125, -0.0154266357421875, -0.06500244140625, 0.064208984375, -0.028045654296875, 0.02655029296875, -0.02716064453125, 0.0120849609375, -0.01261138916015625, 0.045196533203125, 0.01476287841796875, 0.04901123046875, 0.06646728515625, -0.056396484375, -0.0008792877197265625, 0.03253173828125, -0.01024627685546875, 0.05712890625, -0.07025146484375, 0.0007171630859375, -0.01361083984375, 0.0205841064453125, -0.06451416015625, -0.007366180419921875, 0.059295654296875, -0.04498291015625, 0.036773681640625, -0.0095367431640625, -0.03662109375, -0.0148162841796875, -0.0108184814453125, 0.03692626953125, 0.06219482421875, -0.04083251953125, 0.04150390625, 0.0256195068359375, 0.0174713134765625, -0.0272369384765625, -0.0677490234375, -0.0269927978515625, -0.02374267578125, -0.050689697265625, 0.03277587890625, -0.03375244140625, 0.000010013580322265625, 0.0082550048828125, 0.00882720947265625, -0.0234222412109375, 0.005329132080078125, 0.0389404296875, 0.020721435546875, 0.0059967041015625, -0.0238189697265625, 0.00830841064453125, -0.0283660888671875, -0.0106048583984375, 0.0016527175903320312, 0.0295562744140625, 0.01168060302734375, -0.016021728515625, -0.08343505859375, 0.0164642333984375, 0.054473876953125, 0.0010356903076171875, 0.06890869140625, 0.05535888671875, -0.03765869140625, -0.0012407302856445312, -0.0222625732421875, -0.02008056640625, -0.034576416015625, 0.006336212158203125, -0.0025482177734375, -0.051849365234375, 0.0540771484375, 0.007266998291015625, 0.017730712890625, 0.06475830078125, 0.040283203125, -0.00568389892578125, 0.0838623046875, 0.06390380859375, 0.0023670196533203125, 0.039794921875, -0.055206298828125, -0.0009107589721679688, -0.07989501953125, -0.031494140625, -0.033477783203125, -0.0297088623046875, -0.0269927978515625, -0.0274810791015625, 0.0194091796875, 0.00717926025390625, -0.038726806640625, 0.033721923828125, -0.0323486328125, 0.0183868408203125, 0.011474609375, 0.034027099609375, 0.006237030029296875, 0.0024871826171875, -0.007083892822265625, -0.0096282958984375, -0.0484619140625, -0.0177154541015625, 0.07110595703125, 0.043060302734375, 0.04620361328125, 0.0179290771484375, 0.039947509765625, 0.0247802734375, 0.044708251953125, -0.02935791015625, 0.0401611328125, -0.0028076171875, -0.05926513671875, -0.0027599334716796875, -0.038787841796875, -0.0640869140625, 0.006732940673828125, -0.027435302734375, -0.045501708984375, 0.023681640625, -0.004070281982421875, -0.009521484375, 0.04486083984375, -0.0380859375, 0.07220458984375, 0.0036678314208984375, -0.044677734375, -0.01439666748046875, -0.049774169921875, 0.035491943359375, 0.0008444786071777344, 0.01812744140625, -0.006591796875, -0.011474609375, 0.06146240234375, -0.043304443359375, 0.082275390625, -0.042266845703125, 0.00836944580078125, 0.020050048828125, -0.00007927417755126953, 0.0230255126953125, 0.015655517578125, -0.00353240966796875, 0.029449462890625, 0.0074310302734375, -0.04327392578125, -0.0254058837890625, 0.05902099609375, -0.0838623046875, -0.01224517822265625, -0.031951904296875, -0.006481170654296875, 0.01529693603515625, 0.0256500244140625, 0.0389404296875, 0.02734375, -0.01129913330078125, 0.005962371826171875, 0.04766845703125, -0.0120086669921875, 0.04144287109375, 0.037353515625, -0.0013103485107421875, -0.0509033203125, 0.061004638671875, 0.01116180419921875, 0.0307769775390625, 0.0037631988525390625, 0.0225982666015625, -0.033966064453125, -0.040191650390625, -0.0567626953125, 0.043914794921875, -0.049896240234375, -0.016632080078125, -0.04833984375, -0.00939178466796875, -0.02587890625, -0.00843048095703125, -0.02935791015625, -0.0310211181640625, -0.06005859375, -0.005428314208984375, 0.03363037109375, 0.05419921875, -0.01351165771484375, 0.04425048828125, -0.047698974609375, 0.0182647705078125, 0.0007309913635253906, 0.050201416015625, 0.017974853515625, -0.05145263671875, -0.0160980224609375, 0.0224151611328125, -0.042449951171875, -0.069091796875, 0.038299560546875, 0.0011243820190429688, 0.06549072265625, 0.0313720703125, -0.006671905517578125, 0.037322998046875, -0.022125244140625, 0.0841064453125, 0.036529541015625, -0.051666259765625, 0.0255889892578125, -0.045257568359375, 0.01142120361328125, 0.04345703125, 0.043121337890625, -0.01473236083984375, -0.0298309326171875, -0.0806884765625, -0.053863525390625, 0.05096435546875, 0.015960693359375, 0.0307769775390625, 0.01308441162109375, 0.058441162109375, -0.005939483642578125, -0.0010843276977539062, -0.0936279296875, -0.034576416015625, -0.0251007080078125, -0.004688262939453125, 0.0009698867797851562, -0.0011663436889648438, -0.004741668701171875, -0.03985595703125, 0.09326171875, 0.005954742431640625, 0.035400390625, 0.0179595947265625, 0.01690673828125, -0.0259246826171875, -0.0147552490234375, 0.051849365234375, 0.028900146484375, -0.0013074874877929688, -0.0290374755859375, 0.0011653900146484375, -0.044036865234375, 0.01727294921875, -0.01235198974609375, -0.0245513916015625, -0.023651123046875, -0.01151275634765625, 0.047607421875, 0.00861358642578125, -0.027587890625, 0.051300048828125, -0.01052093505859375, -0.0479736328125, -0.021881103515625, 0.009521484375, 0.0243377685546875, 0.02374267578125, -0.009063720703125, 0.0240631103515625, 0.0035266876220703125, -0.01354217529296875, 0.0211181640625, 0.036346435546875, -0.036346435546875, -0.0291900634765625, 0.09112548828125, 0.00572967529296875, -0.01605224609375, 0.056610107421875, -0.0173797607421875, -0.0178070068359375, 0.05316162109375, 0.03851318359375, 0.073974609375, -0.01277923583984375, 0.0273284912109375, 0.05352783203125, 0.005767822265625, -0.0206146240234375, 0.0074462890625, 0.00168609619140625, -0.0411376953125, -0.00894927978515625, -0.02874755859375, -0.016021728515625, 0.0006799697875976562, -0.0389404296875, 0.03485107421875, -0.0565185546875, -0.016632080078125, -0.016326904296875, -0.001956939697265625, -0.0275726318359375, 0.0216827392578125, 0.004230499267578125, 0.08392333984375, -0.08062744140625, 0.0693359375, 0.037872314453125, -0.0611572265625, -0.060211181640625, -0.00579071044921875, -0.005855560302734375, -0.037994384765625, 0.0187225341796875, 0.00363922119140625, -0.0022258758544921875, 0.0051116943359375, -0.0787353515625, -0.041778564453125, 0.1060791015625, 0.023773193359375, -0.0186614990234375, 0.014312744140625, -0.0121307373046875, 0.046295166015625, -0.027191162109375, 0.0338134765625, 0.004726409912109375, 0.0323486328125, 0.032318115234375, -0.0306549072265625, -0.00015437602996826172, -0.03253173828125, 0.00232696533203125, 0.00493621826171875, -0.0797119140625, 0.0595703125, -0.01131439208984375, -0.0301666259765625, 0.034149169921875, 0.047760009765625, 0.0184478759765625, 0.03387451171875, 0.03192138671875, 0.0870361328125, 0.021392822265625, -0.009521484375, 0.07574462890625, -0.01324462890625, 0.034332275390625, 0.05743408203125, 0.007080078125, 0.07012939453125, 0.0274200439453125, -0.01526641845703125, 0.048187255859375, 0.056610107421875, -0.00760650634765625, 0.040435791015625, 0.0024509429931640625, -0.01139068603515625, -0.028228759765625, -0.0007519721984863281, -0.037567138671875, -0.0099945068359375, 0.00931549072265625, -0.0192718505859375, -0.02874755859375, -0.00168609619140625, -0.0008330345153808594, 0.002750396728515625, -0.007221221923828125, 0.0460205078125, 0.01342010498046875, -0.00662994384765625, 0.039306640625, 0.00913238525390625, 0.061370849609375, -0.048583984375, -0.015838623046875, -0.0153656005859375, -0.00493621826171875, -0.01386260986328125, -0.05316162109375, 0.02655029296875, -0.0155181884765625, 0.006214141845703125, -0.0227813720703125, 0.035430908203125, -0.01435089111328125, -0.04107666015625, 0.026275634765625, 0.025726318359375, 0.024658203125, 0.0119781494140625, -0.08477783203125, 0.00266265869140625, -0.012237548828125, -0.025543212890625, 0.03717041015625, 0.00356292724609375, 0.0130157470703125, 0.06390380859375, 0.044891357421875, -0.005542755126953125, -0.0037593841552734375, -0.0011816024780273438, 0.06512451171875, -0.0247039794921875, -0.039642333984375, -0.050201416015625, 0.0640869140625, -0.0029544830322265625, -0.0217132568359375, 0.052093505859375, 0.045654296875, 0.056243896484375, -0.007503509521484375, 0.042388916015625, -0.0013093948364257812, 0.020416259765625, -0.0333251953125, 0.08203125, -0.09051513671875, 0.00019049644470214844, -0.036041259765625, -0.0599365234375, -0.017364501953125, 0.049774169921875, -0.0010614395141601562, 0.01776123046875, 0.0174102783203125, 0.05902099609375, -0.020751953125, 0.01267242431640625, 0.0178375244140625, 0.008544921875, 0.0283203125, 0.050689697265625, 0.043243408203125, -0.05731201171875, 0.007076263427734375, -0.039947509765625, -0.0166015625, 0.004566192626953125, -0.07305908203125, -0.059356689453125, -0.048583984375, -0.04962158203125, -0.0401611328125, -0.030364990234375, 0.035186767578125, 0.075439453125, -0.035919189453125, -0.01137542724609375, -0.03009033203125, 0.003612518310546875, -0.0126953125, -0.016632080078125, 0.0106201171875, 0.028411865234375, -0.060516357421875, 0.003131866455078125, 0.0034465789794921875, 0.05987548828125, -0.0301055908203125, -0.0240478515625, -0.0275115966796875, -0.0087432861328125, 0.0160980224609375, 0.025848388671875, -0.03741455078125, 0.00418853759765625, -0.0092620849609375, -0.0008873939514160156, 0.01557159423828125, 0.03155517578125, -0.040008544921875, 0.0244903564453125, 0.04583740234375, 0.025543212890625, 0.0548095703125, -0.0030574798583984375, 0.00836944580078125, -0.0158843994140625, 0.0229339599609375, 0.004741668701171875, 0.03143310546875, 0.00452423095703125, -0.03082275390625, 0.04327392578125, 0.063232421875, -0.045379638671875, -0.049957275390625, 0.0026416778564453125, -0.11627197265625, -0.01727294921875, 0.0767822265625, -0.01256561279296875, -0.033599853515625, 0.0070343017578125, -0.020294189453125, 0.0187225341796875, -0.03790283203125, 0.0156097412109375, 0.03363037109375, -0.005451202392578125, -0.0416259765625, -0.057159423828125, 0.0209197998046875, 0.006622314453125, -0.059417724609375, -0.0078887939453125, 0.039520263671875, 0.0582275390625, 0.0086212158203125, 0.0660400390625, -0.0302734375, 0.01386260986328125, -0.0038051605224609375, 0.0074462890625, -0.0103607177734375, -0.017547607421875, -0.036529541015625, 0.007457733154296875, -0.00970458984375, -0.005573272705078125 ] ]
sentence-transformers/multi-qa-mpnet-base-cos-v1
2023-11-02T09:30:23.000Z
[ "sentence-transformers", "pytorch", "mpnet", "feature-extraction", "sentence-similarity", "en", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/multi-qa-mpnet-base-cos-v1
20
136,019
sentence-transformers
2022-03-02T23:29:05
--- language: - en pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity --- # multi-qa-mpnet-base-cos-v1 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and was designed for **semantic search**. It has been trained on 215M (question, answer) pairs from diverse sources. For an introduction to semantic search, have a look at: [SBERT.net - Semantic Search](https://www.sbert.net/examples/applications/semantic-search/README.html) ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer, util query = "How many people live in London?" docs = ["Around 9 Million people live in London", "London is known for its financial district"] #Load the model model = SentenceTransformer('sentence-transformers/multi-qa-mpnet-base-cos-v1') #Encode query and documents query_emb = model.encode(query) doc_emb = model.encode(docs) #Compute dot score between query and all document embeddings scores = util.dot_score(query_emb, doc_emb)[0].cpu().tolist() #Combine docs & scores doc_score_pairs = list(zip(docs, scores)) #Sort by decreasing score doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) #Output passages & scores for doc, score in doc_score_pairs: print(score, doc) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the correct pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch import torch.nn.functional as F #Mean Pooling - Take average of all tokens def mean_pooling(model_output, attention_mask): token_embeddings = model_output.last_hidden_state #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) #Encode text def encode(texts): # Tokenize sentences encoded_input = tokenizer(texts, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input, return_dict=True) # Perform pooling embeddings = mean_pooling(model_output, encoded_input['attention_mask']) # Normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) return embeddings # Sentences we want sentence embeddings for query = "How many people live in London?" docs = ["Around 9 Million people live in London", "London is known for its financial district"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/multi-qa-mpnet-base-cos-v1") model = AutoModel.from_pretrained("sentence-transformers/multi-qa-mpnet-base-cos-v1") #Encode query and docs query_emb = encode(query) doc_emb = encode(docs) #Compute dot score between query and all document embeddings scores = torch.mm(query_emb, doc_emb.transpose(0, 1))[0].cpu().tolist() #Combine docs & scores doc_score_pairs = list(zip(docs, scores)) #Sort by decreasing score doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True) #Output passages & scores for doc, score in doc_score_pairs: print(score, doc) ``` ## Technical Details In the following some technical details how this model must be used: | Setting | Value | | --- | :---: | | Dimensions | 768 | | Produces normalized embeddings | Yes | | Pooling-Method | Mean pooling | | Suitable score functions | dot-product (`util.dot_score`), cosine-similarity (`util.cos_sim`), or euclidean distance | Note: When loaded with `sentence-transformers`, this model produces normalized embeddings with length 1. In that case, dot-product and cosine-similarity are equivalent. dot-product is preferred as it is faster. Euclidean distance is proportional to dot-product and can also be used. ---- ## Background The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised contrastive learning objective. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset. We developped this model during the [Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by Hugging Face. We developped this model as part of the project: [Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks. ## Intended uses Our model is intented to be used for semantic search: It encodes queries / questions and text paragraphs in a dense vector space. It finds relevant documents for the given passages. Note that there is a limit of 512 word pieces: Text longer than that will be truncated. Further note that the model was just trained on input text up to 250 word pieces. It might not work well for longer text. ## Training procedure The full training script is accessible in this current repository: `train_script.py`. ### Pre-training We use the pretrained [`mpnet-base`](https://huggingface.co/microsoft/mpnet-base) model. Please refer to the model card for more detailed information about the pre-training procedure. #### Training We use the concatenation from multiple datasets to fine-tune our model. In total we have about 215M (question, answer) pairs. We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file. The model was trained with [MultipleNegativesRankingLoss](https://www.sbert.net/docs/package_reference/losses.html#multiplenegativesrankingloss) using Mean-pooling, cosine-similarity as similarity function, and a scale of 20. | Dataset | Number of training tuples | |--------------------------------------------------------|:--------------------------:| | [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs from WikiAnswers | 77,427,422 | | [PAQ](https://github.com/facebookresearch/PAQ) Automatically generated (Question, Paragraph) pairs for each paragraph in Wikipedia | 64,371,441 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs from all StackExchanges | 25,316,456 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs from all StackExchanges | 21,396,559 | | [MS MARCO](https://microsoft.github.io/msmarco/) Triplets (query, answer, hard_negative) for 500k queries from Bing search engine | 17,579,773 | | [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) (query, answer) pairs for 3M Google queries and Google featured snippet | 3,012,496 | | [Amazon-QA](http://jmcauley.ucsd.edu/data/amazon/qa/) (Question, Answer) pairs from Amazon product pages | 2,448,839 | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) pairs from Yahoo Answers | 1,198,260 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) pairs from Yahoo Answers | 681,164 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) pairs from Yahoo Answers | 659,896 | | [SearchQA](https://huggingface.co/datasets/search_qa) (Question, Answer) pairs for 140k questions, each with Top5 Google snippets on that question | 582,261 | | [ELI5](https://huggingface.co/datasets/eli5) (Question, Answer) pairs from Reddit ELI5 (explainlikeimfive) | 325,475 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions pairs (titles) | 304,525 | | [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) (Question, Duplicate_Question, Hard_Negative) triplets for Quora Questions Pairs dataset | 103,663 | | [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) (Question, Paragraph) pairs for 100k real Google queries with relevant Wikipedia paragraph | 100,231 | | [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) (Question, Paragraph) pairs from SQuAD2.0 dataset | 87,599 | | [TriviaQA](https://huggingface.co/datasets/trivia_qa) (Question, Evidence) pairs | 73,346 | | **Total** | **214,988,242** |
9,203
[ [ -0.03302001953125, -0.056243896484375, 0.03118896484375, 0.0164337158203125, -0.0093841552734375, -0.0253753662109375, -0.007450103759765625, -0.00904083251953125, 0.0207061767578125, 0.023956298828125, -0.036224365234375, -0.042816162109375, -0.04913330078125, 0.01042938232421875, -0.0266876220703125, 0.0665283203125, 0.0008392333984375, 0.0089874267578125, -0.0275421142578125, -0.0236053466796875, -0.0132293701171875, -0.035186767578125, -0.035675048828125, -0.007282257080078125, 0.033538818359375, 0.0222625732421875, 0.04022216796875, 0.029052734375, 0.0311431884765625, 0.0287628173828125, -0.0013132095336914062, 0.0214385986328125, -0.0362548828125, -0.0083770751953125, -0.00331878662109375, -0.0313720703125, -0.0065155029296875, 0.0258636474609375, 0.03802490234375, 0.034454345703125, -0.0030307769775390625, 0.01434326171875, 0.0018339157104492188, 0.037567138671875, -0.034820556640625, 0.0174560546875, -0.036468505859375, 0.01293182373046875, 0.006931304931640625, -0.01042938232421875, -0.025634765625, -0.01479339599609375, 0.0228424072265625, -0.046630859375, 0.0119171142578125, 0.01076507568359375, 0.085205078125, 0.013031005859375, -0.03668212890625, -0.03167724609375, -0.00908660888671875, 0.06170654296875, -0.0609130859375, 0.0254974365234375, 0.0285491943359375, -0.0060272216796875, -0.0034503936767578125, -0.06243896484375, -0.059295654296875, -0.01285552978515625, -0.026885986328125, 0.0239410400390625, -0.01416778564453125, -0.0032329559326171875, 0.0166778564453125, 0.0290069580078125, -0.05657958984375, 0.0009722709655761719, -0.03363037109375, -0.01450347900390625, 0.053253173828125, 0.00870513916015625, 0.02276611328125, -0.047332763671875, -0.03759765625, -0.02337646484375, -0.0223236083984375, 0.01016998291015625, 0.0242767333984375, 0.005962371826171875, -0.0170745849609375, 0.06304931640625, -0.0187835693359375, 0.0484619140625, 0.001220703125, -0.0014047622680664062, 0.039642333984375, -0.038604736328125, -0.0141143798828125, -0.0237884521484375, 0.0831298828125, 0.033050537109375, 0.016876220703125, -0.006015777587890625, -0.004299163818359375, -0.0015058517456054688, 0.00901031494140625, -0.06536865234375, -0.0235443115234375, 0.0270843505859375, -0.035003662109375, -0.023773193359375, 0.01479339599609375, -0.04656982421875, -0.0169525146484375, -0.012054443359375, 0.0567626953125, -0.051544189453125, -0.0016679763793945312, 0.0255126953125, -0.0277252197265625, 0.02783203125, -0.0036182403564453125, -0.04241943359375, 0.0092315673828125, 0.03399658203125, 0.06536865234375, 0.0039825439453125, -0.0352783203125, -0.0247039794921875, -0.0167694091796875, 0.0012922286987304688, 0.0460205078125, -0.0307159423828125, -0.0084228515625, -0.004276275634765625, 0.009674072265625, -0.0296783447265625, -0.0272674560546875, 0.04132080078125, -0.0292816162109375, 0.058807373046875, -0.00872802734375, -0.06951904296875, -0.01454925537109375, 0.0285491943359375, -0.03924560546875, 0.08135986328125, 0.017822265625, -0.0799560546875, 0.0007448196411132812, -0.05181884765625, -0.0127716064453125, -0.0210418701171875, -0.006175994873046875, -0.042205810546875, 0.0014867782592773438, 0.031494140625, 0.049652099609375, -0.01096343994140625, 0.004535675048828125, -0.0185546875, -0.0330810546875, 0.0260162353515625, -0.0160980224609375, 0.080078125, 0.00649261474609375, -0.028289794921875, -0.00377655029296875, -0.054534912109375, -0.0016393661499023438, 0.0222015380859375, -0.02557373046875, -0.01143646240234375, -0.011932373046875, 0.00696563720703125, 0.041229248046875, 0.019500732421875, -0.054962158203125, 0.0169677734375, -0.045989990234375, 0.050048828125, 0.04864501953125, 0.003505706787109375, 0.0323486328125, -0.03558349609375, 0.0293426513671875, 0.0222015380859375, 0.006137847900390625, -0.007808685302734375, -0.038665771484375, -0.0716552734375, -0.005702972412109375, 0.0280609130859375, 0.0457763671875, -0.05523681640625, 0.05401611328125, -0.0255279541015625, -0.0372314453125, -0.064697265625, 0.00502777099609375, 0.022918701171875, 0.044891357421875, 0.04754638671875, -0.00859832763671875, -0.0299072265625, -0.07598876953125, -0.0037708282470703125, 0.0033512115478515625, -0.004638671875, 0.0330810546875, 0.05084228515625, -0.0160980224609375, 0.061737060546875, -0.05743408203125, -0.03399658203125, -0.00630950927734375, 0.00879669189453125, 0.017303466796875, 0.04058837890625, 0.037750244140625, -0.06610107421875, -0.03594970703125, -0.03680419921875, -0.052459716796875, 0.006366729736328125, -0.01123809814453125, -0.01751708984375, 0.0149078369140625, 0.046173095703125, -0.056640625, 0.01947021484375, 0.0411376953125, -0.04266357421875, 0.02996826171875, -0.0171966552734375, -0.000995635986328125, -0.10772705078125, 0.0079803466796875, 0.004589080810546875, -0.007350921630859375, -0.0219573974609375, 0.01169586181640625, -0.0012235641479492188, -0.01363372802734375, -0.035736083984375, 0.03204345703125, -0.03765869140625, 0.01006317138671875, 0.01055145263671875, 0.039398193359375, 0.0187835693359375, 0.0572509765625, -0.0121002197265625, 0.058013916015625, 0.0318603515625, -0.035125732421875, 0.029327392578125, 0.04217529296875, -0.027587890625, 0.0284576416015625, -0.056915283203125, 0.00614166259765625, -0.01161956787109375, 0.019012451171875, -0.086181640625, -0.0006308555603027344, 0.0147552490234375, -0.051177978515625, 0.00948333740234375, 0.013641357421875, -0.046234130859375, -0.03717041015625, -0.043060302734375, 0.00726318359375, 0.028289794921875, -0.03289794921875, 0.036346435546875, 0.0248870849609375, 0.0051727294921875, -0.051849365234375, -0.06976318359375, -0.00806427001953125, -0.005123138427734375, -0.06317138671875, 0.036712646484375, -0.013916015625, 0.01236724853515625, 0.019317626953125, 0.01123809814453125, 0.003940582275390625, 0.0035877227783203125, 0.0101470947265625, 0.00954437255859375, -0.01326751708984375, 0.030120849609375, -0.00989532470703125, -0.0090789794921875, 0.0021686553955078125, -0.023773193359375, 0.05584716796875, -0.02783203125, -0.0135650634765625, -0.0285491943359375, 0.032684326171875, 0.032958984375, -0.024871826171875, 0.08526611328125, 0.0767822265625, -0.0164794921875, -0.001068115234375, -0.04534912109375, -0.012298583984375, -0.03594970703125, 0.038360595703125, -0.0269012451171875, -0.07916259765625, 0.0309295654296875, 0.018585205078125, -0.0048828125, 0.060882568359375, 0.03271484375, -0.02056884765625, 0.07037353515625, 0.0301666259765625, -0.00634002685546875, 0.0300750732421875, -0.049835205078125, 0.01873779296875, -0.061126708984375, -0.0187225341796875, -0.0262908935546875, -0.0278167724609375, -0.07489013671875, -0.035186767578125, 0.027923583984375, 0.0046844482421875, -0.01544189453125, 0.030029296875, -0.04901123046875, 0.016632080078125, 0.057373046875, 0.026458740234375, -0.00901031494140625, -0.00263214111328125, -0.0281524658203125, -0.0052337646484375, -0.06085205078125, -0.0226898193359375, 0.08758544921875, 0.0251922607421875, 0.029449462890625, 0.0017328262329101562, 0.0654296875, 0.005035400390625, -0.0055694580078125, -0.050933837890625, 0.04266357421875, -0.022979736328125, -0.032745361328125, -0.021240234375, -0.04486083984375, -0.0718994140625, 0.0223236083984375, -0.03167724609375, -0.038360595703125, 0.006778717041015625, -0.015655517578125, -0.0257720947265625, 0.017547607421875, -0.06658935546875, 0.08111572265625, -0.0013818740844726562, -0.024017333984375, -0.01377105712890625, -0.05682373046875, 0.01261138916015625, 0.019317626953125, -0.0008893013000488281, -0.007110595703125, -0.010650634765625, 0.06512451171875, -0.030517578125, 0.04449462890625, -0.00989532470703125, 0.019256591796875, 0.0231475830078125, -0.02410888671875, 0.0240020751953125, -0.00299072265625, -0.00762939453125, -0.00583648681640625, 0.0063934326171875, -0.049407958984375, -0.04241943359375, 0.058380126953125, -0.08074951171875, -0.035858154296875, -0.038818359375, -0.03717041015625, -0.01213836669921875, 0.01148223876953125, 0.0294036865234375, 0.03875732421875, -0.00274658203125, 0.037933349609375, 0.0499267578125, -0.037628173828125, 0.042022705078125, 0.023284912109375, -0.0022411346435546875, -0.04229736328125, 0.06494140625, 0.01418304443359375, 0.004795074462890625, 0.047821044921875, 0.02264404296875, -0.032470703125, -0.0297393798828125, -0.01117706298828125, 0.02752685546875, -0.047088623046875, -0.01947021484375, -0.07769775390625, -0.033538818359375, -0.054901123046875, -0.0001316070556640625, -0.01235198974609375, -0.037811279296875, -0.03582763671875, -0.0180206298828125, 0.0246429443359375, 0.041046142578125, -0.0000045299530029296875, 0.01434326171875, -0.04876708984375, 0.0183563232421875, 0.0284881591796875, 0.01922607421875, -0.00908660888671875, -0.04290771484375, -0.0170745849609375, 0.00046515464782714844, -0.0294189453125, -0.06634521484375, 0.034454345703125, 0.01531982421875, 0.041473388671875, 0.01523590087890625, 0.00859832763671875, 0.040740966796875, -0.021697998046875, 0.07171630859375, 0.004962921142578125, -0.058197021484375, 0.03643798828125, -0.01611328125, 0.034393310546875, 0.051177978515625, 0.047393798828125, -0.03948974609375, -0.02117919921875, -0.057891845703125, -0.0711669921875, 0.046417236328125, 0.0281524658203125, 0.0204620361328125, -0.01519775390625, 0.0231170654296875, -0.0133514404296875, 0.011016845703125, -0.06854248046875, -0.03118896484375, -0.0184326171875, -0.034149169921875, -0.0188140869140625, -0.0223388671875, 0.003665924072265625, -0.037994384765625, 0.06512451171875, -0.006317138671875, 0.045501708984375, 0.037567138671875, -0.025604248046875, 0.03289794921875, 0.0128936767578125, 0.043670654296875, 0.03271484375, -0.01727294921875, 0.00658416748046875, 0.01529693603515625, -0.028411865234375, -0.004055023193359375, 0.035247802734375, -0.0068359375, 0.0009565353393554688, 0.024871826171875, 0.058074951171875, 0.0189666748046875, -0.04058837890625, 0.06591796875, -0.01535797119140625, -0.0162353515625, -0.030853271484375, -0.00945281982421875, 0.02734375, 0.016998291015625, 0.0147552490234375, -0.007354736328125, 0.00861358642578125, -0.035003662109375, 0.033660888671875, 0.01806640625, -0.0305328369140625, -0.0012969970703125, 0.03472900390625, 0.01180267333984375, -0.01180267333984375, 0.06890869140625, -0.02789306640625, -0.050048828125, 0.0389404296875, 0.0309295654296875, 0.058990478515625, 0.005908966064453125, 0.0284423828125, 0.045135498046875, 0.0234375, 0.01404571533203125, 0.01515960693359375, 0.004528045654296875, -0.05865478515625, -0.012786865234375, -0.062164306640625, -0.007122039794921875, 0.006977081298828125, -0.0355224609375, 0.01739501953125, -0.0178375244140625, -0.0021877288818359375, 0.0034084320068359375, 0.0283355712890625, -0.0653076171875, 0.01213836669921875, 0.005184173583984375, 0.07635498046875, -0.06304931640625, 0.053375244140625, 0.051544189453125, -0.06634521484375, -0.06646728515625, -0.005207061767578125, -0.0176849365234375, -0.06072998046875, 0.030364990234375, 0.037445068359375, 0.0144805908203125, 0.00833892822265625, -0.03692626953125, -0.06268310546875, 0.10394287109375, 0.0169677734375, -0.0256805419921875, -0.01727294921875, 0.0161590576171875, 0.042816162109375, -0.0323486328125, 0.037811279296875, 0.02728271484375, 0.025054931640625, -0.00986480712890625, -0.05047607421875, 0.0077667236328125, -0.032623291015625, -0.00395965576171875, -0.00821685791015625, -0.06317138671875, 0.07342529296875, -0.0009713172912597656, -0.0065155029296875, 0.0011529922485351562, 0.048004150390625, 0.0187225341796875, 0.00878143310546875, 0.035400390625, 0.073974609375, 0.055023193359375, -0.005046844482421875, 0.09002685546875, -0.02923583984375, 0.042449951171875, 0.0751953125, 0.0194091796875, 0.0784912109375, 0.030426025390625, -0.01407623291015625, 0.053070068359375, 0.049591064453125, -0.01328277587890625, 0.034088134765625, 0.0197906494140625, -0.0029430389404296875, -0.01161956787109375, -0.001438140869140625, -0.0258331298828125, 0.0491943359375, 0.0097503662109375, -0.046875, -0.007648468017578125, 0.0029754638671875, 0.0082855224609375, 0.00594329833984375, -0.0029087066650390625, 0.05572509765625, 0.01088714599609375, -0.046173095703125, 0.03717041015625, 0.00908660888671875, 0.0721435546875, -0.038787841796875, 0.01325225830078125, -0.018402099609375, 0.0198516845703125, -0.009735107421875, -0.0614013671875, 0.020294189453125, -0.0275421142578125, -0.01523590087890625, -0.0171051025390625, 0.048675537109375, -0.047332763671875, -0.044403076171875, 0.0350341796875, 0.04168701171875, 0.007686614990234375, -0.01183319091796875, -0.08331298828125, -0.0183563232421875, 0.0004260540008544922, -0.03680419921875, 0.0196380615234375, 0.028594970703125, 0.030609130859375, 0.036407470703125, 0.03985595703125, -0.0164794921875, 0.012786865234375, -0.0025177001953125, 0.06573486328125, -0.053558349609375, -0.040283203125, -0.054595947265625, 0.042877197265625, -0.0268707275390625, -0.041015625, 0.060302734375, 0.051849365234375, 0.07171630859375, -0.00449371337890625, 0.03814697265625, -0.00852203369140625, 0.0260162353515625, -0.041473388671875, 0.05926513671875, -0.051544189453125, -0.00138092041015625, -0.0096435546875, -0.062469482421875, -0.0086212158203125, 0.05084228515625, -0.0259246826171875, 0.00220489501953125, 0.057373046875, 0.0718994140625, -0.0081024169921875, -0.0070648193359375, 0.012786865234375, 0.0268096923828125, 0.00945281982421875, 0.059051513671875, 0.038970947265625, -0.0675048828125, 0.058685302734375, -0.03790283203125, 0.0003266334533691406, -0.010650634765625, -0.053253173828125, -0.06658935546875, -0.06549072265625, -0.02764892578125, -0.03955078125, -0.006160736083984375, 0.0711669921875, 0.050750732421875, -0.060577392578125, -0.0007357597351074219, -0.00908660888671875, 0.002864837646484375, -0.0008707046508789062, -0.0253448486328125, 0.042633056640625, -0.0281524658203125, -0.04644775390625, 0.01497650146484375, -0.0030422210693359375, -0.003528594970703125, -0.0243377685546875, 0.0023593902587890625, -0.06640625, 0.01030731201171875, 0.049560546875, -0.004909515380859375, -0.04644775390625, -0.028533935546875, 0.01515960693359375, -0.032257080078125, 0.00698089599609375, 0.0277252197265625, -0.053131103515625, 0.0273284912109375, 0.053131103515625, 0.049835205078125, 0.057220458984375, -0.0035610198974609375, 0.031646728515625, -0.06201171875, 0.004154205322265625, 0.017822265625, 0.035125732421875, 0.037109375, -0.0241546630859375, 0.052032470703125, 0.0299072265625, -0.043731689453125, -0.03924560546875, -0.005458831787109375, -0.08294677734375, -0.028564453125, 0.0902099609375, -0.0247039794921875, -0.0188140869140625, 0.0118865966796875, -0.017303466796875, 0.023773193359375, -0.02294921875, 0.04998779296875, 0.060333251953125, -0.007015228271484375, -0.0220184326171875, -0.0258636474609375, 0.0255126953125, 0.03985595703125, -0.05078125, -0.037017822265625, 0.0229644775390625, 0.037811279296875, 0.0299224853515625, 0.040557861328125, -0.00499725341796875, 0.006465911865234375, 0.0081787109375, -0.004566192626953125, -0.018951416015625, 0.00318145751953125, -0.0279998779296875, 0.034912109375, -0.0292816162109375, -0.02923583984375 ] ]
prompthero/openjourney-v4
2023-05-15T22:41:59.000Z
[ "diffusers", "stable-diffusion", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
prompthero
null
null
prompthero/openjourney-v4
1,160
135,959
diffusers
2022-12-11T17:37:55
--- license: creativeml-openrail-m tags: - stable-diffusion - text-to-image pinned: true --- # <u>Openjourney v4</u> ## Trained on +124k Midjourney v4 images, by [PromptHero](https://prompthero.com/?utm_source=huggingface&utm_medium=referral) Trained on Stable Diffusion v1.5 using +124000 images, 12400 steps, 4 epochs +32 training hours. 💡 [Openjourney-v4 prompts](https://prompthero.com/openjourney-prompts?version=4) Pss... "mdjrny-v4 style" is not necessary anymore (yay!) 🎓 **Want to learn how to train Openjourney? 👉🏼 __[Join our course](https://prompthero.com/academy/dreambooth-stable-diffusion-train-fine-tune-course?utm_source=huggingface&utm_medium=referral)__ 🔥** <img src="https://s3.us-east-1.amazonaws.com/prompthero-newsletter/Group-66.png" alt="openjourney-v4" width="50%"> # Openjourney Links - [Lora version](https://huggingface.co/prompthero/openjourney-lora) - [Openjourney Dreambooth](https://huggingface.co/prompthero/openjourney)
966
[ [ -0.035003662109375, -0.029632568359375, 0.0293121337890625, 0.0191650390625, -0.040985107421875, -0.0191192626953125, 0.00313568115234375, -0.011077880859375, 0.041961669921875, 0.036468505859375, -0.05828857421875, -0.047515869140625, -0.0292816162109375, -0.0086212158203125, 0.0219268798828125, 0.06243896484375, -0.0421142578125, 0.02337646484375, -0.0024471282958984375, -0.0234222412109375, -0.03955078125, -0.0018644332885742188, -0.07061767578125, -0.031707763671875, 0.04229736328125, 0.03106689453125, 0.024383544921875, 0.0167388916015625, 0.015533447265625, 0.0254974365234375, -0.00981903076171875, -0.0026073455810546875, -0.05230712890625, 0.01617431640625, -0.0049896240234375, -0.026763916015625, -0.05010986328125, 0.01910400390625, 0.06689453125, 0.03289794921875, -0.01155853271484375, 0.02581787109375, 0.005126953125, 0.06756591796875, -0.041412353515625, 0.01465606689453125, -0.0133819580078125, 0.033447265625, -0.032806396484375, 0.0033473968505859375, -0.01276397705078125, -0.0333251953125, -0.0155029296875, -0.06243896484375, 0.0035247802734375, 0.0033206939697265625, 0.06658935546875, -0.0029888153076171875, -0.017059326171875, 0.0253448486328125, -0.0241546630859375, 0.04425048828125, -0.0225830078125, 0.0235443115234375, 0.0265655517578125, 0.046478271484375, -0.006916046142578125, -0.0518798828125, -0.022613525390625, 0.0311126708984375, 0.033721923828125, 0.025482177734375, -0.01103973388671875, -0.013885498046875, 0.0277557373046875, 0.0170745849609375, -0.040985107421875, 0.0035381317138671875, -0.04296875, -0.0164794921875, 0.023406982421875, 0.012542724609375, 0.00678253173828125, 0.0212554931640625, -0.043853759765625, -0.01068878173828125, -0.04437255859375, 0.015655517578125, 0.037322998046875, -0.005680084228515625, -0.041534423828125, 0.03863525390625, -0.02496337890625, 0.03546142578125, 0.01241302490234375, -0.016326904296875, 0.064453125, -0.0272979736328125, -0.0204925537109375, -0.01287841796875, 0.064208984375, 0.038848876953125, -0.009185791015625, 0.0006685256958007812, 0.00357818603515625, -0.0273590087890625, 0.002410888671875, -0.083740234375, -0.013214111328125, -0.0018396377563476562, -0.04119873046875, -0.0265045166015625, 0.004634857177734375, -0.0684814453125, 0.005794525146484375, 0.003246307373046875, 0.025299072265625, -0.037872314453125, -0.04071044921875, 0.016448974609375, -0.01641845703125, 0.016021728515625, 0.0487060546875, -0.03662109375, 0.00836944580078125, 0.0036182403564453125, 0.0743408203125, -0.0008225440979003906, -0.0226593017578125, -0.0198974609375, -0.0021038055419921875, -0.0220489501953125, 0.03619384765625, -0.0214385986328125, -0.0277099609375, -0.006275177001953125, 0.01163482666015625, -0.0167388916015625, -0.0288848876953125, 0.061248779296875, -0.047607421875, 0.028839111328125, -0.031005859375, -0.01064300537109375, -0.019927978515625, 0.0121917724609375, -0.036285400390625, 0.0511474609375, 0.0236358642578125, -0.064697265625, 0.0157470703125, -0.07568359375, 0.01181793212890625, -0.00859832763671875, 0.0110626220703125, -0.0284423828125, -0.0289764404296875, 0.004680633544921875, 0.0307159423828125, 0.0182342529296875, -0.032196044921875, -0.037445068359375, -0.01515960693359375, 0.007293701171875, -0.0281982421875, 0.0771484375, 0.0274505615234375, -0.00295257568359375, 0.01039886474609375, -0.06121826171875, -0.00609588623046875, 0.03216552734375, -0.0275115966796875, -0.0207977294921875, -0.05108642578125, -0.01311492919921875, 0.0204010009765625, 0.037994384765625, -0.053955078125, 0.036956787109375, -0.027618408203125, 0.003932952880859375, 0.079833984375, 0.0196685791015625, 0.0323486328125, -0.0270233154296875, 0.0653076171875, 0.045654296875, 0.0183868408203125, -0.0184478759765625, -0.06231689453125, -0.04571533203125, -0.004085540771484375, 0.003353118896484375, 0.01038360595703125, -0.06195068359375, 0.0263824462890625, 0.0013027191162109375, -0.052642822265625, -0.036773681640625, -0.00809478759765625, 0.024383544921875, 0.055389404296875, 0.034912109375, -0.042449951171875, -0.03466796875, -0.0264739990234375, 0.009490966796875, 0.01114654541015625, 0.01442718505859375, -0.0014438629150390625, 0.03472900390625, -0.0238494873046875, 0.0380859375, -0.046844482421875, -0.00861358642578125, 0.0104217529296875, -0.00577545166015625, 0.037872314453125, 0.034271240234375, 0.066162109375, -0.036468505859375, -0.041534423828125, -0.026947021484375, -0.058563232421875, 0.0025539398193359375, 0.0158538818359375, -0.05877685546875, 0.0241241455078125, 0.01336669921875, -0.056182861328125, 0.03228759765625, 0.035919189453125, -0.057952880859375, 0.04327392578125, -0.0126495361328125, 0.02716064453125, -0.08038330078125, -0.007171630859375, 0.0117645263671875, -0.03131103515625, -0.0106658935546875, 0.011016845703125, -0.01043701171875, -0.0022602081298828125, -0.053131103515625, 0.056915283203125, -0.048675537109375, 0.018096923828125, -0.0011777877807617188, 0.0057830810546875, 0.004665374755859375, 0.03179931640625, -0.0196380615234375, 0.0260467529296875, 0.0782470703125, -0.0292816162109375, 0.0247802734375, 0.05157470703125, -0.0156097412109375, 0.0465087890625, -0.023284912109375, 0.017578125, -0.00650787353515625, 0.035003662109375, -0.07183837890625, -0.0160064697265625, 0.0645751953125, -0.02203369140625, 0.039642333984375, -0.022125244140625, -0.026214599609375, -0.037689208984375, -0.043853759765625, 0.0435791015625, 0.0650634765625, -0.048553466796875, 0.003997802734375, 0.018646240234375, 0.0168914794921875, -0.055328369140625, -0.02618408203125, -0.0207672119140625, -0.01995849609375, -0.049041748046875, 0.0088653564453125, -0.0157470703125, 0.005298614501953125, -0.0118408203125, -0.01082611083984375, -0.0001233816146850586, -0.015899658203125, 0.045166015625, 0.0218505859375, -0.028900146484375, -0.030426025390625, 0.01139068603515625, -0.020965576171875, 0.008880615234375, -0.0191192626953125, 0.04864501953125, -0.00902557373046875, -0.0301666259765625, -0.056488037109375, -0.0019931793212890625, 0.033660888671875, 0.005939483642578125, 0.067138671875, 0.07415771484375, -0.01873779296875, 0.01432037353515625, -0.0220947265625, -0.0013561248779296875, -0.039947509765625, -0.0078277587890625, -0.0355224609375, -0.054534912109375, 0.0265045166015625, -0.0034694671630859375, -0.0028438568115234375, 0.056915283203125, 0.04058837890625, -0.01526641845703125, 0.0650634765625, 0.0186004638671875, -0.003894805908203125, 0.037811279296875, -0.055999755859375, -0.01322174072265625, -0.07757568359375, -0.041595458984375, -0.0173797607421875, -0.035675048828125, -0.020599365234375, -0.043609619140625, 0.0298309326171875, 0.0263519287109375, -0.055328369140625, 0.0213165283203125, -0.0281982421875, 0.026275634765625, 0.0256805419921875, 0.02764892578125, 0.01177978515625, -0.0017690658569335938, -0.0194244384765625, -0.01041412353515625, -0.04949951171875, -0.022796630859375, 0.08148193359375, 0.04119873046875, 0.0635986328125, 0.007259368896484375, 0.0660400390625, -0.01377105712890625, -0.00708770751953125, -0.02764892578125, 0.065185546875, 0.032745361328125, -0.020751953125, -0.01824951171875, -0.032379150390625, -0.09521484375, 0.01617431640625, -0.0290985107421875, -0.03131103515625, 0.00502777099609375, 0.01555633544921875, -0.0018548965454101562, 0.037933349609375, -0.036102294921875, 0.06854248046875, 0.004360198974609375, -0.0292510986328125, 0.00017845630645751953, -0.05816650390625, 0.023529052734375, 0.004673004150390625, 0.0049591064453125, -0.01039886474609375, -0.01404571533203125, 0.061431884765625, -0.045166015625, 0.061798095703125, -0.0526123046875, 0.01122283935546875, 0.022186279296875, -0.0125732421875, 0.0183868408203125, 0.00392913818359375, -0.0022640228271484375, 0.0264434814453125, -0.016265869140625, -0.04302978515625, -0.0041656494140625, 0.067626953125, -0.07537841796875, -0.00792694091796875, -0.048126220703125, -0.01177978515625, 0.00010353326797485352, 0.0296630859375, 0.05999755859375, 0.02288818359375, -0.0191802978515625, -0.007415771484375, 0.061737060546875, 0.00868988037109375, 0.04266357421875, 0.01751708984375, -0.034759521484375, -0.0364990234375, 0.049774169921875, 0.01119232177734375, 0.00760650634765625, 0.006961822509765625, 0.0207061767578125, -0.0272216796875, -0.0560302734375, -0.042022705078125, 0.040740966796875, -0.043670654296875, -0.0217132568359375, -0.0458984375, -0.0155792236328125, -0.054595947265625, 0.0018215179443359375, -0.02685546875, -0.035247802734375, -0.0682373046875, -0.01050567626953125, 0.047760009765625, 0.0374755859375, -0.0240631103515625, 0.0190277099609375, -0.055694580078125, 0.0236968994140625, 0.0034637451171875, 0.0228424072265625, -0.008270263671875, -0.06298828125, -0.034912109375, 0.00921630859375, -0.056976318359375, -0.06561279296875, 0.01702880859375, 0.003223419189453125, 0.0518798828125, 0.058837890625, -0.0033779144287109375, 0.05242919921875, -0.048614501953125, 0.068359375, 0.06256103515625, -0.02850341796875, 0.051666259765625, -0.044189453125, 0.021026611328125, 0.04473876953125, 0.0589599609375, -0.0110931396484375, -0.006038665771484375, -0.06414794921875, -0.06402587890625, 0.04656982421875, 0.0163421630859375, 0.00780487060546875, 0.0230255126953125, 0.035888671875, 0.010711669921875, 0.0108642578125, -0.06640625, -0.039031982421875, -0.0338134765625, -0.002040863037109375, 0.005138397216796875, -0.006641387939453125, -0.027008056640625, -0.03558349609375, 0.07733154296875, -0.00890350341796875, 0.0302734375, 0.0017175674438476562, 0.0305328369140625, -0.04388427734375, -0.010162353515625, 0.034088134765625, 0.04608154296875, -0.0211944580078125, -0.0220489501953125, -0.007076263427734375, -0.049468994140625, 0.0228424072265625, 0.0004057884216308594, -0.005474090576171875, 0.005523681640625, 0.00665283203125, 0.07806396484375, 0.004131317138671875, -0.0164794921875, 0.055389404296875, -0.00656890869140625, -0.0233154296875, -0.03912353515625, 0.036102294921875, 0.006862640380859375, 0.037261962890625, -0.00995635986328125, 0.0377197265625, 0.004314422607421875, -0.033203125, -0.00930023193359375, 0.019866943359375, -0.034423828125, -0.0265045166015625, 0.067138671875, 0.01064300537109375, -0.00848388671875, 0.023773193359375, -0.0269012451171875, -0.0045318603515625, 0.025054931640625, 0.047576904296875, 0.06903076171875, -0.0233612060546875, 0.01209259033203125, 0.056976318359375, -0.00899505615234375, -0.0249481201171875, 0.045684814453125, 0.0168304443359375, -0.04461669921875, -0.0009946823120117188, -0.044158935546875, -0.033966064453125, 0.00921630859375, -0.0538330078125, 0.036285400390625, -0.0303192138671875, -0.0068511962890625, -0.0212249755859375, -0.006481170654296875, -0.055419921875, 0.0120391845703125, 0.00421905517578125, 0.103759765625, -0.062225341796875, 0.06414794921875, 0.0621337890625, -0.036407470703125, -0.043182373046875, -0.002414703369140625, 0.0027866363525390625, -0.064697265625, 0.024658203125, 0.023284912109375, -0.004810333251953125, 0.0024890899658203125, -0.056304931640625, -0.04937744140625, 0.1090087890625, 0.012908935546875, -0.032745361328125, 0.005413055419921875, -0.034515380859375, 0.0262451171875, -0.0401611328125, 0.0094146728515625, 0.005023956298828125, 0.0275115966796875, 0.03265380859375, -0.045196533203125, -0.005191802978515625, -0.0217742919921875, 0.0026721954345703125, 0.0208740234375, -0.06744384765625, 0.06005859375, -0.0272674560546875, 0.005588531494140625, 0.02685546875, 0.0595703125, 0.0269775390625, 0.048553466796875, 0.04290771484375, 0.08074951171875, 0.0384521484375, -0.016082763671875, 0.06036376953125, 0.0073089599609375, 0.0222625732421875, 0.06549072265625, -0.0018711090087890625, 0.0465087890625, 0.0267791748046875, -0.02593994140625, 0.056976318359375, 0.07611083984375, 0.005584716796875, 0.066162109375, -0.007678985595703125, -0.0214385986328125, 0.005008697509765625, 0.0279083251953125, -0.05462646484375, 0.00023806095123291016, 0.00704193115234375, -0.0087890625, -0.018524169921875, 0.03619384765625, 0.0183563232421875, -0.0150299072265625, -0.04058837890625, 0.03253173828125, 0.005092620849609375, -0.03179931640625, 0.048553466796875, -0.0163421630859375, 0.05072021484375, -0.071044921875, -0.00665283203125, -0.00905609130859375, -0.0012378692626953125, -0.0185394287109375, -0.06695556640625, -0.0031719207763671875, -0.0081329345703125, 0.003887176513671875, -0.03411865234375, 0.0499267578125, -0.0166015625, -0.026336669921875, 0.01178741455078125, 0.0267486572265625, 0.04119873046875, 0.00849151611328125, -0.06414794921875, -0.0017137527465820312, 0.0160675048828125, -0.044586181640625, 0.03515625, 0.036163330078125, 0.0203704833984375, 0.0272216796875, 0.040496826171875, 0.00992584228515625, 0.01131439208984375, -0.0027103424072265625, 0.07562255859375, -0.0333251953125, -0.033203125, -0.05096435546875, 0.046356201171875, -0.007328033447265625, -0.047943115234375, 0.06201171875, 0.052947998046875, 0.065185546875, -0.0212860107421875, 0.040618896484375, 0.00786590576171875, 0.0440673828125, -0.06561279296875, 0.044525146484375, -0.046295166015625, -0.003021240234375, -0.0168304443359375, -0.06475830078125, 0.01161956787109375, 0.055328369140625, 0.001617431640625, 0.0143890380859375, 0.05035400390625, 0.034423828125, -0.01922607421875, -0.006046295166015625, 0.0099334716796875, 0.0234222412109375, 0.002994537353515625, 0.0258026123046875, 0.056182861328125, -0.04296875, 0.0133819580078125, -0.0263214111328125, -0.033477783203125, -0.021514892578125, -0.066162109375, -0.06695556640625, -0.04150390625, -0.044158935546875, -0.061126708984375, -0.006595611572265625, 0.06341552734375, 0.04437255859375, -0.031280517578125, 0.000020682811737060547, 0.004825592041015625, 0.002391815185546875, -0.0025997161865234375, -0.01546478271484375, 0.01059722900390625, -0.004817962646484375, -0.06329345703125, 0.0261383056640625, 0.012298583984375, 0.03472900390625, -0.0207672119140625, -0.0195159912109375, -0.0121917724609375, -0.0218048095703125, 0.0292510986328125, 0.0287322998046875, -0.01214599609375, -0.0042724609375, -0.007328033447265625, 0.00010144710540771484, 0.033660888671875, 0.015289306640625, -0.04180908203125, 0.032379150390625, 0.049224853515625, -0.006755828857421875, 0.05841064453125, -0.0177001953125, 0.003696441650390625, -0.0443115234375, 0.024993896484375, 0.0089874267578125, 0.060455322265625, 0.0057220458984375, -0.01003265380859375, 0.0648193359375, 0.0537109375, -0.0621337890625, -0.04351806640625, 0.00966644287109375, -0.11199951171875, -0.0101318359375, 0.08831787109375, -0.016693115234375, -0.017608642578125, 0.00969696044921875, -0.039947509765625, 0.017669677734375, -0.0418701171875, 0.0172882080078125, 0.0287322998046875, -0.026336669921875, -0.002040863037109375, -0.054718017578125, 0.010986328125, -0.0212860107421875, -0.049102783203125, -0.053009033203125, 0.04656982421875, 0.040191650390625, 0.0231170654296875, 0.0860595703125, -0.01282501220703125, 0.000022351741790771484, 0.002796173095703125, 0.01116943359375, -0.0087738037109375, -0.0176849365234375, -0.025390625, 0.01172637939453125, -0.0018482208251953125, -0.06884765625 ] ]
timm/vit_base_patch16_224.augreg2_in21k_ft_in1k
2023-05-06T00:00:25.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-1k", "dataset:imagenet-21k", "arxiv:2106.10270", "arxiv:2010.11929", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/vit_base_patch16_224.augreg2_in21k_ft_in1k
2
135,764
timm
2022-12-22T07:24:28
--- tags: - image-classification - timm library_name: timm license: apache-2.0 datasets: - imagenet-1k - imagenet-21k --- # Model card for vit_base_patch16_224.augreg2_in21k_ft_in1k A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k by paper authors and (re) fine-tuned on ImageNet-1k with additional augmentation and regularization by Ross Wightman. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 86.6 - GMACs: 16.9 - Activations (M): 16.5 - Image size: 224 x 224 - **Papers:** - How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270 - An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2 - **Dataset:** ImageNet-1k - **Pretrain Dataset:** ImageNet-21k - **Original:** https://github.com/google-research/vision_transformer ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('vit_base_patch16_224.augreg2_in21k_ft_in1k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'vit_base_patch16_224.augreg2_in21k_ft_in1k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 197, 768) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). ## Citation ```bibtex @article{steiner2021augreg, title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers}, author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas}, journal={arXiv preprint arXiv:2106.10270}, year={2021} } ``` ```bibtex @article{dosovitskiy2020vit, title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale}, author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil}, journal={ICLR}, year={2021} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
3,886
[ [ -0.039154052734375, -0.0280609130859375, -0.00287628173828125, 0.007205963134765625, -0.0303802490234375, -0.0254058837890625, -0.020233154296875, -0.03692626953125, 0.01264190673828125, 0.0260009765625, -0.04302978515625, -0.0343017578125, -0.04840087890625, -0.0006165504455566406, -0.00907135009765625, 0.0701904296875, -0.009002685546875, 0.006938934326171875, -0.0182342529296875, -0.0330810546875, -0.026092529296875, -0.0197296142578125, -0.04742431640625, -0.034088134765625, 0.0247039794921875, 0.01438140869140625, 0.041168212890625, 0.047882080078125, 0.058074951171875, 0.033111572265625, -0.0092010498046875, 0.014007568359375, -0.0278472900390625, -0.0196990966796875, 0.022247314453125, -0.048065185546875, -0.0303802490234375, 0.0187225341796875, 0.05548095703125, 0.0281219482421875, 0.00988006591796875, 0.0247039794921875, 0.01195526123046875, 0.040985107421875, -0.023590087890625, 0.0142974853515625, -0.04144287109375, 0.018585205078125, -0.0054168701171875, -0.003841400146484375, -0.0221405029296875, -0.0247039794921875, 0.01422119140625, -0.041717529296875, 0.043365478515625, -0.003612518310546875, 0.1053466796875, 0.023712158203125, 0.0036945343017578125, 0.01849365234375, -0.0316162109375, 0.057098388671875, -0.0491943359375, 0.03302001953125, 0.0175018310546875, 0.01270294189453125, 0.00438690185546875, -0.07366943359375, -0.05059814453125, -0.01053619384765625, -0.0195770263671875, 0.00849151611328125, -0.0204010009765625, 0.0157470703125, 0.035186767578125, 0.04302978515625, -0.038909912109375, 0.001949310302734375, -0.04412841796875, -0.01922607421875, 0.04241943359375, -0.00250244140625, 0.0138702392578125, -0.010650634765625, -0.050262451171875, -0.041748046875, -0.028411865234375, 0.023681640625, 0.0234375, 0.0048065185546875, -0.03717041015625, 0.0394287109375, 0.0040130615234375, 0.047088623046875, 0.0223236083984375, -0.016021728515625, 0.051300048828125, -0.0165557861328125, -0.0294036865234375, -0.020721435546875, 0.08062744140625, 0.036224365234375, 0.030670166015625, -0.0011615753173828125, -0.01543426513671875, -0.009490966796875, 0.005859375, -0.08160400390625, -0.0257415771484375, 0.005123138427734375, -0.0341796875, -0.030242919921875, 0.0264434814453125, -0.04498291015625, -0.00830841064453125, -0.01043701171875, 0.056915283203125, -0.033966064453125, -0.01690673828125, 0.00772857666015625, -0.0128021240234375, 0.040191650390625, 0.0169219970703125, -0.047576904296875, 0.01165771484375, 0.0197906494140625, 0.07513427734375, 0.0024852752685546875, -0.035308837890625, -0.0179443359375, -0.033416748046875, -0.028167724609375, 0.039825439453125, -0.004180908203125, -0.01056671142578125, -0.014312744140625, 0.029510498046875, -0.019012451171875, -0.045166015625, 0.024749755859375, -0.0155792236328125, 0.0274200439453125, 0.004688262939453125, -0.0171356201171875, -0.0313720703125, 0.022796630859375, -0.031524658203125, 0.0950927734375, 0.031494140625, -0.06658935546875, 0.03179931640625, -0.034393310546875, -0.00669097900390625, -0.007579803466796875, 0.0007557868957519531, -0.080078125, 0.0021991729736328125, 0.0208740234375, 0.04144287109375, -0.01654052734375, -0.0011091232299804688, -0.0275115966796875, -0.02618408203125, 0.0244293212890625, -0.0171051025390625, 0.06854248046875, 0.000018358230590820312, -0.023529052734375, 0.018585205078125, -0.046844482421875, 0.00612640380859375, 0.03253173828125, -0.0211639404296875, -0.0006432533264160156, -0.045257568359375, 0.01081085205078125, 0.0172882080078125, 0.0186004638671875, -0.051849365234375, 0.0268096923828125, -0.0301055908203125, 0.0301513671875, 0.049163818359375, -0.00661468505859375, 0.02825927734375, -0.0262908935546875, 0.02423095703125, 0.0218963623046875, 0.031768798828125, -0.0117645263671875, -0.048553466796875, -0.0771484375, -0.02978515625, 0.0267333984375, 0.038116455078125, -0.04888916015625, 0.043548583984375, -0.0267333984375, -0.0540771484375, -0.04541015625, 0.005619049072265625, 0.03692626953125, 0.04302978515625, 0.03961181640625, -0.042572021484375, -0.04052734375, -0.0762939453125, -0.00853729248046875, -0.005214691162109375, 0.0005311965942382812, 0.0182342529296875, 0.04388427734375, -0.0207977294921875, 0.06494140625, -0.032989501953125, -0.0250396728515625, -0.01392364501953125, 0.0029392242431640625, 0.0233154296875, 0.0582275390625, 0.054901123046875, -0.05218505859375, -0.03460693359375, -0.01104736328125, -0.064453125, 0.0101165771484375, -0.003452301025390625, -0.0149383544921875, 0.01000213623046875, 0.01432037353515625, -0.050018310546875, 0.054229736328125, 0.010406494140625, -0.0283050537109375, 0.03082275390625, -0.0174713134765625, 0.00582122802734375, -0.08880615234375, -0.0010509490966796875, 0.029571533203125, -0.020233154296875, -0.04107666015625, 0.00035762786865234375, 0.0092620849609375, -0.0002541542053222656, -0.0318603515625, 0.04248046875, -0.03765869140625, -0.0008602142333984375, -0.005893707275390625, -0.026214599609375, 0.00524139404296875, 0.057891845703125, -0.00435638427734375, 0.040374755859375, 0.0555419921875, -0.036285400390625, 0.03973388671875, 0.041351318359375, -0.01629638671875, 0.036468505859375, -0.055145263671875, 0.0118865966796875, -0.0053253173828125, 0.01380157470703125, -0.0791015625, -0.01381683349609375, 0.0296478271484375, -0.054656982421875, 0.052490234375, -0.03692626953125, -0.033843994140625, -0.044586181640625, -0.03302001953125, 0.0307769775390625, 0.057647705078125, -0.0565185546875, 0.040802001953125, 0.00875091552734375, 0.0216064453125, -0.043792724609375, -0.07550048828125, -0.01605224609375, -0.0273284912109375, -0.053619384765625, 0.035003662109375, 0.0029144287109375, 0.01136016845703125, 0.004718780517578125, -0.00566864013671875, -0.00266265869140625, -0.01500701904296875, 0.0345458984375, 0.03228759765625, -0.0184783935546875, -0.0050201416015625, -0.0223846435546875, -0.01727294921875, 0.0011758804321289062, -0.0242767333984375, 0.03509521484375, -0.0216827392578125, -0.01442718505859375, -0.053497314453125, -0.0163116455078125, 0.0360107421875, -0.021942138671875, 0.053314208984375, 0.08612060546875, -0.03814697265625, 0.006137847900390625, -0.0445556640625, -0.0283203125, -0.037384033203125, 0.0338134765625, -0.0242919921875, -0.035247802734375, 0.053680419921875, 0.01212310791015625, 0.0071258544921875, 0.059356689453125, 0.0309600830078125, 0.0032863616943359375, 0.06134033203125, 0.052734375, 0.012176513671875, 0.064208984375, -0.07220458984375, -0.00945281982421875, -0.07159423828125, -0.03167724609375, -0.0174560546875, -0.041717529296875, -0.05084228515625, -0.03704833984375, 0.031494140625, 0.00896453857421875, -0.0207672119140625, 0.04107666015625, -0.065185546875, 0.01433563232421875, 0.053741455078125, 0.039642333984375, -0.0104827880859375, 0.030548095703125, -0.01381683349609375, -0.0059051513671875, -0.056488037109375, -0.008636474609375, 0.07928466796875, 0.037017822265625, 0.057647705078125, -0.0211639404296875, 0.045867919921875, -0.0181427001953125, 0.0210723876953125, -0.057342529296875, 0.04107666015625, -0.0059967041015625, -0.03125, -0.01024627685546875, -0.0274200439453125, -0.0743408203125, 0.01373291015625, -0.0256195068359375, -0.0570068359375, 0.0296478271484375, 0.01415252685546875, -0.0175018310546875, 0.04840087890625, -0.0621337890625, 0.071044921875, -0.00626373291015625, -0.033233642578125, 0.00618743896484375, -0.05670166015625, 0.016143798828125, 0.01422119140625, -0.026214599609375, 0.00934600830078125, 0.0174560546875, 0.07379150390625, -0.045501708984375, 0.06341552734375, -0.031890869140625, 0.024932861328125, 0.03662109375, -0.0203857421875, 0.028839111328125, 0.0004711151123046875, 0.01220703125, 0.0254974365234375, -0.003963470458984375, -0.0276336669921875, -0.0361328125, 0.035308837890625, -0.075927734375, -0.0276031494140625, -0.03350830078125, -0.04156494140625, 0.006717681884765625, 0.006275177001953125, 0.05450439453125, 0.048583984375, 0.01995849609375, 0.0294647216796875, 0.05450439453125, -0.0234832763671875, 0.02783203125, 0.001819610595703125, -0.00879669189453125, -0.0413818359375, 0.06988525390625, 0.020111083984375, 0.014068603515625, 0.016143798828125, 0.016387939453125, -0.0249481201171875, -0.03704833984375, -0.0284576416015625, 0.0306243896484375, -0.052459716796875, -0.034942626953125, -0.043182373046875, -0.041473388671875, -0.026336669921875, 0.0028629302978515625, -0.032440185546875, -0.0275421142578125, -0.0291290283203125, 0.00588226318359375, 0.060760498046875, 0.040863037109375, -0.01094818115234375, 0.03924560546875, -0.0438232421875, 0.017822265625, 0.0228729248046875, 0.04193115234375, -0.01507568359375, -0.0780029296875, -0.0290069580078125, 0.0031280517578125, -0.036224365234375, -0.056060791015625, 0.0306854248046875, 0.017242431640625, 0.036285400390625, 0.0310211181640625, -0.021209716796875, 0.062744140625, -0.00905609130859375, 0.047393798828125, 0.0252532958984375, -0.03802490234375, 0.0372314453125, -0.00682830810546875, 0.01100921630859375, 0.01678466796875, 0.01396942138671875, -0.018707275390625, -0.003314971923828125, -0.07952880859375, -0.05548095703125, 0.05908203125, 0.0180511474609375, 0.005084991455078125, 0.03448486328125, 0.048248291015625, -0.0015268325805664062, 0.003833770751953125, -0.064697265625, -0.0271148681640625, -0.030242919921875, -0.021575927734375, -0.006195068359375, -0.006572723388671875, -0.0008916854858398438, -0.0595703125, 0.051849365234375, -0.005123138427734375, 0.06005859375, 0.033111572265625, -0.0124969482421875, -0.01519012451171875, -0.0284423828125, 0.030303955078125, 0.018951416015625, -0.0211029052734375, 0.0008983612060546875, 0.022186279296875, -0.0560302734375, -0.00225830078125, 0.025115966796875, -0.00908660888671875, 0.004291534423828125, 0.034942626953125, 0.08367919921875, -0.006755828857421875, -0.0004646778106689453, 0.041168212890625, -0.006702423095703125, -0.0347900390625, -0.0206756591796875, 0.0077667236328125, -0.017242431640625, 0.027740478515625, 0.0248260498046875, 0.031524658203125, -0.00916290283203125, -0.0100860595703125, 0.01255035400390625, 0.04058837890625, -0.04217529296875, -0.0293426513671875, 0.051605224609375, -0.015625, -0.009857177734375, 0.058502197265625, -0.0067138671875, -0.04345703125, 0.0670166015625, 0.0251617431640625, 0.078369140625, -0.00811004638671875, -0.004817962646484375, 0.06005859375, 0.02972412109375, -0.0036334991455078125, 0.0147247314453125, 0.01093292236328125, -0.0604248046875, -0.00545501708984375, -0.04730224609375, 0.0035247802734375, 0.0279388427734375, -0.040008544921875, 0.0295562744140625, -0.03839111328125, -0.027130126953125, 0.0045166015625, 0.018402099609375, -0.07525634765625, 0.0228729248046875, 0.005126953125, 0.060516357421875, -0.061676025390625, 0.049285888671875, 0.06585693359375, -0.048065185546875, -0.07244873046875, -0.01406097412109375, -0.0117645263671875, -0.06524658203125, 0.032928466796875, 0.03277587890625, 0.01146697998046875, 0.016571044921875, -0.06365966796875, -0.047607421875, 0.0986328125, 0.0277557373046875, -0.01214599609375, 0.01186370849609375, -0.0006256103515625, 0.027923583984375, -0.023101806640625, 0.0303192138671875, 0.0131072998046875, 0.03253173828125, 0.0157470703125, -0.053436279296875, 0.006198883056640625, -0.0288543701171875, 0.0133209228515625, 0.01548004150390625, -0.06439208984375, 0.07177734375, -0.030548095703125, -0.0089263916015625, 0.01230621337890625, 0.047149658203125, 0.007354736328125, 0.0062713623046875, 0.044219970703125, 0.06793212890625, 0.0300140380859375, -0.02996826171875, 0.06610107421875, -0.0121002197265625, 0.050201416015625, 0.038421630859375, 0.03643798828125, 0.032867431640625, 0.033416748046875, -0.02545166015625, 0.025726318359375, 0.07769775390625, -0.0439453125, 0.0248870849609375, 0.00688934326171875, 0.006214141845703125, -0.0179443359375, 0.0037822723388671875, -0.03765869140625, 0.0367431640625, 0.0158843994140625, -0.040313720703125, -0.00583648681640625, 0.011688232421875, -0.01236724853515625, -0.0268402099609375, -0.01343536376953125, 0.0478515625, 0.00222015380859375, -0.03277587890625, 0.06304931640625, 0.00012046098709106445, 0.061859130859375, -0.03485107421875, -0.0040283203125, -0.020111083984375, 0.0291290283203125, -0.028045654296875, -0.059295654296875, 0.01318359375, -0.01678466796875, -0.00913238525390625, 0.0028743743896484375, 0.055999755859375, -0.032806396484375, -0.0426025390625, 0.00882720947265625, 0.0228118896484375, 0.0245819091796875, -0.00433349609375, -0.07830810546875, -0.00605010986328125, 0.0006723403930664062, -0.044189453125, 0.0150299072265625, 0.029876708984375, 0.002979278564453125, 0.049041748046875, 0.05010986328125, -0.006580352783203125, 0.016204833984375, -0.0095367431640625, 0.071044921875, -0.0303802490234375, -0.0307769775390625, -0.0595703125, 0.04827880859375, -0.005153656005859375, -0.0467529296875, 0.049163818359375, 0.048370361328125, 0.0692138671875, -0.01035308837890625, 0.036834716796875, -0.0101165771484375, 0.004161834716796875, -0.0283966064453125, 0.04412841796875, -0.05426025390625, -0.006748199462890625, -0.01806640625, -0.06622314453125, -0.024932861328125, 0.06689453125, -0.019622802734375, 0.0311431884765625, 0.037139892578125, 0.07476806640625, -0.02471923828125, -0.0306243896484375, 0.0131072998046875, 0.01467132568359375, 0.00662994384765625, 0.02978515625, 0.0396728515625, -0.06707763671875, 0.034271240234375, -0.0460205078125, -0.013916015625, -0.0177459716796875, -0.035888671875, -0.0767822265625, -0.06256103515625, -0.042877197265625, -0.048492431640625, -0.0170135498046875, 0.06341552734375, 0.0728759765625, -0.04278564453125, -0.004299163818359375, -0.00933074951171875, 0.00018274784088134766, -0.0246124267578125, -0.01776123046875, 0.035919189453125, -0.009490966796875, -0.055877685546875, -0.0201873779296875, 0.0010938644409179688, 0.038604736328125, -0.01451873779296875, -0.015655517578125, -0.0137176513671875, -0.02032470703125, 0.0189971923828125, 0.0217742919921875, -0.051300048828125, -0.016693115234375, -0.00359344482421875, -0.004840850830078125, 0.040252685546875, 0.02783203125, -0.05462646484375, 0.040802001953125, 0.04339599609375, 0.026214599609375, 0.061981201171875, -0.00934600830078125, 0.006839752197265625, -0.06597900390625, 0.04083251953125, -0.004978179931640625, 0.038238525390625, 0.03961181640625, -0.02490234375, 0.048187255859375, 0.0438232421875, -0.03369140625, -0.062744140625, -0.0017910003662109375, -0.0841064453125, 0.007537841796875, 0.07208251953125, -0.01788330078125, -0.033447265625, 0.029632568359375, -0.013275146484375, 0.053192138671875, -0.00395965576171875, 0.0316162109375, 0.01898193359375, 0.006290435791015625, -0.0445556640625, -0.03466796875, 0.0391845703125, 0.00875091552734375, -0.04034423828125, -0.030548095703125, 0.002521514892578125, 0.042022705078125, 0.0280609130859375, 0.0273284912109375, -0.0120697021484375, 0.012969970703125, 0.00479888916015625, 0.037933349609375, -0.030120849609375, -0.011138916015625, -0.033477783203125, -0.01419830322265625, -0.0079193115234375, -0.047271728515625 ] ]
facebook/esm2_t6_8M_UR50D
2023-03-21T15:05:17.000Z
[ "transformers", "pytorch", "tf", "safetensors", "esm", "fill-mask", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
facebook
null
null
facebook/esm2_t6_8M_UR50D
9
134,648
transformers
2022-09-26T18:44:55
--- license: mit widget: - text: "MQIFVKTLTGKTITLEVEPS<mask>TIENVKAKIQDKEGIPPDQQRLIFAGKQLEDGRTLSDYNIQKESTLHLVLRLRGG" --- ## ESM-2 ESM-2 is a state-of-the-art protein model trained on a masked language modelling objective. It is suitable for fine-tuning on a wide range of tasks that take protein sequences as input. For detailed information on the model architecture and training data, please refer to the [accompanying paper](https://www.biorxiv.org/content/10.1101/2022.07.20.500902v2). You may also be interested in some demo notebooks ([PyTorch](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/protein_language_modeling.ipynb), [TensorFlow](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/protein_language_modeling-tf.ipynb)) which demonstrate how to fine-tune ESM-2 models on your tasks of interest. Several ESM-2 checkpoints are available in the Hub with varying sizes. Larger sizes generally have somewhat better accuracy, but require much more memory and time to train: | Checkpoint name | Num layers | Num parameters | |------------------------------|----|----------| | [esm2_t48_15B_UR50D](https://huggingface.co/facebook/esm2_t48_15B_UR50D) | 48 | 15B | | [esm2_t36_3B_UR50D](https://huggingface.co/facebook/esm2_t36_3B_UR50D) | 36 | 3B | | [esm2_t33_650M_UR50D](https://huggingface.co/facebook/esm2_t33_650M_UR50D) | 33 | 650M | | [esm2_t30_150M_UR50D](https://huggingface.co/facebook/esm2_t30_150M_UR50D) | 30 | 150M | | [esm2_t12_35M_UR50D](https://huggingface.co/facebook/esm2_t12_35M_UR50D) | 12 | 35M | | [esm2_t6_8M_UR50D](https://huggingface.co/facebook/esm2_t6_8M_UR50D) | 6 | 8M |
1,705
[ [ -0.029876708984375, -0.041015625, 0.0237884521484375, 0.0173492431640625, -0.0149078369140625, 0.005001068115234375, 0.00994873046875, -0.03546142578125, 0.01806640625, 0.0284881591796875, -0.056793212890625, -0.03656005859375, -0.064208984375, 0.005741119384765625, -0.013885498046875, 0.0743408203125, 0.00116729736328125, 0.01885986328125, -0.0238800048828125, -0.00672149658203125, -0.007171630859375, -0.0170440673828125, -0.05670166015625, -0.050445556640625, 0.0230560302734375, 0.033294677734375, 0.0191650390625, 0.046905517578125, 0.0333251953125, 0.0162200927734375, -0.034912109375, 0.0249176025390625, -0.03607177734375, 0.01318359375, -0.0088043212890625, -0.0305938720703125, -0.055877685546875, -0.01078033447265625, 0.03192138671875, 0.03692626953125, 0.004093170166015625, 0.034454345703125, 0.013763427734375, 0.06524658203125, -0.022796630859375, 0.0138397216796875, -0.029571533203125, 0.0202178955078125, -0.0207061767578125, -0.00205230712890625, -0.0264739990234375, 0.0036487579345703125, 0.004505157470703125, -0.024505615234375, 0.01412200927734375, 0.0107421875, 0.09185791015625, 0.0157470703125, -0.044219970703125, -0.016021728515625, -0.0306549072265625, 0.05950927734375, -0.034423828125, 0.0303802490234375, 0.052001953125, 0.021484375, -0.021484375, -0.0535888671875, -0.00260162353515625, 0.021453857421875, -0.001277923583984375, 0.028076171875, -0.018646240234375, 0.013580322265625, 0.03961181640625, 0.022369384765625, -0.0679931640625, 0.01380157470703125, -0.04443359375, -0.01549530029296875, 0.040679931640625, 0.01464080810546875, 0.0236053466796875, 0.0019989013671875, -0.032958984375, 0.01085662841796875, -0.0399169921875, 0.003692626953125, 0.0205535888671875, -0.00302886962890625, -0.0158538818359375, 0.04351806640625, -0.033111572265625, 0.053985595703125, 0.00537109375, -0.01172637939453125, 0.034088134765625, -0.003902435302734375, 0.0024890899658203125, -0.0355224609375, 0.038848876953125, 0.0509033203125, -0.00638580322265625, -0.007762908935546875, -0.031097412109375, -0.00797271728515625, 0.003910064697265625, -0.09149169921875, -0.01213836669921875, 0.04443359375, -0.03631591796875, -0.016754150390625, 0.00992584228515625, -0.056854248046875, -0.0008678436279296875, -0.0178985595703125, 0.0267333984375, -0.037384033203125, -0.0174407958984375, 0.01221466064453125, -0.033660888671875, 0.02471923828125, 0.0158233642578125, -0.061431884765625, 0.044921875, 0.050384521484375, 0.08306884765625, -0.00305938720703125, -0.0159912109375, -0.031280517578125, 0.02044677734375, -0.01015472412109375, 0.061859130859375, -0.0154266357421875, -0.003780364990234375, 0.0022220611572265625, 0.022796630859375, -0.00453948974609375, -0.03857421875, 0.027679443359375, -0.0217437744140625, 0.011138916015625, -0.030548095703125, -0.056640625, -0.0291900634765625, 0.0086669921875, -0.0302886962890625, 0.1041259765625, 0.0152740478515625, -0.0408935546875, 0.00753021240234375, -0.0411376953125, -0.0251007080078125, -0.0007410049438476562, -0.0108184814453125, -0.05352783203125, 0.008544921875, -0.0136871337890625, 0.03057861328125, -0.0233612060546875, 0.0022525787353515625, -0.02398681640625, -0.025787353515625, 0.005401611328125, 0.032257080078125, 0.05029296875, 0.03363037109375, -0.042877197265625, -0.0225830078125, -0.06646728515625, 0.018707275390625, 0.0158233642578125, -0.0197601318359375, 0.0256195068359375, 0.0021514892578125, 0.016448974609375, 0.045745849609375, 0.01812744140625, -0.03411865234375, 0.005626678466796875, -0.0230560302734375, 0.046600341796875, 0.032958984375, 0.0015277862548828125, 0.020782470703125, -0.04913330078125, 0.0296478271484375, 0.003204345703125, 0.006256103515625, -0.00777435302734375, -0.061187744140625, -0.06744384765625, -0.028350830078125, -0.006275177001953125, 0.048858642578125, -0.0197601318359375, 0.055084228515625, 0.0122833251953125, -0.042816162109375, -0.0269317626953125, 0.00823974609375, 0.040374755859375, 0.018463134765625, 0.033782958984375, -0.01464080810546875, -0.05731201171875, -0.08612060546875, -0.0274658203125, 0.001544952392578125, -0.0225372314453125, 0.0152130126953125, 0.05963134765625, -0.026641845703125, 0.054718017578125, -0.02667236328125, -0.0219879150390625, -0.0126953125, 0.0086517333984375, 0.00937652587890625, 0.0478515625, 0.046234130859375, -0.0322265625, -0.031158447265625, -0.008636474609375, -0.05731201171875, -0.0152740478515625, 0.01122283935546875, -0.003574371337890625, 0.01136016845703125, 0.045196533203125, -0.034423828125, 0.008514404296875, 0.051055908203125, -0.046722412109375, 0.00991058349609375, -0.0117645263671875, -0.0023708343505859375, -0.09649658203125, 0.0152435302734375, 0.00008362531661987305, -0.03424072265625, -0.04681396484375, 0.007648468017578125, 0.0125732421875, -0.01519012451171875, -0.0416259765625, 0.04833984375, -0.054107666015625, -0.0234222412109375, -0.0274505615234375, -0.0021152496337890625, 0.0186920166015625, 0.03668212890625, 0.001277923583984375, 0.03265380859375, 0.054595947265625, -0.0256500244140625, 0.004024505615234375, 0.0270538330078125, -0.02203369140625, 0.031219482421875, -0.06622314453125, 0.03704833984375, -0.0192413330078125, 0.023895263671875, -0.073486328125, -0.0330810546875, 0.01125335693359375, -0.03851318359375, 0.03802490234375, -0.0248870849609375, -0.0335693359375, -0.03631591796875, -0.032928466796875, 0.012725830078125, 0.05633544921875, -0.0262451171875, 0.032196044921875, 0.041229248046875, -0.0027618408203125, -0.0264739990234375, -0.0712890625, -0.0085296630859375, -0.0089111328125, -0.047698974609375, 0.034423828125, 0.002132415771484375, 0.01171112060546875, -0.01244354248046875, -0.01020050048828125, 0.00807952880859375, 0.0018568038940429688, 0.0472412109375, -0.0031986236572265625, 0.00847625732421875, -0.013824462890625, 0.027496337890625, -0.02154541015625, -0.01309967041015625, -0.01898193359375, 0.048248291015625, -0.035614013671875, -0.0141754150390625, -0.04949951171875, 0.033721923828125, 0.05059814453125, -0.00812530517578125, 0.06573486328125, 0.060302734375, -0.06109619140625, -0.0086669921875, -0.04327392578125, -0.0325927734375, -0.031341552734375, 0.059906005859375, -0.04376220703125, -0.07720947265625, 0.056732177734375, -0.01532745361328125, 0.0014963150024414062, 0.04052734375, 0.047088623046875, -0.0189208984375, 0.0902099609375, 0.0277099609375, 0.026123046875, 0.02911376953125, -0.034637451171875, -0.006317138671875, -0.07073974609375, -0.05816650390625, -0.044403076171875, -0.03399658203125, -0.031219482421875, -0.036285400390625, 0.00911712646484375, 0.04168701171875, -0.039642333984375, 0.0491943359375, -0.025054931640625, 0.033843994140625, 0.0179901123046875, 0.022735595703125, -0.0136566162109375, 0.01507568359375, -0.00624847412109375, 0.0030269622802734375, -0.0616455078125, -0.044464111328125, 0.0653076171875, 0.06353759765625, 0.033599853515625, 0.0096893310546875, 0.04510498046875, 0.007427215576171875, -0.00688934326171875, -0.060516357421875, 0.035308837890625, -0.01120758056640625, -0.0589599609375, -0.006412506103515625, -0.0142669677734375, -0.049652099609375, 0.0101165771484375, -0.0161590576171875, -0.06939697265625, -0.00201416015625, 0.01318359375, -0.0159759521484375, 0.02423095703125, -0.03961181640625, 0.046661376953125, 0.0008678436279296875, -0.0232696533203125, -0.0088958740234375, -0.06109619140625, -0.0017385482788085938, 0.00012254714965820312, 0.00598907470703125, -0.0295867919921875, -0.01285552978515625, 0.07513427734375, -0.041748046875, 0.055328369140625, -0.01238250732421875, 0.0256195068359375, 0.020355224609375, 0.0025310516357421875, 0.066162109375, 0.0070343017578125, -0.0130767822265625, 0.022705078125, 0.00843048095703125, -0.0628662109375, -0.0172119140625, 0.037384033203125, -0.0740966796875, -0.01154327392578125, -0.04071044921875, -0.0243072509765625, -0.013824462890625, 0.0175323486328125, 0.053253173828125, 0.034454345703125, -0.0010385513305664062, 0.025634765625, 0.04559326171875, -0.0185089111328125, 0.0187835693359375, 0.0531005859375, -0.01611328125, -0.039520263671875, 0.04534912109375, 0.018707275390625, 0.024871826171875, 0.024993896484375, -0.00922393798828125, -0.031707763671875, -0.043304443359375, -0.030548095703125, 0.019378662109375, -0.03564453125, -0.031463623046875, -0.080322265625, -0.0232086181640625, -0.0273590087890625, -0.01012420654296875, -0.057373046875, -0.033233642578125, -0.0161590576171875, -0.020355224609375, 0.043914794921875, 0.046844482421875, -0.0206451416015625, 0.0167083740234375, -0.045501708984375, 0.01415252685546875, 0.01270294189453125, 0.029083251953125, -0.03253173828125, -0.06964111328125, -0.013671875, -0.0029888153076171875, -0.0203094482421875, -0.07275390625, 0.0171356201171875, 0.03857421875, 0.032562255859375, 0.032562255859375, -0.0296478271484375, 0.027008056640625, -0.0310516357421875, 0.051788330078125, 0.0269927978515625, -0.048431396484375, 0.056793212890625, -0.03302001953125, 0.02081298828125, 0.0472412109375, 0.02581787109375, -0.0489501953125, -0.0305938720703125, -0.03900146484375, -0.059173583984375, 0.064453125, 0.0241241455078125, -0.001148223876953125, -0.0103607177734375, 0.032684326171875, 0.007495880126953125, 0.00457000732421875, -0.03216552734375, -0.0367431640625, 0.004238128662109375, -0.005626678466796875, 0.01409149169921875, -0.052734375, -0.01342010498046875, -0.022552490234375, 0.075439453125, -0.012176513671875, 0.03826904296875, 0.0015239715576171875, -0.0021514892578125, -0.0300750732421875, -0.0121002197265625, 0.05474853515625, 0.038421630859375, -0.03857421875, 0.0073089599609375, 0.0272674560546875, -0.031768798828125, -0.003253936767578125, 0.006824493408203125, -0.029571533203125, 0.0034885406494140625, 0.0164947509765625, 0.0645751953125, 0.00649261474609375, -0.0341796875, 0.039642333984375, 0.015045166015625, -0.032806396484375, -0.015289306640625, -0.0084075927734375, 0.0230865478515625, 0.031646728515625, 0.0161590576171875, 0.0136566162109375, 0.01053619384765625, -0.043487548828125, 0.0322265625, 0.019134521484375, -0.04583740234375, -0.028411865234375, 0.054107666015625, 0.0129547119140625, -0.02825927734375, 0.0513916015625, -0.037078857421875, -0.0462646484375, 0.0589599609375, 0.060760498046875, 0.05908203125, -0.018951416015625, 0.0161590576171875, 0.06768798828125, 0.0206451416015625, -0.0282440185546875, 0.044647216796875, 0.028076171875, -0.043914794921875, -0.00983428955078125, -0.06500244140625, -0.0035648345947265625, 0.034454345703125, -0.06707763671875, 0.037933349609375, -0.0285797119140625, -0.0192413330078125, -0.00746917724609375, 0.01248931884765625, -0.0576171875, 0.01177215576171875, 0.006793975830078125, 0.08245849609375, -0.08135986328125, 0.0660400390625, 0.07244873046875, -0.01837158203125, -0.034759521484375, -0.037506103515625, 0.029998779296875, -0.0655517578125, 0.0153961181640625, 0.028167724609375, 0.0126190185546875, 0.00524139404296875, -0.0253143310546875, -0.06549072265625, 0.10992431640625, 0.0168304443359375, -0.06646728515625, 0.012359619140625, 0.0033588409423828125, 0.037750244140625, -0.0205230712890625, 0.0286865234375, 0.035308837890625, 0.0157470703125, 0.00902557373046875, -0.047576904296875, 0.00858306884765625, -0.0372314453125, 0.0136871337890625, 0.01171112060546875, -0.08514404296875, 0.05303955078125, -0.019866943359375, -0.002696990966796875, 0.031097412109375, 0.0447998046875, 0.043853759765625, 0.0303192138671875, 0.021942138671875, 0.057708740234375, 0.052093505859375, -0.0234222412109375, 0.061004638671875, -0.035675048828125, 0.06524658203125, 0.06866455078125, -0.0012693405151367188, 0.0423583984375, 0.03948974609375, -0.021759033203125, 0.0195770263671875, 0.0772705078125, -0.01947021484375, 0.0323486328125, 0.0227813720703125, -0.0043182373046875, -0.025146484375, -0.0051422119140625, -0.04620361328125, 0.01253509521484375, 0.0169677734375, -0.02703857421875, -0.01393890380859375, -0.0039520263671875, 0.00994873046875, -0.01230621337890625, -0.0012874603271484375, 0.052276611328125, 0.0165557861328125, -0.0303192138671875, 0.0220489501953125, 0.02056884765625, 0.0306243896484375, -0.041656494140625, 0.003509521484375, -0.033538818359375, 0.006626129150390625, -0.026031494140625, -0.047515869140625, 0.0237274169921875, 0.004791259765625, -0.017791748046875, -0.0221405029296875, 0.058624267578125, -0.037017822265625, -0.037567138671875, 0.033447265625, 0.035797119140625, 0.035369873046875, -0.00359344482421875, -0.0738525390625, 0.01568603515625, -0.0161590576171875, -0.03857421875, 0.033660888671875, 0.00847625732421875, 0.0238800048828125, 0.045928955078125, 0.01568603515625, -0.01256561279296875, -0.0089874267578125, 0.0047760009765625, 0.050628662109375, -0.04034423828125, -0.031829833984375, -0.050537109375, 0.037567138671875, -0.0048675537109375, -0.0294189453125, 0.049072265625, 0.079833984375, 0.06219482421875, -0.0169525146484375, 0.039520263671875, -0.014404296875, 0.039276123046875, -0.035552978515625, 0.043701171875, -0.05352783203125, -0.00814056396484375, -0.0004963874816894531, -0.06658935546875, -0.01262664794921875, 0.048614501953125, 0.0066375732421875, 0.01155853271484375, 0.044219970703125, 0.0784912109375, 0.01078033447265625, -0.00739288330078125, 0.01293182373046875, 0.01055145263671875, 0.0096282958984375, 0.05462646484375, 0.050811767578125, -0.06951904296875, 0.01232147216796875, -0.0184173583984375, -0.0291900634765625, -0.0299224853515625, -0.044586181640625, -0.08013916015625, -0.053802490234375, -0.04180908203125, -0.05511474609375, 0.020721435546875, 0.08074951171875, 0.0772705078125, -0.077392578125, -0.0099029541015625, -0.013275146484375, -0.0201568603515625, -0.025360107421875, -0.0101165771484375, 0.0174407958984375, -0.012359619140625, -0.0631103515625, 0.0235595703125, 0.039276123046875, 0.0179595947265625, 0.0149993896484375, -0.033294677734375, -0.01995849609375, 0.0005326271057128906, 0.0509033203125, 0.027557373046875, -0.042755126953125, -0.0261688232421875, 0.0014972686767578125, -0.0191497802734375, -0.006580352783203125, 0.0282745361328125, -0.006740570068359375, 0.0223846435546875, 0.047088623046875, 0.0296173095703125, 0.0726318359375, -0.014739990234375, 0.0293121337890625, -0.0489501953125, 0.0205078125, 0.00818634033203125, 0.0240325927734375, 0.006381988525390625, -0.0107421875, 0.048858642578125, 0.0303192138671875, -0.04071044921875, -0.057647705078125, 0.027099609375, -0.0877685546875, -0.0222930908203125, 0.10821533203125, 0.0018243789672851562, -0.0008449554443359375, -0.0012722015380859375, -0.0042877197265625, 0.029754638671875, -0.018341064453125, 0.04443359375, 0.052490234375, -0.016571044921875, -0.001194000244140625, -0.047149658203125, 0.0552978515625, 0.0389404296875, -0.05621337890625, -0.0352783203125, 0.007537841796875, 0.0396728515625, -0.01088714599609375, 0.045074462890625, -0.02947998046875, 0.0163116455078125, 0.007686614990234375, -0.0007224082946777344, -0.0217132568359375, -0.0278778076171875, -0.0216064453125, 0.0030651092529296875, 0.00263214111328125, -0.0196990966796875 ] ]
asfxxx/MModel
2023-10-15T13:22:47.000Z
[ "transformers", "pytorch", "mistral", "text-generation", "pretrained", "en", "arxiv:2310.06825", "license:apache-2.0", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
asfxxx
null
null
asfxxx/MModel
0
134,356
transformers
2023-10-15T12:54:37
--- license: apache-2.0 pipeline_tag: text-generation language: - en tags: - pretrained inference: parameters: temperature: 0.7 --- # Model Card for Mistral-7B-v0.1 The Mistral-7B-v0.1 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters. Mistral-7B-v0.1 outperforms Llama 2 13B on all benchmarks we tested. For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/announcing-mistral-7b/). ## Model Architecture Mistral-7B-v0.1 is a transformer model, with the following architecture choices: - Grouped-Query Attention - Sliding-Window Attention - Byte-fallback BPE tokenizer ## Troubleshooting - If you see the following error: ``` KeyError: 'mistral' ``` - Or: ``` NotImplementedError: Cannot copy out of meta tensor; no data! ``` Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer. ## Notice Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms. ## The Mistral AI Team Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lélio Renard Lavaud, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed.
1,390
[ [ -0.029541015625, -0.034576416015625, 0.0247955322265625, 0.0377197265625, -0.033294677734375, -0.01534271240234375, 0.002254486083984375, -0.016357421875, -0.01087188720703125, 0.04412841796875, -0.032928466796875, -0.038177490234375, -0.05755615234375, 0.009552001953125, -0.050506591796875, 0.07342529296875, -0.00533294677734375, 0.0143890380859375, -0.0026302337646484375, 0.006649017333984375, -0.036468505859375, -0.05120849609375, -0.0640869140625, -0.040435791015625, 0.0267486572265625, -0.005035400390625, 0.04241943359375, 0.033111572265625, 0.052734375, 0.0222930908203125, -0.03924560546875, 0.0113372802734375, -0.059844970703125, 0.0096893310546875, -0.0081787109375, -0.0494384765625, -0.04510498046875, -0.03619384765625, 0.043548583984375, 0.007358551025390625, -0.0203094482421875, 0.025848388671875, -0.0038967132568359375, 0.03631591796875, -0.025909423828125, 0.0233612060546875, -0.0478515625, -0.00970458984375, -0.029296875, -0.001720428466796875, -0.029937744140625, -0.0084991455078125, 0.004016876220703125, -0.03271484375, 0.03399658203125, 0.0049896240234375, 0.0728759765625, 0.0545654296875, -0.025360107421875, 0.004360198974609375, -0.0572509765625, 0.039703369140625, -0.06427001953125, 0.038299560546875, 0.014251708984375, 0.02471923828125, -0.019134521484375, -0.0853271484375, -0.042205810546875, -0.016754150390625, 0.016265869140625, 0.0222320556640625, -0.037017822265625, 0.01354217529296875, 0.040863037109375, 0.031829833984375, -0.0302886962890625, -0.006313323974609375, -0.06524658203125, -0.011077880859375, 0.0206298828125, 0.0264434814453125, -0.0096588134765625, -0.002727508544921875, -0.040374755859375, -0.0231475830078125, -0.037322998046875, 0.007648468017578125, 0.005451202392578125, -0.0031452178955078125, -0.022674560546875, 0.015838623046875, 0.00788116455078125, 0.046173095703125, 0.0268402099609375, -0.004791259765625, 0.0215911865234375, -0.0019292831420898438, -0.02679443359375, 0.004047393798828125, 0.06707763671875, 0.0140838623046875, 0.007205963134765625, -0.00814056396484375, -0.0294036865234375, -0.0006170272827148438, 0.029541015625, -0.06939697265625, -0.0037689208984375, 0.0166473388671875, -0.03424072265625, -0.017059326171875, 0.02313232421875, -0.042816162109375, -0.0246124267578125, -0.0136566162109375, 0.0287017822265625, -0.0241241455078125, -0.0302276611328125, 0.00868988037109375, -0.01904296875, 0.046875, 0.033172607421875, -0.048126220703125, 0.0205078125, 0.044708251953125, 0.05841064453125, -0.0172576904296875, -0.025238037109375, 0.0110931396484375, 0.00417327880859375, -0.03204345703125, 0.055389404296875, -0.00652313232421875, -0.053009033203125, -0.01202392578125, 0.0007123947143554688, -0.004467010498046875, -0.06903076171875, 0.05548095703125, -0.044952392578125, 0.004108428955078125, -0.0245208740234375, -0.0263671875, -0.02374267578125, 0.004306793212890625, -0.07745361328125, 0.09979248046875, 0.02001953125, -0.0638427734375, 0.00627899169921875, -0.032379150390625, -0.0069427490234375, -0.005947113037109375, -0.0010099411010742188, -0.0246734619140625, 0.031585693359375, -0.0032825469970703125, 0.027557373046875, -0.04827880859375, 0.04876708984375, -0.01033782958984375, -0.038909912109375, 0.0252685546875, -0.06475830078125, 0.0556640625, 0.019378662109375, -0.035125732421875, 0.00688934326171875, -0.06134033203125, -0.019989013671875, -0.00034356117248535156, -0.01849365234375, 0.01232147216796875, -0.033447265625, 0.0272216796875, 0.03125, 0.03717041015625, -0.005847930908203125, 0.02001953125, -0.017547607421875, 0.0264892578125, 0.06292724609375, -0.00440216064453125, 0.0167694091796875, -0.029266357421875, 0.049896240234375, 0.0285491943359375, 0.027008056640625, -0.00360107421875, -0.0263214111328125, -0.07647705078125, -0.03350830078125, 0.0145111083984375, 0.0391845703125, -0.05682373046875, 0.034576416015625, -0.031524658203125, -0.06427001953125, -0.055206298828125, 0.0088958740234375, 0.0293426513671875, 0.0233917236328125, 0.036376953125, -0.00551605224609375, -0.0230560302734375, -0.07366943359375, 0.001125335693359375, -0.0168304443359375, 0.0249481201171875, -0.0023250579833984375, 0.0285491943359375, -0.041748046875, 0.07373046875, -0.020843505859375, -0.0273590087890625, -0.030609130859375, -0.00647735595703125, 0.03289794921875, 0.01276397705078125, 0.038818359375, -0.04150390625, -0.0188446044921875, -0.0030231475830078125, -0.058868408203125, -0.029632568359375, 0.0030078887939453125, -0.01358795166015625, 0.006183624267578125, 0.0478515625, -0.06146240234375, 0.03533935546875, 0.04852294921875, -0.02325439453125, 0.031463623046875, -0.0124969482421875, -0.0101776123046875, -0.10980224609375, 0.01580810546875, -0.008026123046875, -0.01483154296875, -0.0443115234375, -0.01146697998046875, 0.007068634033203125, 0.0055999755859375, -0.051025390625, 0.06781005859375, -0.0258331298828125, 0.03179931640625, -0.03515625, -0.025360107421875, -0.0027141571044921875, 0.034698486328125, -0.0093841552734375, 0.049591064453125, 0.0582275390625, -0.04022216796875, 0.03955078125, 0.020050048828125, -0.006622314453125, 0.0245208740234375, -0.07550048828125, 0.00048041343688964844, -0.03692626953125, 0.03497314453125, -0.05621337890625, -0.006412506103515625, 0.02056884765625, -0.020477294921875, 0.03143310546875, 0.00522613525390625, -0.0311279296875, -0.026123046875, -0.0157623291015625, 0.0458984375, 0.044677734375, -0.050262451171875, 0.059478759765625, -0.001232147216796875, 0.011016845703125, -0.05218505859375, -0.048065185546875, -0.00829315185546875, -0.020416259765625, -0.029205322265625, 0.02362060546875, 0.00046515464782714844, -0.01099395751953125, 0.0037899017333984375, -0.02435302734375, -0.0090789794921875, -0.0008463859558105469, 0.050628662109375, 0.024505615234375, -0.01110076904296875, -0.0014257431030273438, 0.0157318115234375, -0.031585693359375, -0.002140045166015625, 0.0187835693359375, 0.05670166015625, -0.010589599609375, -0.0186004638671875, -0.038726806640625, -0.011444091796875, 0.06439208984375, -0.017486572265625, 0.06585693359375, 0.06341552734375, -0.0233306884765625, -0.0193634033203125, -0.056427001953125, 0.0063323974609375, -0.0350341796875, 0.03955078125, -0.029083251953125, -0.0673828125, 0.052703857421875, -0.00835418701171875, 0.002685546875, 0.06512451171875, 0.062103271484375, 0.027557373046875, 0.042083740234375, 0.03802490234375, -0.0276947021484375, 0.043975830078125, -0.034698486328125, 0.01142120361328125, -0.055572509765625, -0.025238037109375, -0.034423828125, -0.0054168701171875, -0.0245208740234375, -0.040985107421875, 0.0272216796875, 0.032867431640625, -0.032501220703125, 0.02471923828125, -0.034912109375, 0.01551055908203125, 0.04046630859375, -0.0057373046875, 0.0196380615234375, -0.00240325927734375, 0.0007748603820800781, 0.01543426513671875, -0.038726806640625, -0.04827880859375, 0.0767822265625, 0.033721923828125, 0.09259033203125, 0.013397216796875, 0.05169677734375, 0.006378173828125, 0.047210693359375, -0.044708251953125, 0.03460693359375, 0.003253936767578125, -0.06829833984375, -0.004001617431640625, -0.02618408203125, -0.06842041015625, 0.04345703125, -0.00624847412109375, -0.037750244140625, 0.01560211181640625, 0.0137481689453125, -0.0140380859375, 0.013763427734375, -0.0467529296875, 0.0689697265625, -0.01512908935546875, -0.0037326812744140625, 0.0034923553466796875, -0.030487060546875, 0.03497314453125, -0.0265045166015625, -0.0017137527465820312, 0.006622314453125, -0.0010995864868164062, 0.05206298828125, -0.0205535888671875, 0.057647705078125, -0.01206207275390625, -0.01299285888671875, 0.026031494140625, -0.021209716796875, 0.01593017578125, -0.0014495849609375, -0.01488494873046875, 0.03839111328125, 0.019012451171875, -0.01322174072265625, -0.03424072265625, 0.045318603515625, -0.0965576171875, -0.046142578125, -0.044219970703125, -0.01311492919921875, 0.008880615234375, 0.0226593017578125, 0.06341552734375, 0.033843994140625, -0.0187225341796875, 0.0216064453125, 0.044891357421875, -0.00766754150390625, 0.040496826171875, 0.0271759033203125, -0.0295867919921875, -0.047576904296875, 0.056549072265625, 0.005878448486328125, 0.0161895751953125, 0.0252685546875, 0.00008803606033325195, -0.030609130859375, -0.0307464599609375, -0.0154571533203125, 0.031341552734375, -0.055389404296875, -0.0335693359375, -0.047210693359375, -0.05084228515625, -0.025238037109375, 0.01415252685546875, -0.040283203125, 0.0022830963134765625, -0.035491943359375, -0.00743865966796875, 0.0169219970703125, 0.060028076171875, 0.02032470703125, 0.0679931640625, -0.070556640625, 0.028106689453125, -0.0013332366943359375, 0.012054443359375, 0.0295867919921875, -0.075927734375, -0.0169219970703125, -0.0183868408203125, -0.050384521484375, -0.06622314453125, 0.03350830078125, -0.006687164306640625, 0.061553955078125, 0.045166015625, -0.001049041748046875, 0.061248779296875, -0.038238525390625, 0.05963134765625, 0.020965576171875, -0.06683349609375, 0.004039764404296875, -0.0079803466796875, 0.028350830078125, 0.0362548828125, 0.0175628662109375, -0.033477783203125, -0.026458740234375, -0.06365966796875, -0.0653076171875, 0.041656494140625, 0.0112152099609375, 0.014190673828125, 0.0026378631591796875, 0.028411865234375, 0.00804901123046875, 0.024871826171875, -0.073486328125, -0.0523681640625, -0.021820068359375, -0.0277862548828125, -0.005092620849609375, -0.031707763671875, 0.00997161865234375, -0.01468658447265625, 0.0577392578125, 0.01812744140625, 0.0189056396484375, 0.02056884765625, -0.00939178466796875, -0.00673675537109375, 0.00803375244140625, 0.044036865234375, 0.04827880859375, -0.00836944580078125, 0.0008716583251953125, 0.0208587646484375, -0.057098388671875, 0.007778167724609375, 0.02679443359375, -0.0102081298828125, 0.011810302734375, 0.049591064453125, 0.08782958984375, -0.0019121170043945312, -0.041259765625, 0.042236328125, -0.015838623046875, -0.014190673828125, -0.035797119140625, 0.0038471221923828125, 0.006793975830078125, 0.034881591796875, 0.034881591796875, 0.005290985107421875, 0.00025010108947753906, -0.00439453125, -0.0096435546875, 0.0265960693359375, -0.03521728515625, -0.036834716796875, 0.05548095703125, 0.001194000244140625, -0.01503753662109375, 0.039276123046875, -0.00458526611328125, -0.039031982421875, 0.031707763671875, 0.051971435546875, 0.06201171875, -0.031402587890625, -0.00399017333984375, 0.0195770263671875, 0.028594970703125, -0.043548583984375, 0.019195556640625, 0.0161590576171875, -0.0537109375, -0.01451873779296875, -0.06134033203125, -0.013031005859375, 0.0213470458984375, -0.047607421875, 0.031585693359375, -0.028350830078125, -0.045257568359375, -0.0219879150390625, -0.0194091796875, -0.059356689453125, 0.0056610107421875, 0.00931549072265625, 0.07305908203125, -0.06414794921875, 0.060577392578125, 0.05908203125, -0.0305938720703125, -0.076416015625, -0.010040283203125, -0.0263214111328125, -0.062225341796875, 0.0518798828125, 0.00835418701171875, 0.004108428955078125, 0.0227203369140625, -0.0438232421875, -0.07080078125, 0.09857177734375, 0.041656494140625, -0.04534912109375, -0.01123046875, 0.00884246826171875, 0.050537109375, 0.01512908935546875, 0.03680419921875, 0.027618408203125, 0.029022216796875, 0.0241851806640625, -0.083740234375, -0.006977081298828125, -0.0167999267578125, 0.016754150390625, 0.01259613037109375, -0.0753173828125, 0.06805419921875, 0.0096282958984375, -0.005496978759765625, 0.0162811279296875, 0.058746337890625, 0.02703857421875, 0.0087432861328125, 0.021697998046875, 0.0712890625, 0.044647216796875, -0.0066070556640625, 0.0714111328125, -0.032440185546875, 0.03729248046875, 0.0576171875, -0.0093231201171875, 0.058929443359375, 0.044158935546875, -0.01427459716796875, 0.022186279296875, 0.047821044921875, -0.00485992431640625, 0.0306549072265625, 0.0203094482421875, -0.01238250732421875, -0.0193634033203125, -0.0029144287109375, -0.051300048828125, 0.01953125, 0.0038700103759765625, -0.03692626953125, -0.020050048828125, -0.01480865478515625, -0.0018014907836914062, -0.036041259765625, -0.0311126708984375, 0.049835205078125, 0.037261962890625, -0.0352783203125, 0.05615234375, 0.006031036376953125, 0.050048828125, -0.042938232421875, -0.00936126708984375, -0.027008056640625, 0.0279693603515625, -0.0126190185546875, -0.018646240234375, -0.01317596435546875, -0.0021266937255859375, -0.0214080810546875, 0.0036487579345703125, 0.05218505859375, -0.02056884765625, -0.04931640625, 0.0207366943359375, 0.0523681640625, 0.004360198974609375, -0.012237548828125, -0.050079345703125, 0.027618408203125, -0.01229095458984375, -0.0311431884765625, 0.0093231201171875, 0.049163818359375, -0.01438140869140625, 0.06463623046875, 0.03607177734375, 0.005401611328125, 0.03741455078125, 0.018890380859375, 0.0667724609375, -0.06634521484375, -0.02471923828125, -0.07659912109375, 0.039886474609375, 0.0127105712890625, -0.044281005859375, 0.0496826171875, 0.04486083984375, 0.045440673828125, -0.0214385986328125, 0.053955078125, 0.016143798828125, 0.00905609130859375, -0.02069091796875, 0.035797119140625, -0.035308837890625, 0.01012420654296875, -0.00992584228515625, -0.09515380859375, -0.007598876953125, 0.048858642578125, -0.00872039794921875, 0.0181732177734375, 0.055206298828125, 0.0650634765625, -0.002132415771484375, -0.0220184326171875, 0.030487060546875, 0.02557373046875, 0.03857421875, 0.028594970703125, 0.0770263671875, -0.043060302734375, 0.039398193359375, 0.00684356689453125, -0.0191497802734375, -0.00616455078125, -0.04266357421875, -0.08917236328125, -0.035491943359375, 0.0027675628662109375, -0.032806396484375, -0.012847900390625, 0.07159423828125, 0.03594970703125, -0.0377197265625, -0.0185394287109375, 0.001247406005859375, -0.006641387939453125, -0.002094268798828125, -0.01287841796875, 0.0234375, -0.0140838623046875, -0.052215576171875, -0.004184722900390625, 0.00533294677734375, 0.0166168212890625, -0.0286865234375, -0.005573272705078125, 0.0257720947265625, -0.00510406494140625, 0.0309906005859375, -0.004215240478515625, -0.0714111328125, 0.00588226318359375, 0.00337982177734375, -0.0269775390625, -0.0136566162109375, 0.033111572265625, -0.0362548828125, 0.006317138671875, 0.0273590087890625, 0.055145263671875, 0.0419921875, 0.0137786865234375, 0.02130126953125, -0.0178985595703125, 0.01433563232421875, -0.001544952392578125, 0.02520751953125, 0.00751495361328125, -0.005855560302734375, 0.01500701904296875, 0.0198211669921875, -0.039276123046875, -0.05084228515625, -0.01006317138671875, -0.1077880859375, -0.00279998779296875, 0.099365234375, 0.0133514404296875, -0.0285491943359375, 0.01094818115234375, -0.0477294921875, 0.0259857177734375, -0.037261962890625, 0.04345703125, 0.05389404296875, 0.0169525146484375, -0.0162811279296875, -0.031524658203125, 0.035614013671875, 0.00830078125, -0.048095703125, -0.0082855224609375, 0.0226593017578125, 0.0165863037109375, 0.011993408203125, 0.059722900390625, -0.00330352783203125, 0.0115966796875, -0.0017461776733398438, 0.0215911865234375, -0.01166534423828125, -0.0162353515625, -0.037811279296875, -0.0011491775512695312, 0.032470703125, 0.0006113052368164062 ] ]
google/mt5-large
2023-01-24T16:37:29.000Z
[ "transformers", "pytorch", "tf", "jax", "mt5", "text2text-generation", "multilingual", "af", "am", "ar", "az", "be", "bg", "bn", "ca", "ceb", "co", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fil", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "haw", "hi", "hmn", "ht", "hu", "hy", "ig", "is", "it", "iw", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lb", "lo", "lt", "lv", "mg", "mi", "mk", "ml", "mn", "mr", "ms", "mt", "my", "ne", "nl", "no", "ny", "pa", "pl", "ps", "pt", "ro", "ru", "sd", "si", "sk", "sl", "sm", "sn", "so", "sq", "sr", "st", "su", "sv", "sw", "ta", "te", "tg", "th", "tr", "uk", "und", "ur", "uz", "vi", "xh", "yi", "yo", "zh", "zu", "dataset:mc4", "arxiv:2010.11934", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
google
null
null
google/mt5-large
55
133,517
transformers
2022-03-02T23:29:05
--- language: - multilingual - af - am - ar - az - be - bg - bn - ca - ceb - co - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fil - fr - fy - ga - gd - gl - gu - ha - haw - hi - hmn - ht - hu - hy - ig - is - it - iw - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lb - lo - lt - lv - mg - mi - mk - ml - mn - mr - ms - mt - my - ne - nl - no - ny - pa - pl - ps - pt - ro - ru - sd - si - sk - sl - sm - sn - so - sq - sr - st - su - sv - sw - ta - te - tg - th - tr - uk - und - ur - uz - vi - xh - yi - yo - zh - zu datasets: - mc4 license: apache-2.0 --- [Google's mT5](https://github.com/google-research/multilingual-t5) mT5 is pretrained on the [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu. **Note**: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task. Pretraining Dataset: [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) Other Community Checkpoints: [here](https://huggingface.co/models?search=mt5) Paper: [mT5: A massively multilingual pre-trained text-to-text transformer](https://arxiv.org/abs/2010.11934) Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel* ## Abstract The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We describe the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. All of the code and model checkpoints used in this work are publicly available.
2,826
[ [ -0.0369873046875, -0.0119171142578125, 0.0203704833984375, 0.028778076171875, -0.0205841064453125, 0.0251922607421875, -0.0268096923828125, -0.031341552734375, 0.01203155517578125, 0.0252838134765625, -0.04913330078125, -0.059906005859375, -0.06512451171875, 0.051910400390625, -0.0175628662109375, 0.07598876953125, -0.0264129638671875, 0.0141143798828125, 0.01629638671875, -0.03802490234375, -0.0295257568359375, -0.04339599609375, -0.034698486328125, -0.0087432861328125, 0.057098388671875, 0.03118896484375, 0.0233306884765625, 0.032318115234375, 0.0408935546875, 0.0183258056640625, 0.01248931884765625, 0.016021728515625, -0.03411865234375, -0.0241546630859375, 0.000022470951080322266, -0.0276031494140625, -0.0290679931640625, -0.01000213623046875, 0.037750244140625, 0.04254150390625, -0.00957489013671875, 0.032928466796875, -0.007659912109375, 0.03692626953125, -0.034332275390625, 0.00007367134094238281, -0.037689208984375, 0.006526947021484375, -0.03271484375, -0.00041985511779785156, -0.027862548828125, -0.005924224853515625, -0.0075225830078125, -0.0491943359375, 0.013336181640625, 0.0021495819091796875, 0.07635498046875, 0.0166473388671875, -0.046051025390625, -0.0225982666015625, -0.0313720703125, 0.06884765625, -0.0300140380859375, 0.065673828125, 0.036865234375, 0.025970458984375, 0.01198577880859375, -0.07135009765625, -0.050384521484375, 0.0170440673828125, -0.00231170654296875, 0.0166778564453125, -0.0035915374755859375, -0.013397216796875, 0.01045989990234375, 0.0183563232421875, -0.046661376953125, 0.0018978118896484375, -0.0535888671875, -0.0087127685546875, 0.02276611328125, -0.0101776123046875, 0.03411865234375, -0.00986480712890625, -0.0192718505859375, -0.003803253173828125, -0.052642822265625, 0.00791168212890625, 0.028656005859375, 0.024566650390625, -0.0341796875, 0.021392822265625, 0.0107421875, 0.043212890625, -0.00554656982421875, -0.03106689453125, 0.05169677734375, -0.03240966796875, -0.0071258544921875, -0.001312255859375, 0.07550048828125, 0.01520538330078125, 0.0260467529296875, -0.037139892578125, -0.0026035308837890625, 0.001972198486328125, 0.0172119140625, -0.06353759765625, -0.018341064453125, 0.0236053466796875, -0.0182342529296875, 0.005443572998046875, -0.0103607177734375, -0.031463623046875, 0.0029544830322265625, -0.0160369873046875, 0.017364501953125, -0.047698974609375, -0.0277557373046875, 0.005519866943359375, -0.0005521774291992188, 0.005168914794921875, 0.004302978515625, -0.0870361328125, 0.00396728515625, 0.0228271484375, 0.062103271484375, -0.026519775390625, -0.055572509765625, -0.02593994140625, 0.0224609375, -0.020355224609375, 0.04193115234375, -0.039093017578125, -0.0238037109375, -0.00450897216796875, 0.03759765625, -0.01049041748046875, -0.0212249755859375, 0.054351806640625, -0.033782958984375, 0.047271728515625, -0.02960205078125, -0.0010328292846679688, -0.028045654296875, 0.034576416015625, -0.06085205078125, 0.0914306640625, 0.007549285888671875, -0.06817626953125, 0.043548583984375, -0.066162109375, -0.04693603515625, -0.01073455810546875, 0.004077911376953125, -0.032867431640625, -0.0216064453125, 0.041656494140625, 0.0305938720703125, -0.004482269287109375, 0.021148681640625, -0.0088348388671875, -0.025482177734375, -0.01507568359375, -0.01349639892578125, 0.051300048828125, 0.0245513916015625, -0.03228759765625, 0.0095367431640625, -0.06683349609375, -0.0034694671630859375, -0.003932952880859375, -0.03778076171875, -0.0003714561462402344, -0.0180816650390625, 0.0129547119140625, 0.03948974609375, 0.019195556640625, -0.046844482421875, 0.00016164779663085938, -0.0178680419921875, 0.038818359375, 0.040374755859375, -0.035247802734375, 0.0257415771484375, -0.01284027099609375, 0.0462646484375, 0.035247802734375, -0.006061553955078125, -0.0303802490234375, -0.0288238525390625, -0.05438232421875, -0.034698486328125, 0.042938232421875, 0.04913330078125, -0.09136962890625, 0.001338958740234375, -0.052032470703125, -0.0191497802734375, -0.0726318359375, 0.0175323486328125, 0.025238037109375, 0.0258331298828125, 0.052001953125, -0.00890350341796875, -0.059356689453125, -0.046356201171875, -0.02142333984375, 0.0209197998046875, 0.0030879974365234375, -0.0032176971435546875, 0.03863525390625, -0.031646728515625, 0.043548583984375, 0.0005078315734863281, -0.031494140625, -0.030364990234375, 0.0036563873291015625, 0.02362060546875, 0.02960205078125, 0.051239013671875, -0.05712890625, -0.05181884765625, 0.01177978515625, -0.04754638671875, 0.00775146484375, 0.0172119140625, -0.0025196075439453125, 0.039703369140625, 0.0242919921875, -0.0232391357421875, -0.0007529258728027344, 0.08404541015625, -0.006313323974609375, 0.0164947509765625, -0.02960205078125, 0.0258331298828125, -0.12548828125, 0.0231475830078125, -0.01436614990234375, -0.025146484375, -0.034912109375, -0.00400543212890625, 0.0167999267578125, -0.00809478759765625, -0.048828125, 0.042938232421875, -0.057952880859375, 0.002285003662109375, -0.001018524169921875, 0.00518035888671875, -0.00812530517578125, 0.042205810546875, 0.00585174560546875, 0.06732177734375, 0.026641845703125, -0.0489501953125, 0.0095062255859375, 0.0212554931640625, -0.0232391357421875, 0.035125732421875, -0.03582763671875, 0.01641845703125, -0.01103973388671875, 0.017486572265625, -0.065185546875, -0.01079559326171875, 0.00409698486328125, -0.046295166015625, 0.01413726806640625, -0.0280303955078125, -0.0474853515625, -0.031982421875, -0.01088714599609375, 0.0287933349609375, 0.0194549560546875, -0.048309326171875, 0.037261962890625, 0.023406982421875, -0.0026874542236328125, -0.0694580078125, -0.07452392578125, 0.0325927734375, -0.032623291015625, -0.044281005859375, 0.0238800048828125, -0.01186370849609375, 0.0286865234375, -0.02386474609375, 0.0230255126953125, -0.0162506103515625, 0.006862640380859375, 0.0017194747924804688, 0.010009765625, -0.00885009765625, -0.0119781494140625, 0.0018301010131835938, -0.01100921630859375, -0.0173492431640625, -0.0306396484375, 0.052947998046875, -0.00452423095703125, -0.0096282958984375, -0.02655029296875, 0.0262908935546875, 0.045806884765625, -0.043914794921875, 0.05889892578125, 0.09027099609375, -0.0148468017578125, 0.01141357421875, -0.033538818359375, 0.00496673583984375, -0.033233642578125, 0.031463623046875, -0.067138671875, -0.08056640625, 0.049407958984375, -0.00934600830078125, 0.0215911865234375, 0.03619384765625, 0.04425048828125, 0.0024166107177734375, 0.07733154296875, 0.05731201171875, -0.005035400390625, 0.02886962890625, -0.01922607421875, 0.0178070068359375, -0.056304931640625, -0.00936126708984375, -0.0389404296875, -0.0253143310546875, -0.07354736328125, -0.0245208740234375, 0.025390625, -0.0159759521484375, -0.0151824951171875, 0.04345703125, -0.0221405029296875, 0.031951904296875, 0.033477783203125, -0.0159912109375, 0.02276611328125, 0.0142974853515625, -0.045684814453125, -0.0252838134765625, -0.055023193359375, -0.041961669921875, 0.0960693359375, 0.0130157470703125, 0.0117645263671875, 0.03765869140625, 0.044036865234375, -0.00980377197265625, 0.03326416015625, -0.0301361083984375, 0.0099334716796875, -0.032012939453125, -0.0614013671875, -0.0093536376953125, -0.03411865234375, -0.094970703125, 0.0232696533203125, -0.01100921630859375, -0.043670654296875, -0.00615692138671875, 0.0008130073547363281, -0.0027523040771484375, 0.02313232421875, -0.066650390625, 0.07708740234375, -0.01000213623046875, -0.012847900390625, 0.00511932373046875, -0.05548095703125, 0.02783203125, -0.0204315185546875, 0.044219970703125, 0.00262451171875, 0.0073089599609375, 0.0517578125, -0.006710052490234375, 0.0460205078125, -0.005462646484375, -0.00864410400390625, -0.017608642578125, -0.007381439208984375, 0.0282745361328125, -0.011383056640625, 0.006626129150390625, 0.0311126708984375, 0.020233154296875, -0.048553466796875, -0.01776123046875, 0.042083740234375, -0.07568359375, -0.01247406005859375, -0.031494140625, -0.0279693603515625, -0.0216522216796875, 0.051116943359375, 0.0301513671875, 0.0207366943359375, -0.004131317138671875, 0.0229644775390625, 0.0282440185546875, -0.023895263671875, 0.0545654296875, 0.053955078125, -0.0251922607421875, -0.05364990234375, 0.06756591796875, 0.0162353515625, 0.0139923095703125, 0.030548095703125, -0.002994537353515625, -0.0306396484375, -0.044097900390625, -0.060302734375, 0.0245208740234375, -0.042083740234375, 0.004131317138671875, -0.06439208984375, 0.01509857177734375, -0.04443359375, -0.006954193115234375, -0.0288238525390625, -0.01520538330078125, -0.00936126708984375, -0.0184326171875, 0.00099945068359375, 0.042816162109375, 0.00968170166015625, 0.0325927734375, -0.06927490234375, 0.032379150390625, -0.00836944580078125, 0.0321044921875, -0.029144287109375, -0.039764404296875, -0.03472900390625, 0.01479339599609375, -0.0260467529296875, -0.032501220703125, 0.0491943359375, 0.01377105712890625, 0.03802490234375, 0.0211334228515625, -0.01227569580078125, 0.0556640625, -0.05841064453125, 0.06365966796875, 0.029144287109375, -0.06494140625, 0.0130767822265625, -0.035858154296875, 0.036712646484375, 0.048858642578125, 0.065673828125, -0.061309814453125, -0.017913818359375, -0.043701171875, -0.058685302734375, 0.0577392578125, 0.007904052734375, 0.013275146484375, 0.000020623207092285156, -0.00879669189453125, 0.0206146240234375, 0.032562255859375, -0.07427978515625, -0.019195556640625, -0.03704833984375, -0.035400390625, -0.031829833984375, -0.007328033447265625, -0.0039520263671875, -0.0196380615234375, 0.039947509765625, -0.0217742919921875, 0.0172119140625, 0.00269317626953125, -0.03155517578125, 0.01727294921875, 0.0125274658203125, 0.06903076171875, 0.060394287109375, -0.0112457275390625, 0.0201416015625, 0.0311126708984375, -0.06146240234375, 0.01033782958984375, -0.0005488395690917969, 0.01241302490234375, 0.0086822509765625, 0.0287017822265625, 0.07147216796875, 0.0081787109375, -0.03033447265625, 0.0283203125, -0.0187225341796875, -0.0255126953125, -0.0246124267578125, -0.02569580078125, 0.0238189697265625, -0.0100250244140625, 0.0202178955078125, -0.002132415771484375, -0.005519866943359375, -0.0438232421875, -0.0006837844848632812, 0.0016307830810546875, -0.033416748046875, -0.044342041015625, 0.05548095703125, 0.0251312255859375, -0.00746917724609375, 0.040008544921875, -0.00592041015625, -0.05023193359375, 0.015899658203125, 0.044769287109375, 0.047637939453125, -0.0318603515625, 0.0005393028259277344, 0.040435791015625, 0.039398193359375, 0.001499176025390625, 0.0380859375, 0.003398895263671875, -0.058837890625, -0.047210693359375, -0.047760009765625, -0.0208282470703125, -0.004486083984375, -0.0213165283203125, 0.036346435546875, -0.01378631591796875, -0.01035308837890625, 0.0037670135498046875, 0.0041046142578125, -0.060302734375, 0.0345458984375, 0.00403594970703125, 0.044158935546875, -0.042327880859375, 0.08636474609375, 0.07293701171875, -0.0263214111328125, -0.062225341796875, -0.021728515625, -0.0216827392578125, -0.06317138671875, 0.056854248046875, 0.0225677490234375, -0.011505126953125, 0.0234527587890625, -0.0137786865234375, -0.06610107421875, 0.0875244140625, 0.04638671875, -0.016357421875, 0.000896453857421875, 0.040618896484375, 0.03338623046875, -0.0157318115234375, 0.037994384765625, 0.0256805419921875, 0.043426513671875, 0.0132293701171875, -0.09210205078125, -0.01393890380859375, -0.0379638671875, -0.0107269287109375, 0.0198822021484375, -0.051910400390625, 0.05828857421875, -0.0075836181640625, -0.0100250244140625, -0.0239105224609375, 0.049530029296875, 0.01666259765625, 0.0081634521484375, 0.027587890625, 0.05731201171875, 0.06201171875, -0.018768310546875, 0.08477783203125, -0.04705810546875, 0.02117919921875, 0.05731201171875, 0.0007600784301757812, 0.05731201171875, 0.036102294921875, -0.01435089111328125, 0.03497314453125, 0.06048583984375, 0.01513671875, 0.03424072265625, -0.01169586181640625, -0.01309967041015625, 0.002391815185546875, 0.003398895263671875, -0.023406982421875, 0.03106689453125, 0.01236724853515625, -0.019134521484375, -0.00023663043975830078, 0.017669677734375, 0.0374755859375, -0.0279541015625, -0.00716400146484375, 0.043487548828125, 0.00836944580078125, -0.059844970703125, 0.06915283203125, 0.0277862548828125, 0.06768798828125, -0.05364990234375, 0.0262298583984375, -0.0187835693359375, 0.0173797607421875, -0.0203399658203125, -0.045806884765625, 0.023101806640625, 0.00811004638671875, -0.01506805419921875, -0.0419921875, 0.0204620361328125, -0.05169677734375, -0.03668212890625, 0.02227783203125, 0.0257720947265625, 0.014007568359375, 0.0017232894897460938, -0.042449951171875, -0.002468109130859375, 0.01044464111328125, -0.005764007568359375, 0.0235748291015625, 0.044036865234375, -0.00803375244140625, 0.052459716796875, 0.058929443359375, 0.0006361007690429688, 0.025543212890625, 0.00986480712890625, 0.0472412109375, -0.049041748046875, -0.04925537109375, -0.04937744140625, 0.043487548828125, 0.01506805419921875, -0.039794921875, 0.06146240234375, 0.05218505859375, 0.07635498046875, -0.01345062255859375, 0.06317138671875, 0.01346588134765625, 0.051727294921875, -0.038360595703125, 0.05303955078125, -0.047943115234375, -0.014801025390625, -0.019683837890625, -0.06329345703125, -0.02789306640625, 0.03009033203125, -0.019805908203125, 0.01393890380859375, 0.0789794921875, 0.034515380859375, -0.02484130859375, -0.020111083984375, 0.033966064453125, 0.0090789794921875, 0.0308380126953125, 0.042205810546875, 0.032012939453125, -0.045806884765625, 0.0582275390625, -0.010040283203125, 0.016143798828125, 0.01088714599609375, -0.06268310546875, -0.075927734375, -0.053802490234375, -0.0036716461181640625, -0.0133209228515625, 0.00078582763671875, 0.057098388671875, 0.054656982421875, -0.057403564453125, -0.0253753662109375, 0.00949859619140625, -0.0074920654296875, 0.0119171142578125, -0.00705718994140625, 0.023590087890625, -0.03173828125, -0.0762939453125, 0.0236968994140625, 0.0050048828125, 0.007740020751953125, -0.01129913330078125, -0.0069580078125, -0.0293426513671875, -0.01776123046875, 0.04876708984375, 0.0037403106689453125, -0.0294647216796875, -0.004901885986328125, 0.0104522705078125, -0.01214599609375, 0.0244293212890625, 0.03143310546875, -0.035614013671875, 0.0238189697265625, 0.019195556640625, 0.05535888671875, 0.05328369140625, -0.01678466796875, 0.04693603515625, -0.059478759765625, 0.0223388671875, -0.005001068115234375, 0.0263824462890625, 0.0439453125, 0.0017528533935546875, 0.03729248046875, 0.0281829833984375, -0.0274505615234375, -0.053375244140625, -0.0025539398193359375, -0.0672607421875, -0.00098419189453125, 0.08319091796875, -0.02154541015625, -0.0196075439453125, -0.01355743408203125, -0.01117706298828125, 0.022216796875, -0.0175933837890625, 0.044036865234375, 0.07452392578125, 0.02850341796875, -0.035614013671875, -0.059051513671875, 0.03717041015625, 0.032867431640625, -0.06640625, -0.032379150390625, 0.0036563873291015625, 0.03619384765625, 0.00817108154296875, 0.045501708984375, -0.00396728515625, 0.0038356781005859375, -0.0214080810546875, 0.03472900390625, -0.0090789794921875, -0.023681640625, -0.0035190582275390625, 0.00846099853515625, -0.01255035400390625, -0.0238494873046875 ] ]
xlm-roberta-large-finetuned-conll03-english
2022-07-22T08:04:08.000Z
[ "transformers", "pytorch", "rust", "xlm-roberta", "token-classification", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:1911.02116", "arxiv:2008.03415", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
null
null
null
xlm-roberta-large-finetuned-conll03-english
70
132,942
transformers
2022-03-02T23:29:04
--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - no - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh --- # xlm-roberta-large-finetuned-conll03-english # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 4. [Training](#training) 5. [Evaluation](#evaluation) 6. [Environmental Impact](#environmental-impact) 7. [Technical Specifications](#technical-specifications) 8. [Citation](#citation) 9. [Model Card Authors](#model-card-authors) 10. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description The XLM-RoBERTa model was proposed in [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116) by Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer and Veselin Stoyanov. It is based on Facebook's RoBERTa model released in 2019. It is a large multi-lingual language model, trained on 2.5TB of filtered CommonCrawl data. This model is [XLM-RoBERTa-large](https://huggingface.co/xlm-roberta-large) fine-tuned with the [conll2003](https://huggingface.co/datasets/conll2003) dataset in English. - **Developed by:** See [associated paper](https://arxiv.org/abs/1911.02116) - **Model type:** Multi-lingual language model - **Language(s) (NLP) or Countries (images):** XLM-RoBERTa is a multilingual model trained on 100 different languages; see [GitHub Repo](https://github.com/facebookresearch/fairseq/tree/main/examples/xlmr) for full list; model is fine-tuned on a dataset in English - **License:** More information needed - **Related Models:** [RoBERTa](https://huggingface.co/roberta-base), [XLM](https://huggingface.co/docs/transformers/model_doc/xlm) - **Parent Model:** [XLM-RoBERTa-large](https://huggingface.co/xlm-roberta-large) - **Resources for more information:** -[GitHub Repo](https://github.com/facebookresearch/fairseq/tree/main/examples/xlmr) -[Associated Paper](https://arxiv.org/abs/1911.02116) # Uses ## Direct Use The model is a language model. The model can be used for token classification, a natural language understanding task in which a label is assigned to some tokens in a text. ## Downstream Use Potential downstream use cases include Named Entity Recognition (NER) and Part-of-Speech (PoS) tagging. To learn more about token classification and other potential downstream use cases, see the Hugging Face [token classification docs](https://huggingface.co/tasks/token-classification). ## Out-of-Scope Use The model should not be used to intentionally create hostile or alienating environments for people. # Bias, Risks, and Limitations **CONTENT WARNING: Readers should be made aware that language generated by this model may be disturbing or offensive to some and may propagate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). In the context of tasks relevant to this model, [Mishra et al. (2020)](https://arxiv.org/pdf/2008.03415.pdf) explore social biases in NER systems for English and find that there is systematic bias in existing NER systems in that they fail to identify named entities from different demographic groups (though this paper did not look at BERT). For example, using a sample sentence from [Mishra et al. (2020)](https://arxiv.org/pdf/2008.03415.pdf): ```python >>> from transformers import pipeline >>> tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-large-finetuned-conll03-english") >>> model = AutoModelForTokenClassification.from_pretrained("xlm-roberta-large-finetuned-conll03-english") >>> classifier = pipeline("ner", model=model, tokenizer=tokenizer) >>> classifier("Alya told Jasmine that Andrew could pay with cash..") [{'end': 2, 'entity': 'I-PER', 'index': 1, 'score': 0.9997861, 'start': 0, 'word': '▁Al'}, {'end': 4, 'entity': 'I-PER', 'index': 2, 'score': 0.9998591, 'start': 2, 'word': 'ya'}, {'end': 16, 'entity': 'I-PER', 'index': 4, 'score': 0.99995816, 'start': 10, 'word': '▁Jasmin'}, {'end': 17, 'entity': 'I-PER', 'index': 5, 'score': 0.9999584, 'start': 16, 'word': 'e'}, {'end': 29, 'entity': 'I-PER', 'index': 7, 'score': 0.99998057, 'start': 23, 'word': '▁Andrew'}] ``` ## Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. # Training See the following resources for training data and training procedure details: - [XLM-RoBERTa-large model card](https://huggingface.co/xlm-roberta-large) - [CoNLL-2003 data card](https://huggingface.co/datasets/conll2003) - [Associated paper](https://arxiv.org/pdf/1911.02116.pdf) # Evaluation See the [associated paper](https://arxiv.org/pdf/1911.02116.pdf) for evaluation details. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** 500 32GB Nvidia V100 GPUs (from the [associated paper](https://arxiv.org/pdf/1911.02116.pdf)) - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Technical Specifications See the [associated paper](https://arxiv.org/pdf/1911.02116.pdf) for further details. # Citation **BibTeX:** ```bibtex @article{conneau2019unsupervised, title={Unsupervised Cross-lingual Representation Learning at Scale}, author={Conneau, Alexis and Khandelwal, Kartikay and Goyal, Naman and Chaudhary, Vishrav and Wenzek, Guillaume and Guzm{\'a}n, Francisco and Grave, Edouard and Ott, Myle and Zettlemoyer, Luke and Stoyanov, Veselin}, journal={arXiv preprint arXiv:1911.02116}, year={2019} } ``` **APA:** - Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V., Wenzek, G., Guzmán, F., ... & Stoyanov, V. (2019). Unsupervised cross-lingual representation learning at scale. arXiv preprint arXiv:1911.02116. # Model Card Authors This model card was written by the team at Hugging Face. # How to Get Started with the Model Use the code below to get started with the model. You can use this model directly within a pipeline for NER. <details> <summary> Click to expand </summary> ```python >>> from transformers import AutoTokenizer, AutoModelForTokenClassification >>> from transformers import pipeline >>> tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-large-finetuned-conll03-english") >>> model = AutoModelForTokenClassification.from_pretrained("xlm-roberta-large-finetuned-conll03-english") >>> classifier = pipeline("ner", model=model, tokenizer=tokenizer) >>> classifier("Hello I'm Omar and I live in Zürich.") [{'end': 14, 'entity': 'I-PER', 'index': 5, 'score': 0.9999175, 'start': 10, 'word': '▁Omar'}, {'end': 35, 'entity': 'I-LOC', 'index': 10, 'score': 0.9999906, 'start': 29, 'word': '▁Zürich'}] ``` </details>
7,668
[ [ -0.0274505615234375, -0.041351318359375, 0.013336181640625, 0.00641632080078125, -0.010894775390625, -0.01444244384765625, -0.0296630859375, -0.0374755859375, 0.011474609375, 0.03472900390625, -0.0310821533203125, -0.0445556640625, -0.058502197265625, 0.0085601806640625, -0.0225830078125, 0.07818603515625, -0.00562286376953125, 0.0249481201171875, 0.00600433349609375, -0.01934814453125, -0.0222320556640625, -0.053741455078125, -0.0709228515625, -0.019989013671875, 0.03399658203125, 0.02239990234375, 0.03228759765625, 0.042633056640625, 0.0127716064453125, 0.0269622802734375, -0.019866943359375, 0.00251007080078125, -0.0247344970703125, -0.0240325927734375, -0.0033111572265625, -0.035369873046875, -0.03521728515625, 0.0121307373046875, 0.059814453125, 0.0572509765625, -0.003948211669921875, 0.02105712890625, 0.00803375244140625, 0.037567138671875, -0.019561767578125, 0.03253173828125, -0.04498291015625, 0.0134735107421875, -0.0124359130859375, 0.0175933837890625, -0.035400390625, 0.004131317138671875, -0.0008072853088378906, -0.039642333984375, 0.000850677490234375, 0.01549530029296875, 0.09051513671875, 0.00739288330078125, -0.03387451171875, -0.0166168212890625, -0.03497314453125, 0.07861328125, -0.058319091796875, 0.044921875, 0.01715087890625, 0.0110015869140625, 0.00923919677734375, -0.054443359375, -0.05352783203125, -0.006061553955078125, -0.0161895751953125, 0.0135040283203125, -0.0221405029296875, -0.0168304443359375, 0.0263214111328125, 0.0209197998046875, -0.03936767578125, 0.0222320556640625, -0.028228759765625, -0.0213165283203125, 0.04541015625, 0.005706787109375, 0.031768798828125, -0.039276123046875, -0.02947998046875, -0.0245513916015625, -0.03216552734375, 0.0128173828125, 0.0292510986328125, 0.0455322265625, -0.035369873046875, 0.039276123046875, 0.000591278076171875, 0.049713134765625, 0.016265869140625, -0.005382537841796875, 0.04754638671875, -0.0283660888671875, -0.01708984375, -0.00278472900390625, 0.07977294921875, 0.019073486328125, 0.004619598388671875, 0.00328826904296875, -0.017608642578125, -0.007480621337890625, -0.01357269287109375, -0.06121826171875, -0.002819061279296875, 0.01019287109375, -0.03887939453125, -0.00926971435546875, 0.01045989990234375, -0.056060791015625, 0.00955963134765625, -0.037933349609375, 0.03643798828125, -0.031402587890625, -0.027984619140625, 0.003734588623046875, 0.0078125, 0.021881103515625, 0.00258636474609375, -0.055328369140625, 0.0275421142578125, 0.03680419921875, 0.064697265625, -0.007419586181640625, -0.0234832763671875, -0.040496826171875, -0.01250457763671875, -0.01137542724609375, 0.036773681640625, -0.0287933349609375, -0.020172119140625, -0.01348114013671875, 0.03424072265625, -0.0158233642578125, -0.0298614501953125, 0.041473388671875, -0.0269622802734375, 0.04473876953125, -0.0018148422241210938, -0.044891357421875, -0.02398681640625, 0.0142669677734375, -0.04840087890625, 0.08013916015625, 0.01568603515625, -0.06536865234375, 0.0293121337890625, -0.050537109375, -0.0208282470703125, 0.0004870891571044922, -0.004741668701171875, -0.04376220703125, -0.02301025390625, 0.020599365234375, 0.044403076171875, -0.0240631103515625, 0.025665283203125, -0.0100250244140625, -0.00839996337890625, -0.00348663330078125, -0.0185546875, 0.0938720703125, 0.0184478759765625, -0.040191650390625, 0.006298065185546875, -0.061492919921875, 0.0035114288330078125, 0.01532745361328125, -0.0218048095703125, -0.0204315185546875, -0.02618408203125, 0.036956787109375, 0.032958984375, 0.0209197998046875, -0.0426025390625, -0.00122833251953125, -0.032196044921875, 0.0382080078125, 0.050262451171875, -0.017303466796875, 0.038787841796875, -0.01360321044921875, 0.040374755859375, 0.01151275634765625, 0.01317596435546875, -0.0013093948364257812, -0.03753662109375, -0.060272216796875, -0.01323699951171875, 0.050384521484375, 0.04278564453125, -0.04095458984375, 0.051239013671875, -0.023590087890625, -0.053253173828125, -0.028594970703125, 0.00884246826171875, 0.0389404296875, 0.03802490234375, 0.037841796875, -0.01873779296875, -0.0498046875, -0.057861328125, -0.003551483154296875, -0.0006422996520996094, 0.0038089752197265625, 0.038787841796875, 0.05157470703125, -0.0345458984375, 0.051483154296875, -0.029693603515625, -0.041351318359375, -0.024688720703125, 0.01273345947265625, 0.0214996337890625, 0.045318603515625, 0.049774169921875, -0.060760498046875, -0.052734375, 0.0030422210693359375, -0.042633056640625, 0.00039505958557128906, -0.0035953521728515625, -0.0073699951171875, 0.04925537109375, 0.044342041015625, -0.03643798828125, 0.0179443359375, 0.05279541015625, -0.0297698974609375, 0.0269775390625, -0.0181121826171875, -0.01294708251953125, -0.09869384765625, 0.01015472412109375, 0.01255035400390625, -0.0034637451171875, -0.030242919921875, -0.007534027099609375, 0.00213623046875, -0.007320404052734375, -0.03875732421875, 0.06256103515625, -0.04638671875, 0.00266265869140625, -0.0029163360595703125, 0.0228424072265625, 0.0005507469177246094, 0.044708251953125, 0.020111083984375, 0.03289794921875, 0.051422119140625, -0.044281005859375, 0.0089569091796875, 0.01715087890625, -0.0240631103515625, 0.035888671875, -0.04925537109375, 0.0018167495727539062, -0.0044403076171875, 0.02349853515625, -0.052154541015625, -0.00444793701171875, 0.022369384765625, -0.041168212890625, 0.04437255859375, -0.019775390625, -0.046661376953125, -0.02947998046875, 0.0027141571044921875, 0.030120849609375, 0.03619384765625, -0.044952392578125, 0.0574951171875, 0.044189453125, -0.0007886886596679688, -0.046783447265625, -0.057098388671875, 0.0033168792724609375, -0.022186279296875, -0.040740966796875, 0.03411865234375, -0.003749847412109375, -0.0009565353393554688, 0.00798797607421875, 0.01242828369140625, -0.0013494491577148438, -0.0013780593872070312, 0.01309967041015625, 0.02252197265625, -0.002330780029296875, 0.009002685546875, -0.01409912109375, -0.0195465087890625, 0.006366729736328125, -0.0282440185546875, 0.0506591796875, -0.01149749755859375, -0.005840301513671875, -0.0283355712890625, 0.0205535888671875, 0.0289154052734375, -0.0192413330078125, 0.0601806640625, 0.0716552734375, -0.041839599609375, -0.0079345703125, -0.029510498046875, -0.006855010986328125, -0.03204345703125, 0.04205322265625, -0.01690673828125, -0.06378173828125, 0.05096435546875, 0.017974853515625, -0.00695037841796875, 0.06219482421875, 0.029296875, 0.02276611328125, 0.085205078125, 0.04522705078125, -0.018798828125, 0.03399658203125, -0.050201416015625, 0.0221099853515625, -0.073974609375, -0.028045654296875, -0.051605224609375, -0.01038360595703125, -0.06854248046875, -0.0391845703125, 0.004619598388671875, -0.0042724609375, -0.0343017578125, 0.04119873046875, -0.04583740234375, 0.004802703857421875, 0.04254150390625, 0.0012798309326171875, 0.0020427703857421875, -0.010711669921875, -0.0284423828125, -0.01088714599609375, -0.056060791015625, -0.028900146484375, 0.08038330078125, 0.032135009765625, 0.042144775390625, -0.00005793571472167969, 0.055389404296875, -0.007221221923828125, 0.01416015625, -0.055145263671875, 0.03973388671875, -0.0205230712890625, -0.052978515625, -0.0297393798828125, -0.051239013671875, -0.083984375, 0.01416015625, -0.019775390625, -0.05767822265625, 0.020416259765625, -0.007465362548828125, -0.0174713134765625, 0.040985107421875, -0.049530029296875, 0.07867431640625, -0.03839111328125, -0.0193023681640625, 0.0003712177276611328, -0.04229736328125, 0.0175628662109375, -0.0184783935546875, 0.035675048828125, 0.004619598388671875, 0.0007648468017578125, 0.07525634765625, -0.040863037109375, 0.06756591796875, -0.022705078125, -0.002655029296875, 0.012237548828125, -0.02130126953125, 0.039642333984375, -0.0021152496337890625, -0.00498199462890625, 0.044189453125, -0.0027523040771484375, -0.0223236083984375, -0.030853271484375, 0.06219482421875, -0.07330322265625, -0.0300140380859375, -0.046417236328125, -0.0355224609375, 0.0035457611083984375, 0.033905029296875, 0.0298614501953125, 0.0284423828125, 0.0017728805541992188, 0.018646240234375, 0.036376953125, -0.041473388671875, 0.0269622802734375, 0.03448486328125, -0.027587890625, -0.038909912109375, 0.06329345703125, 0.035308837890625, 0.0112762451171875, 0.033538818359375, 0.01617431640625, -0.0310821533203125, -0.02825927734375, -0.025238037109375, 0.0277557373046875, -0.051849365234375, -0.0155487060546875, -0.0706787109375, -0.0285186767578125, -0.04595947265625, 0.00797271728515625, -0.0238037109375, -0.025390625, -0.037994384765625, -0.0098419189453125, 0.02862548828125, 0.04840087890625, -0.0130615234375, 0.0125885009765625, -0.04901123046875, 0.00783538818359375, 0.0161285400390625, 0.02490234375, 0.01042938232421875, -0.06671142578125, -0.033935546875, 0.0139007568359375, -0.014678955078125, -0.038177490234375, 0.051910400390625, 0.00909423828125, 0.048309326171875, 0.0307464599609375, -0.0021076202392578125, 0.04925537109375, -0.0352783203125, 0.062744140625, 0.0201873779296875, -0.0782470703125, 0.031585693359375, -0.010711669921875, 0.01934814453125, 0.01715087890625, 0.047119140625, -0.052734375, -0.0172271728515625, -0.06390380859375, -0.0848388671875, 0.072265625, 0.0008568763732910156, 0.0201263427734375, -0.0090789794921875, 0.01837158203125, -0.0024890899658203125, -0.005035400390625, -0.08734130859375, -0.03607177734375, -0.0186767578125, -0.017852783203125, -0.024383544921875, -0.00891876220703125, 0.0081329345703125, -0.0379638671875, 0.07275390625, -0.0087890625, 0.0262908935546875, 0.01036834716796875, -0.012054443359375, -0.0021495819091796875, 0.00489044189453125, 0.032012939453125, 0.0257568359375, -0.00959014892578125, -0.001247406005859375, 0.0247039794921875, -0.0269317626953125, -0.0038433074951171875, 0.0268707275390625, -0.02117919921875, 0.0055694580078125, 0.0203094482421875, 0.072021484375, 0.0275115966796875, -0.040283203125, 0.04010009765625, -0.01090240478515625, -0.0280303955078125, -0.0445556640625, -0.008270263671875, 0.0113677978515625, 0.007343292236328125, 0.0243072509765625, -0.0006842613220214844, 0.0015268325805664062, -0.042083740234375, 0.0105743408203125, 0.039459228515625, -0.03729248046875, -0.01416015625, 0.06103515625, -0.004604339599609375, -0.0200958251953125, 0.04150390625, -0.0224456787109375, -0.05029296875, 0.055023193359375, 0.040252685546875, 0.0640869140625, -0.0089569091796875, 0.00926971435546875, 0.0650634765625, 0.0218048095703125, -0.001434326171875, 0.0084686279296875, 0.0080108642578125, -0.06103515625, -0.030242919921875, -0.052947998046875, -0.0118255615234375, 0.0164031982421875, -0.05072021484375, 0.037567138671875, -0.0294036865234375, -0.01207733154296875, 0.00948333740234375, 0.022979736328125, -0.064453125, 0.021636962890625, 0.0177001953125, 0.06634521484375, -0.06756591796875, 0.058319091796875, 0.054962158203125, -0.05078125, -0.09625244140625, -0.01386260986328125, -0.00597381591796875, -0.055877685546875, 0.07672119140625, 0.0273895263671875, 0.016387939453125, -0.0015316009521484375, -0.0323486328125, -0.08612060546875, 0.08697509765625, 0.01070404052734375, -0.049102783203125, -0.00007867813110351562, 0.01418304443359375, 0.040771484375, -0.040374755859375, 0.038421630859375, 0.014892578125, 0.046600341796875, -0.001556396484375, -0.0755615234375, 0.01325225830078125, -0.03021240234375, 0.005687713623046875, 0.0081939697265625, -0.07037353515625, 0.080322265625, -0.01328277587890625, -0.0168914794921875, -0.00681304931640625, 0.038909912109375, 0.00997161865234375, 0.0179901123046875, 0.049530029296875, 0.05718994140625, 0.059814453125, -0.012786865234375, 0.07073974609375, -0.036041259765625, 0.03887939453125, 0.0831298828125, -0.0090179443359375, 0.06451416015625, 0.0205078125, -0.017364501953125, 0.049530029296875, 0.0550537109375, -0.01445770263671875, 0.0193939208984375, 0.01042938232421875, 0.007171630859375, -0.0115509033203125, -0.0024471282958984375, -0.0217132568359375, 0.041961669921875, 0.0161285400390625, -0.039886474609375, -0.0060272216796875, 0.01081085205078125, 0.0226593017578125, -0.0022411346435546875, -0.0244293212890625, 0.04766845703125, 0.01390838623046875, -0.0506591796875, 0.049957275390625, 0.0081329345703125, 0.0643310546875, -0.03338623046875, 0.003322601318359375, -0.00559234619140625, 0.017852783203125, -0.0189361572265625, -0.044891357421875, 0.01009368896484375, -0.00494384765625, -0.01482391357421875, -0.01483154296875, 0.0257568359375, -0.042083740234375, -0.0589599609375, 0.0531005859375, 0.023590087890625, 0.0167236328125, 0.00876617431640625, -0.08135986328125, 0.00001537799835205078, 0.004085540771484375, -0.021453857421875, 0.026397705078125, 0.042724609375, -0.00843048095703125, 0.04864501953125, 0.0462646484375, 0.007080078125, -0.0004355907440185547, 0.01100921630859375, 0.053558349609375, -0.0498046875, -0.033050537109375, -0.047149658203125, 0.037811279296875, -0.007022857666015625, -0.0222320556640625, 0.0714111328125, 0.057830810546875, 0.087890625, 0.0112762451171875, 0.055450439453125, -0.01181793212890625, 0.039794921875, -0.0361328125, 0.048675537109375, -0.05853271484375, 0.01451873779296875, -0.031982421875, -0.0682373046875, -0.0263671875, 0.053253173828125, -0.02947998046875, 0.0305633544921875, 0.047027587890625, 0.0596923828125, -0.0011911392211914062, -0.018890380859375, 0.01277923583984375, 0.031951904296875, 0.00983428955078125, 0.043792724609375, 0.0278778076171875, -0.05535888671875, 0.047393798828125, -0.0308990478515625, -0.01522064208984375, -0.01239013671875, -0.064697265625, -0.0665283203125, -0.056732177734375, -0.03753662109375, -0.03167724609375, -0.021087646484375, 0.07476806640625, 0.058013916015625, -0.06231689453125, -0.0231475830078125, -0.007083892822265625, 0.004093170166015625, -0.016357421875, -0.0212554931640625, 0.043914794921875, -0.021636962890625, -0.07440185546875, -0.005512237548828125, -0.00382232666015625, 0.007587432861328125, -0.0258941650390625, -0.0246734619140625, -0.027099609375, -0.0053863525390625, 0.037200927734375, 0.0147247314453125, -0.053131103515625, -0.017333984375, 0.00780487060546875, -0.00533294677734375, 0.005664825439453125, 0.0155181884765625, -0.04083251953125, 0.02276611328125, 0.0188751220703125, 0.0284271240234375, 0.053070068359375, -0.02239990234375, 0.022186279296875, -0.0352783203125, 0.0147857666015625, 0.008544921875, 0.052734375, 0.037445068359375, -0.032958984375, 0.037078857421875, 0.0184783935546875, -0.04766845703125, -0.059844970703125, -0.006622314453125, -0.08111572265625, -0.0198974609375, 0.0963134765625, -0.016693115234375, -0.032928466796875, -0.0005850791931152344, -0.018218994140625, 0.02880859375, -0.025238037109375, 0.04119873046875, 0.050323486328125, 0.00954437255859375, -0.024383544921875, -0.029388427734375, 0.0222625732421875, 0.02471923828125, -0.045684814453125, -0.005893707275390625, 0.0247955322265625, 0.038543701171875, 0.026214599609375, 0.032501220703125, -0.0217437744140625, -0.0002460479736328125, -0.009979248046875, 0.0287933349609375, 0.0089569091796875, -0.0010499954223632812, -0.0260162353515625, -0.0074462890625, -0.005855560302734375, 0.003734588623046875 ] ]
facebook/vit-mae-base
2023-06-13T19:42:42.000Z
[ "transformers", "pytorch", "tf", "vit_mae", "pretraining", "vision", "dataset:imagenet-1k", "arxiv:2111.06377", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
facebook
null
null
facebook/vit-mae-base
15
131,961
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - vision datasets: - imagenet-1k --- # Vision Transformer (base-sized model) pre-trained with MAE Vision Transformer (ViT) model pre-trained using the MAE method. It was introduced in the paper [Masked Autoencoders Are Scalable Vision Learners](https://arxiv.org/abs/2111.06377) by Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick and first released in [this repository](https://github.com/facebookresearch/mae). Disclaimer: The team releasing MAE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like). Images are presented to the model as a sequence of fixed-size patches. During pre-training, one randomly masks out a high portion (75%) of the image patches. First, the encoder is used to encode the visual patches. Next, a learnable (shared) mask token is added at the positions of the masked patches. The decoder takes the encoded visual patches and mask tokens as input and reconstructs raw pixel values for the masked positions. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/vit-mae) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import AutoImageProcessor, ViTMAEForPreTraining from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained('facebook/vit-mae-base') model = ViTMAEForPreTraining.from_pretrained('facebook/vit-mae-base') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) loss = outputs.loss mask = outputs.mask ids_restore = outputs.ids_restore ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2111-06377, author = {Kaiming He and Xinlei Chen and Saining Xie and Yanghao Li and Piotr Doll{\'{a}}r and Ross B. Girshick}, title = {Masked Autoencoders Are Scalable Vision Learners}, journal = {CoRR}, volume = {abs/2111.06377}, year = {2021}, url = {https://arxiv.org/abs/2111.06377}, eprinttype = {arXiv}, eprint = {2111.06377}, timestamp = {Tue, 16 Nov 2021 12:12:31 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2111-06377.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
2,967
[ [ -0.0457763671875, -0.035400390625, -0.0036468505859375, 0.0116119384765625, -0.0211181640625, -0.00463104248046875, 0.007411956787109375, -0.03839111328125, 0.036468505859375, 0.0325927734375, -0.045196533203125, -0.02001953125, -0.06463623046875, -0.0085601806640625, -0.021514892578125, 0.06414794921875, -0.0031375885009765625, -0.014068603515625, -0.0031948089599609375, -0.01439666748046875, -0.028076171875, -0.042877197265625, -0.039459228515625, -0.0021381378173828125, 0.0106353759765625, 0.00023257732391357422, 0.041107177734375, 0.053955078125, 0.0579833984375, 0.033050537109375, 0.0057220458984375, 0.0037174224853515625, -0.031463623046875, -0.0209808349609375, 0.01319122314453125, -0.02947998046875, -0.043426513671875, 0.01898193359375, 0.043731689453125, 0.0263824462890625, 0.015533447265625, 0.036285400390625, 0.0075531005859375, 0.042724609375, -0.057861328125, 0.0201416015625, -0.036346435546875, 0.021636962890625, -0.005580902099609375, 0.0014562606811523438, -0.0240631103515625, -0.0153961181640625, 0.00899505615234375, -0.046661376953125, 0.035919189453125, -0.00545501708984375, 0.10821533203125, 0.037811279296875, -0.012481689453125, 0.0038623809814453125, -0.0377197265625, 0.04345703125, -0.032440185546875, 0.03253173828125, 0.0214385986328125, 0.038421630859375, 0.0178985595703125, -0.07232666015625, -0.0352783203125, -0.0106048583984375, -0.0282440185546875, 0.01419830322265625, -0.0253143310546875, 0.007610321044921875, 0.0360107421875, 0.047119140625, -0.042022705078125, -0.016326904296875, -0.04840087890625, -0.020355224609375, 0.047943115234375, -0.01145172119140625, 0.0146636962890625, -0.0202789306640625, -0.03814697265625, -0.0288238525390625, -0.0172119140625, 0.0224151611328125, 0.00791168212890625, -0.024017333984375, -0.02178955078125, 0.036468505859375, -0.01885986328125, 0.048187255859375, 0.050201416015625, -0.00860595703125, 0.048248291015625, -0.01148223876953125, -0.037200927734375, -0.0220184326171875, 0.06561279296875, 0.035888671875, 0.0230865478515625, 0.009765625, -0.0206298828125, 0.0122528076171875, 0.0176849365234375, -0.0670166015625, -0.039154052734375, -0.0189361572265625, -0.047210693359375, -0.0296783447265625, 0.02191162109375, -0.048095703125, 0.0106048583984375, -0.01666259765625, 0.06463623046875, -0.007724761962890625, -0.0172271728515625, -0.01224517822265625, 0.00960540771484375, 0.03851318359375, 0.00839996337890625, -0.053131103515625, 0.0180511474609375, 0.002040863037109375, 0.0625, -0.0087738037109375, -0.0175628662109375, -0.0209503173828125, -0.0157623291015625, -0.0258331298828125, 0.037750244140625, -0.01161956787109375, -0.01535797119140625, -0.010345458984375, 0.042236328125, -0.0165863037109375, -0.0391845703125, 0.01105499267578125, -0.0362548828125, 0.01033782958984375, -0.0079803466796875, -0.01052093505859375, -0.0255584716796875, 0.01446533203125, -0.051483154296875, 0.07843017578125, 0.0190582275390625, -0.045562744140625, 0.02813720703125, -0.04803466796875, -0.00897979736328125, 0.0101776123046875, 0.006702423095703125, -0.05169677734375, -0.0037212371826171875, 0.0428466796875, 0.050567626953125, 0.01261138916015625, 0.00921630859375, -0.0230712890625, -0.0231781005859375, 0.00799560546875, -0.0287017822265625, 0.07373046875, 0.016143798828125, -0.0309295654296875, 0.0014371871948242188, -0.052734375, -0.01276397705078125, 0.0244293212890625, -0.002109527587890625, -0.01052093505859375, -0.0287017822265625, -0.00753021240234375, 0.0243682861328125, 0.0255279541015625, -0.040130615234375, 0.0013971328735351562, -0.0138092041015625, 0.0312347412109375, 0.050018310546875, -0.0006055831909179688, 0.039703369140625, -0.0166778564453125, 0.035369873046875, 0.0025806427001953125, 0.060150146484375, -0.045928955078125, -0.040802001953125, -0.08184814453125, -0.03564453125, 0.0027713775634765625, 0.031890869140625, -0.057403564453125, 0.03570556640625, -0.01372528076171875, -0.038421630859375, -0.06304931640625, 0.0023059844970703125, 0.0372314453125, 0.04541015625, 0.0406494140625, -0.0462646484375, -0.05255126953125, -0.0693359375, 0.01259613037109375, 0.0019989013671875, 0.0064849853515625, 0.0211181640625, 0.045074462890625, -0.0435791015625, 0.0802001953125, -0.04296875, -0.0222320556640625, 0.0129852294921875, -0.001049041748046875, 0.00995635986328125, 0.052734375, 0.0584716796875, -0.06756591796875, -0.049713134765625, -0.02520751953125, -0.06689453125, -0.006641387939453125, 0.00899505615234375, -0.0206298828125, 0.0256195068359375, 0.036346435546875, -0.0352783203125, 0.06298828125, 0.039886474609375, -0.0098876953125, 0.0295257568359375, -0.0030307769775390625, 0.0110015869140625, -0.0758056640625, -0.00222015380859375, 0.0088958740234375, -0.033050537109375, -0.0307159423828125, 0.0078277587890625, 0.01541900634765625, -0.01116943359375, -0.0289306640625, 0.034332275390625, -0.04864501953125, -0.0019140243530273438, -0.0088958740234375, -0.036651611328125, 0.005474090576171875, 0.05908203125, 0.0177459716796875, 0.0311431884765625, 0.05615234375, -0.039886474609375, 0.039337158203125, 0.020355224609375, -0.01216888427734375, 0.048919677734375, -0.058746337890625, 0.004055023193359375, -0.0149993896484375, 0.0160675048828125, -0.07080078125, -0.02239990234375, 0.031890869140625, -0.04266357421875, 0.04071044921875, -0.02874755859375, -0.0106658935546875, -0.06439208984375, -0.019561767578125, 0.040008544921875, 0.058563232421875, -0.05609130859375, 0.0362548828125, 0.03436279296875, 0.0138397216796875, -0.049346923828125, -0.055023193359375, -0.0015087127685546875, -0.007015228271484375, -0.04278564453125, 0.0295562744140625, -0.01441192626953125, 0.0117645263671875, 0.020660400390625, 0.0014982223510742188, -0.00927734375, -0.01922607421875, 0.03717041015625, 0.047698974609375, -0.0233612060546875, -0.01678466796875, -0.0156707763671875, -0.0261993408203125, 0.001430511474609375, -0.01898193359375, 0.0171661376953125, -0.03936767578125, -0.0263519287109375, -0.03314208984375, 0.01393890380859375, 0.038848876953125, -0.025604248046875, 0.04266357421875, 0.0701904296875, -0.040618896484375, 0.0108642578125, -0.066162109375, -0.01110076904296875, -0.04168701171875, 0.0236053466796875, -0.03216552734375, -0.058624267578125, 0.047943115234375, 0.0217437744140625, -0.00438690185546875, 0.06317138671875, 0.04913330078125, -0.01303863525390625, 0.05810546875, 0.0628662109375, 0.0095672607421875, 0.050567626953125, -0.0604248046875, -0.0010080337524414062, -0.06549072265625, -0.0343017578125, 0.01102447509765625, -0.018768310546875, -0.04071044921875, -0.04205322265625, 0.0236663818359375, -0.0075531005859375, -0.0242462158203125, 0.046722412109375, -0.06671142578125, 0.030120849609375, 0.0516357421875, 0.019256591796875, -0.004222869873046875, 0.004421234130859375, 0.00894927978515625, -0.002044677734375, -0.042877197265625, -0.006519317626953125, 0.07318115234375, 0.041351318359375, 0.0631103515625, -0.029327392578125, 0.046630859375, 0.007457733154296875, 0.0230560302734375, -0.046112060546875, 0.041351318359375, -0.00841522216796875, -0.04718017578125, -0.00617218017578125, -0.014556884765625, -0.05889892578125, 0.01212310791015625, -0.0298614501953125, -0.03753662109375, 0.053802490234375, 0.0203857421875, -0.0212554931640625, 0.0232391357421875, -0.061004638671875, 0.06341552734375, -0.0140838623046875, -0.0118865966796875, 0.0222625732421875, -0.068115234375, 0.0360107421875, -0.0107879638671875, -0.007671356201171875, 0.01374053955078125, 0.0266571044921875, 0.06829833984375, -0.0469970703125, 0.07220458984375, -0.0186920166015625, 0.021209716796875, 0.038330078125, -0.0200042724609375, 0.0156402587890625, -0.00789642333984375, 0.0400390625, 0.044342041015625, 0.0025634765625, -0.0292816162109375, -0.02899169921875, 0.0287017822265625, -0.0653076171875, -0.036102294921875, -0.0305023193359375, -0.0313720703125, 0.010467529296875, 0.006809234619140625, 0.06781005859375, 0.03411865234375, 0.006595611572265625, 0.0223236083984375, 0.054840087890625, -0.017181396484375, 0.0428466796875, 0.01378631591796875, -0.01641845703125, -0.035400390625, 0.06396484375, 0.0233612060546875, 0.027008056640625, 0.02191162109375, -0.0009241104125976562, -0.0098724365234375, -0.0038623809814453125, -0.0229644775390625, 0.0277252197265625, -0.0504150390625, -0.02850341796875, -0.051300048828125, -0.061248779296875, -0.045623779296875, -0.03424072265625, -0.0445556640625, -0.017486572265625, -0.020843505859375, 0.00004881620407104492, 0.0240325927734375, 0.038787841796875, -0.02496337890625, 0.0360107421875, -0.0419921875, 0.0345458984375, 0.05987548828125, 0.019775390625, -0.008575439453125, -0.052398681640625, -0.026611328125, 0.0010728836059570312, -0.022705078125, -0.057220458984375, 0.04498291015625, 0.0113677978515625, 0.0499267578125, 0.042266845703125, -0.006031036376953125, 0.061370849609375, -0.023712158203125, 0.044281005859375, 0.029449462890625, -0.050201416015625, 0.0494384765625, -0.003631591796875, 0.013946533203125, 0.0225830078125, 0.0257568359375, -0.00951385498046875, -0.006542205810546875, -0.058929443359375, -0.0555419921875, 0.048919677734375, 0.0158538818359375, 0.0103759765625, 0.0228118896484375, 0.031524658203125, -0.0107879638671875, 0.006992340087890625, -0.07537841796875, -0.030303955078125, -0.061676025390625, -0.01233673095703125, -0.010284423828125, -0.032470703125, -0.0005974769592285156, -0.05609130859375, 0.041107177734375, 0.0006728172302246094, 0.054290771484375, 0.0272369384765625, -0.037261962890625, -0.00830078125, -0.036102294921875, 0.018524169921875, 0.021484375, -0.029022216796875, 0.00670623779296875, 0.00897216796875, -0.069091796875, 0.0019216537475585938, 0.01526641845703125, -0.0226287841796875, 0.00405120849609375, 0.0260772705078125, 0.0826416015625, 0.00351715087890625, -0.00434112548828125, 0.056854248046875, 0.004474639892578125, -0.0307159423828125, -0.0268096923828125, 0.003971099853515625, -0.019561767578125, 0.02874755859375, 0.03759765625, 0.01428985595703125, 0.006702423095703125, -0.0263824462890625, 0.017059326171875, 0.027313232421875, -0.031341552734375, -0.0244598388671875, 0.057891845703125, -0.0010786056518554688, -0.017791748046875, 0.037078857421875, -0.0119476318359375, -0.052337646484375, 0.063720703125, 0.035186767578125, 0.07464599609375, -0.033599853515625, 0.0178070068359375, 0.054595947265625, 0.040435791015625, 0.001495361328125, -0.015045166015625, -0.0089569091796875, -0.053802490234375, -0.0213775634765625, -0.0557861328125, -0.004180908203125, 0.015106201171875, -0.0557861328125, 0.01177978515625, -0.02130126953125, -0.020843505859375, -0.00266265869140625, 0.006320953369140625, -0.07269287109375, 0.0286712646484375, 0.023834228515625, 0.052398681640625, -0.067138671875, 0.05572509765625, 0.04974365234375, -0.041473388671875, -0.062744140625, -0.0252227783203125, -0.005794525146484375, -0.07415771484375, 0.05322265625, 0.027191162109375, 0.004917144775390625, 0.0118255615234375, -0.0670166015625, -0.07122802734375, 0.0946044921875, 0.0188140869140625, -0.031768798828125, -0.001323699951171875, -0.01364898681640625, 0.0167999267578125, -0.0404052734375, 0.015777587890625, 0.0137176513671875, 0.0179901123046875, 0.03668212890625, -0.06439208984375, 0.001941680908203125, -0.0396728515625, 0.01535797119140625, -0.005962371826171875, -0.0499267578125, 0.0797119140625, -0.018524169921875, -0.00009179115295410156, -0.0112457275390625, 0.049713134765625, -0.0023822784423828125, 0.027008056640625, 0.04791259765625, 0.046875, 0.035614013671875, -0.0011692047119140625, 0.06427001953125, -0.006305694580078125, 0.0455322265625, 0.05218505859375, 0.0252838134765625, 0.03387451171875, 0.0191497802734375, -0.01259613037109375, 0.03863525390625, 0.0582275390625, -0.036651611328125, 0.050811767578125, -0.01546478271484375, 0.003108978271484375, -0.0147247314453125, 0.0208892822265625, -0.03924560546875, 0.031829833984375, 0.020233154296875, -0.038116455078125, 0.0010614395141601562, 0.03350830078125, -0.00748443603515625, -0.03167724609375, -0.03173828125, 0.038970947265625, 0.009735107421875, -0.033905029296875, 0.0579833984375, -0.019195556640625, 0.06182861328125, -0.03759765625, 0.0024013519287109375, -0.01531219482421875, 0.0162353515625, -0.019744873046875, -0.0447998046875, 0.0304412841796875, -0.0022373199462890625, -0.01477813720703125, -0.00638580322265625, 0.0748291015625, -0.0228118896484375, -0.050445556640625, 0.01104736328125, 0.010498046875, 0.0177764892578125, -0.0020236968994140625, -0.058868408203125, -0.00150299072265625, -0.00975799560546875, -0.038482666015625, 0.017547607421875, 0.01219940185546875, -0.0018863677978515625, 0.041107177734375, 0.05419921875, -0.00836944580078125, 0.03167724609375, -0.027587890625, 0.0767822265625, -0.0304718017578125, -0.03387451171875, -0.05438232421875, 0.0452880859375, -0.014251708984375, -0.010284423828125, 0.053314208984375, 0.02581787109375, 0.06866455078125, -0.0231170654296875, 0.036163330078125, -0.01023101806640625, 0.0038909912109375, -0.0262603759765625, 0.05828857421875, -0.040069580078125, -0.01410675048828125, -0.035980224609375, -0.07452392578125, -0.0263824462890625, 0.0850830078125, -0.01192474365234375, 0.01308441162109375, 0.03668212890625, 0.07373046875, -0.0221405029296875, -0.0335693359375, 0.0281982421875, 0.0222625732421875, 0.0005192756652832031, 0.03924560546875, 0.038330078125, -0.058197021484375, 0.03289794921875, -0.03955078125, -0.03814697265625, -0.026702880859375, -0.05126953125, -0.073974609375, -0.0638427734375, -0.03851318359375, -0.0304412841796875, -0.0007157325744628906, 0.0391845703125, 0.08807373046875, -0.046722412109375, 0.0030345916748046875, 0.00824737548828125, -0.006496429443359375, -0.01174163818359375, -0.0189361572265625, 0.039154052734375, -0.0032176971435546875, -0.0504150390625, -0.01486968994140625, 0.011322021484375, 0.0232696533203125, -0.0282135009765625, -0.0189666748046875, -0.008636474609375, 0.0021839141845703125, 0.0546875, 0.03656005859375, -0.03826904296875, -0.0171966552734375, 0.014129638671875, -0.011810302734375, 0.009735107421875, 0.0207366943359375, -0.075439453125, 0.044891357421875, 0.0225067138671875, 0.031890869140625, 0.069091796875, -0.00879669189453125, 0.007175445556640625, -0.04180908203125, 0.025543212890625, -0.011077880859375, 0.03466796875, 0.0233154296875, -0.031158447265625, 0.04559326171875, 0.051849365234375, -0.041259765625, -0.06121826171875, 0.0007352828979492188, -0.09368896484375, 0.0046234130859375, 0.07452392578125, -0.027008056640625, -0.0321044921875, 0.016510009765625, -0.0178985595703125, 0.0302581787109375, 0.00749969482421875, 0.04010009765625, 0.026611328125, 0.0189971923828125, -0.0443115234375, -0.035980224609375, 0.02899169921875, -0.01265716552734375, -0.035858154296875, -0.044525146484375, 0.0224609375, 0.03265380859375, 0.024566650390625, 0.07177734375, -0.0179595947265625, 0.01861572265625, 0.006072998046875, 0.02801513671875, -0.0218658447265625, -0.020263671875, -0.0282135009765625, -0.006412506103515625, -0.039337158203125, -0.05841064453125 ] ]
codellama/CodeLlama-13b-Instruct-hf
2023-10-27T18:11:57.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "llama-2", "code", "arxiv:2308.12950", "license:llama2", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
codellama
null
null
codellama/CodeLlama-13b-Instruct-hf
77
131,238
transformers
2023-08-24T16:33:54
--- language: - code pipeline_tag: text-generation tags: - llama-2 license: llama2 --- # **Code Llama** Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the 13 instruct-tuned version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom. | | Base Model | Python | Instruct | | --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | | 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) | | 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) | | 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) | ## Model Use To use this model, please make sure to install transformers from `main` until the next version is released: ```bash pip install git+https://github.com/huggingface/transformers.git@main accelerate ``` Model capabilities: - [x] Code completion. - [x] Infilling. - [x] Instructions / chat. - [ ] Python specialist. ## Model Details *Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs). **Model Developers** Meta **Variations** Code Llama comes in three model sizes, and three variants: * Code Llama: base models designed for general code synthesis and understanding * Code Llama - Python: designed specifically for Python * Code Llama - Instruct: for instruction following and safer deployment All variants are available in sizes of 7B, 13B and 34B parameters. **This repository contains the Instruct version of the 13B parameters model.** **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture. **Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or its [arXiv page](https://arxiv.org/abs/2308.12950). ## Intended Use **Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications. **Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants. ## Hardware and Software **Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster. **Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program. ## Training Data All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details). ## Evaluation Results See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper. ## Ethical Considerations and Limitations Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-use-guide](https://ai.meta.com/llama/responsible-use-guide).
6,160
[ [ -0.028564453125, -0.0469970703125, 0.0219268798828125, 0.04095458984375, -0.0175323486328125, 0.0122222900390625, -0.006351470947265625, -0.04742431640625, 0.018402099609375, 0.038299560546875, -0.0303192138671875, -0.041534423828125, -0.043914794921875, 0.0242462158203125, -0.03729248046875, 0.09088134765625, -0.004375457763671875, -0.0233154296875, -0.0215911865234375, -0.00009703636169433594, -0.01666259765625, -0.046844482421875, -0.012908935546875, -0.03424072265625, 0.0242919921875, 0.0211944580078125, 0.0548095703125, 0.045745849609375, 0.0377197265625, 0.0239105224609375, -0.0236968994140625, 0.0012331008911132812, -0.0223846435546875, -0.0274658203125, 0.01788330078125, -0.043304443359375, -0.057708740234375, -0.0008349418640136719, 0.025970458984375, 0.0256195068359375, -0.023162841796875, 0.032562255859375, -0.0127105712890625, 0.036956787109375, -0.0244140625, 0.0144805908203125, -0.046630859375, -0.00408172607421875, 0.0034656524658203125, -0.007457733154296875, -0.007904052734375, -0.042266845703125, -0.0095367431640625, -0.03253173828125, -0.00775146484375, -0.00321197509765625, 0.082275390625, 0.041717529296875, -0.0239410400390625, -0.0182952880859375, -0.0223541259765625, 0.059478759765625, -0.07220458984375, 0.0016832351684570312, 0.0301361083984375, -0.003448486328125, -0.011627197265625, -0.0628662109375, -0.0550537109375, -0.0285186767578125, -0.00957489013671875, -0.0030307769775390625, -0.035369873046875, 0.00543212890625, 0.0316162109375, 0.03887939453125, -0.032958984375, 0.01268768310546875, -0.03131103515625, -0.0168304443359375, 0.06744384765625, 0.0084991455078125, 0.03253173828125, -0.0185089111328125, -0.0252227783203125, -0.0016393661499023438, -0.063720703125, 0.0021800994873046875, 0.037139892578125, -0.01026153564453125, -0.058746337890625, 0.055633544921875, -0.0131378173828125, 0.042266845703125, 0.004650115966796875, -0.041961669921875, 0.039215087890625, -0.0237884521484375, -0.022491455078125, -0.01123809814453125, 0.06591796875, 0.037139892578125, 0.0279998779296875, 0.0033397674560546875, -0.0193023681640625, 0.02447509765625, 0.0107269287109375, -0.061126708984375, -0.0044097900390625, 0.02264404296875, -0.0462646484375, -0.0523681640625, -0.0213775634765625, -0.0616455078125, -0.00872039794921875, -0.0052032470703125, 0.0091400146484375, -0.01305389404296875, -0.0301513671875, 0.0162811279296875, 0.0080413818359375, 0.034271240234375, 0.007083892822265625, -0.06512451171875, 0.004375457763671875, 0.03741455078125, 0.0562744140625, 0.00267791748046875, -0.035491943359375, 0.0012693405151367188, -0.01039886474609375, -0.0259246826171875, 0.049957275390625, -0.037017822265625, -0.037841796875, -0.007167816162109375, 0.00750732421875, -0.0006442070007324219, -0.038787841796875, 0.0168609619140625, -0.0266876220703125, -0.0011911392211914062, 0.0114288330078125, -0.021270751953125, -0.0333251953125, 0.003322601318359375, -0.04229736328125, 0.08587646484375, 0.0210723876953125, -0.049407958984375, -0.003360748291015625, -0.041961669921875, -0.0287017822265625, -0.0195770263671875, -0.002544403076171875, -0.04925537109375, -0.00321197509765625, 0.01409149169921875, 0.03759765625, -0.0308990478515625, 0.033660888671875, -0.0081634521484375, -0.0303955078125, 0.0162200927734375, -0.01128387451171875, 0.07415771484375, 0.026275634765625, -0.034088134765625, 0.0175323486328125, -0.06988525390625, -0.00963592529296875, 0.036407470703125, -0.041229248046875, 0.00940704345703125, -0.0098724365234375, -0.0011749267578125, -0.00411224365234375, 0.041015625, -0.0196990966796875, 0.0416259765625, -0.029144287109375, 0.0562744140625, 0.048919677734375, -0.0010290145874023438, 0.0304718017578125, -0.043426513671875, 0.060333251953125, -0.01290130615234375, 0.01480865478515625, -0.020843505859375, -0.055633544921875, -0.0753173828125, -0.0216827392578125, 0.0016689300537109375, 0.0526123046875, -0.03753662109375, 0.04730224609375, 0.0007753372192382812, -0.056793212890625, -0.03900146484375, 0.0155487060546875, 0.04083251953125, 0.0204620361328125, 0.024078369140625, -0.00572967529296875, -0.059417724609375, -0.06353759765625, 0.005035400390625, -0.031707763671875, 0.0070953369140625, 0.015472412109375, 0.06414794921875, -0.0499267578125, 0.05633544921875, -0.03228759765625, -0.00022089481353759766, -0.02740478515625, -0.0212554931640625, 0.0372314453125, 0.0406494140625, 0.0565185546875, -0.043304443359375, -0.017669677734375, 0.005039215087890625, -0.06341552734375, -0.0084991455078125, -0.015472412109375, -0.0035037994384765625, 0.0308074951171875, 0.0246429443359375, -0.04791259765625, 0.0380859375, 0.06842041015625, -0.0167694091796875, 0.04437255859375, -0.01082611083984375, -0.01175689697265625, -0.07861328125, 0.01529693603515625, -0.01052093505859375, -0.0015840530395507812, -0.03753662109375, 0.028564453125, 0.007053375244140625, 0.00659942626953125, -0.03790283203125, 0.0255279541015625, -0.027618408203125, -0.002109527587890625, -0.00899505615234375, -0.01751708984375, -0.003170013427734375, 0.057037353515625, -0.004505157470703125, 0.07525634765625, 0.039031982421875, -0.04876708984375, 0.022613525390625, 0.0246429443359375, -0.0303955078125, 0.014373779296875, -0.07122802734375, 0.0273590087890625, 0.00896453857421875, 0.0252838134765625, -0.0576171875, -0.019866943359375, 0.02593994140625, -0.032958984375, 0.00763702392578125, -0.0020847320556640625, -0.037200927734375, -0.034423828125, -0.019195556640625, 0.032958984375, 0.06463623046875, -0.0472412109375, 0.031341552734375, 0.03094482421875, 0.009063720703125, -0.053680419921875, -0.053497314453125, 0.0084991455078125, -0.035736083984375, -0.04681396484375, 0.031829833984375, -0.023162841796875, -0.0164031982421875, -0.01280975341796875, 0.004123687744140625, -0.0009660720825195312, 0.023193359375, 0.034759521484375, 0.0302581787109375, -0.008758544921875, -0.0159149169921875, 0.0011587142944335938, -0.00846099853515625, 0.0028591156005859375, 0.01291656494140625, 0.056976318359375, -0.0299072265625, -0.0160369873046875, -0.042572021484375, 0.0140838623046875, 0.044189453125, -0.020477294921875, 0.044281005859375, 0.027435302734375, -0.028076171875, -0.0018033981323242188, -0.04840087890625, 0.01045989990234375, -0.041046142578125, 0.02117919921875, -0.0182952880859375, -0.06463623046875, 0.0494384765625, 0.00466156005859375, 0.01456451416015625, 0.035400390625, 0.05975341796875, 0.00798797607421875, 0.056060791015625, 0.07318115234375, -0.0322265625, 0.030364990234375, -0.039886474609375, 0.00751495361328125, -0.06011962890625, -0.0352783203125, -0.04852294921875, -0.0029888153076171875, -0.0523681640625, -0.03314208984375, 0.0246429443359375, 0.01461029052734375, -0.03765869140625, 0.05548095703125, -0.059814453125, 0.03240966796875, 0.03338623046875, 0.002101898193359375, 0.0299072265625, 0.0034847259521484375, -0.0017681121826171875, 0.022613525390625, -0.03216552734375, -0.05413818359375, 0.0897216796875, 0.033294677734375, 0.064208984375, -0.0022678375244140625, 0.06353759765625, 0.004955291748046875, 0.0239715576171875, -0.052154541015625, 0.045196533203125, 0.0204620361328125, -0.0361328125, 0.000652313232421875, -0.0167083740234375, -0.06805419921875, 0.010589599609375, 0.0046844482421875, -0.0595703125, 0.0053863525390625, -0.0025157928466796875, -0.01788330078125, 0.023162841796875, -0.04901123046875, 0.044677734375, -0.016265869140625, 0.004047393798828125, -0.01447296142578125, -0.0386962890625, 0.044921875, -0.01043701171875, 0.0164031982421875, -0.0102081298828125, -0.0163116455078125, 0.0489501953125, -0.03985595703125, 0.0810546875, 0.0105743408203125, -0.03619384765625, 0.044769287109375, -0.0006275177001953125, 0.03582763671875, 0.000682830810546875, -0.01708984375, 0.052215576171875, 0.00067138671875, -0.01355743408203125, -0.00962066650390625, 0.04742431640625, -0.08013916015625, -0.056060791015625, -0.030517578125, -0.03387451171875, 0.0200958251953125, 0.0107879638671875, 0.029541015625, 0.003528594970703125, 0.01340484619140625, 0.01024627685546875, 0.0293731689453125, -0.052703857421875, 0.04638671875, 0.0269317626953125, -0.0212554931640625, -0.036285400390625, 0.060882568359375, -0.01119232177734375, 0.015655517578125, 0.020538330078125, 0.0035610198974609375, -0.0086669921875, -0.03448486328125, -0.030731201171875, 0.03314208984375, -0.047119140625, -0.042144775390625, -0.04632568359375, -0.026885986328125, -0.0248565673828125, -0.024078369140625, -0.020721435546875, -0.0208587646484375, -0.0499267578125, -0.01346588134765625, 0.059112548828125, 0.060638427734375, 0.0037899017333984375, 0.034759521484375, -0.04632568359375, 0.034393310546875, 0.00823974609375, 0.0299072265625, 0.001708984375, -0.03680419921875, -0.00925445556640625, -0.0024356842041015625, -0.03948974609375, -0.0640869140625, 0.044952392578125, 0.0098724365234375, 0.0467529296875, 0.00980377197265625, -0.0037021636962890625, 0.051239013671875, -0.033599853515625, 0.06988525390625, 0.0250244140625, -0.082275390625, 0.0472412109375, -0.018707275390625, 0.00293731689453125, 0.00763702392578125, 0.0265045166015625, -0.032073974609375, -0.0194854736328125, -0.048248291015625, -0.05517578125, 0.045654296875, 0.01342010498046875, 0.022430419921875, 0.001956939697265625, 0.034088134765625, -0.000926971435546875, 0.0231781005859375, -0.07904052734375, -0.0246429443359375, -0.0234527587890625, -0.01763916015625, -0.006618499755859375, -0.022613525390625, -0.006343841552734375, -0.022491455078125, 0.033294677734375, -0.01375579833984375, 0.0404052734375, 0.0092620849609375, -0.01143646240234375, -0.0201263427734375, 0.0035381317138671875, 0.051513671875, 0.04290771484375, -0.000995635986328125, -0.0109100341796875, 0.029022216796875, -0.04107666015625, 0.017547607421875, -0.009552001953125, -0.006626129150390625, -0.0228729248046875, 0.042144775390625, 0.04852294921875, 0.01177978515625, -0.062744140625, 0.036773681640625, 0.01155853271484375, -0.0210418701171875, -0.038330078125, 0.0202789306640625, 0.021392822265625, 0.0263519287109375, 0.0180511474609375, 0.0023822784423828125, -0.00832366943359375, -0.031890869140625, -0.0007677078247070312, 0.026824951171875, 0.0134735107421875, -0.027252197265625, 0.068603515625, 0.0086669921875, -0.0276031494140625, 0.035675048828125, 0.006122589111328125, -0.042816162109375, 0.08782958984375, 0.051055908203125, 0.05767822265625, -0.0157012939453125, 0.008758544921875, 0.035064697265625, 0.04095458984375, -0.0004849433898925781, 0.03204345703125, 0.0008654594421386719, -0.04046630859375, -0.025909423828125, -0.06536865234375, -0.0288543701171875, 0.00820159912109375, -0.03521728515625, 0.022918701171875, -0.048370361328125, -0.00295257568359375, -0.0270843505859375, 0.007404327392578125, -0.04705810546875, -0.0016145706176757812, 0.008575439453125, 0.07135009765625, -0.046844482421875, 0.0693359375, 0.04388427734375, -0.053253173828125, -0.067626953125, -0.0146331787109375, -0.004314422607421875, -0.093017578125, 0.03594970703125, 0.0200653076171875, 0.0031566619873046875, 0.005840301513671875, -0.07177734375, -0.0814208984375, 0.0955810546875, 0.035186767578125, -0.039154052734375, -0.0011663436889648438, 0.01479339599609375, 0.042388916015625, -0.026885986328125, 0.0282745361328125, 0.04962158203125, 0.03289794921875, -0.00748443603515625, -0.09033203125, 0.023834228515625, -0.029510498046875, 0.01654052734375, -0.020477294921875, -0.07806396484375, 0.07708740234375, -0.041229248046875, -0.01033782958984375, 0.03656005859375, 0.047943115234375, 0.04010009765625, 0.01479339599609375, 0.0262908935546875, 0.042572021484375, 0.048614501953125, 0.0020389556884765625, 0.0892333984375, -0.03424072265625, 0.030120849609375, 0.038421630859375, -0.0087127685546875, 0.05401611328125, 0.0304412841796875, -0.043731689453125, 0.056121826171875, 0.05731201171875, -0.016204833984375, 0.0213470458984375, 0.0230865478515625, -0.005168914794921875, -0.0032329559326171875, -0.00756072998046875, -0.055816650390625, 0.029541015625, 0.0238494873046875, -0.025970458984375, 0.005199432373046875, -0.015899658203125, 0.0231170654296875, -0.00792694091796875, -0.00548553466796875, 0.04901123046875, 0.0172882080078125, -0.0399169921875, 0.087158203125, 0.00930023193359375, 0.073974609375, -0.0399169921875, -0.0093231201171875, -0.033660888671875, 0.0036106109619140625, -0.0426025390625, -0.039154052734375, 0.0137939453125, 0.0229644775390625, 0.0008440017700195312, -0.00982666015625, 0.03546142578125, -0.00490570068359375, -0.038299560546875, 0.0291290283203125, 0.013641357421875, 0.0203094482421875, 0.01043701171875, -0.049468994140625, 0.034637451171875, 0.0149383544921875, -0.034393310546875, 0.0273895263671875, 0.00885009765625, 0.0038700103759765625, 0.07037353515625, 0.057464599609375, -0.010772705078125, 0.012176513671875, -0.0086212158203125, 0.08526611328125, -0.0526123046875, -0.0250396728515625, -0.0596923828125, 0.047882080078125, 0.02349853515625, -0.033355712890625, 0.04571533203125, 0.025970458984375, 0.061370849609375, -0.00984954833984375, 0.05999755859375, -0.0134429931640625, 0.006290435791015625, -0.035858154296875, 0.04840087890625, -0.05804443359375, 0.0287322998046875, -0.03790283203125, -0.070068359375, -0.0236968994140625, 0.06475830078125, -0.0027828216552734375, 0.00498199462890625, 0.0386962890625, 0.07415771484375, 0.0230865478515625, -0.00748443603515625, 0.0160064697265625, 0.01427459716796875, 0.0290069580078125, 0.058013916015625, 0.07391357421875, -0.0435791015625, 0.053680419921875, -0.04388427734375, -0.0178375244140625, -0.02093505859375, -0.074462890625, -0.07373046875, -0.038299560546875, -0.0255279541015625, -0.0280609130859375, -0.021820068359375, 0.0682373046875, 0.0416259765625, -0.04425048828125, -0.0350341796875, -0.01021575927734375, 0.030029296875, -0.008758544921875, -0.0150604248046875, 0.0212554931640625, -0.00959014892578125, -0.0634765625, 0.0299072265625, -0.0025539398193359375, 0.0131988525390625, -0.0247802734375, -0.0192413330078125, -0.00969696044921875, 0.00095367431640625, 0.036285400390625, 0.0260162353515625, -0.06304931640625, -0.015350341796875, 0.005985260009765625, -0.01334381103515625, 0.0084381103515625, 0.032928466796875, -0.048248291015625, -0.005771636962890625, 0.0243072509765625, 0.033782958984375, 0.0247039794921875, -0.0168304443359375, 0.016876220703125, -0.0288543701171875, 0.03314208984375, -0.00025916099548339844, 0.037139892578125, 0.007709503173828125, -0.04541015625, 0.053619384765625, 0.01849365234375, -0.050750732421875, -0.06781005859375, 0.0113067626953125, -0.08343505859375, -0.0159454345703125, 0.0975341796875, -0.006755828857421875, -0.0240631103515625, 0.01500701904296875, -0.0283966064453125, 0.018951416015625, -0.02947998046875, 0.053253173828125, 0.0225982666015625, -0.006046295166015625, -0.011627197265625, -0.03082275390625, 0.0214080810546875, 0.018768310546875, -0.07122802734375, -0.0122528076171875, 0.027923583984375, 0.028839111328125, 0.0149688720703125, 0.050262451171875, -0.0098724365234375, 0.01137542724609375, 0.0036830902099609375, 0.03363037109375, -0.006870269775390625, -0.0172119140625, -0.03021240234375, -0.005718231201171875, -0.006622314453125, -0.002834320068359375 ] ]
NousResearch/Llama-2-7b-hf
2023-08-26T20:16:26.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "facebook", "meta", "llama-2", "en", "has_space", "text-generation-inference", "region:us" ]
text-generation
NousResearch
null
null
NousResearch/Llama-2-7b-hf
85
130,256
transformers
2023-07-18T18:30:59
--- extra_gated_heading: Access Llama 2 on Hugging Face extra_gated_description: >- This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days. extra_gated_button_content: Submit extra_gated_fields: I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox language: - en pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 --- # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
10,098
[ [ -0.0165252685546875, -0.05267333984375, 0.02777099609375, 0.0149993896484375, -0.0283660888671875, 0.0164031982421875, -0.0032901763916015625, -0.056549072265625, 0.004703521728515625, 0.02374267578125, -0.052337646484375, -0.042572021484375, -0.050537109375, 0.005863189697265625, -0.01702880859375, 0.08087158203125, -0.0012121200561523438, -0.021331787109375, -0.01024627685546875, 0.006439208984375, -0.037017822265625, -0.0303497314453125, -0.039764404296875, -0.032318115234375, 0.0305328369140625, 0.036041259765625, 0.045745849609375, 0.049163818359375, 0.04119873046875, 0.018218994140625, -0.0193634033203125, 0.015655517578125, -0.052337646484375, -0.020965576171875, 0.00872039794921875, -0.037322998046875, -0.0517578125, 0.01314544677734375, 0.0261077880859375, 0.01335906982421875, -0.0220489501953125, 0.040191650390625, 0.004512786865234375, 0.03515625, -0.04248046875, 0.01236724853515625, -0.0543212890625, 0.002635955810546875, -0.016815185546875, -0.00608062744140625, -0.01410675048828125, -0.022003173828125, -0.014434814453125, -0.0626220703125, -0.0084228515625, 0.006206512451171875, 0.0780029296875, 0.0491943359375, -0.03448486328125, -0.00897979736328125, -0.021270751953125, 0.0712890625, -0.06402587890625, 0.0038166046142578125, 0.0443115234375, 0.021148681640625, -0.016510009765625, -0.057403564453125, -0.0489501953125, -0.0106964111328125, 0.0044097900390625, 0.0265350341796875, -0.0303497314453125, 0.0003504753112792969, 0.0135650634765625, 0.02825927734375, -0.043304443359375, 0.042755126953125, -0.0391845703125, -0.01242828369140625, 0.0799560546875, 0.017181396484375, 0.00024020671844482422, -0.003925323486328125, -0.03765869140625, -0.0219268798828125, -0.061187744140625, 0.01342010498046875, 0.03662109375, -0.0033321380615234375, -0.035064697265625, 0.04656982421875, -0.029693603515625, 0.02191162109375, 0.0010709762573242188, -0.037689208984375, 0.03704833984375, -0.0355224609375, -0.021240234375, -0.00841522216796875, 0.0667724609375, 0.055084228515625, 0.0121002197265625, 0.0081787109375, -0.004817962646484375, 0.00841522216796875, -0.00014328956604003906, -0.06085205078125, -0.0032196044921875, 0.0179443359375, -0.0281524658203125, -0.0435791015625, -0.0222625732421875, -0.05572509765625, -0.01229095458984375, -0.006984710693359375, 0.0188140869140625, -0.0019626617431640625, -0.0295257568359375, 0.009185791015625, 0.0037479400634765625, 0.041778564453125, 0.01532745361328125, -0.07073974609375, 0.016357421875, 0.04132080078125, 0.058929443359375, -0.0180816650390625, -0.0275421142578125, 0.0011501312255859375, -0.0012083053588867188, -0.02471923828125, 0.0675048828125, -0.02508544921875, -0.04144287109375, -0.016845703125, -0.001312255859375, 0.0126495361328125, -0.03961181640625, 0.0323486328125, -0.0290985107421875, 0.01226043701171875, -0.024993896484375, -0.0267791748046875, -0.0262451171875, 0.01456451416015625, -0.0305328369140625, 0.1094970703125, 0.008544921875, -0.03643798828125, 0.0239410400390625, -0.0513916015625, -0.0141448974609375, -0.0158538818359375, 0.0080108642578125, -0.04010009765625, -0.020477294921875, 0.00946807861328125, 0.0281829833984375, -0.04754638671875, 0.03668212890625, -0.0152130126953125, -0.032989501953125, 0.0026721954345703125, -0.03076171875, 0.064208984375, 0.0218963623046875, -0.034759521484375, 0.00543975830078125, -0.06280517578125, 0.0033130645751953125, 0.034454345703125, -0.036041259765625, 0.0193634033203125, 0.005718231201171875, -0.0088653564453125, 0.01458740234375, 0.036590576171875, -0.0269622802734375, 0.01239776611328125, -0.02313232421875, 0.03778076171875, 0.056732177734375, 0.0035381317138671875, 0.01256561279296875, -0.0391845703125, 0.0391845703125, -0.00267791748046875, 0.0296173095703125, 0.0009016990661621094, -0.054046630859375, -0.07708740234375, -0.0133514404296875, -0.0036602020263671875, 0.0634765625, -0.0189666748046875, 0.051910400390625, -0.0005803108215332031, -0.055694580078125, -0.0313720703125, 0.0276947021484375, 0.04998779296875, 0.037567138671875, 0.0328369140625, -0.0210113525390625, -0.04571533203125, -0.075927734375, 0.00403594970703125, -0.033477783203125, -0.0007534027099609375, 0.0268402099609375, 0.049530029296875, -0.0255889892578125, 0.054931640625, -0.0401611328125, -0.01332855224609375, -0.0199127197265625, -0.0095977783203125, 0.005207061767578125, 0.026641845703125, 0.04931640625, -0.0301055908203125, -0.0154876708984375, -0.00942230224609375, -0.068603515625, -0.007427215576171875, 0.007537841796875, -0.01561737060546875, 0.017730712890625, 0.02386474609375, -0.046478271484375, 0.034393310546875, 0.05340576171875, -0.01366424560546875, 0.038665771484375, -0.0005016326904296875, -0.01332855224609375, -0.08099365234375, 0.0034618377685546875, -0.0155029296875, 0.002162933349609375, -0.0325927734375, -0.002185821533203125, -0.01580810546875, 0.00555419921875, -0.047210693359375, 0.044708251953125, -0.02374267578125, -0.01287841796875, -0.009307861328125, 0.0038280487060546875, 0.00461578369140625, 0.045745849609375, -0.009796142578125, 0.0810546875, 0.0303955078125, -0.04473876953125, 0.01971435546875, 0.0298004150390625, -0.037261962890625, 0.0109710693359375, -0.06585693359375, 0.0276947021484375, 0.007648468017578125, 0.0394287109375, -0.0716552734375, -0.0279388427734375, 0.0242767333984375, -0.032867431640625, 0.006763458251953125, 0.018463134765625, -0.04144287109375, -0.030670166015625, -0.032012939453125, 0.02337646484375, 0.06085205078125, -0.034454345703125, 0.013946533203125, 0.0284881591796875, 0.0019779205322265625, -0.05218505859375, -0.06402587890625, 0.00455474853515625, -0.02813720703125, -0.040130615234375, 0.022918701171875, -0.0143890380859375, -0.01800537109375, -0.018951416015625, 0.005218505859375, -0.0009350776672363281, 0.02972412109375, 0.02777099609375, 0.0283050537109375, -0.00823211669921875, -0.001728057861328125, 0.01076507568359375, -0.015472412109375, 0.003948211669921875, 0.0167388916015625, 0.043853759765625, -0.01248931884765625, -0.0164642333984375, -0.054931640625, 0.00400543212890625, 0.0220947265625, -0.019866943359375, 0.04583740234375, 0.032257080078125, -0.0173492431640625, 0.0175323486328125, -0.05743408203125, -0.0078277587890625, -0.04010009765625, 0.04058837890625, -0.0157623291015625, -0.06207275390625, 0.040130615234375, -0.0002446174621582031, 0.032745361328125, 0.05670166015625, 0.04644775390625, -0.006282806396484375, 0.06024169921875, 0.043975830078125, -0.005809783935546875, 0.02520751953125, -0.037841796875, -0.007015228271484375, -0.0714111328125, -0.046966552734375, -0.02374267578125, -0.03228759765625, -0.050018310546875, -0.031402587890625, 0.0199127197265625, 0.0147552490234375, -0.051361083984375, 0.024200439453125, -0.043609619140625, 0.04266357421875, 0.040557861328125, 0.00995635986328125, 0.02294921875, 0.007110595703125, 0.0114593505859375, 0.005046844482421875, -0.039398193359375, -0.05560302734375, 0.1112060546875, 0.031890869140625, 0.033905029296875, 0.007404327392578125, 0.05181884765625, 0.0111083984375, 0.0253753662109375, -0.053619384765625, 0.049407958984375, 0.0038394927978515625, -0.054718017578125, -0.01094818115234375, -0.00856781005859375, -0.067626953125, 0.0103607177734375, -0.015716552734375, -0.058380126953125, 0.0023193359375, -0.0018739700317382812, -0.0283050537109375, 0.021575927734375, -0.050262451171875, 0.045440673828125, -0.042877197265625, -0.0233612060546875, -0.0269775390625, -0.059967041015625, 0.050872802734375, -0.0150604248046875, 0.007678985595703125, -0.036895751953125, -0.019073486328125, 0.06744384765625, -0.0248870849609375, 0.07550048828125, -0.003673553466796875, -0.00737762451171875, 0.043975830078125, -0.01459503173828125, 0.033538818359375, 0.00254058837890625, -0.0208282470703125, 0.050994873046875, -0.01000213623046875, -0.0238037109375, -0.01120758056640625, 0.04034423828125, -0.0909423828125, -0.06011962890625, -0.038726806640625, -0.0384521484375, -0.00154876708984375, 0.007022857666015625, 0.038330078125, -0.00739288330078125, -0.00243377685546875, 0.00910186767578125, 0.034271240234375, -0.039276123046875, 0.03607177734375, 0.040802001953125, -0.00811767578125, -0.034454345703125, 0.049591064453125, 0.004241943359375, 0.027252197265625, 0.0160064697265625, 0.0036411285400390625, -0.03106689453125, -0.03167724609375, -0.038604736328125, 0.0207366943359375, -0.03509521484375, -0.03656005859375, -0.0399169921875, -0.02642822265625, -0.02471923828125, -0.00591278076171875, -0.03253173828125, -0.031646728515625, -0.056427001953125, -0.0301971435546875, 0.03887939453125, 0.0616455078125, -0.0006442070007324219, 0.047607421875, -0.0245819091796875, 0.01378631591796875, 0.0279083251953125, 0.01320648193359375, -0.0027313232421875, -0.055877685546875, 0.004001617431640625, 0.009033203125, -0.057342529296875, -0.04656982421875, 0.0184326171875, 0.0202484130859375, 0.0362548828125, 0.0355224609375, -0.00616455078125, 0.057830810546875, -0.0261688232421875, 0.08294677734375, 0.0274810791015625, -0.04986572265625, 0.05279541015625, -0.016204833984375, 0.00348663330078125, 0.0474853515625, 0.0211181640625, -0.006259918212890625, -0.01157379150390625, -0.047698974609375, -0.05145263671875, 0.059661865234375, 0.01690673828125, 0.0136566162109375, 0.003787994384765625, 0.035003662109375, 0.003971099853515625, 0.00836944580078125, -0.0634765625, -0.023590087890625, -0.0204010009765625, -0.007205963134765625, -0.0154571533203125, -0.038177490234375, -0.005039215087890625, -0.0234527587890625, 0.048004150390625, 0.003986358642578125, 0.0263214111328125, -0.00952911376953125, 0.001102447509765625, -0.0073699951171875, 0.00348663330078125, 0.05438232421875, 0.0380859375, -0.0193634033203125, -0.01070404052734375, 0.04852294921875, -0.0477294921875, 0.025787353515625, 0.0007052421569824219, -0.0100860595703125, -0.0283966064453125, 0.0303955078125, 0.06585693359375, 0.0195770263671875, -0.05322265625, 0.0248870849609375, 0.0107421875, -0.02764892578125, -0.032012939453125, 0.0272216796875, 0.00684356689453125, 0.024139404296875, 0.0200042724609375, -0.0102386474609375, 0.006763458251953125, -0.039093017578125, -0.00933074951171875, 0.0284423828125, 0.00933074951171875, -0.031402587890625, 0.0748291015625, 0.0240936279296875, -0.0217437744140625, 0.0399169921875, -0.012481689453125, -0.0271148681640625, 0.0684814453125, 0.04833984375, 0.048492431640625, -0.021514892578125, 0.00902557373046875, 0.053497314453125, 0.034515380859375, -0.0175018310546875, 0.0178985595703125, -0.0002655982971191406, -0.03729248046875, -0.0159149169921875, -0.0517578125, -0.035003662109375, 0.026824951171875, -0.043365478515625, 0.0231475830078125, -0.0477294921875, -0.0205535888671875, -0.0244903564453125, 0.03387451171875, -0.050628662109375, 0.01654052734375, 0.00824737548828125, 0.06915283203125, -0.05462646484375, 0.0572509765625, 0.03826904296875, -0.038604736328125, -0.06707763671875, -0.022430419921875, 0.014862060546875, -0.0919189453125, 0.03973388671875, 0.029296875, -0.0051422119140625, 0.00927734375, -0.05682373046875, -0.091064453125, 0.1280517578125, 0.034576416015625, -0.056060791015625, -0.001434326171875, 0.0257415771484375, 0.036041259765625, -0.007633209228515625, 0.033050537109375, 0.0615234375, 0.03729248046875, 0.008148193359375, -0.0792236328125, 0.0071868896484375, -0.02642822265625, -0.0028171539306640625, -0.0147552490234375, -0.0982666015625, 0.0618896484375, -0.030120849609375, -0.0180816650390625, 0.0157318115234375, 0.04840087890625, 0.051483154296875, 0.04241943359375, 0.0266571044921875, 0.060028076171875, 0.06878662109375, -0.0013055801391601562, 0.08392333984375, -0.0266571044921875, 0.013458251953125, 0.06646728515625, -0.022003173828125, 0.0728759765625, 0.018463134765625, -0.044342041015625, 0.0458984375, 0.07611083984375, -0.001415252685546875, 0.043975830078125, 0.005001068115234375, -0.01308441162109375, -0.013641357421875, -0.01329803466796875, -0.04913330078125, 0.03851318359375, 0.0178375244140625, -0.01094818115234375, -0.0020847320556640625, -0.024932861328125, 0.0174102783203125, -0.02490234375, -0.0012178421020507812, 0.0595703125, 0.01338958740234375, -0.0460205078125, 0.06597900390625, 0.00335693359375, 0.06353759765625, -0.0487060546875, 0.006069183349609375, -0.038848876953125, 0.0002732276916503906, -0.0279541015625, -0.05328369140625, 0.00643157958984375, 0.0276031494140625, -0.0011320114135742188, -0.0084991455078125, 0.041412353515625, 0.00368499755859375, -0.042877197265625, 0.027557373046875, 0.02069091796875, 0.0264129638671875, 0.0157623291015625, -0.0509033203125, 0.01279449462890625, 0.006359100341796875, -0.040802001953125, 0.0285797119140625, 0.0018396377563476562, -0.005252838134765625, 0.060943603515625, 0.056365966796875, -0.01528167724609375, 0.01111602783203125, -0.0157318115234375, 0.0753173828125, -0.0379638671875, -0.01457977294921875, -0.05682373046875, 0.04010009765625, 0.0039215087890625, -0.05352783203125, 0.040924072265625, 0.048614501953125, 0.05340576171875, 0.0206298828125, 0.0489501953125, 0.00582122802734375, 0.023040771484375, -0.040435791015625, 0.046844482421875, -0.058807373046875, 0.0280609130859375, 0.00638580322265625, -0.0733642578125, -0.005062103271484375, 0.05157470703125, -0.01812744140625, 0.0036869049072265625, 0.028289794921875, 0.06402587890625, 0.0134124755859375, -0.01248931884765625, 0.0096282958984375, 0.012725830078125, 0.0264739990234375, 0.06622314453125, 0.06329345703125, -0.04705810546875, 0.052398681640625, -0.0289306640625, -0.0180206298828125, -0.0196075439453125, -0.056060791015625, -0.07269287109375, -0.0197601318359375, -0.0180511474609375, -0.01096343994140625, 0.005035400390625, 0.056060791015625, 0.0379638671875, -0.043304443359375, -0.0224761962890625, -0.005924224853515625, -0.00606536865234375, 0.002277374267578125, -0.01197052001953125, 0.0247344970703125, -0.00782012939453125, -0.043914794921875, 0.03662109375, 0.00017821788787841797, 0.01473236083984375, -0.0251007080078125, -0.019775390625, -0.01556396484375, 0.0105743408203125, 0.04583740234375, 0.02099609375, -0.06903076171875, -0.01763916015625, 0.003353118896484375, -0.0103607177734375, 0.0092620849609375, 0.0012874603271484375, -0.057769775390625, 0.007282257080078125, 0.01055145263671875, 0.0286712646484375, 0.0498046875, 0.004169464111328125, 0.004810333251953125, -0.0372314453125, 0.03369140625, 0.001251220703125, 0.01152801513671875, 0.0230865478515625, -0.032470703125, 0.059600830078125, 0.0113372802734375, -0.052703857421875, -0.070556640625, 0.008544921875, -0.07904052734375, 0.0002701282501220703, 0.103515625, -0.0003902912139892578, -0.009735107421875, 0.01468658447265625, -0.01531219482421875, 0.0290679931640625, -0.0283660888671875, 0.06048583984375, 0.0423583984375, -0.006168365478515625, -0.0078277587890625, -0.06048583984375, 0.0253753662109375, 0.0292816162109375, -0.08245849609375, -0.0189666748046875, 0.03387451171875, 0.037017822265625, -0.006778717041015625, 0.051910400390625, 0.00130462646484375, 0.017669677734375, 0.005023956298828125, 0.00868988037109375, -0.0187835693359375, -0.01220703125, -0.007419586181640625, -0.020660400390625, -0.004146575927734375, -0.016693115234375 ] ]
facebook/wav2vec2-base
2021-12-28T12:44:31.000Z
[ "transformers", "pytorch", "wav2vec2", "pretraining", "speech", "en", "dataset:librispeech_asr", "arxiv:2006.11477", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
null
facebook
null
null
facebook/wav2vec2-base
36
129,490
transformers
2022-03-02T23:29:05
--- language: en datasets: - librispeech_asr tags: - speech license: apache-2.0 --- # Wav2Vec2-Base [Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. [Paper](https://arxiv.org/abs/2006.11477) Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli **Abstract** We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20. # Usage See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model.
1,997
[ [ -0.0046234130859375, -0.04095458984375, 0.0038928985595703125, -0.004978179931640625, -0.0252685546875, -0.007572174072265625, -0.0213623046875, -0.053466796875, -0.0181732177734375, 0.01079559326171875, -0.041778564453125, -0.032684326171875, -0.048431396484375, -0.0271148681640625, -0.019561767578125, 0.07012939453125, 0.0296478271484375, 0.0290374755859375, 0.00390625, -0.0173797607421875, -0.037841796875, -0.03302001953125, -0.058868408203125, -0.03692626953125, 0.022216796875, 0.0209197998046875, 0.02490234375, 0.039947509765625, 0.01317596435546875, 0.0163726806640625, -0.036285400390625, 0.01103973388671875, -0.05877685546875, -0.01145172119140625, -0.0039215087890625, -0.030120849609375, -0.025054931640625, 0.0214080810546875, 0.045257568359375, 0.054046630859375, -0.0285797119140625, 0.030487060546875, 0.014129638671875, 0.032989501953125, -0.026824951171875, 0.03314208984375, -0.060028076171875, -0.0070343017578125, -0.0204010009765625, 0.0018491744995117188, -0.03778076171875, 0.00661468505859375, 0.00650787353515625, -0.035064697265625, 0.0137176513671875, -0.006519317626953125, 0.06561279296875, 0.027557373046875, -0.041412353515625, -0.0179443359375, -0.0758056640625, 0.07183837890625, -0.03143310546875, 0.0670166015625, 0.050323486328125, 0.023834228515625, 0.0009403228759765625, -0.08001708984375, -0.0275726318359375, -0.0008082389831542969, 0.032745361328125, 0.027191162109375, -0.0330810546875, 0.0011653900146484375, 0.0264434814453125, 0.0226287841796875, -0.041839599609375, 0.02801513671875, -0.063720703125, -0.04541015625, 0.03790283203125, -0.0161895751953125, -0.0012445449829101562, 0.0021305084228515625, -0.0273590087890625, -0.0255889892578125, -0.0254058837890625, 0.0391845703125, 0.01116180419921875, 0.0230255126953125, -0.0220794677734375, 0.035552978515625, 0.004482269287109375, 0.04510498046875, -0.004894256591796875, -0.01776123046875, 0.04730224609375, -0.0207366943359375, -0.0016202926635742188, 0.022186279296875, 0.04803466796875, 0.01141357421875, 0.0150146484375, 0.0017862319946289062, -0.01885986328125, 0.0159454345703125, -0.003650665283203125, -0.053863525390625, -0.0400390625, 0.006778717041015625, -0.037872314453125, 0.004657745361328125, 0.0098724365234375, -0.001155853271484375, -0.005847930908203125, -0.04791259765625, 0.048492431640625, -0.0251007080078125, -0.00925445556640625, -0.003795623779296875, -0.006591796875, 0.0164794921875, 0.00800323486328125, -0.06927490234375, 0.0302581787109375, 0.044097900390625, 0.0633544921875, -0.00434112548828125, 0.01367950439453125, -0.0638427734375, -0.00762176513671875, -0.039154052734375, 0.0494384765625, -0.01806640625, -0.0306549072265625, -0.01435089111328125, -0.0027370452880859375, 0.032989501953125, -0.048980712890625, 0.050750732421875, -0.0257568359375, 0.0048980712890625, -0.003429412841796875, -0.060821533203125, -0.0031833648681640625, -0.033416748046875, -0.0498046875, 0.0924072265625, 0.005786895751953125, -0.045074462890625, 0.0266876220703125, -0.01715087890625, -0.0341796875, -0.0043792724609375, -0.004482269287109375, -0.0357666015625, 0.01041412353515625, 0.00833892822265625, 0.03143310546875, 0.0003941059112548828, 0.0016489028930664062, -0.0038967132568359375, -0.028076171875, 0.0092926025390625, -0.038848876953125, 0.060028076171875, 0.0289459228515625, -0.0105743408203125, 0.009857177734375, -0.0845947265625, 0.01141357421875, 0.005126953125, -0.04669189453125, -0.00274658203125, -0.00411224365234375, 0.027252197265625, 0.010284423828125, 0.007305145263671875, -0.0498046875, -0.01690673828125, -0.0650634765625, 0.056243896484375, 0.043243408203125, -0.00806427001953125, 0.040283203125, -0.018341064453125, 0.0012559890747070312, -0.0196533203125, -0.0002987384796142578, 0.0027618408203125, -0.039703369140625, -0.03857421875, -0.0210418701171875, 0.040191650390625, 0.046295166015625, -0.01068115234375, 0.04052734375, -0.005039215087890625, -0.051361083984375, -0.0599365234375, 0.01004791259765625, 0.033355712890625, 0.039276123046875, 0.054931640625, -0.01256561279296875, -0.06146240234375, -0.05535888671875, -0.01033782958984375, -0.01139068603515625, -0.0188751220703125, 0.02862548828125, 0.01500701904296875, -0.017730712890625, 0.054931640625, -0.005550384521484375, -0.029144287109375, -0.015106201171875, 0.01451873779296875, 0.016021728515625, 0.046630859375, 0.01496124267578125, -0.05645751953125, -0.0191497802734375, -0.0215606689453125, -0.01444244384765625, 0.0006351470947265625, 0.016082763671875, 0.005767822265625, 0.01030731201171875, 0.049896240234375, -0.0120391845703125, 0.0318603515625, 0.047515869140625, -0.0049285888671875, 0.00926971435546875, -0.01483917236328125, -0.0293731689453125, -0.0908203125, -0.01251220703125, -0.0081329345703125, -0.0382080078125, -0.046478271484375, -0.053558349609375, 0.0043487548828125, -0.01522064208984375, -0.04388427734375, 0.034942626953125, -0.03277587890625, -0.01348876953125, -0.0263214111328125, 0.007049560546875, -0.013275146484375, 0.02447509765625, 0.00415802001953125, 0.054718017578125, 0.03466796875, -0.0484619140625, 0.027496337890625, 0.03240966796875, -0.03887939453125, -0.0006814002990722656, -0.06829833984375, 0.040740966796875, -0.0014591217041015625, 0.0276031494140625, -0.08392333984375, -0.0136260986328125, -0.00553131103515625, -0.068115234375, 0.0394287109375, -0.0118255615234375, -0.0242156982421875, -0.0165557861328125, 0.00478363037109375, 0.0302886962890625, 0.0738525390625, -0.05755615234375, 0.042755126953125, 0.055206298828125, -0.0036907196044921875, -0.0251007080078125, -0.07928466796875, -0.016510009765625, 0.014068603515625, -0.034149169921875, 0.04644775390625, 0.0009403228759765625, 0.0067138671875, -0.0180206298828125, -0.037872314453125, 0.00412750244140625, -0.0075836181640625, 0.04791259765625, 0.00591278076171875, -0.0015964508056640625, 0.0341796875, 0.003643035888671875, -0.0244293212890625, -0.0052490234375, -0.038360595703125, 0.043060302734375, 0.0011816024780273438, -0.00435638427734375, -0.07177734375, 0.007568359375, 0.01282501220703125, -0.028961181640625, 0.0297698974609375, 0.06927490234375, -0.049285888671875, -0.018524169921875, -0.0347900390625, -0.0220184326171875, -0.033782958984375, 0.051513671875, -0.0266571044921875, -0.07183837890625, 0.0153045654296875, -0.006561279296875, -0.0010023117065429688, 0.043853759765625, 0.054443359375, -0.0279998779296875, 0.07965087890625, 0.04376220703125, -0.0080413818359375, 0.053558349609375, -0.032562255859375, 0.004001617431640625, -0.049163818359375, -0.052978515625, -0.044281005859375, -0.0151824951171875, -0.041839599609375, -0.043609619140625, 0.0210723876953125, 0.01605224609375, -0.0012359619140625, 0.020050048828125, -0.042755126953125, 0.023040771484375, 0.051513671875, 0.023895263671875, -0.00826263427734375, 0.0216522216796875, -0.00394439697265625, -0.0066986083984375, -0.046844482421875, -0.034759521484375, 0.080322265625, 0.045166015625, 0.053253173828125, -0.00414276123046875, 0.050323486328125, 0.026031494140625, -0.0369873046875, -0.07855224609375, 0.0197601318359375, -0.0209197998046875, -0.043060302734375, -0.02099609375, -0.016998291015625, -0.0699462890625, -0.00009828805923461914, -0.0308990478515625, -0.05694580078125, 0.0079498291015625, 0.01457977294921875, -0.01959228515625, 0.002902984619140625, -0.054656982421875, 0.044281005859375, -0.00402069091796875, -0.01293182373046875, -0.0333251953125, -0.0589599609375, -0.013946533203125, 0.0079498291015625, 0.018218994140625, -0.01141357421875, 0.016876220703125, 0.093994140625, -0.02728271484375, 0.050384521484375, -0.034576416015625, 0.00841522216796875, 0.042877197265625, -0.017608642578125, 0.0380859375, -0.0037593841552734375, -0.0001811981201171875, 0.017242431640625, 0.01531219482421875, -0.03289794921875, -0.0263519287109375, 0.02783203125, -0.076904296875, -0.0067596435546875, -0.01287841796875, -0.0200958251953125, -0.031463623046875, 0.0016202926635742188, 0.039825439453125, 0.0621337890625, -0.0213775634765625, 0.042938232421875, 0.05804443359375, 0.0033283233642578125, 0.017059326171875, 0.0252227783203125, 0.00724029541015625, -0.0269012451171875, 0.072021484375, 0.019500732421875, 0.005279541015625, 0.026275634765625, 0.032562255859375, -0.042633056640625, -0.049285888671875, -0.0030670166015625, 0.00409698486328125, -0.05389404296875, -0.00603485107421875, -0.059600830078125, -0.031768798828125, -0.049072265625, 0.030609130859375, -0.057403564453125, -0.046295166015625, -0.050384521484375, -0.0285797119140625, 0.023345947265625, 0.053863525390625, -0.0400390625, 0.02130126953125, -0.04827880859375, 0.0380859375, 0.042083740234375, -0.002532958984375, -0.018463134765625, -0.085205078125, -0.02001953125, 0.01959228515625, 0.0126800537109375, -0.052001953125, 0.004428863525390625, 0.03546142578125, 0.040435791015625, 0.0260467529296875, -0.01102447509765625, 0.0389404296875, -0.0369873046875, 0.061798095703125, 0.0193328857421875, -0.07305908203125, 0.059600830078125, -0.0026454925537109375, 0.01422882080078125, 0.0399169921875, 0.01239013671875, -0.019500732421875, 0.0068511962890625, -0.0439453125, -0.0675048828125, 0.05633544921875, 0.018585205078125, 0.01708984375, 0.02398681640625, 0.0321044921875, 0.007663726806640625, -0.012451171875, -0.032257080078125, -0.0228424072265625, -0.036041259765625, -0.027740478515625, -0.0180511474609375, -0.042144775390625, 0.0106201171875, -0.0421142578125, 0.061370849609375, 0.0184173583984375, 0.019134521484375, 0.01268768310546875, -0.0162506103515625, 0.0114288330078125, 0.01568603515625, 0.0285797119140625, 0.021697998046875, -0.01526641845703125, 0.01485443115234375, 0.0200958251953125, -0.04425048828125, 0.0007371902465820312, 0.0254058837890625, 0.0043792724609375, -0.0017938613891601562, 0.041168212890625, 0.101806640625, 0.0010204315185546875, -0.040283203125, 0.0389404296875, -0.00724029541015625, -0.029754638671875, -0.0256195068359375, 0.016326904296875, 0.01297760009765625, 0.027435302734375, 0.0249176025390625, 0.002399444580078125, 0.0148468017578125, -0.037139892578125, 0.0261077880859375, 0.016204833984375, -0.060028076171875, -0.0249481201171875, 0.06610107421875, 0.0140838623046875, -0.031585693359375, 0.046173095703125, -0.01522064208984375, -0.0244140625, 0.0321044921875, 0.044647216796875, 0.037109375, -0.0285797119140625, -0.01270294189453125, 0.046234130859375, 0.00933837890625, -0.00959014892578125, 0.0380859375, -0.0291900634765625, -0.04058837890625, -0.0193328857421875, -0.046142578125, -0.0173187255859375, 0.0203857421875, -0.057769775390625, 0.01490020751953125, -0.035614013671875, -0.0286102294921875, 0.0277099609375, 0.0138702392578125, -0.056427001953125, 0.0263824462890625, 0.0312347412109375, 0.0565185546875, -0.0599365234375, 0.074951171875, 0.04327392578125, -0.01934814453125, -0.0980224609375, -0.0204010009765625, 0.005950927734375, -0.05694580078125, 0.041412353515625, 0.0154266357421875, -0.0283660888671875, 0.0207977294921875, -0.057159423828125, -0.062286376953125, 0.07659912109375, 0.01568603515625, -0.08062744140625, 0.01157379150390625, -0.0113677978515625, 0.03668212890625, -0.0071563720703125, -0.007534027099609375, 0.040618896484375, 0.01390838623046875, 0.017730712890625, -0.0791015625, -0.0229949951171875, -0.017486572265625, -0.0035686492919921875, -0.0276641845703125, -0.037628173828125, 0.05609130859375, -0.0299530029296875, -0.0302581787109375, 0.00675201416015625, 0.07318115234375, 0.0038928985595703125, 0.0260009765625, 0.042083740234375, 0.032196044921875, 0.09356689453125, -0.01129150390625, 0.04681396484375, -0.0008029937744140625, 0.024658203125, 0.10205078125, 0.002777099609375, 0.07476806640625, 0.02294921875, -0.027923583984375, 0.0216064453125, 0.04931640625, -0.01824951171875, 0.060028076171875, 0.0215606689453125, -0.0075531005859375, -0.0238800048828125, -0.0251312255859375, -0.038787841796875, 0.061737060546875, 0.022125244140625, -0.005229949951171875, 0.0187530517578125, 0.0183868408203125, -0.020477294921875, 0.007717132568359375, -0.0198516845703125, 0.08038330078125, 0.0206451416015625, -0.0070648193359375, 0.049560546875, 0.01003265380859375, 0.038482666015625, -0.043853759765625, -0.0005488395690917969, 0.01381683349609375, 0.0166168212890625, -0.0186309814453125, -0.0384521484375, 0.0029087066650390625, -0.0013151168823242188, -0.024810791015625, -0.0041961669921875, 0.061981201171875, -0.04473876953125, -0.0322265625, 0.03173828125, 0.0172882080078125, 0.022613525390625, -0.00730133056640625, -0.044281005859375, 0.01445770263671875, 0.005260467529296875, -0.01141357421875, 0.0007386207580566406, 0.01538848876953125, 0.00904083251953125, 0.0230255126953125, 0.058746337890625, 0.01076507568359375, 0.00867462158203125, 0.0299530029296875, 0.04443359375, -0.037811279296875, -0.05194091796875, -0.03570556640625, 0.032012939453125, 0.007965087890625, -0.00983428955078125, 0.02020263671875, 0.0565185546875, 0.07940673828125, 0.00885009765625, 0.048004150390625, 0.0158538818359375, 0.0723876953125, -0.058074951171875, 0.048095703125, -0.045166015625, -0.00677490234375, 0.00910186767578125, -0.052276611328125, -0.0117950439453125, 0.062225341796875, 0.01258087158203125, 0.013458251953125, 0.033599853515625, 0.06268310546875, 0.0006704330444335938, 0.00699615478515625, 0.0200958251953125, 0.02490234375, 0.019317626953125, 0.03350830078125, 0.0643310546875, -0.05072021484375, 0.060394287109375, -0.0253448486328125, -0.0225067138671875, -0.00244140625, -0.0260009765625, -0.0787353515625, -0.0589599609375, -0.02691650390625, -0.0396728515625, 0.00872039794921875, 0.0831298828125, 0.0760498046875, -0.07379150390625, -0.002071380615234375, -0.0007939338684082031, -0.0206146240234375, -0.01177978515625, -0.0070037841796875, 0.036865234375, -0.005222320556640625, -0.05548095703125, 0.0601806640625, -0.00029969215393066406, 0.0253448486328125, 0.0140838623046875, -0.020782470703125, -0.001659393310546875, -0.000621795654296875, 0.032440185546875, 0.0186920166015625, -0.055755615234375, -0.0105743408203125, -0.0178070068359375, -0.0103759765625, 0.0131072998046875, 0.049713134765625, -0.0579833984375, 0.04705810546875, 0.041839599609375, 0.033721923828125, 0.07232666015625, -0.0005435943603515625, 0.007213592529296875, -0.05609130859375, 0.0251922607421875, 0.030029296875, 0.020477294921875, 0.0258941650390625, -0.004962921142578125, 0.01617431640625, 0.0228424072265625, -0.035858154296875, -0.05792236328125, 0.01032257080078125, -0.10064697265625, -0.0160980224609375, 0.0899658203125, 0.01291656494140625, -0.007724761962890625, 0.00262451171875, -0.03558349609375, 0.07110595703125, -0.04248046875, 0.0218658447265625, 0.035675048828125, -0.00045228004455566406, 0.0008625984191894531, -0.036407470703125, 0.043548583984375, 0.0229339599609375, -0.02081298828125, 0.0018110275268554688, 0.035064697265625, 0.0350341796875, -0.01049041748046875, 0.051727294921875, -0.01319122314453125, 0.01605224609375, 0.009521484375, -0.0029582977294921875, -0.0251007080078125, -0.032501220703125, -0.042816162109375, 0.0022449493408203125, 0.00531768798828125, -0.041168212890625 ] ]
codellama/CodeLlama-7b-hf
2023-10-27T16:00:06.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "llama-2", "code", "arxiv:2308.12950", "license:llama2", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
codellama
null
null
codellama/CodeLlama-7b-hf
170
129,268
transformers
2023-08-24T16:31:11
--- language: - code pipeline_tag: text-generation tags: - llama-2 license: llama2 --- # **Code Llama** Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the base 7B version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom. | | Base Model | Python | Instruct | | --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | | 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) | | 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) | | 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) | ## Model Use To use this model, please make sure to install transformers from `main` until the next version is released: ```bash pip install git+https://github.com/huggingface/transformers.git@main accelerate ``` Model capabilities: - [x] Code completion. - [x] Infilling. - [ ] Instructions / chat. - [ ] Python specialist. ```python from transformers import AutoTokenizer import transformers import torch model = "codellama/CodeLlama-7b-hf" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) sequences = pipeline( 'import socket\n\ndef ping_exponential_backoff(host: str):', do_sample=True, top_k=10, temperature=0.1, top_p=0.95, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, max_length=200, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ``` ## Model Details *Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs). **Model Developers** Meta **Variations** Code Llama comes in three model sizes, and three variants: * Code Llama: base models designed for general code synthesis and understanding * Code Llama - Python: designed specifically for Python * Code Llama - Instruct: for instruction following and safer deployment All variants are available in sizes of 7B, 13B and 34B parameters. **This repository contains the base model of 7B parameters.** **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture. **Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or it's [arXiv page](https://arxiv.org/abs/2308.12950). ## Intended Use **Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications. **Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants. ## Hardware and Software **Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster. **Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program. ## Training Data All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details). ## Evaluation Results See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper. ## Ethical Considerations and Limitations Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-use-guide](https://ai.meta.com/llama/responsible-use-guide).
6,758
[ [ -0.0252685546875, -0.0494384765625, 0.01873779296875, 0.039215087890625, -0.0178985595703125, 0.01049041748046875, -0.0051727294921875, -0.0452880859375, 0.0203094482421875, 0.035491943359375, -0.0279083251953125, -0.042083740234375, -0.04559326171875, 0.0222320556640625, -0.03668212890625, 0.08624267578125, -0.002593994140625, -0.0284271240234375, -0.01568603515625, 0.0019702911376953125, -0.018280029296875, -0.04315185546875, -0.0167388916015625, -0.032012939453125, 0.01971435546875, 0.02496337890625, 0.050079345703125, 0.045501708984375, 0.03948974609375, 0.0270538330078125, -0.0199127197265625, 0.0012760162353515625, -0.027130126953125, -0.0250396728515625, 0.0193634033203125, -0.043121337890625, -0.0556640625, -0.00263214111328125, 0.0269317626953125, 0.022735595703125, -0.0196990966796875, 0.034210205078125, -0.0137786865234375, 0.0355224609375, -0.024658203125, 0.01800537109375, -0.050323486328125, -0.00439453125, 0.00360107421875, -0.0079498291015625, -0.0156097412109375, -0.036773681640625, -0.006633758544921875, -0.03424072265625, -0.002651214599609375, -0.0025177001953125, 0.08514404296875, 0.037567138671875, -0.021820068359375, -0.0174407958984375, -0.02374267578125, 0.05908203125, -0.07244873046875, 0.0019989013671875, 0.0238189697265625, -0.004467010498046875, -0.01116180419921875, -0.06573486328125, -0.0570068359375, -0.024200439453125, -0.01080322265625, -0.0005083084106445312, -0.034332275390625, 0.004970550537109375, 0.0316162109375, 0.034637451171875, -0.035980224609375, 0.0070343017578125, -0.035797119140625, -0.01568603515625, 0.0665283203125, 0.0088653564453125, 0.0311431884765625, -0.023834228515625, -0.0268707275390625, -0.00542449951171875, -0.05889892578125, 0.005741119384765625, 0.0330810546875, -0.0095977783203125, -0.056976318359375, 0.05224609375, -0.015411376953125, 0.042327880859375, 0.00829315185546875, -0.037384033203125, 0.042327880859375, -0.0207672119140625, -0.025390625, -0.010345458984375, 0.07080078125, 0.03851318359375, 0.0253448486328125, 0.004322052001953125, -0.01531219482421875, 0.02032470703125, 0.0057830810546875, -0.063720703125, -0.01226806640625, 0.026153564453125, -0.045928955078125, -0.05078125, -0.017669677734375, -0.05902099609375, -0.00402069091796875, 0.0009307861328125, 0.01241302490234375, -0.01412200927734375, -0.034423828125, 0.0161285400390625, 0.003543853759765625, 0.035003662109375, 0.00453948974609375, -0.06622314453125, 0.0025806427001953125, 0.033966064453125, 0.06005859375, 0.00397491455078125, -0.038421630859375, -0.0006532669067382812, -0.0090789794921875, -0.0202484130859375, 0.048248291015625, -0.0321044921875, -0.03631591796875, -0.0127716064453125, 0.009857177734375, -0.004718780517578125, -0.035247802734375, 0.01399993896484375, -0.0264129638671875, 0.00012743473052978516, 0.0095062255859375, -0.0224151611328125, -0.0311737060546875, 0.00238800048828125, -0.041168212890625, 0.0872802734375, 0.0210723876953125, -0.05670166015625, -0.00243377685546875, -0.042724609375, -0.0265655517578125, -0.019805908203125, -0.0019273757934570312, -0.051605224609375, -0.005462646484375, 0.016021728515625, 0.03839111328125, -0.0282135009765625, 0.02935791015625, -0.01137542724609375, -0.030548095703125, 0.0159912109375, -0.0133056640625, 0.079345703125, 0.0257720947265625, -0.03924560546875, 0.017364501953125, -0.06402587890625, -0.007366180419921875, 0.038665771484375, -0.033660888671875, 0.01287841796875, -0.00977325439453125, -0.00023257732391357422, 0.000545501708984375, 0.037139892578125, -0.02508544921875, 0.037750244140625, -0.033538818359375, 0.058868408203125, 0.051025390625, -0.002193450927734375, 0.0284576416015625, -0.04254150390625, 0.05181884765625, -0.0068206787109375, 0.01739501953125, -0.02362060546875, -0.05670166015625, -0.0777587890625, -0.0196533203125, 0.0010576248168945312, 0.052520751953125, -0.038543701171875, 0.04852294921875, -0.0031681060791015625, -0.058685302734375, -0.03839111328125, 0.01263427734375, 0.0347900390625, 0.0262298583984375, 0.02655029296875, -0.0102386474609375, -0.05804443359375, -0.05950927734375, 0.0077667236328125, -0.034271240234375, 0.009979248046875, 0.01727294921875, 0.06378173828125, -0.04669189453125, 0.060699462890625, -0.03125, -0.0006365776062011719, -0.0273284912109375, -0.01849365234375, 0.041259765625, 0.04473876953125, 0.053955078125, -0.041748046875, -0.023681640625, 0.002674102783203125, -0.065185546875, -0.00916290283203125, -0.017364501953125, -0.00482177734375, 0.0279083251953125, 0.022735595703125, -0.05108642578125, 0.03924560546875, 0.06268310546875, -0.019989013671875, 0.04443359375, -0.011444091796875, -0.006298065185546875, -0.08013916015625, 0.0179901123046875, -0.01433563232421875, -0.0040435791015625, -0.038421630859375, 0.024261474609375, 0.00530242919921875, 0.0060272216796875, -0.041473388671875, 0.0283355712890625, -0.031402587890625, -0.0013055801391601562, -0.00928497314453125, -0.01311492919921875, -0.0004494190216064453, 0.05450439453125, -0.0032215118408203125, 0.06964111328125, 0.04266357421875, -0.0452880859375, 0.028961181640625, 0.0227508544921875, -0.0241851806640625, 0.0126953125, -0.07415771484375, 0.0265655517578125, 0.0089111328125, 0.0256805419921875, -0.06280517578125, -0.0157470703125, 0.0263214111328125, -0.039031982421875, 0.00707244873046875, -0.00429534912109375, -0.036407470703125, -0.038726806640625, -0.0181732177734375, 0.034881591796875, 0.06591796875, -0.045654296875, 0.03076171875, 0.03118896484375, 0.01148223876953125, -0.054656982421875, -0.05218505859375, 0.006256103515625, -0.033660888671875, -0.054656982421875, 0.03204345703125, -0.0216064453125, -0.010894775390625, -0.01477813720703125, 0.0071258544921875, 0.0019121170043945312, 0.0215606689453125, 0.03350830078125, 0.0306854248046875, -0.009063720703125, -0.0153961181640625, -0.002872467041015625, -0.011871337890625, 0.005123138427734375, 0.0069580078125, 0.0587158203125, -0.031524658203125, -0.01898193359375, -0.044830322265625, 0.01204681396484375, 0.041778564453125, -0.01788330078125, 0.044281005859375, 0.032501220703125, -0.0265045166015625, -0.001224517822265625, -0.049407958984375, 0.0038623809814453125, -0.042236328125, 0.023193359375, -0.020111083984375, -0.061248779296875, 0.053009033203125, 0.00818634033203125, 0.01611328125, 0.04132080078125, 0.058929443359375, 0.006900787353515625, 0.05731201171875, 0.06866455078125, -0.02996826171875, 0.02978515625, -0.045928955078125, 0.006397247314453125, -0.058929443359375, -0.030548095703125, -0.042724609375, -0.00421905517578125, -0.050048828125, -0.03515625, 0.0233154296875, 0.0149078369140625, -0.038787841796875, 0.05328369140625, -0.064453125, 0.0311737060546875, 0.032623291015625, 0.002727508544921875, 0.026092529296875, 0.003002166748046875, -0.0038013458251953125, 0.0235748291015625, -0.036865234375, -0.051116943359375, 0.09112548828125, 0.0360107421875, 0.061920166015625, -0.004238128662109375, 0.06591796875, 0.0025787353515625, 0.027862548828125, -0.04425048828125, 0.04364013671875, 0.021484375, -0.03668212890625, -0.0015230178833007812, -0.019775390625, -0.06890869140625, 0.01227569580078125, 0.004669189453125, -0.06231689453125, 0.00583648681640625, 0.0019483566284179688, -0.0178985595703125, 0.027618408203125, -0.052978515625, 0.04901123046875, -0.014923095703125, 0.0007796287536621094, -0.009185791015625, -0.041534423828125, 0.04229736328125, -0.002422332763671875, 0.01336669921875, -0.012359619140625, -0.01094818115234375, 0.05133056640625, -0.03955078125, 0.0791015625, 0.00782012939453125, -0.02618408203125, 0.046539306640625, -0.0034351348876953125, 0.036712646484375, 0.0036716461181640625, -0.016387939453125, 0.05010986328125, 0.0008292198181152344, -0.0201416015625, -0.006534576416015625, 0.0477294921875, -0.08087158203125, -0.0555419921875, -0.032867431640625, -0.030975341796875, 0.0234527587890625, 0.01305389404296875, 0.035125732421875, 0.00510406494140625, 0.011749267578125, 0.00870513916015625, 0.0311737060546875, -0.04852294921875, 0.05047607421875, 0.023590087890625, -0.0208282470703125, -0.0367431640625, 0.06298828125, -0.0107421875, 0.0186309814453125, 0.0172882080078125, 0.00345611572265625, -0.01041412353515625, -0.031890869140625, -0.03521728515625, 0.03497314453125, -0.047149658203125, -0.041534423828125, -0.0509033203125, -0.0300750732421875, -0.0289154052734375, -0.023101806640625, -0.023895263671875, -0.0203704833984375, -0.05029296875, -0.01081085205078125, 0.058349609375, 0.056365966796875, 0.0019588470458984375, 0.035797119140625, -0.047943115234375, 0.030426025390625, 0.0087890625, 0.0280303955078125, 0.004039764404296875, -0.039398193359375, -0.008270263671875, -0.0016040802001953125, -0.0396728515625, -0.06915283203125, 0.046966552734375, 0.0095062255859375, 0.044464111328125, 0.0094757080078125, -0.0011272430419921875, 0.0501708984375, -0.0306549072265625, 0.06756591796875, 0.026702880859375, -0.0848388671875, 0.047943115234375, -0.0178680419921875, 0.00701904296875, 0.005645751953125, 0.02044677734375, -0.033782958984375, -0.0225830078125, -0.0540771484375, -0.0596923828125, 0.046966552734375, 0.0191802978515625, 0.0205230712890625, -0.00016164779663085938, 0.031402587890625, -0.006565093994140625, 0.01953125, -0.0809326171875, -0.0291595458984375, -0.0281829833984375, -0.0189971923828125, -0.004611968994140625, -0.0178070068359375, -0.0030841827392578125, -0.0224151611328125, 0.035003662109375, -0.01239013671875, 0.047271728515625, 0.0161895751953125, -0.016082763671875, -0.01849365234375, 0.0006709098815917969, 0.04913330078125, 0.043914794921875, -0.0025730133056640625, -0.00982666015625, 0.031585693359375, -0.04168701171875, 0.017425537109375, -0.00530242919921875, -0.00672149658203125, -0.0164337158203125, 0.040435791015625, 0.049041748046875, 0.0062103271484375, -0.056365966796875, 0.039093017578125, 0.00743865966796875, -0.0219268798828125, -0.038238525390625, 0.0196990966796875, 0.023406982421875, 0.0253448486328125, 0.020904541015625, -0.0017976760864257812, -0.00894927978515625, -0.0273590087890625, 0.00421142578125, 0.025543212890625, 0.0117340087890625, -0.0265350341796875, 0.06927490234375, 0.0114288330078125, -0.02569580078125, 0.03875732421875, 0.0027523040771484375, -0.045440673828125, 0.09088134765625, 0.051300048828125, 0.05792236328125, -0.014068603515625, 0.0088043212890625, 0.037567138671875, 0.043060302734375, 0.002338409423828125, 0.030792236328125, 0.0039520263671875, -0.040557861328125, -0.023345947265625, -0.0604248046875, -0.0235137939453125, 0.00513458251953125, -0.0303192138671875, 0.0251617431640625, -0.04693603515625, -0.0062408447265625, -0.0254669189453125, 0.007740020751953125, -0.051361083984375, -0.0010528564453125, 0.005588531494140625, 0.07025146484375, -0.0494384765625, 0.06744384765625, 0.0399169921875, -0.05096435546875, -0.06915283203125, -0.018035888671875, -0.0072784423828125, -0.0897216796875, 0.0396728515625, 0.0227508544921875, 0.00563812255859375, 0.00945281982421875, -0.06658935546875, -0.081787109375, 0.0986328125, 0.032562255859375, -0.0380859375, -0.0024566650390625, 0.01320648193359375, 0.03936767578125, -0.0287933349609375, 0.03985595703125, 0.047149658203125, 0.029693603515625, -0.006412506103515625, -0.08892822265625, 0.022552490234375, -0.0309600830078125, 0.01241302490234375, -0.01666259765625, -0.08111572265625, 0.08087158203125, -0.041534423828125, -0.0112152099609375, 0.0287322998046875, 0.053741455078125, 0.039459228515625, 0.0137176513671875, 0.02850341796875, 0.041656494140625, 0.047943115234375, -0.0016832351684570312, 0.08099365234375, -0.03656005859375, 0.04034423828125, 0.03839111328125, -0.007030487060546875, 0.0543212890625, 0.0282135009765625, -0.03887939453125, 0.0562744140625, 0.05743408203125, -0.015838623046875, 0.0181732177734375, 0.02313232421875, -0.005474090576171875, -0.0021648406982421875, -0.0007662773132324219, -0.05731201171875, 0.030029296875, 0.0240478515625, -0.0283050537109375, 0.005191802978515625, -0.01690673828125, 0.0199127197265625, -0.01519012451171875, -0.0023345947265625, 0.04736328125, 0.01517486572265625, -0.03778076171875, 0.08697509765625, 0.008026123046875, 0.07440185546875, -0.03424072265625, -0.0095977783203125, -0.029449462890625, 0.004638671875, -0.039886474609375, -0.03936767578125, 0.0142059326171875, 0.017730712890625, -0.0001671314239501953, -0.007061004638671875, 0.0382080078125, -0.007541656494140625, -0.03875732421875, 0.0273284912109375, 0.01340484619140625, 0.0227813720703125, 0.01458740234375, -0.055389404296875, 0.033477783203125, 0.0145111083984375, -0.0352783203125, 0.0210418701171875, 0.0089569091796875, 0.010650634765625, 0.06829833984375, 0.05670166015625, -0.01000213623046875, 0.0151214599609375, -0.01412200927734375, 0.082763671875, -0.053436279296875, -0.0279388427734375, -0.06207275390625, 0.0517578125, 0.01477813720703125, -0.03558349609375, 0.04510498046875, 0.027618408203125, 0.06280517578125, -0.0079803466796875, 0.059539794921875, -0.0176849365234375, 0.006317138671875, -0.031585693359375, 0.051666259765625, -0.051544189453125, 0.02813720703125, -0.03802490234375, -0.066650390625, -0.0190887451171875, 0.0684814453125, -0.00412750244140625, 0.00896453857421875, 0.039825439453125, 0.0738525390625, 0.0184478759765625, -0.00829315185546875, 0.01369476318359375, 0.016632080078125, 0.03173828125, 0.0634765625, 0.0673828125, -0.049560546875, 0.05230712890625, -0.045684814453125, -0.02032470703125, -0.0226593017578125, -0.07537841796875, -0.0748291015625, -0.0379638671875, -0.0246429443359375, -0.032501220703125, -0.0188446044921875, 0.0699462890625, 0.0457763671875, -0.04534912109375, -0.03668212890625, -0.01175689697265625, 0.0300140380859375, -0.00699615478515625, -0.01629638671875, 0.0248565673828125, -0.01230621337890625, -0.0616455078125, 0.020111083984375, -0.0011882781982421875, 0.01244354248046875, -0.022674560546875, -0.0184478759765625, -0.0128936767578125, -0.0005903244018554688, 0.035003662109375, 0.027130126953125, -0.06243896484375, -0.0178680419921875, 0.00865936279296875, -0.0152435302734375, 0.011016845703125, 0.02996826171875, -0.048675537109375, 0.0012035369873046875, 0.0291595458984375, 0.031005859375, 0.028839111328125, -0.017974853515625, 0.017608642578125, -0.0286407470703125, 0.0321044921875, -0.0011196136474609375, 0.037109375, 0.008819580078125, -0.04364013671875, 0.048370361328125, 0.024505615234375, -0.05340576171875, -0.0673828125, 0.00861358642578125, -0.08087158203125, -0.01306915283203125, 0.09515380859375, -0.01146697998046875, -0.026123046875, 0.01157379150390625, -0.0274505615234375, 0.02313232421875, -0.0294342041015625, 0.054718017578125, 0.0206298828125, -0.00847625732421875, -0.01335906982421875, -0.0286407470703125, 0.0207366943359375, 0.0215301513671875, -0.07061767578125, -0.0114288330078125, 0.0236358642578125, 0.031951904296875, 0.01486968994140625, 0.056793212890625, -0.00508880615234375, 0.01496124267578125, 0.005390167236328125, 0.03253173828125, -0.005260467529296875, -0.0130462646484375, -0.025238037109375, -0.00579071044921875, -0.0069427490234375, -0.004276275634765625 ] ]
mattmdjaga/segformer_b2_clothes
2023-07-25T20:11:56.000Z
[ "transformers", "pytorch", "safetensors", "segformer", "vision", "image-segmentation", "dataset:mattmdjaga/human_parsing_dataset", "arxiv:2105.15203", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
image-segmentation
mattmdjaga
null
null
mattmdjaga/segformer_b2_clothes
98
129,259
transformers
2022-11-24T09:48:16
--- license: mit tags: - vision - image-segmentation widget: - src: https://images.unsplash.com/photo-1643310325061-2beef64926a5?ixlib=rb-4.0.3&ixid=MnwxMjA3fDB8MHxzZWFyY2h8Nnx8cmFjb29uc3xlbnwwfHwwfHw%3D&w=1000&q=80 example_title: Person - src: https://freerangestock.com/sample/139043/young-man-standing-and-leaning-on-car.jpg example_title: Person datasets: - mattmdjaga/human_parsing_dataset --- # Segformer B2 fine-tuned for clothes segmentation SegFormer model fine-tuned on [ATR dataset](https://github.com/lemondan/HumanParsing-Dataset) for clothes segmentation. The dataset on hugging face is called "mattmdjaga/human_parsing_dataset". ```python from transformers import SegformerImageProcessor, AutoModelForSemanticSegmentation from PIL import Image import requests import matplotlib.pyplot as plt import torch.nn as nn processor = SegformerImageProcessor.from_pretrained("mattmdjaga/segformer_b2_clothes") model = AutoModelForSemanticSegmentation.from_pretrained("mattmdjaga/segformer_b2_clothes") url = "https://plus.unsplash.com/premium_photo-1673210886161-bfcc40f54d1f?ixlib=rb-4.0.3&ixid=MnwxMjA3fDB8MHxzZWFyY2h8MXx8cGVyc29uJTIwc3RhbmRpbmd8ZW58MHx8MHx8&w=1000&q=80" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits.cpu() upsampled_logits = nn.functional.interpolate( logits, size=image.size[::-1], mode="bilinear", align_corners=False, ) pred_seg = upsampled_logits.argmax(dim=1)[0] plt.imshow(pred_seg) ``` Labels: 0: "Background", 1: "Hat", 2: "Hair", 3: "Sunglasses", 4: "Upper-clothes", 5: "Skirt", 6: "Pants", 7: "Dress", 8: "Belt", 9: "Left-shoe", 10: "Right-shoe", 11: "Face", 12: "Left-leg", 13: "Right-leg", 14: "Left-arm", 15: "Right-arm", 16: "Bag", 17: "Scarf" ### Evaluation | Label Index | Label Name | Category Accuracy | Category IoU | |:-------------:|:----------------:|:-----------------:|:------------:| | 0 | Background | 0.99 | 0.99 | | 1 | Hat | 0.73 | 0.68 | | 2 | Hair | 0.91 | 0.82 | | 3 | Sunglasses | 0.73 | 0.63 | | 4 | Upper-clothes | 0.87 | 0.78 | | 5 | Skirt | 0.76 | 0.65 | | 6 | Pants | 0.90 | 0.84 | | 7 | Dress | 0.74 | 0.55 | | 8 | Belt | 0.35 | 0.30 | | 9 | Left-shoe | 0.74 | 0.58 | | 10 | Right-shoe | 0.75 | 0.60 | | 11 | Face | 0.92 | 0.85 | | 12 | Left-leg | 0.90 | 0.82 | | 13 | Right-leg | 0.90 | 0.81 | | 14 | Left-arm | 0.86 | 0.74 | | 15 | Right-arm | 0.82 | 0.73 | | 16 | Bag | 0.91 | 0.84 | | 17 | Scarf | 0.63 | 0.29 | Overall Evaluation Metrics: - Evaluation Loss: 0.15 - Mean Accuracy: 0.80 - Mean IoU: 0.69 ### License The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2105-15203, author = {Enze Xie and Wenhai Wang and Zhiding Yu and Anima Anandkumar and Jose M. Alvarez and Ping Luo}, title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers}, journal = {CoRR}, volume = {abs/2105.15203}, year = {2021}, url = {https://arxiv.org/abs/2105.15203}, eprinttype = {arXiv}, eprint = {2105.15203}, timestamp = {Wed, 02 Jun 2021 11:46:42 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
4,258
[ [ -0.0631103515625, -0.047027587890625, 0.0124359130859375, 0.016326904296875, -0.033111572265625, -0.003231048583984375, 0.0031032562255859375, -0.03692626953125, 0.032012939453125, 0.021636962890625, -0.06402587890625, -0.0506591796875, -0.052490234375, 0.0032806396484375, -0.01047515869140625, 0.053314208984375, 0.007526397705078125, 0.0015869140625, -0.01085662841796875, -0.0299835205078125, -0.006725311279296875, -0.0239410400390625, -0.043060302734375, -0.042999267578125, 0.01025390625, 0.0192108154296875, 0.06463623046875, 0.051788330078125, 0.052215576171875, 0.0290679931640625, -0.0244903564453125, 0.006618499755859375, -0.0191802978515625, -0.0103912353515625, 0.00571441650390625, -0.007232666015625, -0.034820556640625, -0.00418853759765625, 0.034027099609375, 0.04522705078125, 0.0013685226440429688, 0.03424072265625, 0.0114288330078125, 0.0618896484375, -0.0179595947265625, -0.00998687744140625, 0.0019044876098632812, -0.00528717041015625, -0.01203155517578125, -0.00019729137420654297, 0.0033626556396484375, -0.01215362548828125, 0.0106048583984375, -0.040283203125, 0.037994384765625, 0.00139617919921875, 0.11444091796875, 0.0269012451171875, -0.011962890625, -0.003597259521484375, -0.021575927734375, 0.055328369140625, -0.05682373046875, 0.04449462890625, 0.0235595703125, -0.0019741058349609375, 0.0000985264778137207, -0.04925537109375, -0.034698486328125, 0.0223541259765625, -0.02569580078125, 0.011474609375, -0.0310516357421875, -0.01457977294921875, 0.03955078125, 0.0240478515625, -0.046661376953125, -0.0018625259399414062, -0.044647216796875, -0.01447296142578125, 0.06182861328125, 0.0011186599731445312, 0.019256591796875, -0.0283355712890625, -0.05072021484375, -0.02532958984375, -0.019775390625, 0.0255584716796875, 0.0294189453125, -0.0015544891357421875, -0.0379638671875, 0.042938232421875, -0.00916290283203125, 0.0648193359375, 0.0279541015625, -0.018463134765625, 0.06048583984375, -0.0178985595703125, -0.0196990966796875, -0.01096343994140625, 0.05877685546875, 0.059173583984375, -0.0027141571044921875, 0.00850677490234375, -0.00897216796875, -0.00091552734375, -0.0009074211120605469, -0.08343505859375, -0.0234222412109375, 0.0178375244140625, -0.05474853515625, -0.016571044921875, 0.0203857421875, -0.06982421875, -0.0031185150146484375, -0.034210205078125, 0.035675048828125, -0.023284912109375, -0.00815582275390625, 0.0030918121337890625, -0.0114288330078125, 0.044158935546875, 0.00873565673828125, -0.048492431640625, 0.0167388916015625, 0.03302001953125, 0.0750732421875, -0.001415252685546875, -0.004161834716796875, 0.0014982223510742188, -0.0014562606811523438, -0.0197296142578125, 0.065673828125, -0.022308349609375, -0.031341552734375, -0.0283355712890625, 0.041595458984375, -0.01192474365234375, -0.033721923828125, 0.057281494140625, -0.019561767578125, 0.02569580078125, -0.0178070068359375, -0.027374267578125, -0.0307769775390625, 0.0219268798828125, -0.04095458984375, 0.0850830078125, 0.0242767333984375, -0.058319091796875, 0.03656005859375, -0.037261962890625, -0.0159454345703125, -0.01212310791015625, -0.0033969879150390625, -0.08074951171875, -0.007778167724609375, 0.03948974609375, 0.032867431640625, -0.034698486328125, 0.01202392578125, -0.047119140625, -0.01763916015625, 0.01233673095703125, 0.0004878044128417969, 0.07244873046875, 0.015899658203125, -0.029754638671875, 0.0218048095703125, -0.06951904296875, 0.0175933837890625, 0.029205322265625, -0.007259368896484375, -0.01059722900390625, -0.01806640625, 0.0039825439453125, 0.0289306640625, 0.0110015869140625, -0.039215087890625, 0.00394439697265625, -0.0232086181640625, 0.02752685546875, 0.05120849609375, 0.01325225830078125, 0.023101806640625, -0.03436279296875, 0.0296783447265625, 0.001987457275390625, 0.0205230712890625, 0.004108428955078125, -0.02447509765625, -0.06549072265625, -0.041046142578125, 0.01520538330078125, 0.01904296875, -0.0252685546875, 0.055633544921875, -0.00624847412109375, -0.047271728515625, -0.034393310546875, -0.004302978515625, 0.02288818359375, 0.048431396484375, 0.0310516357421875, -0.0289306640625, -0.054656982421875, -0.0845947265625, -0.0067596435546875, 0.00991058349609375, -0.012176513671875, 0.0287017822265625, 0.037750244140625, -0.040069580078125, 0.058990478515625, -0.0614013671875, -0.033905029296875, -0.0018901824951171875, -0.0014667510986328125, 0.039398193359375, 0.052825927734375, 0.054718017578125, -0.054229736328125, -0.040191650390625, -0.01500701904296875, -0.041748046875, -0.0205230712890625, 0.007778167724609375, -0.03070068359375, 0.02105712890625, 0.03155517578125, -0.0276641845703125, 0.044525146484375, 0.0295562744140625, -0.05718994140625, 0.03839111328125, -0.022064208984375, 0.00848388671875, -0.08038330078125, 0.0379638671875, 0.0235595703125, -0.016693115234375, -0.044097900390625, -0.00719451904296875, 0.006031036376953125, -0.00740814208984375, -0.0209503173828125, 0.0537109375, -0.040435791015625, -0.00577545166015625, -0.00028634071350097656, -0.01239013671875, 0.0184478759765625, 0.04620361328125, 0.00199127197265625, 0.048126220703125, 0.03497314453125, -0.0296783447265625, 0.0128021240234375, 0.042572021484375, -0.04534912109375, 0.043670654296875, -0.06756591796875, 0.01303863525390625, -0.0108489990234375, -0.00039505958557128906, -0.07293701171875, -0.04705810546875, 0.03863525390625, -0.037689208984375, 0.040283203125, -0.0234527587890625, -0.01215362548828125, -0.04827880859375, -0.03656005859375, 0.0193634033203125, 0.03265380859375, -0.04156494140625, 0.0270538330078125, 0.0389404296875, -0.0027332305908203125, -0.036834716796875, -0.04449462890625, -0.03057861328125, -0.0283203125, -0.0697021484375, 0.0430908203125, -0.004787445068359375, -0.00873565673828125, 0.001129150390625, -0.028076171875, -0.0099945068359375, 0.0007166862487792969, 0.017822265625, 0.047760009765625, -0.006664276123046875, -0.01404571533203125, -0.007717132568359375, -0.0178680419921875, 0.0027618408203125, 0.0111846923828125, 0.049163818359375, -0.0179443359375, -0.02581787109375, -0.04010009765625, 0.00841522216796875, 0.029449462890625, -0.0108795166015625, 0.034149169921875, 0.0704345703125, -0.0225982666015625, 0.006134033203125, -0.03973388671875, -0.0155792236328125, -0.038848876953125, 0.0246124267578125, -0.0279541015625, -0.056365966796875, 0.063232421875, 0.00655364990234375, -0.01326751708984375, 0.05804443359375, 0.039886474609375, -0.017547607421875, 0.09588623046875, 0.0300140380859375, 0.0163421630859375, 0.044952392578125, -0.051300048828125, 0.0034427642822265625, -0.08441162109375, -0.04693603515625, -0.03759765625, -0.049835205078125, -0.0390625, -0.044342041015625, 0.034210205078125, 0.0211334228515625, -0.05010986328125, 0.0230560302734375, -0.06646728515625, 0.030792236328125, 0.038848876953125, 0.0310211181640625, -0.014251708984375, 0.015899658203125, -0.002777099609375, -0.00331878662109375, -0.038299560546875, -0.0232391357421875, 0.05328369140625, 0.0264892578125, 0.06689453125, 0.006561279296875, 0.035797119140625, 0.01068878173828125, -0.0035762786865234375, -0.06298828125, 0.026641845703125, -0.0116424560546875, -0.05340576171875, -0.0081634521484375, -0.0218048095703125, -0.0745849609375, 0.032470703125, -0.0193328857421875, -0.0692138671875, 0.056182861328125, 0.01702880859375, -0.03179931640625, 0.00983428955078125, -0.045745849609375, 0.0631103515625, -0.0201568603515625, -0.046722412109375, 0.0185546875, -0.0704345703125, 0.01340484619140625, 0.015411376953125, 0.005756378173828125, -0.031494140625, 0.0035839080810546875, 0.0804443359375, -0.0380859375, 0.044525146484375, -0.0183563232421875, 0.029205322265625, 0.0290069580078125, -0.01480865478515625, 0.0367431640625, -0.005100250244140625, -0.0014667510986328125, 0.0220794677734375, 0.01287078857421875, -0.024871826171875, -0.022857666015625, 0.041656494140625, -0.06646728515625, -0.047821044921875, -0.049530029296875, -0.021026611328125, -0.00891876220703125, 0.02783203125, 0.037872314453125, 0.04022216796875, 0.00511932373046875, 0.03485107421875, 0.04052734375, -0.006397247314453125, 0.02899169921875, 0.0162811279296875, -0.0092926025390625, -0.050262451171875, 0.059417724609375, 0.002437591552734375, 0.006626129150390625, -0.00214385986328125, 0.0294952392578125, -0.041717529296875, -0.0157318115234375, -0.0345458984375, 0.036895751953125, -0.0406494140625, -0.033538818359375, -0.036895751953125, -0.0141754150390625, -0.0406494140625, -0.03424072265625, -0.01456451416015625, -0.02789306640625, -0.0386962890625, 0.0051422119140625, 0.0298004150390625, 0.02880859375, -0.0182037353515625, 0.0277557373046875, -0.042144775390625, 0.01480865478515625, 0.01197052001953125, 0.0260162353515625, -0.019256591796875, -0.057464599609375, -0.0038089752197265625, -0.00690460205078125, -0.029205322265625, -0.049072265625, 0.0533447265625, 0.0090484619140625, 0.0390625, 0.038177490234375, -0.0029735565185546875, 0.0853271484375, -0.004817962646484375, 0.03900146484375, 0.043670654296875, -0.057403564453125, 0.02252197265625, -0.0214691162109375, 0.037109375, 0.03948974609375, 0.0215301513671875, -0.044708251953125, 0.0038089752197265625, -0.061737060546875, -0.08441162109375, 0.087890625, 0.0044403076171875, -0.02276611328125, 0.02996826171875, 0.00986480712890625, -0.01345062255859375, 0.0106048583984375, -0.035614013671875, -0.035919189453125, -0.022674560546875, -0.0194244384765625, -0.0014820098876953125, -0.0247344970703125, -0.0234375, -0.045196533203125, 0.0667724609375, -0.0173492431640625, 0.0408935546875, 0.038787841796875, -0.00463104248046875, -0.00995635986328125, -0.004573822021484375, 0.044525146484375, 0.049652099609375, -0.0264129638671875, -0.005840301513671875, 0.00001049041748046875, -0.033355712890625, -0.0033092498779296875, 0.0223541259765625, -0.032806396484375, -0.00217437744140625, 0.0247802734375, 0.07501220703125, 0.0135345458984375, -0.01175689697265625, 0.043670654296875, 0.009918212890625, -0.03692626953125, -0.0228729248046875, -0.00518798828125, 0.00232696533203125, 0.035552978515625, 0.0193939208984375, 0.03607177734375, 0.0134429931640625, -0.0168609619140625, 0.0241546630859375, 0.0305328369140625, -0.03955078125, -0.0289459228515625, 0.05633544921875, 0.005344390869140625, -0.019439697265625, 0.03485107421875, 0.005863189697265625, -0.0631103515625, 0.0634765625, 0.03704833984375, 0.057159423828125, -0.01568603515625, 0.01457977294921875, 0.0765380859375, 0.00853729248046875, 0.0013055801391601562, 0.024017333984375, 0.0017108917236328125, -0.04132080078125, -0.0033512115478515625, -0.0673828125, -0.004978179931640625, 0.01520538330078125, -0.05889892578125, 0.03619384765625, -0.02545166015625, -0.02679443359375, 0.0100860595703125, 0.017669677734375, -0.06671142578125, 0.03082275390625, 0.007106781005859375, 0.05694580078125, -0.05157470703125, 0.04248046875, 0.057830810546875, -0.022369384765625, -0.07501220703125, -0.01800537109375, 0.0085601806640625, -0.0721435546875, 0.02496337890625, 0.0200958251953125, 0.0016546249389648438, 0.002262115478515625, -0.0386962890625, -0.0838623046875, 0.09930419921875, 0.0125885009765625, -0.03460693359375, -0.00562286376953125, -0.0010595321655273438, 0.0282135009765625, -0.0121917724609375, 0.0243072509765625, 0.042388916015625, 0.041473388671875, 0.0300140380859375, -0.038116455078125, 0.0093231201171875, -0.0279998779296875, 0.0014657974243164062, 0.0298919677734375, -0.074951171875, 0.06817626953125, -0.01387786865234375, -0.00646209716796875, -0.0003814697265625, 0.0361328125, 0.033966064453125, 0.0338134765625, 0.047943115234375, 0.059112548828125, 0.045501708984375, -0.025054931640625, 0.0579833984375, -0.024078369140625, 0.06793212890625, 0.0765380859375, 0.0190277099609375, 0.028778076171875, 0.03369140625, -0.028778076171875, 0.0298004150390625, 0.06927490234375, -0.031982421875, 0.0328369140625, -0.0061187744140625, 0.004730224609375, -0.0245819091796875, 0.0039215087890625, -0.041900634765625, 0.03802490234375, 0.01873779296875, -0.0458984375, -0.01352691650390625, -0.01456451416015625, 0.01346588134765625, -0.0165252685546875, -0.01837158203125, 0.0513916015625, 0.0012845993041992188, -0.0300140380859375, 0.04998779296875, -0.01105499267578125, 0.03802490234375, -0.048828125, 0.01119232177734375, -0.005321502685546875, 0.029083251953125, -0.037841796875, -0.0650634765625, 0.0254669189453125, -0.00257110595703125, -0.011199951171875, -0.00855255126953125, 0.06744384765625, -0.019073486328125, -0.0594482421875, 0.01788330078125, 0.0186309814453125, 0.0098876953125, 0.02069091796875, -0.08233642578125, 0.0199127197265625, 0.0179443359375, -0.056396484375, -0.00042247772216796875, 0.01166534423828125, 0.0035762786865234375, 0.0389404296875, 0.05242919921875, -0.0104217529296875, -0.000013113021850585938, -0.01152801513671875, 0.0770263671875, -0.041290283203125, -0.04052734375, -0.05499267578125, 0.03619384765625, -0.037445068359375, -0.035858154296875, 0.058502197265625, 0.05377197265625, 0.07147216796875, -0.003963470458984375, 0.0310516357421875, -0.03863525390625, 0.04058837890625, -0.0231475830078125, 0.057464599609375, -0.06329345703125, -0.01023101806640625, -0.02447509765625, -0.0662841796875, -0.037841796875, 0.07110595703125, -0.02362060546875, 0.00754547119140625, 0.03143310546875, 0.0692138671875, -0.0102996826171875, -0.01904296875, 0.007762908935546875, 0.00600433349609375, 0.02032470703125, 0.012969970703125, 0.036407470703125, -0.034637451171875, 0.02239990234375, -0.060882568359375, -0.00897216796875, -0.027587890625, -0.040252685546875, -0.052459716796875, -0.046112060546875, -0.0304718017578125, -0.034820556640625, -0.023895263671875, 0.0816650390625, 0.06903076171875, -0.05877685546875, -0.01171112060546875, 0.012847900390625, 0.0033626556396484375, -0.00844573974609375, -0.01568603515625, 0.053436279296875, 0.006259918212890625, -0.06549072265625, -0.0024852752685546875, 0.0010128021240234375, 0.021331787109375, 0.00888824462890625, -0.0178070068359375, -0.0148773193359375, -0.0124053955078125, 0.032135009765625, 0.020172119140625, -0.041900634765625, -0.0263519287109375, -0.00247955322265625, 0.005519866943359375, 0.0283660888671875, 0.0286407470703125, -0.03704833984375, 0.043670654296875, 0.04693603515625, 0.01412200927734375, 0.0535888671875, 0.02996826171875, -0.0003247261047363281, -0.044281005859375, 0.020416259765625, 0.0128631591796875, 0.036834716796875, 0.0257720947265625, -0.019744873046875, 0.0556640625, 0.029327392578125, -0.056365966796875, -0.06793212890625, -0.0019989013671875, -0.09661865234375, -0.01149749755859375, 0.07623291015625, -0.007537841796875, -0.04864501953125, 0.0124969482421875, -0.029449462890625, 0.028839111328125, -0.0303955078125, 0.037689208984375, 0.034942626953125, -0.0214080810546875, -0.0038204193115234375, -0.0177764892578125, 0.0386962890625, 0.01045989990234375, -0.040313720703125, -0.01837158203125, 0.0313720703125, 0.03350830078125, 0.017608642578125, 0.03070068359375, -0.01934814453125, 0.011962890625, 0.0025730133056640625, 0.017791748046875, -0.016815185546875, 0.005382537841796875, -0.0270843505859375, 0.00244140625, -0.0157318115234375, -0.025390625 ] ]
Lykon/DreamShaper
2023-08-01T15:02:43.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "art", "artistic", "anime", "en", "doi:10.57967/hf/0453", "license:other", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Lykon
null
null
Lykon/DreamShaper
782
127,688
diffusers
2023-01-12T09:14:06
--- language: - en license: other tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - art - artistic - diffusers - anime inference: false --- # Dream Shaper ## Official Repository Read more about this model here: https://civitai.com/models/4384/dreamshaper Also please support by giving 5 stars and a heart, which will notify new updates. Please consider supporting me on Patreon or buy me a coffee - https://www.patreon.com/Lykon275 - https://snipfeed.co/lykon You can run this model on: - https://huggingface.co/spaces/Lykon/DreamShaper-webui - Mage.space, sinkin.ai and more
602
[ [ -0.0128021240234375, -0.004383087158203125, 0.0247802734375, 0.0265350341796875, -0.0224151611328125, 0.00614166259765625, 0.0189208984375, -0.025421142578125, 0.0482177734375, 0.061248779296875, -0.06390380859375, -0.0165252685546875, -0.031280517578125, -0.0037078857421875, -0.01947021484375, 0.048431396484375, -0.00658416748046875, 0.032928466796875, -0.00653839111328125, 0.032867431640625, -0.03009033203125, -0.021881103515625, -0.0665283203125, -0.035919189453125, 0.0499267578125, 0.0307159423828125, 0.0484619140625, 0.01169586181640625, 0.0361328125, 0.02557373046875, 0.00533294677734375, -0.0161285400390625, -0.04534912109375, 0.00574493408203125, 0.01031494140625, -0.043212890625, -0.07598876953125, -0.008758544921875, 0.022430419921875, 0.009368896484375, -0.0290985107421875, 0.0197601318359375, 0.004718780517578125, 0.045684814453125, -0.0518798828125, 0.031951904296875, -0.01296234130859375, 0.0140380859375, -0.01080322265625, 0.02996826171875, 0.0021457672119140625, -0.053955078125, -0.01947021484375, -0.08154296875, 0.025299072265625, 0.00887298583984375, 0.07366943359375, 0.0226898193359375, -0.0200347900390625, -0.0014009475708007812, -0.05419921875, 0.041107177734375, -0.0406494140625, 0.066650390625, 0.0306243896484375, 0.06353759765625, 0.01055145263671875, -0.07275390625, -0.0225677490234375, 0.018341064453125, 0.00232696533203125, 0.0261993408203125, -0.01812744140625, 0.01029205322265625, 0.017181396484375, 0.03466796875, -0.04144287109375, -0.033966064453125, -0.0662841796875, -0.01082611083984375, 0.035675048828125, 0.01200103759765625, 0.0296630859375, -0.00984954833984375, -0.02099609375, -0.0185089111328125, -0.0224761962890625, 0.003139495849609375, 0.0282135009765625, 0.0033702850341796875, -0.038360595703125, 0.051513671875, -0.00011199712753295898, 0.0426025390625, 0.0191802978515625, 0.00815582275390625, 0.0060272216796875, 0.0189056396484375, -0.033782958984375, -0.0146484375, 0.05047607421875, 0.04388427734375, 0.0171051025390625, 0.01082611083984375, -0.0189971923828125, 0.007564544677734375, 0.02276611328125, -0.07867431640625, -0.04461669921875, 0.0226287841796875, -0.035369873046875, -0.0239410400390625, 0.01364898681640625, -0.0687255859375, -0.0455322265625, 0.007503509521484375, 0.00982666015625, -0.02777099609375, -0.04443359375, 0.0189056396484375, -0.01373291015625, 0.035888671875, 0.0223388671875, -0.06396484375, 0.041229248046875, 0.0267486572265625, 0.03497314453125, 0.03582763671875, 0.0162200927734375, 0.0029926300048828125, 0.01090240478515625, -0.0303497314453125, 0.0384521484375, 0.0008478164672851562, -0.05133056640625, 0.00951385498046875, 0.001819610595703125, -0.0105438232421875, -0.0181884765625, 0.07318115234375, -0.03192138671875, 0.0161285400390625, -0.025238037109375, -0.0426025390625, -0.025054931640625, 0.0235443115234375, -0.07379150390625, 0.034637451171875, 0.01342010498046875, -0.0400390625, 0.0229644775390625, -0.0845947265625, -0.004703521728515625, 0.01403045654296875, 0.00196075439453125, -0.05169677734375, 0.036529541015625, -0.00534820556640625, 0.03851318359375, -0.00646209716796875, -0.005176544189453125, -0.039276123046875, -0.01116180419921875, 0.0062103271484375, -0.00994110107421875, 0.077392578125, 0.032623291015625, 0.01354217529296875, 0.01025390625, -0.07080078125, 0.003986358642578125, 0.05047607421875, 0.0107879638671875, 0.01309967041015625, -0.00543975830078125, 0.0308380126953125, 0.0007281303405761719, 0.01338958740234375, -0.05255126953125, 0.02874755859375, 0.0029697418212890625, 0.0026340484619140625, 0.032073974609375, -0.0097503662109375, -0.0037746429443359375, -0.038177490234375, 0.058441162109375, -0.0161895751953125, 0.045257568359375, 0.05194091796875, -0.0286712646484375, -0.0880126953125, -0.01507568359375, 0.01215362548828125, 0.02130126953125, -0.0285797119140625, 0.0203857421875, -0.010040283203125, -0.06005859375, -0.0435791015625, -0.0182952880859375, 0.0196380615234375, 0.004444122314453125, 0.00262451171875, -0.02093505859375, -0.06353759765625, -0.0693359375, -0.00826263427734375, -0.01113128662109375, -0.0157012939453125, 0.0325927734375, 0.03619384765625, -0.01152801513671875, 0.04229736328125, -0.06317138671875, -0.0160980224609375, -0.035247802734375, -0.0355224609375, 0.0209503173828125, 0.036529541015625, 0.075439453125, -0.0650634765625, -0.045074462890625, -0.0167236328125, -0.03485107421875, 0.00852203369140625, 0.0189361572265625, -0.02459716796875, -0.0257568359375, 0.0008931159973144531, -0.06610107421875, 0.032501220703125, 0.040740966796875, -0.059112548828125, 0.033355712890625, 0.006977081298828125, 0.0297088623046875, -0.09564208984375, -0.0028400421142578125, -0.0011196136474609375, -0.03564453125, -0.04150390625, 0.017120361328125, -0.004909515380859375, -0.021697998046875, -0.048614501953125, 0.035552978515625, -0.032470703125, 0.01514434814453125, -0.0042266845703125, -0.024322509765625, 0.0198516845703125, 0.01415252685546875, -0.009521484375, 0.006664276123046875, 0.049224853515625, -0.026336669921875, 0.04443359375, 0.048797607421875, -0.03619384765625, 0.031768798828125, -0.08349609375, 0.0192108154296875, -0.0080413818359375, 0.0143585205078125, -0.048126220703125, -0.039794921875, 0.04986572265625, -0.042327880859375, 0.0216522216796875, -0.02008056640625, -0.033721923828125, -0.047760009765625, -0.007755279541015625, 0.036407470703125, 0.055023193359375, -0.034027099609375, 0.03466796875, 0.0193023681640625, 0.0285491943359375, -0.00774383544921875, -0.03668212890625, -0.01372528076171875, -0.036102294921875, -0.0572509765625, 0.0255279541015625, -0.0372314453125, -0.04412841796875, -0.0194549560546875, 0.00887298583984375, -0.0153045654296875, -0.01532745361328125, 0.03955078125, 0.010345458984375, -0.01519775390625, -0.037567138671875, -0.017852783203125, -0.004093170166015625, 0.01459503173828125, -0.0191192626953125, 0.061676025390625, -0.0186767578125, -0.0269317626953125, -0.05218505859375, 0.0032958984375, 0.06744384765625, -0.002712249755859375, 0.047088623046875, 0.034423828125, -0.06072998046875, 0.005794525146484375, -0.05224609375, -0.0165863037109375, -0.033905029296875, 0.01465606689453125, -0.059051513671875, -0.0241851806640625, 0.03594970703125, 0.0195770263671875, 0.0113983154296875, 0.053802490234375, 0.0233001708984375, -0.014678955078125, 0.06744384765625, 0.052947998046875, -0.010345458984375, 0.01849365234375, -0.0673828125, -0.0109405517578125, -0.0560302734375, -0.03369140625, -0.00394439697265625, -0.04217529296875, -0.02593994140625, -0.03570556640625, 0.0073394775390625, 0.021331787109375, -0.01522064208984375, 0.039703369140625, -0.0243072509765625, 0.032867431640625, 0.01114654541015625, 0.0218963623046875, 0.0144805908203125, -0.0102386474609375, -0.0023345947265625, 0.00461578369140625, -0.033721923828125, -0.01434326171875, 0.05419921875, 0.0189056396484375, 0.041259765625, 0.011138916015625, 0.046722412109375, -0.0130767822265625, -0.01041412353515625, -0.039886474609375, 0.0303955078125, 0.0026454925537109375, -0.083984375, 0.00534820556640625, 0.0085906982421875, -0.053924560546875, 0.037139892578125, -0.0270843505859375, -0.0289764404296875, 0.01207733154296875, 0.0199737548828125, -0.034820556640625, 0.048858642578125, -0.025390625, 0.07904052734375, -0.02801513671875, -0.00751495361328125, -0.0182037353515625, -0.01727294921875, 0.0295257568359375, 0.036346435546875, 0.0060577392578125, -0.0182952880859375, 0.0005745887756347656, 0.039825439453125, -0.050689697265625, 0.07037353515625, -0.02435302734375, 0.0141754150390625, 0.0269775390625, 0.0007681846618652344, 0.0235443115234375, 0.0193634033203125, -0.004642486572265625, 0.006015777587890625, -0.0162506103515625, -0.049407958984375, -0.0257568359375, 0.046356201171875, -0.06744384765625, -0.017913818359375, -0.048736572265625, -0.0247802734375, -0.0007123947143554688, 0.017913818359375, 0.057342529296875, 0.0147247314453125, -0.0447998046875, 0.00794219970703125, 0.041473388671875, 0.03558349609375, 0.038360595703125, 0.008819580078125, -0.046661376953125, -0.04510498046875, 0.053253173828125, -0.003719329833984375, -0.00012373924255371094, 0.006626129150390625, 0.02801513671875, 0.005046844482421875, -0.0225677490234375, -0.03350830078125, 0.02520751953125, 0.00341796875, -0.0222015380859375, -0.03546142578125, -0.0404052734375, -0.023468017578125, -0.01496124267578125, -0.049346923828125, -0.035858154296875, -0.0343017578125, -0.0217742919921875, 0.04931640625, 0.0638427734375, 0.0020694732666015625, 0.0266876220703125, -0.03253173828125, 0.018157958984375, 0.02728271484375, 0.04443359375, -0.0167236328125, -0.01416778564453125, -0.014739990234375, 0.0081787109375, -0.0443115234375, -0.046966552734375, 0.043670654296875, 0.0027065277099609375, 0.035247802734375, 0.04132080078125, -0.014892578125, 0.06207275390625, -0.0311431884765625, 0.053466796875, 0.041717529296875, -0.03338623046875, 0.047821044921875, -0.04571533203125, 0.030426025390625, 0.045745849609375, 0.040771484375, -0.0274200439453125, -0.0148773193359375, -0.053619384765625, -0.037933349609375, 0.00038361549377441406, 0.037078857421875, -0.004344940185546875, 0.040985107421875, 0.042449951171875, 0.00799560546875, 0.001789093017578125, -0.08270263671875, -0.021881103515625, -0.0474853515625, -0.01155853271484375, 0.01019287109375, -0.024566650390625, 0.00127410888671875, -0.03009033203125, 0.071044921875, 0.0032100677490234375, 0.015777587890625, 0.0101165771484375, 0.0037899017333984375, -0.0152435302734375, -0.0007343292236328125, 0.03314208984375, 0.046600341796875, -0.0232086181640625, -0.02490234375, 0.0142364501953125, -0.04547119140625, -0.0175323486328125, 0.02362060546875, -0.028411865234375, -0.004436492919921875, 0.0010700225830078125, 0.06243896484375, 0.025360107421875, -0.007228851318359375, 0.0253448486328125, -0.028778076171875, -0.01447296142578125, -0.07183837890625, 0.01139068603515625, 0.0289764404296875, 0.04315185546875, -0.003177642822265625, 0.036285400390625, 0.0288543701171875, -0.007038116455078125, -0.0128936767578125, 0.0204925537109375, -0.08642578125, -0.052734375, 0.08447265625, 0.010101318359375, -0.053802490234375, 0.045013427734375, -0.0067138671875, -0.0226287841796875, 0.04498291015625, 0.06719970703125, 0.07879638671875, -0.0254669189453125, 0.0105743408203125, 0.033599853515625, -0.007274627685546875, -0.0225677490234375, 0.072021484375, 0.01512908935546875, -0.046051025390625, 0.005886077880859375, -0.045318603515625, -0.024505615234375, 0.0261993408203125, -0.04803466796875, 0.061676025390625, -0.045257568359375, -0.0134429931640625, -0.01552581787109375, 0.01294708251953125, -0.0538330078125, 0.04290771484375, 0.035919189453125, 0.07330322265625, -0.028778076171875, 0.05767822265625, 0.05120849609375, -0.045501708984375, -0.045867919921875, -0.012481689453125, 0.03631591796875, -0.05047607421875, -0.0024547576904296875, 0.005359649658203125, -0.006870269775390625, 0.0006289482116699219, -0.041748046875, -0.07940673828125, 0.09869384765625, 0.0190582275390625, -0.068359375, -0.02337646484375, -0.017974853515625, 0.0333251953125, -0.036285400390625, 0.0234832763671875, 0.00644683837890625, 0.0251617431640625, 0.01544952392578125, -0.060791015625, -0.0202789306640625, -0.036865234375, 0.00733184814453125, 0.020233154296875, -0.099853515625, 0.03485107421875, -0.039459228515625, 0.0126495361328125, 0.036834716796875, 0.05596923828125, 0.03759765625, 0.0017137527465820312, 0.057464599609375, 0.036376953125, 0.05255126953125, 0.005657196044921875, 0.076904296875, -0.02374267578125, 0.0236053466796875, 0.04876708984375, -0.006031036376953125, 0.04498291015625, 0.00815582275390625, -0.005985260009765625, 0.0631103515625, 0.054931640625, -0.03680419921875, 0.0186004638671875, 0.035186767578125, -0.018035888671875, -0.0121307373046875, -0.00037026405334472656, -0.033203125, 0.0428466796875, 0.004199981689453125, -0.004383087158203125, 0.0321044921875, -0.0038700103759765625, -0.0010585784912109375, -0.0060882568359375, -0.04705810546875, 0.0222625732421875, 0.0210418701171875, -0.00984954833984375, 0.0176544189453125, -0.0285797119140625, 0.047332763671875, -0.045318603515625, 0.0013828277587890625, -0.005413055419921875, 0.0279998779296875, -0.0243988037109375, -0.0266265869140625, 0.010345458984375, -0.01334381103515625, -0.004970550537109375, -0.0372314453125, 0.06427001953125, -0.006999969482421875, -0.0885009765625, 0.032318115234375, 0.02777099609375, 0.0193328857421875, -0.00769805908203125, -0.0574951171875, 0.0122833251953125, -0.000682830810546875, -0.03228759765625, -0.0005245208740234375, -0.005985260009765625, 0.0160064697265625, 0.06048583984375, 0.01052093505859375, 0.00565338134765625, -0.0014324188232421875, 0.0230865478515625, 0.04473876953125, -0.053863525390625, -0.027130126953125, -0.048980712890625, 0.0701904296875, -0.0242156982421875, -0.043212890625, 0.044769287109375, 0.1046142578125, 0.069580078125, -0.049468994140625, 0.0516357421875, -0.0183868408203125, 0.041168212890625, -0.010040283203125, 0.0986328125, -0.054534912109375, -0.01375579833984375, -0.01473236083984375, -0.07171630859375, -0.0252227783203125, 0.05047607421875, 0.032867431640625, 0.010650634765625, 0.01308441162109375, 0.03668212890625, -0.016937255859375, 0.00849151611328125, 0.046966552734375, 0.025299072265625, 0.004764556884765625, 0.013641357421875, 0.040252685546875, -0.06805419921875, 0.03668212890625, -0.03265380859375, -0.007366180419921875, -0.01128387451171875, -0.0262451171875, -0.06884765625, -0.05047607421875, -0.0286712646484375, -0.037445068359375, 0.00975799560546875, 0.072021484375, 0.06561279296875, -0.08447265625, -0.06109619140625, 0.004093170166015625, 0.018798828125, -0.0129852294921875, -0.0165252685546875, 0.0023193359375, 0.0262451171875, -0.07183837890625, 0.037506103515625, -0.017578125, 0.0261688232421875, -0.0180206298828125, -0.0037250518798828125, -0.01605224609375, 0.00627899169921875, 0.033599853515625, 0.0299224853515625, -0.055389404296875, -0.0174102783203125, 0.00901031494140625, 0.00164794921875, 0.009735107421875, 0.060028076171875, -0.033721923828125, -0.0024509429931640625, 0.0267333984375, -0.01206207275390625, 0.059417724609375, -0.024810791015625, 0.06817626953125, -0.0146942138671875, -0.002079010009765625, 0.021514892578125, 0.0570068359375, 0.0216064453125, -0.0400390625, 0.057708740234375, 0.0248565673828125, -0.03582763671875, -0.03851318359375, 0.0166168212890625, -0.079345703125, 0.0003533363342285156, 0.037994384765625, 0.024139404296875, -0.014892578125, 0.0246124267578125, -0.0251617431640625, 0.013427734375, -0.0204925537109375, 0.042694091796875, 0.025390625, -0.04583740234375, -0.0224761962890625, -0.040283203125, 0.045867919921875, -0.0163421630859375, -0.0516357421875, -0.0170440673828125, 0.0396728515625, 0.03619384765625, 0.0162200927734375, 0.047210693359375, -0.018768310546875, 0.0091552734375, 0.0295867919921875, 0.0465087890625, 0.00689697265625, -0.01482391357421875, -0.0033092498779296875, 0.01459503173828125, -0.017608642578125, -0.0413818359375 ] ]
vinai/xphonebert-base
2023-08-29T04:01:53.000Z
[ "transformers", "pytorch", "roberta", "fill-mask", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
vinai
null
null
vinai/xphonebert-base
4
127,549
transformers
2023-04-13T15:46:03
# <a name="introduction"></a> XPhoneBERT : A Pre-trained Multilingual Model for Phoneme Representations for Text-to-Speech XPhoneBERT is the first pre-trained multilingual model for phoneme representations for text-to-speech(TTS). XPhoneBERT has the same model architecture as BERT-base, trained using the RoBERTa pre-training approach on 330M phoneme-level sentences from nearly 100 languages and locales. Experimental results show that employing XPhoneBERT as an input phoneme encoder significantly boosts the performance of a strong neural TTS model in terms of naturalness and prosody and also helps produce fairly high-quality speech with limited training data. The general architecture and experimental results of XPhoneBERT can be found in [our INTERSPEECH 2023 paper](https://www.doi.org/10.21437/Interspeech.2023-444): @inproceedings{xphonebert, title = {{XPhoneBERT: A Pre-trained Multilingual Model for Phoneme Representations for Text-to-Speech}}, author = {Linh The Nguyen and Thinh Pham and Dat Quoc Nguyen}, booktitle = {Proceedings of the 24th Annual Conference of the International Speech Communication Association (INTERSPEECH)}, year = {2023}, pages = {5506--5510} } **Please CITE** our paper when XPhoneBERT is used to help produce published results or is incorporated into other software. For further information or requests, please go to [XPhoneBERT's homepage](https://github.com/VinAIResearch/XPhoneBERT)! ## <a name="transformers"></a> Using XPhoneBERT with `transformers` ### Installation <a name="install2"></a> - Install `transformers` with pip: `pip install transformers`, or install `transformers` [from source](https://huggingface.co/docs/transformers/installation#installing-from-source). - Install `text2phonemesequence`: `pip install text2phonemesequence` <br> Our [`text2phonemesequence`](https://github.com/thelinhbkhn2014/Text2PhonemeSequence) package is to convert text sequences into phoneme-level sequences, employed to construct our multilingual phoneme-level pre-training data. We build `text2phonemesequence` by incorporating the [CharsiuG2P](https://github.com/lingjzhu/CharsiuG2P/tree/main) and the [segments](https://pypi.org/project/segments/) toolkits that perform text-to-phoneme conversion and phoneme segmentation, respectively. - **Notes** - Initializing `text2phonemesequence` for each language requires its corresponding ISO 639-3 code. The ISO 639-3 codes of supported languages are available at [HERE](https://github.com/VinAIResearch/XPhoneBERT/blob/main/LanguageISO639-3Codes.md). - `text2phonemesequence` takes a word-segmented sequence as input. And users might also perform text normalization on the word-segmented sequence before feeding into `text2phonemesequence`. When creating our pre-training data, we perform word and sentence segmentation on all text documents in each language by using the [spaCy](https://spacy.io) toolkit, except for Vietnamese where we employ the [VnCoreNLP](https://github.com/vncorenlp/VnCoreNLP) toolkit. We also use the text normalization component from the [NVIDIA NeMo toolkit](https://github.com/NVIDIA/NeMo) for English, German, Spanish and Chinese, and the [Vinorm](https://github.com/v-nhandt21/Vinorm) text normalization package for Vietnamese. ### <a name="models2"></a> Pre-trained model Model | #params | Arch. | Max length | Pre-training data ---|---|---|---|--- `vinai/xphonebert-base` | 88M | base | 512 | 330M phoneme-level sentences from nearly 100 languages and locales ### Example usage <a name="usage2"></a> ```python from transformers import AutoModel, AutoTokenizer from text2phonemesequence import Text2PhonemeSequence # Load XPhoneBERT model and its tokenizer xphonebert = AutoModel.from_pretrained("vinai/xphonebert-base") tokenizer = AutoTokenizer.from_pretrained("vinai/xphonebert-base") # Load Text2PhonemeSequence # text2phone_model = Text2PhonemeSequence(language='eng-us', is_cuda=True) text2phone_model = Text2PhonemeSequence(language='jpn', is_cuda=True) # Input sequence that is already WORD-SEGMENTED (and text-normalized if applicable) # sentence = "That is , it is a testing text ." sentence = "これ は 、 テスト テキスト です ." input_phonemes = text2phone_model.infer_sentence(sentence) input_ids = tokenizer(input_phonemes, return_tensors="pt") with torch.no_grad(): features = xphonebert(**input_ids) ```
4,410
[ [ -0.0175323486328125, -0.036285400390625, 0.01381683349609375, 0.0265960693359375, -0.036865234375, -0.006351470947265625, -0.0273590087890625, -0.0283966064453125, 0.0093536376953125, 0.020050048828125, -0.0300750732421875, -0.056365966796875, -0.0435791015625, 0.00992584228515625, -0.031036376953125, 0.059906005859375, 0.00024247169494628906, 0.007129669189453125, 0.0134735107421875, -0.0214080810546875, -0.018829345703125, -0.05462646484375, -0.052398681640625, -0.0160064697265625, 0.022216796875, 0.036956787109375, 0.033966064453125, 0.035736083984375, 0.00899505615234375, 0.02301025390625, -0.00859832763671875, 0.01558685302734375, -0.015228271484375, 0.004810333251953125, 0.01552581787109375, -0.053070068359375, -0.0178375244140625, 0.00803375244140625, 0.053802490234375, 0.03558349609375, 0.00008732080459594727, -0.003040313720703125, -0.00012791156768798828, 0.022186279296875, -0.0285186767578125, 0.0119781494140625, -0.03900146484375, 0.007083892822265625, -0.0206756591796875, -0.0108642578125, -0.032562255859375, -0.0286865234375, 0.0269012451171875, -0.042388916015625, -0.00640869140625, -0.0077362060546875, 0.09332275390625, 0.004154205322265625, -0.03302001953125, -0.0325927734375, -0.039276123046875, 0.06329345703125, -0.053985595703125, 0.030914306640625, 0.0290069580078125, -0.0072784423828125, 0.0150299072265625, -0.06787109375, -0.05126953125, -0.0143890380859375, -0.011383056640625, 0.02423095703125, -0.020233154296875, 0.005649566650390625, 0.02325439453125, 0.0241851806640625, -0.0638427734375, 0.007476806640625, -0.034423828125, -0.023956298828125, 0.03948974609375, -0.00881195068359375, 0.027313232421875, -0.032012939453125, -0.028228759765625, -0.03118896484375, -0.02923583984375, 0.0210723876953125, 0.029876708984375, 0.0174560546875, -0.0286407470703125, 0.034210205078125, 0.00787353515625, 0.05023193359375, -0.0005578994750976562, -0.0169830322265625, 0.06317138671875, -0.029510498046875, -0.0091552734375, 0.0223541259765625, 0.0797119140625, 0.00946807861328125, 0.03302001953125, 0.004756927490234375, -0.0109710693359375, 0.004314422607421875, -0.0179595947265625, -0.07061767578125, -0.0170745849609375, 0.031494140625, -0.028961181640625, -0.0087432861328125, 0.00307464599609375, -0.032012939453125, 0.0007128715515136719, -0.0215301513671875, 0.047607421875, -0.06390380859375, -0.0230560302734375, 0.0159149169921875, -0.006954193115234375, 0.0214385986328125, -0.0037670135498046875, -0.0494384765625, 0.0166168212890625, 0.016571044921875, 0.073974609375, -0.0034542083740234375, -0.035064697265625, -0.03460693359375, -0.013702392578125, -0.0147247314453125, 0.03656005859375, -0.0143890380859375, -0.041015625, 0.002399444580078125, 0.01102447509765625, -0.032745361328125, -0.047088623046875, 0.06683349609375, -0.0161590576171875, 0.054229736328125, 0.01126861572265625, -0.0411376953125, -0.005001068115234375, -0.0005793571472167969, -0.0255584716796875, 0.10308837890625, 0.01161956787109375, -0.0699462890625, 0.0155181884765625, -0.0491943359375, -0.050140380859375, -0.015228271484375, -0.01139068603515625, -0.035125732421875, -0.0009927749633789062, 0.01467132568359375, 0.026580810546875, -0.0101470947265625, 0.005023956298828125, 0.00943756103515625, -0.033782958984375, 0.016876220703125, -0.022491455078125, 0.09716796875, 0.01432037353515625, -0.041290283203125, 0.01448822021484375, -0.07330322265625, 0.0214385986328125, 0.01059722900390625, -0.0274810791015625, -0.027862548828125, -0.015777587890625, 0.0165557861328125, 0.035980224609375, 0.01342010498046875, -0.051910400390625, -0.0008568763732910156, -0.051483154296875, 0.052947998046875, 0.02996826171875, -0.0250244140625, 0.041961669921875, -0.0032863616943359375, 0.0196990966796875, 0.0302734375, -0.00024056434631347656, -0.046661376953125, -0.0277252197265625, -0.080078125, -0.01399993896484375, 0.034393310546875, 0.0692138671875, -0.057098388671875, 0.0537109375, -0.026458740234375, -0.043792724609375, -0.03985595703125, -0.0119171142578125, 0.0225982666015625, 0.031768798828125, 0.03582763671875, -0.0286407470703125, -0.048187255859375, -0.0611572265625, -0.00934600830078125, -0.0302276611328125, 0.00177001953125, 0.01485443115234375, 0.0254058837890625, -0.0257415771484375, 0.07373046875, -0.026458740234375, -0.02618408203125, -0.029510498046875, 0.021942138671875, 0.0203704833984375, 0.049285888671875, 0.04278564453125, -0.04974365234375, -0.0269622802734375, -0.006969451904296875, -0.03680419921875, -0.01161956787109375, -0.01531219482421875, -0.0001596212387084961, 0.0256195068359375, 0.04888916015625, -0.048095703125, 0.0181427001953125, 0.042449951171875, -0.0213470458984375, 0.040802001953125, -0.0200347900390625, -0.005947113037109375, -0.10601806640625, 0.0059356689453125, 0.00914764404296875, -0.022216796875, -0.04974365234375, -0.03546142578125, 0.0195465087890625, -0.0260772705078125, -0.0296630859375, 0.037811279296875, -0.043060302734375, 0.005767822265625, -0.0081024169921875, 0.02264404296875, 0.006687164306640625, 0.04351806640625, 0.0138397216796875, 0.062164306640625, 0.052520751953125, -0.053314208984375, 0.01204681396484375, 0.032684326171875, -0.03143310546875, 0.02301025390625, -0.06591796875, 0.00966644287109375, 0.0013666152954101562, 0.0169830322265625, -0.07659912109375, -0.0024585723876953125, 0.025146484375, -0.057769775390625, 0.01983642578125, -0.0114898681640625, -0.05694580078125, -0.0236968994140625, -0.01044464111328125, 0.0296173095703125, 0.0491943359375, -0.03729248046875, 0.030853271484375, 0.0294189453125, -0.023468017578125, -0.029693603515625, -0.0667724609375, 0.01403045654296875, -0.01515960693359375, -0.06549072265625, 0.040008544921875, 0.0111083984375, 0.0119781494140625, -0.0228118896484375, 0.0026760101318359375, -0.0237884521484375, -0.0096435546875, 0.0135498046875, 0.014068603515625, -0.005126953125, 0.0228118896484375, 0.01361083984375, -0.0181427001953125, 0.006404876708984375, -0.046173095703125, 0.0638427734375, 0.0004935264587402344, -0.0019407272338867188, -0.0484619140625, 0.0258941650390625, 0.035308837890625, -0.043609619140625, 0.035247802734375, 0.0693359375, -0.01526641845703125, -0.00146484375, -0.026397705078125, -0.01531982421875, -0.034912109375, 0.051483154296875, -0.030792236328125, -0.06982421875, 0.0167236328125, -0.0011196136474609375, 0.010467529296875, 0.020111083984375, 0.03314208984375, 0.0028514862060546875, 0.082275390625, 0.043975830078125, -0.00942230224609375, 0.046722412109375, -0.032318115234375, 0.023681640625, -0.04583740234375, -0.0175933837890625, -0.041351318359375, -0.002895355224609375, -0.050750732421875, -0.035186767578125, 0.0200347900390625, 0.0017805099487304688, -0.0081939697265625, 0.055908203125, -0.03125, -0.0006489753723144531, 0.048309326171875, -0.01117706298828125, 0.0059814453125, 0.025360107421875, -0.029296875, -0.00493621826171875, -0.055938720703125, -0.04608154296875, 0.06195068359375, 0.037261962890625, 0.041015625, -0.0124053955078125, 0.06988525390625, -0.01390838623046875, -0.018768310546875, -0.06610107421875, 0.039306640625, -0.01248931884765625, -0.0284576416015625, -0.01407623291015625, -0.038055419921875, -0.08294677734375, 0.0244598388671875, 0.0007042884826660156, -0.0679931640625, 0.030792236328125, 0.00423431396484375, -0.03448486328125, 0.00392913818359375, -0.07025146484375, 0.059814453125, -0.0200653076171875, -0.010894775390625, -0.006916046142578125, -0.061187744140625, 0.00762939453125, 0.005931854248046875, 0.006114959716796875, -0.00267791748046875, -0.00531005859375, 0.06915283203125, -0.040924072265625, 0.05206298828125, -0.02227783203125, -0.0203704833984375, 0.031585693359375, -0.02557373046875, 0.026641845703125, 0.01139068603515625, -0.00982666015625, 0.0207061767578125, 0.012725830078125, -0.027313232421875, -0.021240234375, 0.04644775390625, -0.07818603515625, -0.026580810546875, -0.035888671875, -0.0274200439453125, -0.011199951171875, 0.02093505859375, 0.06329345703125, 0.041900634765625, 0.000370025634765625, -0.002994537353515625, 0.05535888671875, -0.038818359375, 0.0428466796875, 0.0154266357421875, -0.01605224609375, -0.037567138671875, 0.07037353515625, 0.0258941650390625, 0.0157470703125, 0.0147705078125, -0.0008192062377929688, -0.033203125, -0.0289306640625, -0.04443359375, 0.0172882080078125, -0.040618896484375, 0.01092529296875, -0.07379150390625, -0.026458740234375, -0.035491943359375, 0.0266265869140625, -0.03143310546875, -0.04718017578125, -0.031463623046875, 0.00786590576171875, 0.0143280029296875, 0.01049041748046875, -0.00592041015625, 0.035247802734375, -0.067626953125, 0.00823211669921875, 0.00655364990234375, 0.0016469955444335938, -0.0222625732421875, -0.06787109375, -0.035369873046875, 0.0056304931640625, -0.011962890625, -0.059539794921875, 0.027862548828125, 0.0208587646484375, 0.018951416015625, 0.0241851806640625, -0.000579833984375, 0.03912353515625, -0.033203125, 0.058837890625, 0.022735595703125, -0.08984375, 0.04583740234375, -0.00785064697265625, 0.038177490234375, 0.03643798828125, 0.0277557373046875, -0.064453125, -0.005535125732421875, -0.0435791015625, -0.086669921875, 0.07257080078125, 0.04046630859375, 0.033203125, -0.002025604248046875, 0.01153564453125, -0.00585174560546875, 0.01000213623046875, -0.0611572265625, -0.038726806640625, -0.0270843505859375, -0.019134521484375, 0.0017681121826171875, -0.0204620361328125, 0.01006317138671875, -0.0282135009765625, 0.055877685546875, 0.0077972412109375, 0.042572021484375, 0.046478271484375, -0.03485107421875, 0.029052734375, 0.0171661376953125, 0.053497314453125, 0.0310821533203125, 0.0018215179443359375, 0.005069732666015625, 0.0170745849609375, -0.048004150390625, 0.026580810546875, 0.0443115234375, -0.00965118408203125, 0.042266845703125, 0.019561767578125, 0.08489990234375, -0.0030574798583984375, -0.04522705078125, 0.0460205078125, -0.0143280029296875, -0.0296173095703125, -0.042236328125, -0.007354736328125, 0.0037250518798828125, -0.005229949951171875, 0.03131103515625, 0.00370025634765625, -0.0047607421875, -0.0281829833984375, 0.01303863525390625, -0.002170562744140625, -0.044525146484375, -0.0274200439453125, 0.036163330078125, 0.0263824462890625, -0.0104217529296875, 0.0462646484375, -0.0277862548828125, -0.05352783203125, 0.039642333984375, 0.03619384765625, 0.060455322265625, -0.0228271484375, 0.0179595947265625, 0.0469970703125, 0.0217742919921875, 0.0007462501525878906, 0.0322265625, -0.0024623870849609375, -0.056121826171875, -0.03387451171875, -0.0294952392578125, 0.0008435249328613281, 0.00774383544921875, -0.034332275390625, 0.0321044921875, -0.0238037109375, -0.0207977294921875, -0.0121307373046875, -0.0077056884765625, -0.03717041015625, 0.03125, 0.0225830078125, 0.07794189453125, -0.04522705078125, 0.08123779296875, 0.05914306640625, -0.0277252197265625, -0.0743408203125, 0.0124359130859375, -0.0179443359375, -0.0556640625, 0.057037353515625, 0.035064697265625, -0.005306243896484375, 0.019500732421875, -0.0190887451171875, -0.062103271484375, 0.06695556640625, 0.02362060546875, -0.04693603515625, -0.009918212890625, 0.0290069580078125, 0.0253753662109375, -0.0164337158203125, 0.036163330078125, 0.04669189453125, 0.032989501953125, -0.01296234130859375, -0.06903076171875, -0.00797271728515625, -0.0352783203125, 0.0002532005310058594, 0.0161895751953125, -0.053436279296875, 0.0699462890625, -0.016387939453125, -0.035797119140625, 0.02227783203125, 0.050811767578125, 0.01253509521484375, 0.0075836181640625, 0.049957275390625, 0.050201416015625, 0.06353759765625, -0.0211334228515625, 0.0592041015625, -0.039825439453125, 0.02288818359375, 0.08673095703125, -0.00868988037109375, 0.0498046875, 0.028289794921875, -0.0053253173828125, 0.0435791015625, 0.06298828125, -0.001155853271484375, 0.0267181396484375, 0.0194549560546875, -0.023681640625, -0.00250244140625, -0.0013303756713867188, -0.034576416015625, 0.052398681640625, 0.016998291015625, -0.042022705078125, -0.00684356689453125, 0.0139617919921875, 0.0306854248046875, -0.017364501953125, 0.003734588623046875, 0.034912109375, 0.0122222900390625, -0.047271728515625, 0.0677490234375, 0.01593017578125, 0.07147216796875, -0.048736572265625, 0.0188751220703125, -0.01123046875, 0.0163421630859375, -0.025360107421875, -0.039337158203125, 0.005046844482421875, -0.0037708282470703125, -0.0020580291748046875, -0.0028400421142578125, 0.040740966796875, -0.05419921875, -0.0286712646484375, 0.0305938720703125, 0.035186767578125, 0.023040771484375, -0.01274871826171875, -0.06707763671875, 0.006183624267578125, 0.01357269287109375, -0.0193939208984375, 0.004360198974609375, 0.035186767578125, -0.0008683204650878906, 0.029205322265625, 0.035736083984375, -0.0043487548828125, 0.01171875, 0.017425537109375, 0.060455322265625, -0.044158935546875, -0.0472412109375, -0.06298828125, 0.04888916015625, -0.0240478515625, -0.047454833984375, 0.06011962890625, 0.067138671875, 0.0771484375, -0.0008530616760253906, 0.07733154296875, -0.01177215576171875, 0.03741455078125, -0.026397705078125, 0.051361083984375, -0.038299560546875, 0.00909423828125, -0.0218963623046875, -0.07135009765625, -0.012451171875, 0.060546875, -0.0215911865234375, 0.005832672119140625, 0.057769775390625, 0.0703125, -0.0027904510498046875, -0.004360198974609375, 0.023284912109375, 0.0163116455078125, 0.01317596435546875, 0.042999267578125, 0.05419921875, -0.056121826171875, 0.07073974609375, -0.0255584716796875, -0.0201416015625, -0.003879547119140625, -0.029693603515625, -0.06817626953125, -0.0540771484375, -0.03045654296875, -0.031768798828125, 0.004505157470703125, 0.08258056640625, 0.06390380859375, -0.04949951171875, -0.03985595703125, -0.01317596435546875, -0.01465606689453125, 0.002666473388671875, -0.0185394287109375, 0.035614013671875, -0.044189453125, -0.08184814453125, 0.02239990234375, 0.0216827392578125, 0.0193939208984375, -0.0011997222900390625, -0.0020618438720703125, -0.0167083740234375, -0.0017385482788085938, 0.0478515625, 0.0157470703125, -0.02923583984375, 0.00862884521484375, 0.0112457275390625, -0.02215576171875, 0.0302734375, 0.05889892578125, -0.05657958984375, 0.02325439453125, 0.037109375, 0.029541015625, 0.04241943359375, -0.023468017578125, 0.042449951171875, -0.051727294921875, 0.038299560546875, 0.014801025390625, 0.0340576171875, 0.0258331298828125, -0.005062103271484375, 0.0271453857421875, 0.0171051025390625, -0.03387451171875, -0.050323486328125, 0.0183258056640625, -0.068115234375, -0.013427734375, 0.089111328125, -0.0297698974609375, -0.005096435546875, -0.01290130615234375, -0.055206298828125, 0.036590576171875, -0.03253173828125, 0.0293121337890625, 0.0411376953125, 0.004184722900390625, -0.0178680419921875, -0.02374267578125, 0.038299560546875, 0.0653076171875, -0.0611572265625, -0.00888824462890625, 0.00814056396484375, 0.00931549072265625, 0.0229644775390625, 0.038665771484375, -0.020477294921875, 0.01335906982421875, -0.0107574462890625, 0.01513671875, 0.0114593505859375, 0.0013675689697265625, -0.0237274169921875, -0.00548553466796875, -0.005260467529296875, -0.03369140625 ] ]
deepset/tinyroberta-squad2
2023-09-27T11:51:22.000Z
[ "transformers", "pytorch", "safetensors", "roberta", "question-answering", "en", "dataset:squad_v2", "arxiv:1909.10351", "license:cc-by-4.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
deepset
null
null
deepset/tinyroberta-squad2
69
127,488
transformers
2022-03-02T23:29:05
--- language: en license: cc-by-4.0 datasets: - squad_v2 model-index: - name: deepset/tinyroberta-squad2 results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 78.8627 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDNlZDU4ODAxMzY5NGFiMTMyZmQ1M2ZhZjMyODA1NmFlOGMxNzYxNTA4OGE5YTBkZWViZjBkNGQ2ZmMxZjVlMCIsInZlcnNpb24iOjF9.Wgu599r6TvgMLTrHlLMVAbUtKD_3b70iJ5QSeDQ-bRfUsVk6Sz9OsJCp47riHJVlmSYzcDj_z_3jTcUjCFFXBg - type: f1 value: 82.0355 name: F1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTFkMzEzMWNiZDRhMGZlODhkYzcwZTZiMDFjZDg2YjllZmUzYWM5NTgwNGQ2NGYyMDk2ZGQwN2JmMTE5NTc3YiIsInZlcnNpb24iOjF9.ChgaYpuRHd5WeDFjtiAHUyczxtoOD_M5WR8834jtbf7wXhdGOnZKdZ1KclmhoI5NuAGc1NptX-G0zQ5FTHEcBA - task: type: question-answering name: Question Answering dataset: name: squad type: squad config: plain_text split: validation metrics: - type: exact_match value: 83.860 name: Exact Match - type: f1 value: 90.752 name: F1 - task: type: question-answering name: Question Answering dataset: name: adversarial_qa type: adversarial_qa config: adversarialQA split: validation metrics: - type: exact_match value: 25.967 name: Exact Match - type: f1 value: 37.006 name: F1 - task: type: question-answering name: Question Answering dataset: name: squad_adversarial type: squad_adversarial config: AddOneSent split: validation metrics: - type: exact_match value: 76.329 name: Exact Match - type: f1 value: 83.292 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts amazon type: squadshifts config: amazon split: test metrics: - type: exact_match value: 63.915 name: Exact Match - type: f1 value: 78.395 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts new_wiki type: squadshifts config: new_wiki split: test metrics: - type: exact_match value: 80.297 name: Exact Match - type: f1 value: 89.808 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts nyt type: squadshifts config: nyt split: test metrics: - type: exact_match value: 80.149 name: Exact Match - type: f1 value: 88.321 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts reddit type: squadshifts config: reddit split: test metrics: - type: exact_match value: 66.959 name: Exact Match - type: f1 value: 79.300 name: F1 --- # tinyroberta-squad2 This is the *distilled* version of the [deepset/roberta-base-squad2](https://huggingface.co/deepset/roberta-base-squad2) model. This model has a comparable prediction quality and runs at twice the speed of the base model. ## Overview **Language model:** tinyroberta-squad2 **Language:** English **Downstream-task:** Extractive QA **Training data:** SQuAD 2.0 **Eval data:** SQuAD 2.0 **Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system) **Infrastructure**: 4x Tesla v100 ## Hyperparameters ``` batch_size = 96 n_epochs = 4 base_LM_model = "deepset/tinyroberta-squad2-step1" max_seq_len = 384 learning_rate = 3e-5 lr_schedule = LinearWarmup warmup_proportion = 0.2 doc_stride = 128 max_query_length = 64 distillation_loss_weight = 0.75 temperature = 1.5 teacher = "deepset/robert-large-squad2" ``` ## Distillation This model was distilled using the TinyBERT approach described in [this paper](https://arxiv.org/pdf/1909.10351.pdf) and implemented in [haystack](https://github.com/deepset-ai/haystack). Firstly, we have performed intermediate layer distillation with roberta-base as the teacher which resulted in [deepset/tinyroberta-6l-768d](https://huggingface.co/deepset/tinyroberta-6l-768d). Secondly, we have performed task-specific distillation with [deepset/roberta-base-squad2](https://huggingface.co/deepset/roberta-base-squad2) as the teacher for further intermediate layer distillation on an augmented version of SQuADv2 and then with [deepset/roberta-large-squad2](https://huggingface.co/deepset/roberta-large-squad2) as the teacher for prediction layer distillation. ## Usage ### In Haystack Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/): ```python reader = FARMReader(model_name_or_path="deepset/tinyroberta-squad2") # or reader = TransformersReader(model_name_or_path="deepset/tinyroberta-squad2") ``` ### In Transformers ```python from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline model_name = "deepset/tinyroberta-squad2" # a) Get predictions nlp = pipeline('question-answering', model=model_name, tokenizer=model_name) QA_input = { 'question': 'Why is model conversion important?', 'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.' } res = nlp(QA_input) # b) Load model & tokenizer model = AutoModelForQuestionAnswering.from_pretrained(model_name) tokenizer = AutoTokenizer.from_pretrained(model_name) ``` ## Performance Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/). ``` "exact": 78.69114798281817, "f1": 81.9198998536977, "total": 11873, "HasAns_exact": 76.19770580296895, "HasAns_f1": 82.66446878592329, "HasAns_total": 5928, "NoAns_exact": 81.17746005046257, "NoAns_f1": 81.17746005046257, "NoAns_total": 5945 ``` ## Authors **Branden Chan:** branden.chan@deepset.ai **Timo Möller:** timo.moeller@deepset.ai **Malte Pietsch:** malte.pietsch@deepset.ai **Tanay Soni:** tanay.soni@deepset.ai **Michel Bartels:** michel.bartels@deepset.ai ## About us <div class="grid lg:grid-cols-2 gap-x-4 gap-y-3"> <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/deepset-logo-colored.png" class="w-40"/> </div> <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center"> <img alt="" src="https://raw.githubusercontent.com/deepset-ai/.github/main/haystack-logo-colored.png" class="w-40"/> </div> </div> [deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc. Some of our other work: - [roberta-base-squad2]([https://huggingface.co/deepset/roberta-base-squad2) - [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert) - [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad) ## Get in touch and join the Haystack community <p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://docs.haystack.deepset.ai">Documentation</a></strong>. We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join">Discord community open to everyone!</a></strong></p> [Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai) By the way: [we're hiring!](http://www.deepset.ai/jobs)
8,381
[ [ -0.030670166015625, -0.044342041015625, 0.03302001953125, 0.00305938720703125, -0.00386810302734375, 0.01235198974609375, -0.0152130126953125, -0.0306396484375, 0.01953125, 0.0165863037109375, -0.062744140625, -0.04547119140625, -0.028472900390625, 0.0019054412841796875, -0.0219268798828125, 0.0701904296875, 0.0110321044921875, -0.0011796951293945312, -0.01287078857421875, -0.0003483295440673828, -0.036163330078125, -0.0308685302734375, -0.055999755859375, -0.018829345703125, 0.0196990966796875, 0.032684326171875, 0.04803466796875, 0.02630615234375, 0.040740966796875, 0.0256500244140625, -0.007091522216796875, 0.01195526123046875, -0.037872314453125, 0.015380859375, -0.0018739700317382812, -0.03155517578125, -0.0271148681640625, -0.0006818771362304688, 0.037200927734375, 0.0250396728515625, -0.011566162109375, 0.040924072265625, -0.0008006095886230469, 0.059051513671875, -0.040924072265625, 0.0085906982421875, -0.053375244140625, -0.012908935546875, 0.01091766357421875, 0.0205078125, -0.00962066650390625, -0.0083770751953125, 0.0173187255859375, -0.0404052734375, 0.0292205810546875, -0.01132965087890625, 0.083740234375, 0.0237274169921875, -0.0032253265380859375, -0.00982666015625, -0.03668212890625, 0.06903076171875, -0.078857421875, -0.006595611572265625, 0.040618896484375, 0.035400390625, 0.0104827880859375, -0.06982421875, -0.0487060546875, 0.0007495880126953125, -0.0264892578125, 0.0158233642578125, -0.010955810546875, -0.0272216796875, 0.00787353515625, 0.02569580078125, -0.05035400390625, 0.01300811767578125, -0.042999267578125, 0.0007376670837402344, 0.06890869140625, 0.0205230712890625, 0.0101776123046875, -0.0197296142578125, -0.024566650390625, -0.0224151611328125, -0.0309906005859375, 0.016387939453125, 0.00888824462890625, 0.033233642578125, -0.0228424072265625, 0.0303192138671875, -0.0294647216796875, 0.0380859375, 0.019561767578125, 0.022308349609375, 0.034942626953125, -0.0548095703125, -0.021636962890625, -0.0057830810546875, 0.07879638671875, 0.030426025390625, 0.0006809234619140625, 0.00926971435546875, -0.0205230712890625, -0.01345062255859375, 0.0137786865234375, -0.0777587890625, -0.02142333984375, 0.031707763671875, -0.0253753662109375, -0.040069580078125, 0.003143310546875, -0.058319091796875, -0.0244293212890625, 0.0001951456069946289, 0.03662109375, -0.030792236328125, -0.028045654296875, 0.0121917724609375, -0.01433563232421875, 0.036346435546875, 0.01320648193359375, -0.06292724609375, 0.00926971435546875, 0.0474853515625, 0.0670166015625, 0.01849365234375, -0.02392578125, -0.0285491943359375, -0.00957489013671875, -0.0126800537109375, 0.0467529296875, -0.0215301513671875, -0.00923919677734375, -0.00623321533203125, 0.01433563232421875, -0.0060577392578125, -0.0311431884765625, 0.0159454345703125, -0.037017822265625, 0.0321044921875, -0.014251708984375, -0.03424072265625, -0.01551055908203125, 0.02459716796875, -0.05224609375, 0.07904052734375, 0.02117919921875, -0.038848876953125, 0.0175323486328125, -0.05242919921875, -0.017547607421875, 0.0058746337890625, 0.006366729736328125, -0.035888671875, -0.015838623046875, 0.025787353515625, 0.042083740234375, -0.0255126953125, 0.0083465576171875, -0.02337646484375, -0.0302886962890625, 0.0116119384765625, -0.0038204193115234375, 0.08612060546875, 0.007781982421875, -0.032623291015625, -0.003635406494140625, -0.05084228515625, 0.024261474609375, 0.0162353515625, -0.01049041748046875, -0.0059356689453125, -0.011749267578125, 0.003986358642578125, 0.0257110595703125, 0.036712646484375, -0.0235137939453125, 0.0141143798828125, -0.040802001953125, 0.05169677734375, 0.049346923828125, 0.0017147064208984375, 0.033233642578125, -0.024810791015625, 0.04034423828125, -0.00135040283203125, 0.00959014892578125, 0.008148193359375, -0.02813720703125, -0.06463623046875, -0.020599365234375, 0.0343017578125, 0.046661376953125, -0.0545654296875, 0.054931640625, -0.0132598876953125, -0.052581787109375, -0.06134033203125, 0.0104522705078125, 0.02703857421875, 0.029144287109375, 0.0389404296875, 0.00412750244140625, -0.05938720703125, -0.07843017578125, 0.0007348060607910156, -0.01470947265625, -0.013092041015625, 0.0133209228515625, 0.058135986328125, -0.02490234375, 0.062103271484375, -0.049346923828125, -0.0301971435546875, -0.019561767578125, -0.007244110107421875, 0.045928955078125, 0.0518798828125, 0.05621337890625, -0.055389404296875, -0.047882080078125, -0.01444244384765625, -0.0546875, 0.0207366943359375, -0.004138946533203125, -0.0219268798828125, 0.007747650146484375, 0.025146484375, -0.053375244140625, 0.02459716796875, 0.037200927734375, -0.041046142578125, 0.0328369140625, 0.00258636474609375, 0.00879669189453125, -0.10858154296875, 0.01788330078125, 0.0009784698486328125, -0.016387939453125, -0.037506103515625, 0.01253509521484375, -0.0166473388671875, -0.002349853515625, -0.037841796875, 0.04119873046875, -0.0269012451171875, 0.00992584228515625, 0.0088348388671875, 0.0074615478515625, 0.020599365234375, 0.037994384765625, -0.01457977294921875, 0.07684326171875, 0.049041748046875, -0.031951904296875, 0.0439453125, 0.0360107421875, -0.0360107421875, 0.022216796875, -0.0697021484375, 0.009521484375, 0.0075225830078125, 0.016845703125, -0.07696533203125, -0.0132598876953125, 0.0140533447265625, -0.0516357421875, 0.01396942138671875, -0.0146636962890625, -0.04766845703125, -0.0340576171875, -0.04339599609375, 0.020294189453125, 0.05963134765625, -0.02685546875, 0.0175018310546875, 0.0230255126953125, -0.0020198822021484375, -0.042633056640625, -0.06915283203125, -0.0035152435302734375, -0.01296234130859375, -0.051177978515625, 0.021636962890625, -0.005710601806640625, -0.01104736328125, 0.010955810546875, -0.0008602142333984375, -0.03387451171875, 0.0133056640625, 0.0129852294921875, 0.035552978515625, -0.029144287109375, 0.01275634765625, -0.0111846923828125, -0.00939178466796875, -0.00232696533203125, -0.019500732421875, 0.04595947265625, -0.0518798828125, 0.00946044921875, -0.048583984375, 0.0217132568359375, 0.038482666015625, -0.01702880859375, 0.07080078125, 0.057037353515625, -0.0243682861328125, -0.00484466552734375, -0.037322998046875, -0.0245819091796875, -0.036956787109375, 0.031494140625, -0.021881103515625, -0.060394287109375, 0.04864501953125, 0.024078369140625, 0.00933837890625, 0.0775146484375, 0.037567138671875, -0.035797119140625, 0.07122802734375, 0.03204345703125, -0.00258636474609375, 0.0278778076171875, -0.07513427734375, 0.0026988983154296875, -0.07305908203125, -0.0216064453125, -0.04510498046875, -0.0457763671875, -0.046661376953125, -0.03167724609375, 0.0167388916015625, 0.0170440673828125, -0.03521728515625, 0.0430908203125, -0.0576171875, 0.035552978515625, 0.05291748046875, 0.01125335693359375, 0.0038204193115234375, -0.0017919540405273438, 0.01047515869140625, 0.01023101806640625, -0.056976318359375, -0.035888671875, 0.07861328125, 0.0091094970703125, 0.036590576171875, 0.0106353759765625, 0.06927490234375, 0.0211334228515625, -0.0118255615234375, -0.04913330078125, 0.039520263671875, -0.01253509521484375, -0.07000732421875, -0.04241943359375, -0.0301666259765625, -0.072509765625, 0.00984954833984375, -0.018310546875, -0.05010986328125, 0.0265655517578125, 0.0017375946044921875, -0.052825927734375, 0.016082763671875, -0.057861328125, 0.07061767578125, -0.008270263671875, -0.024169921875, -0.013580322265625, -0.060638427734375, 0.02099609375, 0.007511138916015625, 0.002788543701171875, -0.0162200927734375, -0.0032901763916015625, 0.055084228515625, -0.0537109375, 0.06494140625, -0.01294708251953125, 0.005405426025390625, 0.0413818359375, 0.0005650520324707031, 0.03387451171875, 0.026611328125, -0.0309906005859375, 0.02325439453125, 0.0275726318359375, -0.041351318359375, -0.043212890625, 0.05718994140625, -0.0667724609375, -0.031646728515625, -0.036895751953125, -0.022613525390625, -0.0005402565002441406, 0.0244140625, 0.024505615234375, 0.022613525390625, -0.0072021484375, 0.041595458984375, 0.03826904296875, -0.0019435882568359375, 0.040069580078125, 0.028472900390625, -0.006664276123046875, -0.0247955322265625, 0.057403564453125, -0.00047135353088378906, 0.0159149169921875, 0.02728271484375, 0.01273345947265625, -0.03271484375, -0.03546142578125, -0.0438232421875, 0.0111541748046875, -0.036529541015625, -0.0362548828125, -0.035369873046875, -0.036285400390625, -0.048797607421875, -0.00798797607421875, -0.0357666015625, -0.043975830078125, -0.04248046875, -0.007232666015625, 0.0517578125, 0.035858154296875, -0.007213592529296875, 0.0155181884765625, -0.048919677734375, 0.021209716796875, 0.0367431640625, 0.02197265625, -0.005268096923828125, -0.04571533203125, -0.0207061767578125, 0.03656005859375, -0.003589630126953125, -0.0418701171875, 0.0220794677734375, 0.0158233642578125, 0.019805908203125, -0.0016393661499023438, 0.01096343994140625, 0.050750732421875, -0.0266571044921875, 0.0679931640625, 0.0170745849609375, -0.05780029296875, 0.0516357421875, -0.0214385986328125, 0.02789306640625, 0.0853271484375, 0.015960693359375, -0.0300445556640625, -0.0147857666015625, -0.05792236328125, -0.07476806640625, 0.05413818359375, 0.031890869140625, 0.0084075927734375, 0.00814056396484375, 0.0186614990234375, -0.005558013916015625, 0.019378662109375, -0.033599853515625, -0.0203857421875, -0.0169677734375, -0.0174102783203125, -0.00608062744140625, -0.01157379150390625, -0.01117706298828125, -0.035003662109375, 0.0699462890625, -0.000019729137420654297, 0.0119476318359375, 0.01922607421875, -0.00848388671875, 0.0168304443359375, 0.004901885986328125, 0.0374755859375, 0.057952880859375, -0.028900146484375, -0.01091766357421875, 0.016448974609375, -0.027496337890625, 0.007427215576171875, 0.01427459716796875, -0.03350830078125, 0.00856781005859375, 0.034515380859375, 0.06329345703125, 0.0042572021484375, -0.03912353515625, 0.0440673828125, -0.0077667236328125, -0.0299072265625, -0.038330078125, 0.016204833984375, 0.0157928466796875, 0.036712646484375, 0.0308685302734375, 0.0021514892578125, 0.013824462890625, -0.040924072265625, 0.010711669921875, 0.03271484375, -0.0283050537109375, -0.00811767578125, 0.035552978515625, 0.0219268798828125, -0.0304107666015625, 0.052764892578125, -0.0172882080078125, -0.04083251953125, 0.0738525390625, 0.0152740478515625, 0.075439453125, 0.0110626220703125, 0.0251922607421875, 0.04864501953125, 0.0210418701171875, -0.003231048583984375, 0.01177215576171875, 0.006748199462890625, -0.039398193359375, -0.020294189453125, -0.0556640625, -0.00403594970703125, 0.0219879150390625, -0.055450439453125, 0.0139007568359375, -0.039825439453125, -0.01529693603515625, 0.00791168212890625, 0.0305633544921875, -0.0726318359375, 0.01253509521484375, -0.00971221923828125, 0.06103515625, -0.042633056640625, 0.040740966796875, 0.057403564453125, -0.051025390625, -0.06915283203125, -0.007534027099609375, -0.01385498046875, -0.0736083984375, 0.036285400390625, 0.015899658203125, -0.00911712646484375, 0.01548004150390625, -0.0576171875, -0.06756591796875, 0.1033935546875, 0.002201080322265625, -0.04437255859375, -0.0145416259765625, -0.00835418701171875, 0.04339599609375, -0.025970458984375, 0.0203094482421875, 0.048065185546875, 0.035797119140625, 0.0096282958984375, -0.06292724609375, 0.021148681640625, -0.0299072265625, 0.00676727294921875, -0.003509521484375, -0.066650390625, 0.060333251953125, -0.0189361572265625, -0.00749969482421875, 0.0205230712890625, 0.038482666015625, 0.018310546875, 0.0128021240234375, 0.033599853515625, 0.0430908203125, 0.0528564453125, -0.003818511962890625, 0.07080078125, -0.0093536376953125, 0.056640625, 0.0946044921875, -0.0083770751953125, 0.068359375, 0.032379150390625, -0.031707763671875, 0.057220458984375, 0.048583984375, -0.0277862548828125, 0.04205322265625, 0.01030731201171875, -0.0028972625732421875, -0.021087646484375, 0.00678253173828125, -0.055267333984375, 0.03759765625, 0.0068817138671875, -0.022735595703125, -0.00841522216796875, -0.029571533203125, -0.00899505615234375, -0.00182342529296875, -0.00791168212890625, 0.0643310546875, -0.0012350082397460938, -0.0390625, 0.072021484375, -0.01038360595703125, 0.055816650390625, -0.045867919921875, 0.0002180337905883789, -0.020294189453125, 0.01413726806640625, -0.01557159423828125, -0.06622314453125, 0.00934600830078125, -0.001392364501953125, -0.036224365234375, -0.00943756103515625, 0.0430908203125, -0.03472900390625, -0.06927490234375, 0.004161834716796875, 0.037322998046875, 0.0119171142578125, -0.002140045166015625, -0.07269287109375, -0.00879669189453125, 0.002002716064453125, -0.0276641845703125, 0.01439666748046875, 0.0299224853515625, 0.0266571044921875, 0.040252685546875, 0.055877685546875, 0.0013360977172851562, -0.002285003662109375, -0.003421783447265625, 0.06683349609375, -0.046844482421875, -0.025787353515625, -0.07177734375, 0.0526123046875, -0.0302276611328125, -0.0307769775390625, 0.048919677734375, 0.053955078125, 0.0628662109375, -0.01551055908203125, 0.0552978515625, -0.021942138671875, 0.037933349609375, -0.03399658203125, 0.07330322265625, -0.061798095703125, 0.0095367431640625, -0.0034809112548828125, -0.054840087890625, -0.0033321380615234375, 0.059234619140625, -0.006549835205078125, 0.01141357421875, 0.04986572265625, 0.0609130859375, 0.0031299591064453125, -0.0234222412109375, -0.0005612373352050781, 0.0255584716796875, 0.0149078369140625, 0.064697265625, 0.05279541015625, -0.061126708984375, 0.045562744140625, -0.03509521484375, -0.00356292724609375, -0.0254058837890625, -0.04974365234375, -0.06707763671875, -0.04473876953125, -0.0216827392578125, -0.049591064453125, 0.0006570816040039062, 0.062469482421875, 0.06512451171875, -0.06884765625, -0.01319122314453125, -0.003765106201171875, 0.01343536376953125, -0.0237884521484375, -0.0236358642578125, 0.03375244140625, -0.0185699462890625, -0.05194091796875, 0.0250701904296875, -0.004222869873046875, 0.0007734298706054688, -0.017822265625, -0.00019991397857666016, -0.047882080078125, -0.0158233642578125, 0.03460693359375, 0.018280029296875, -0.0455322265625, -0.009033203125, 0.002330780029296875, -0.0192108154296875, 0.0002548694610595703, 0.0311431884765625, -0.06585693359375, 0.0125732421875, 0.04656982421875, 0.04766845703125, 0.04937744140625, -0.0009131431579589844, 0.030853271484375, -0.049346923828125, 0.017822265625, 0.035430908203125, 0.0158843994140625, 0.019500732421875, -0.03887939453125, 0.059783935546875, 0.0089111328125, -0.033416748046875, -0.06805419921875, -0.004337310791015625, -0.06683349609375, -0.0269775390625, 0.08795166015625, -0.0036602020263671875, -0.01528167724609375, 0.020751953125, -0.01070404052734375, 0.0166015625, -0.040283203125, 0.051116943359375, 0.04876708984375, 0.0132293701171875, 0.0081024169921875, -0.04901123046875, 0.03759765625, 0.032928466796875, -0.055267333984375, -0.00997161865234375, 0.0301666259765625, 0.02484130859375, 0.0134429931640625, 0.04058837890625, 0.0170440673828125, 0.0243682861328125, -0.006866455078125, -0.00024402141571044922, -0.0178680419921875, -0.013427734375, -0.0235595703125, -0.002933502197265625, -0.023895263671875, -0.03271484375 ] ]
Kaludi/food-category-classification-v2.0
2023-02-09T19:20:59.000Z
[ "transformers", "pytorch", "swin", "image-classification", "vision", "dataset:Kaludi/food-category-classification-v2.0", "co2_eq_emissions", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
Kaludi
null
null
Kaludi/food-category-classification-v2.0
10
126,534
transformers
2023-02-08T20:35:47
--- tags: - vision - image-classification datasets: - Kaludi/food-category-classification-v2.0 widget: - src: https://www.foodandwine.com/thmb/gv06VNqj1uUJHGlw5e7IULwUmr8=/1500x0/filters:no_upscale():max_bytes(150000):strip_icc()/2012-r-xl-vegetable-sandwich-with-dill-sauce-2000-0984c1b513ae4af396aee039afa5e38c.jpg example_title: Bread - src: https://cdn.britannica.com/34/176234-050-0E0C55C6/Glass-milk.jpg example_title: Dairy - src: https://images-gmi-pmc.edge-generalmills.com/7c1096c7-bfd0-4806-a794-1d3001fe0063.jpg example_title: Dessert - src: https://theheirloompantry.co/wp-content/uploads/2022/06/how-to-fry-eggs-perfectly-in-4-ways-the-heirloom-pantry.jpg example_title: Egg - src: https://www.mashed.com/img/gallery/the-real-reason-fried-foods-are-so-popular-right-now/l-intro-1650327494.jpg example_title: Fried Food - src: https://www.seriouseats.com/thmb/WzQz05gt5witRGeOYKTcTqfe1gs=/1500x0/filters:no_upscale():max_bytes(150000):strip_icc()/butter-basted-pan-seared-steaks-recipe-hero-06-03b1131c58524be2bd6c9851a2fbdbc3.jpg example_title: Meat - src: https://assets3.thrillist.com/v1/image/3097381/1200x600/scale; example_title: Seafood - src: https://i0.wp.com/post.healthline.com/wp-content/uploads/2020/03/romaine-lettuce-1296x728-body.jpg?w=1155&h=1528 example_title: Vegetable co2_eq_emissions: emissions: 12.456278925446485 --- # Food Category Classification v2.0 This is an updated Food Category Image Classifier model of the [old](https://huggingface.co/Kaludi/food-category-classification) model that has been trained by [Kaludi](https://huggingface.co/Kaludi) to recognize **12** different categories of foods, which includes **Bread**, **Dairy**, **Dessert**, **Egg**, **Fried Food**, **Fruit**, **Meat**, **Noodles**, **Rice**, **Seafood**, **Soup**, and **Vegetable**. It can accurately classify an image of food into one of these categories by analyzing its visual features. This model can be used by food bloggers, restaurants, and recipe websites to quickly categorize and sort their food images, making it easier to manage their content and provide a better user experience. ### Gradio This model supports a [Gradio](https://github.com/gradio-app/gradio) Web UI to run the data-food-classification model: [![Open In HF Spaces](https://camo.githubusercontent.com/00380c35e60d6b04be65d3d94a58332be5cc93779f630bcdfc18ab9a3a7d3388/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f25463025394625413425393725323048756767696e67253230466163652d5370616365732d626c7565)](https://huggingface.co/spaces/Kaludi/Food-Category-Classification_V2_App) ## Validation Metrics - Problem type: Multi-class Classification - Model ID: 3353292434 - CO2 Emissions (in grams): 12.4563 - Loss: 0.144 - Accuracy: 0.960 - Macro F1: 0.959 - Micro F1: 0.960 - Weighted F1: 0.959 - Macro Precision: 0.962 - Micro Precision: 0.960 - Weighted Precision: 0.962 - Macro Recall: 0.960 - Micro Recall: 0.960 - Weighted Recall: 0.960
2,964
[ [ -0.01172637939453125, -0.03350830078125, -0.002288818359375, -0.0193634033203125, -0.00511932373046875, -0.011077880859375, 0.014923095703125, -0.056884765625, 0.00020003318786621094, 0.035736083984375, -0.0255126953125, -0.04052734375, -0.046112060546875, -0.00701141357421875, -0.0164031982421875, 0.0826416015625, -0.01514434814453125, 0.01311492919921875, -0.031097412109375, -0.0220794677734375, -0.052734375, -0.03546142578125, -0.042510986328125, -0.0176239013671875, 0.042205810546875, 0.05291748046875, 0.05242919921875, 0.0401611328125, 0.04461669921875, 0.0152130126953125, -0.0138397216796875, 0.00937652587890625, -0.0118255615234375, -0.00482940673828125, -0.01287841796875, -0.04241943359375, -0.033477783203125, 0.0116424560546875, 0.0094451904296875, 0.01486968994140625, -0.01157379150390625, 0.041473388671875, -0.0052490234375, 0.0635986328125, -0.042724609375, 0.031982421875, -0.0367431640625, 0.0195770263671875, 0.00893402099609375, -0.00841522216796875, -0.016204833984375, -0.0300445556640625, -0.00531768798828125, -0.05145263671875, 0.004703521728515625, -0.00823974609375, 0.098388671875, 0.0225372314453125, -0.045989990234375, -0.031646728515625, -0.043853759765625, 0.039459228515625, -0.0310821533203125, 0.0328369140625, 0.038177490234375, 0.055389404296875, -0.0282745361328125, -0.07745361328125, -0.0357666015625, 0.0145263671875, 0.005977630615234375, 0.00873565673828125, -0.0100555419921875, -0.045562744140625, 0.004947662353515625, 0.042572021484375, -0.038726806640625, 0.0267486572265625, -0.0465087890625, 0.0026836395263671875, 0.0304107666015625, 0.00580596923828125, 0.0205535888671875, 0.005451202392578125, -0.04608154296875, -0.03265380859375, -0.0190582275390625, 0.034210205078125, -0.0050811767578125, -0.002208709716796875, -0.01105499267578125, 0.048248291015625, -0.035919189453125, 0.039154052734375, -0.01242828369140625, -0.0347900390625, 0.055511474609375, -0.0243072509765625, -0.01983642578125, -0.00751495361328125, 0.0396728515625, 0.08447265625, 0.0019388198852539062, 0.037384033203125, -0.010009765625, 0.0249176025390625, 0.001148223876953125, -0.044403076171875, -0.03338623046875, 0.0146484375, -0.0196380615234375, -0.044891357421875, 0.00775909423828125, -0.050811767578125, -0.0303802490234375, 0.021514892578125, 0.01557159423828125, 0.0016651153564453125, -0.02679443359375, 0.0242919921875, -0.043060302734375, 0.05279541015625, 0.050140380859375, -0.08026123046875, 0.041229248046875, 0.054656982421875, 0.0654296875, 0.02923583984375, 0.008087158203125, -0.02569580078125, 0.0179595947265625, -0.01007843017578125, 0.0765380859375, -0.039764404296875, -0.007640838623046875, -0.02655029296875, 0.04815673828125, 0.0132598876953125, -0.0596923828125, 0.061920166015625, -0.029876708984375, 0.0131683349609375, -0.044281005859375, -0.0400390625, -0.0440673828125, 0.0165252685546875, -0.05133056640625, 0.08477783203125, 0.035888671875, -0.04827880859375, 0.0501708984375, -0.0297698974609375, -0.01678466796875, 0.029327392578125, -0.018035888671875, -0.05047607421875, -0.01026153564453125, -0.0013370513916015625, 0.0168914794921875, -0.032562255859375, 0.0144500732421875, -0.041748046875, -0.015716552734375, -0.0107421875, 0.0010728836059570312, 0.041107177734375, 0.0299835205078125, -0.004627227783203125, 0.01352691650390625, -0.035003662109375, 0.00949859619140625, 0.0360107421875, -0.0214080810546875, -0.0176239013671875, 0.001251220703125, 0.0277862548828125, 0.0182342529296875, 0.01898193359375, -0.02008056640625, 0.037841796875, 0.0051727294921875, 0.040069580078125, 0.0445556640625, -0.0061492919921875, 0.022979736328125, -0.041107177734375, 0.013519287109375, 0.0168914794921875, 0.037811279296875, 0.0018758773803710938, -0.08258056640625, -0.040618896484375, -0.0170135498046875, 0.04656982421875, 0.0721435546875, -0.0288238525390625, 0.055572509765625, -0.00665283203125, -0.06298828125, -0.01073455810546875, -0.016876220703125, 0.004901885986328125, 0.056610107421875, 0.0167999267578125, -0.035186767578125, -0.04571533203125, -0.05743408203125, 0.005176544189453125, -0.008026123046875, 0.0009379386901855469, 0.01751708984375, 0.04736328125, -0.036865234375, 0.047119140625, -0.041229248046875, -0.015167236328125, -0.0125579833984375, -0.006710052490234375, 0.010162353515625, 0.0479736328125, 0.06268310546875, -0.055633544921875, -0.037628173828125, 0.039031982421875, -0.055084228515625, 0.015625, 0.027618408203125, -0.0038204193115234375, 0.0185699462890625, 0.0137786865234375, -0.0277252197265625, 0.049774169921875, 0.0247802734375, -0.047271728515625, 0.01373291015625, 0.00024771690368652344, 0.01175689697265625, -0.07696533203125, -0.0176849365234375, 0.0169219970703125, -0.00870513916015625, -0.0310821533203125, 0.017181396484375, 0.022216796875, 0.00994873046875, -0.0584716796875, 0.066650390625, -0.008209228515625, -0.0112152099609375, -0.017364501953125, -0.042724609375, 0.0137786865234375, 0.02606201171875, 0.007068634033203125, 0.037841796875, 0.0484619140625, -0.0433349609375, 0.044647216796875, -0.000591278076171875, -0.055389404296875, 0.01715087890625, -0.04376220703125, -0.0084381103515625, -0.009429931640625, 0.033660888671875, -0.100830078125, -0.044647216796875, 0.073486328125, -0.02960205078125, 0.00728607177734375, -0.0099639892578125, -0.0513916015625, -0.058197021484375, -0.0259857177734375, 0.035919189453125, 0.0438232421875, -0.0482177734375, 0.0210723876953125, 0.033721923828125, 0.01319122314453125, -0.0041656494140625, -0.0631103515625, -0.0252532958984375, -0.007411956787109375, -0.03387451171875, -0.01438140869140625, -0.005718231201171875, 0.0007958412170410156, 0.006267547607421875, -0.0340576171875, -0.0210113525390625, 0.0118560791015625, 0.0268402099609375, 0.015869140625, -0.0000036954879760742188, 0.003662109375, 0.0184478759765625, 0.00418853759765625, -0.0136871337890625, -0.00589752197265625, 0.0506591796875, -0.0219268798828125, 0.00981903076171875, -0.06805419921875, -0.00954437255859375, 0.044403076171875, -0.01953125, 0.0280303955078125, 0.073486328125, -0.05230712890625, 0.031402587890625, -0.0193023681640625, 0.00921630859375, -0.034820556640625, 0.0445556640625, -0.0305328369140625, -0.050140380859375, 0.035369873046875, -0.0007014274597167969, -0.00732421875, 0.06060791015625, 0.0103302001953125, 0.0005474090576171875, 0.06591796875, 0.00885009765625, 0.00289154052734375, 0.0283966064453125, -0.030426025390625, -0.00852203369140625, -0.0531005859375, -0.08319091796875, -0.055816650390625, -0.0125732421875, -0.057525634765625, -0.046722412109375, 0.013397216796875, 0.01357269287109375, -0.0121612548828125, 0.0254058837890625, -0.05047607421875, 0.0501708984375, 0.04656982421875, 0.0287933349609375, -0.007732391357421875, 0.027313232421875, 0.006481170654296875, 0.00980377197265625, -0.0504150390625, -0.031341552734375, 0.059814453125, 0.06500244140625, 0.06060791015625, 0.02008056640625, 0.031158447265625, 0.033721923828125, 0.012969970703125, -0.0772705078125, 0.029327392578125, -0.0313720703125, -0.05145263671875, 0.00008410215377807617, -0.026702880859375, -0.03717041015625, 0.0060577392578125, 0.0003788471221923828, -0.05487060546875, 0.01397705078125, 0.01111602783203125, -0.0244903564453125, 0.036712646484375, -0.09912109375, 0.09600830078125, -0.03515625, -0.0296783447265625, 0.02410888671875, -0.031005859375, 0.0211944580078125, 0.0088653564453125, 0.01439666748046875, -0.0200042724609375, -0.01219940185546875, 0.07470703125, -0.046630859375, 0.059814453125, -0.030853271484375, -0.014556884765625, 0.0347900390625, -0.023040771484375, 0.04217529296875, 0.018585205078125, 0.005481719970703125, 0.0150299072265625, 0.0194244384765625, -0.0301666259765625, -0.04388427734375, 0.048614501953125, -0.044830322265625, -0.0159912109375, -0.06695556640625, -0.005313873291015625, 0.0080413818359375, 0.01702880859375, 0.016937255859375, 0.01068115234375, 0.0258331298828125, 0.01385498046875, 0.041656494140625, 0.0010509490966796875, 0.0032901763916015625, 0.03131103515625, -0.04046630859375, -0.048980712890625, 0.0728759765625, 0.006011962890625, -0.01055145263671875, 0.028167724609375, 0.0248565673828125, -0.02655029296875, -0.007411956787109375, -0.0312042236328125, -0.02142333984375, -0.04534912109375, -0.04583740234375, -0.0438232421875, -0.0221099853515625, -0.038360595703125, 0.002414703369140625, -0.0162506103515625, -0.0288543701171875, -0.047088623046875, -0.01345062255859375, 0.04376220703125, 0.066650390625, -0.000797271728515625, 0.0233917236328125, -0.054534912109375, 0.0177764892578125, 0.0269317626953125, 0.032318115234375, -0.0007028579711914062, -0.04400634765625, -0.00937652587890625, 0.0027866363525390625, -0.0286407470703125, -0.072998046875, 0.023773193359375, 0.0168609619140625, 0.01654052734375, 0.037017822265625, -0.0278472900390625, 0.041961669921875, -0.01373291015625, 0.060546875, 0.0416259765625, -0.037506103515625, 0.047119140625, 0.003082275390625, 0.046051025390625, 0.03656005859375, 0.047821044921875, -0.041961669921875, 0.00013184547424316406, -0.01364898681640625, -0.055999755859375, 0.04547119140625, -0.0150909423828125, -0.026519775390625, -0.0073394775390625, 0.034088134765625, 0.035980224609375, 0.002910614013671875, -0.05645751953125, -0.0224456787109375, -0.0282440185546875, -0.0185394287109375, 0.0149688720703125, -0.046112060546875, 0.0193939208984375, -0.0712890625, 0.041168212890625, 0.017730712890625, 0.002773284912109375, -0.0034656524658203125, 0.0400390625, -0.030792236328125, 0.0096282958984375, 0.047943115234375, 0.01435089111328125, -0.04351806640625, 0.00797271728515625, -0.01080322265625, -0.04736328125, 0.0098114013671875, -0.00865936279296875, -0.00476837158203125, -0.014495849609375, 0.00832366943359375, 0.05975341796875, 0.005519866943359375, -0.037628173828125, 0.030303955078125, -0.039581298828125, -0.0418701171875, -0.02606201171875, 0.03253173828125, -0.0321044921875, 0.02740478515625, 0.0192108154296875, 0.03753662109375, 0.031890869140625, -0.060028076171875, 0.02349853515625, 0.00853729248046875, -0.017303466796875, -0.0169525146484375, 0.032379150390625, 0.002796173095703125, -0.00881195068359375, 0.045501708984375, -0.054718017578125, -0.0401611328125, 0.0633544921875, 0.05291748046875, 0.046051025390625, -0.029937744140625, 0.0274200439453125, 0.0596923828125, 0.0135498046875, -0.02508544921875, 0.046356201171875, 0.0006260871887207031, -0.0305938720703125, 0.017974853515625, -0.051727294921875, -0.01654052734375, 0.0435791015625, -0.0511474609375, 0.02825927734375, -0.046295166015625, -0.0178985595703125, 0.021209716796875, 0.00263214111328125, -0.043365478515625, 0.029449462890625, 0.0016326904296875, 0.07427978515625, -0.07427978515625, 0.06341552734375, 0.0838623046875, -0.0080413818359375, -0.039520263671875, -0.021331787109375, 0.0042572021484375, -0.049835205078125, 0.034820556640625, 0.024261474609375, 0.01678466796875, -0.02740478515625, -0.060302734375, -0.066650390625, 0.078125, -0.03045654296875, -0.054351806640625, 0.008392333984375, 0.00638580322265625, 0.040252685546875, -0.0157318115234375, 0.01546478271484375, 0.0015726089477539062, 0.049224853515625, 0.03900146484375, -0.05828857421875, -0.01129913330078125, -0.038909912109375, -0.00007045269012451172, 0.003570556640625, -0.0684814453125, 0.056549072265625, -0.01354217529296875, -0.0235595703125, 0.0278472900390625, 0.045074462890625, 0.0001316070556640625, 0.05328369140625, 0.031494140625, 0.07464599609375, 0.0626220703125, -0.0278472900390625, 0.062286376953125, 0.00032520294189453125, 0.039520263671875, 0.10601806640625, -0.034271240234375, 0.04901123046875, 0.0208587646484375, -0.0124053955078125, 0.0307159423828125, 0.09417724609375, -0.040374755859375, 0.078857421875, 0.00606536865234375, 0.001613616943359375, 0.0023174285888671875, -0.0216522216796875, -0.060760498046875, 0.03179931640625, 0.01800537109375, -0.02850341796875, -0.0130157470703125, 0.0026721954345703125, -0.005130767822265625, -0.0273284912109375, -0.027496337890625, 0.054443359375, -0.02655029296875, -0.016632080078125, 0.05413818359375, 0.01094818115234375, 0.0240936279296875, -0.015838623046875, -0.032501220703125, -0.01195526123046875, 0.00592803955078125, -0.0299835205078125, -0.071044921875, -0.00418853759765625, -0.0094451904296875, -0.0080718994140625, 0.0245208740234375, 0.0433349609375, -0.002880096435546875, -0.0501708984375, 0.00611114501953125, 0.010498046875, 0.003261566162109375, 0.014923095703125, -0.045684814453125, 0.044586181640625, -0.00449371337890625, 0.010162353515625, -0.0028209686279296875, 0.01212310791015625, 0.0012264251708984375, 0.03387451171875, 0.0206451416015625, -0.040740966796875, 0.0094451904296875, -0.0124969482421875, 0.0665283203125, -0.044097900390625, -0.0286407470703125, -0.031768798828125, 0.02093505859375, 0.005405426025390625, -0.017669677734375, 0.04168701171875, 0.07916259765625, 0.11187744140625, -0.033355712890625, 0.04522705078125, -0.017303466796875, 0.02191162109375, -0.038421630859375, 0.0197906494140625, -0.0755615234375, 0.0161895751953125, 0.00014448165893554688, -0.035614013671875, -0.01309967041015625, 0.043121337890625, -0.0269317626953125, 0.01282501220703125, 0.036712646484375, 0.052337646484375, -0.01326751708984375, -0.00585174560546875, 0.0016984939575195312, 0.0035839080810546875, -0.0015134811401367188, 0.028961181640625, 0.058990478515625, -0.06243896484375, 0.0027179718017578125, -0.031402587890625, -0.031005859375, -0.03466796875, -0.00754547119140625, -0.0439453125, -0.03106689453125, -0.031494140625, -0.017486572265625, 0.0145721435546875, 0.052581787109375, 0.06524658203125, -0.060791015625, -0.01319122314453125, -0.0086822509765625, 0.00710296630859375, -0.02288818359375, -0.021728515625, 0.000713348388671875, 0.01175689697265625, -0.03619384765625, -0.0008244514465332031, 0.006519317626953125, 0.021209716796875, 0.0272979736328125, -0.0253753662109375, -0.0186614990234375, -0.01751708984375, 0.048004150390625, 0.035003662109375, -0.055908203125, -0.02276611328125, -0.0307159423828125, 0.0022335052490234375, 0.031982421875, 0.00572967529296875, -0.0294647216796875, 0.0292510986328125, 0.025665283203125, 0.017303466796875, 0.0163116455078125, -0.00746917724609375, -0.027374267578125, -0.0307464599609375, 0.006618499755859375, -0.013458251953125, 0.022491455078125, 0.01290130615234375, -0.0247344970703125, 0.05218505859375, 0.036346435546875, -0.0258331298828125, -0.05889892578125, 0.00905609130859375, -0.0924072265625, -0.0175323486328125, 0.07958984375, 0.00646209716796875, -0.0699462890625, 0.0270233154296875, -0.01074981689453125, 0.0269317626953125, -0.037384033203125, 0.0469970703125, 0.030731201171875, 0.00478363037109375, -0.0164642333984375, -0.0279541015625, 0.032379150390625, 0.0017099380493164062, -0.08001708984375, -0.032958984375, 0.0360107421875, 0.04156494140625, 0.01739501953125, 0.0152740478515625, -0.01445770263671875, 0.037841796875, -0.0017490386962890625, 0.01953125, 0.00733184814453125, -0.0548095703125, -0.0130462646484375, -0.003326416015625, 0.002559661865234375, -0.0357666015625 ] ]
facebook/hubert-base-ls960
2021-11-05T12:43:12.000Z
[ "transformers", "pytorch", "tf", "hubert", "feature-extraction", "speech", "en", "dataset:librispeech_asr", "arxiv:2106.07447", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
facebook
null
null
facebook/hubert-base-ls960
22
125,902
transformers
2022-03-02T23:29:05
--- language: en datasets: - librispeech_asr tags: - speech license: apache-2.0 --- # Hubert-Base [Facebook's Hubert](https://ai.facebook.com/blog/hubert-self-supervised-representation-learning-for-speech-recognition-generation-and-compression) The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. [Paper](https://arxiv.org/abs/2106.07447) Authors: Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed **Abstract** Self-supervised approaches for speech representation learning are challenged by three unique problems: (1) there are multiple sound units in each input utterance, (2) there is no lexicon of input sound units during the pre-training phase, and (3) sound units have variable lengths with no explicit segmentation. To deal with these three problems, we propose the Hidden-Unit BERT (HuBERT) approach for self-supervised speech representation learning, which utilizes an offline clustering step to provide aligned target labels for a BERT-like prediction loss. A key ingredient of our approach is applying the prediction loss over the masked regions only, which forces the model to learn a combined acoustic and language model over the continuous inputs. HuBERT relies primarily on the consistency of the unsupervised clustering step rather than the intrinsic quality of the assigned cluster labels. Starting with a simple k-means teacher of 100 clusters, and using two iterations of clustering, the HuBERT model either matches or improves upon the state-of-the-art wav2vec 2.0 performance on the Librispeech (960h) and Libri-light (60,000h) benchmarks with 10min, 1h, 10h, 100h, and 960h fine-tuning subsets. Using a 1B parameter model, HuBERT shows up to 19% and 13% relative WER reduction on the more challenging dev-other and test-other evaluation subsets. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/hubert . # Usage See [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information on how to fine-tune the model. Note that the class `Wav2Vec2ForCTC` has to be replaced by `HubertForCTC`.
2,596
[ [ -0.0275726318359375, -0.03631591796875, 0.0247344970703125, 0.01380157470703125, -0.01546478271484375, -0.006801605224609375, -0.0257110595703125, -0.0360107421875, 0.01342010498046875, 0.02337646484375, -0.048828125, -0.028839111328125, -0.035064697265625, -0.0202178955078125, -0.01122283935546875, 0.05511474609375, 0.01349639892578125, 0.0248565673828125, 0.0015811920166015625, -0.01020050048828125, -0.042633056640625, -0.0439453125, -0.0528564453125, -0.0294189453125, 0.0243072509765625, 0.031829833984375, 0.015167236328125, 0.041046142578125, 0.01222991943359375, 0.0199432373046875, -0.00481414794921875, -0.00007158517837524414, -0.045623779296875, 0.0025691986083984375, 0.004085540771484375, -0.00991058349609375, -0.03045654296875, 0.016143798828125, 0.057952880859375, 0.04736328125, -0.0303955078125, 0.0240631103515625, 0.008575439453125, 0.03558349609375, -0.0309906005859375, 0.02117919921875, -0.0606689453125, -0.006175994873046875, -0.0059051513671875, 0.00907135009765625, -0.035064697265625, 0.005191802978515625, -0.004558563232421875, -0.0340576171875, 0.020782470703125, -0.008514404296875, 0.07171630859375, 0.025054931640625, -0.026275634765625, -0.0007014274597167969, -0.057220458984375, 0.0777587890625, -0.041015625, 0.05584716796875, 0.049652099609375, 0.023406982421875, 0.004230499267578125, -0.058837890625, -0.0203857421875, -0.0137786865234375, 0.006687164306640625, 0.0205230712890625, -0.0258331298828125, 0.0035648345947265625, 0.01837158203125, 0.01605224609375, -0.040283203125, 0.0229644775390625, -0.053192138671875, -0.03863525390625, 0.05609130859375, -0.0252227783203125, -0.0125732421875, -0.0210723876953125, -0.0296173095703125, -0.019317626953125, -0.034423828125, 0.0261688232421875, 0.0272064208984375, 0.0279388427734375, -0.01030731201171875, 0.0147705078125, 0.005741119384765625, 0.044464111328125, 0.0181121826171875, -0.010833740234375, 0.033782958984375, 0.00746917724609375, -0.00347137451171875, 0.02398681640625, 0.0604248046875, -0.00742340087890625, 0.01209259033203125, 0.00638580322265625, -0.0323486328125, -0.005764007568359375, 0.0181427001953125, -0.055999755859375, -0.0499267578125, 0.01261138916015625, -0.040557861328125, -0.006587982177734375, 0.0149688720703125, 0.0010709762573242188, 0.02520751953125, -0.046875, 0.063720703125, -0.030670166015625, -0.014129638671875, -0.02276611328125, 0.0025386810302734375, -0.0026111602783203125, 0.003108978271484375, -0.091552734375, 0.032867431640625, 0.036895751953125, 0.04852294921875, -0.01461029052734375, -0.005107879638671875, -0.046661376953125, 0.0017185211181640625, -0.043121337890625, 0.033111572265625, -0.0076141357421875, -0.023468017578125, -0.02447509765625, -0.00331878662109375, 0.0171051025390625, -0.04833984375, 0.03082275390625, -0.01873779296875, 0.00527191162109375, -0.01342010498046875, -0.05621337890625, -0.012939453125, -0.027313232421875, -0.040283203125, 0.0989990234375, 0.0182952880859375, -0.03564453125, 0.01207733154296875, -0.031829833984375, -0.0323486328125, -0.001064300537109375, -0.0203704833984375, -0.045379638671875, 0.02117919921875, 0.028533935546875, 0.04864501953125, 0.0191192626953125, 0.0284423828125, -0.0189208984375, -0.02850341796875, 0.01052093505859375, -0.031494140625, 0.055908203125, 0.026031494140625, -0.01806640625, 0.01340484619140625, -0.0833740234375, 0.0167083740234375, -0.0021495819091796875, -0.0260467529296875, -0.005462646484375, -0.00875091552734375, 0.019317626953125, 0.0075836181640625, 0.0245513916015625, -0.0467529296875, -0.00467681884765625, -0.0447998046875, 0.043792724609375, 0.0595703125, -0.008514404296875, 0.031768798828125, -0.01450347900390625, 0.006511688232421875, -0.0196533203125, 0.01486968994140625, -0.007678985595703125, -0.040802001953125, -0.05133056640625, -0.0268402099609375, 0.055419921875, 0.0261688232421875, -0.018524169921875, 0.046539306640625, 0.006443023681640625, -0.0367431640625, -0.06976318359375, 0.006908416748046875, 0.0207672119140625, 0.037078857421875, 0.05609130859375, -0.02020263671875, -0.0533447265625, -0.075439453125, 0.00922393798828125, -0.0251312255859375, -0.02166748046875, 0.0196533203125, 0.0175323486328125, -0.02349853515625, 0.07635498046875, -0.015045166015625, -0.03204345703125, -0.0036716461181640625, 0.02471923828125, 0.01381683349609375, 0.061798095703125, 0.031036376953125, -0.04168701171875, -0.027435302734375, -0.016021728515625, -0.03265380859375, -0.0127410888671875, 0.007465362548828125, 0.0172271728515625, 0.0197296142578125, 0.0477294921875, -0.0178680419921875, 0.02239990234375, 0.052642822265625, 0.01461029052734375, 0.0265960693359375, -0.025787353515625, -0.0240936279296875, -0.0860595703125, -0.01277923583984375, -0.0163421630859375, -0.036224365234375, -0.04876708984375, -0.017852783203125, 0.0180816650390625, -0.0127716064453125, -0.0252227783203125, 0.0323486328125, -0.035430908203125, -0.0148162841796875, -0.0247344970703125, 0.0058441162109375, -0.012481689453125, 0.04071044921875, 0.004405975341796875, 0.0478515625, 0.048370361328125, -0.044158935546875, 0.0234375, 0.0104827880859375, -0.025177001953125, 0.00931549072265625, -0.06060791015625, 0.0267181396484375, -0.0014028549194335938, 0.017608642578125, -0.07293701171875, -0.01096343994140625, 0.003459930419921875, -0.06072998046875, 0.061981201171875, -0.00811004638671875, -0.02587890625, -0.0250396728515625, -0.0028972625732421875, 0.0302276611328125, 0.055694580078125, -0.0643310546875, 0.03778076171875, 0.0433349609375, 0.004642486572265625, -0.03704833984375, -0.06549072265625, -0.0090179443359375, 0.0007143020629882812, -0.043792724609375, 0.04681396484375, -0.01116180419921875, 0.00963592529296875, -0.0095367431640625, -0.0217742919921875, 0.002170562744140625, 0.0009655952453613281, 0.034759521484375, -0.006023406982421875, -0.0074462890625, 0.0491943359375, 0.01751708984375, -0.024658203125, 0.004650115966796875, -0.032745361328125, 0.032989501953125, -0.00853729248046875, -0.01329803466796875, -0.058135986328125, 0.02667236328125, -0.001369476318359375, -0.017791748046875, 0.0184783935546875, 0.08795166015625, -0.037353515625, -0.020477294921875, -0.0540771484375, -0.038970947265625, -0.039276123046875, 0.03802490234375, -0.034759521484375, -0.083740234375, 0.0267486572265625, 0.0014715194702148438, -0.01303863525390625, 0.053497314453125, 0.047149658203125, -0.038299560546875, 0.06988525390625, 0.048370361328125, -0.01763916015625, 0.035064697265625, -0.040283203125, 0.0100250244140625, -0.057891845703125, -0.023773193359375, -0.029266357421875, -0.021514892578125, -0.0533447265625, -0.0384521484375, 0.0238494873046875, 0.0252685546875, -0.009429931640625, 0.03363037109375, -0.048370361328125, 0.006649017333984375, 0.065185546875, 0.007843017578125, -0.00949859619140625, 0.024749755859375, -0.0138092041015625, -0.015228271484375, -0.062469482421875, -0.01561737060546875, 0.0728759765625, 0.045928955078125, 0.059417724609375, -0.0011386871337890625, 0.09124755859375, 0.01702880859375, -0.01018524169921875, -0.07098388671875, 0.0248565673828125, -0.0070037841796875, -0.053802490234375, -0.04071044921875, -0.042266845703125, -0.07550048828125, 0.017303466796875, -0.014129638671875, -0.06646728515625, 0.0205535888671875, 0.0195770263671875, -0.0250701904296875, 0.004852294921875, -0.050750732421875, 0.05609130859375, -0.010498046875, 0.0023345947265625, -0.028778076171875, -0.057220458984375, 0.0004000663757324219, -0.0104827880859375, 0.018218994140625, -0.0243072509765625, 0.0254364013671875, 0.07537841796875, -0.03326416015625, 0.05706787109375, -0.0300140380859375, 0.0035610198974609375, 0.039276123046875, -0.00731658935546875, 0.0269927978515625, 0.00872802734375, 0.01096343994140625, 0.03253173828125, 0.01861572265625, -0.031402587890625, -0.0290679931640625, 0.0550537109375, -0.0799560546875, -0.026885986328125, -0.01715087890625, -0.023773193359375, -0.0269012451171875, -0.004261016845703125, 0.037841796875, 0.044158935546875, -0.0181121826171875, 0.0205230712890625, 0.05047607421875, 0.00927734375, 0.042572021484375, 0.033782958984375, -0.017425537109375, -0.038787841796875, 0.08453369140625, 0.02972412109375, 0.0057525634765625, 0.0251007080078125, 0.029083251953125, -0.0316162109375, -0.03509521484375, -0.02276611328125, 0.0210418701171875, -0.0494384765625, -0.0181121826171875, -0.046661376953125, -0.031524658203125, -0.05035400390625, 0.02362060546875, -0.052337646484375, -0.052337646484375, -0.05816650390625, -0.0142669677734375, 0.021575927734375, 0.0574951171875, -0.053863525390625, 0.040435791015625, -0.0404052734375, 0.02978515625, 0.051422119140625, 0.010284423828125, -0.007770538330078125, -0.0748291015625, -0.0296478271484375, 0.00333404541015625, -0.01317596435546875, -0.058319091796875, 0.0182952880859375, 0.0307464599609375, 0.045166015625, 0.035186767578125, -0.0002465248107910156, 0.04595947265625, -0.0404052734375, 0.05487060546875, 0.021240234375, -0.0777587890625, 0.0609130859375, -0.011962890625, 0.01189422607421875, 0.037353515625, 0.019989013671875, -0.0231170654296875, -0.0080108642578125, -0.05682373046875, -0.055450439453125, 0.06536865234375, 0.0245819091796875, 0.019256591796875, 0.01007843017578125, 0.0283355712890625, -0.0013494491577148438, -0.0006527900695800781, -0.059295654296875, -0.034759521484375, -0.031707763671875, -0.0103607177734375, -0.016845703125, -0.043304443359375, 0.005298614501953125, -0.04962158203125, 0.07757568359375, 0.0036067962646484375, 0.023651123046875, 0.017333984375, -0.0021877288818359375, -0.00794219970703125, 0.0103607177734375, 0.029632568359375, 0.03643798828125, -0.0312347412109375, 0.005218505859375, 0.01534271240234375, -0.0224761962890625, 0.002887725830078125, 0.0243377685546875, -0.00304412841796875, 0.0172576904296875, 0.0277099609375, 0.088623046875, 0.012237548828125, -0.00896453857421875, 0.040557861328125, 0.0015516281127929688, -0.037017822265625, -0.030120849609375, 0.0006313323974609375, 0.0035419464111328125, 0.01432037353515625, 0.043304443359375, 0.0013017654418945312, 0.0130157470703125, -0.028656005859375, 0.0257110595703125, 0.0208740234375, -0.06231689453125, -0.0235443115234375, 0.05224609375, 0.0004000663757324219, -0.02227783203125, 0.04071044921875, -0.03143310546875, -0.0278778076171875, 0.0249176025390625, 0.0499267578125, 0.05804443359375, -0.05633544921875, 0.015655517578125, 0.043548583984375, 0.0233001708984375, -0.00907135009765625, 0.0306243896484375, -0.0263214111328125, -0.039459228515625, -0.036163330078125, -0.052154541015625, -0.01119232177734375, 0.0211181640625, -0.0557861328125, 0.01313018798828125, -0.0232696533203125, -0.0277252197265625, 0.0099334716796875, 0.00524139404296875, -0.037841796875, 0.017852783203125, 0.0202484130859375, 0.03802490234375, -0.048004150390625, 0.091064453125, 0.0289459228515625, -0.00913238525390625, -0.097900390625, -0.0139312744140625, -0.01233673095703125, -0.05560302734375, 0.035858154296875, 0.0158538818359375, -0.004302978515625, 0.0096588134765625, -0.0472412109375, -0.0867919921875, 0.07086181640625, 0.031097412109375, -0.0721435546875, 0.0165557861328125, -0.01251220703125, 0.035430908203125, -0.0198822021484375, -0.010101318359375, 0.032806396484375, 0.021484375, 0.00457763671875, -0.08111572265625, -0.019805908203125, 0.00365447998046875, 0.011688232421875, -0.005962371826171875, -0.040924072265625, 0.0748291015625, -0.0160675048828125, -0.018310546875, 0.00524139404296875, 0.07275390625, 0.00966644287109375, 0.0224609375, 0.03564453125, 0.04736328125, 0.07061767578125, -0.01279449462890625, 0.045166015625, -0.0200653076171875, 0.0535888671875, 0.09832763671875, 0.01904296875, 0.075927734375, 0.029083251953125, -0.03021240234375, 0.0316162109375, 0.051849365234375, -0.0204010009765625, 0.04852294921875, 0.026885986328125, -0.007488250732421875, -0.02947998046875, 0.0027923583984375, -0.0474853515625, 0.061431884765625, 0.025177001953125, -0.020233154296875, 0.0120391845703125, 0.00978851318359375, -0.0261993408203125, -0.0024700164794921875, -0.0249176025390625, 0.056671142578125, 0.02642822265625, -0.019378662109375, 0.0623779296875, 0.01512908935546875, 0.039154052734375, -0.0418701171875, 0.00414276123046875, 0.0012369155883789062, -0.00360107421875, -0.0158843994140625, -0.0241851806640625, 0.004985809326171875, -0.0269927978515625, -0.019805908203125, -0.01268768310546875, 0.05657958984375, -0.05047607421875, -0.047393798828125, 0.033905029296875, 0.0250396728515625, 0.0309906005859375, -0.0173797607421875, -0.0543212890625, 0.00727081298828125, 0.002040863037109375, -0.003978729248046875, 0.0137176513671875, 0.0172119140625, 0.00959014892578125, 0.0219268798828125, 0.039154052734375, 0.0017833709716796875, 0.0006237030029296875, 0.0277862548828125, 0.04620361328125, -0.033538818359375, -0.044158935546875, -0.03594970703125, 0.01885986328125, 0.0081329345703125, 0.0013904571533203125, 0.035736083984375, 0.043701171875, 0.0819091796875, -0.0016651153564453125, 0.035003662109375, 0.018157958984375, 0.057647705078125, -0.041534423828125, 0.055999755859375, -0.04541015625, 0.007785797119140625, -0.011444091796875, -0.062164306640625, -0.00919342041015625, 0.07037353515625, 0.00492095947265625, 0.0235748291015625, 0.0296630859375, 0.0562744140625, -0.003200531005859375, -0.01267242431640625, 0.045257568359375, 0.0166778564453125, 0.0090484619140625, 0.0150299072265625, 0.05853271484375, -0.05316162109375, 0.040313720703125, -0.038299560546875, -0.01468658447265625, -0.0167083740234375, -0.03472900390625, -0.0706787109375, -0.0650634765625, -0.032806396484375, -0.017181396484375, 0.006298065185546875, 0.08111572265625, 0.0943603515625, -0.07763671875, -0.02490234375, 0.0148162841796875, -0.008575439453125, -0.0181884765625, -0.01335906982421875, 0.041534423828125, -0.030120849609375, -0.040496826171875, 0.061431884765625, 0.00914764404296875, 0.0184173583984375, -0.0195159912109375, -0.0184173583984375, 0.0016613006591796875, -0.00832366943359375, 0.044403076171875, 0.0138397216796875, -0.077880859375, -0.022308349609375, -0.02020263671875, 0.003490447998046875, 0.02069091796875, 0.048004150390625, -0.07110595703125, 0.05450439453125, 0.00971221923828125, 0.024627685546875, 0.07867431640625, -0.002460479736328125, 0.0118408203125, -0.08929443359375, 0.0009555816650390625, 0.02362060546875, 0.0277252197265625, 0.03045654296875, -0.0141448974609375, 0.0214996337890625, 0.01873779296875, -0.05224609375, -0.050201416015625, 0.0125885009765625, -0.10028076171875, -0.0260009765625, 0.07757568359375, 0.0026493072509765625, -0.007656097412109375, 0.01129150390625, -0.025360107421875, 0.0396728515625, -0.0513916015625, 0.044281005859375, 0.0380859375, -0.0127410888671875, -0.015228271484375, -0.031829833984375, 0.0447998046875, 0.03155517578125, -0.01342010498046875, -0.0057220458984375, 0.0299530029296875, 0.0290679931640625, 0.009613037109375, 0.052459716796875, 0.006999969482421875, 0.007232666015625, -0.004886627197265625, 0.016845703125, -0.0090179443359375, -0.039154052734375, -0.04644775390625, 0.0032958984375, 0.00794219970703125, -0.034515380859375 ] ]
nateraw/food
2022-05-17T17:44:24.000Z
[ "transformers", "pytorch", "tensorboard", "vit", "image-classification", "generated_from_trainer", "dataset:food101", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
nateraw
null
null
nateraw/food
14
125,658
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - generated_from_trainer - image-classification - pytorch datasets: - food101 metrics: - accuracy model-index: - name: food101_outputs results: - task: name: Image Classification type: image-classification dataset: name: food-101 type: food101 args: default metrics: - name: Accuracy type: accuracy value: 0.8912871287128713 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nateraw/food This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the nateraw/food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.4501 - Accuracy: 0.8913 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 128 - eval_batch_size: 128 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.8271 | 1.0 | 592 | 0.6070 | 0.8562 | | 0.4376 | 2.0 | 1184 | 0.4947 | 0.8691 | | 0.2089 | 3.0 | 1776 | 0.4876 | 0.8747 | | 0.0882 | 4.0 | 2368 | 0.4639 | 0.8857 | | 0.0452 | 5.0 | 2960 | 0.4501 | 0.8913 | ### Framework versions - Transformers 4.9.0.dev0 - Pytorch 1.9.0+cu102 - Datasets 1.9.1.dev0 - Tokenizers 0.10.3
1,957
[ [ -0.02880859375, -0.04742431640625, 0.004383087158203125, 0.00214385986328125, -0.011962890625, -0.0272064208984375, -0.00749969482421875, -0.01561737060546875, 0.030029296875, 0.0285186767578125, -0.045623779296875, -0.047149658203125, -0.051910400390625, -0.0038394927978515625, -0.0166015625, 0.0867919921875, 0.0086212158203125, 0.0219879150390625, -0.0033969879150390625, -0.0036029815673828125, -0.03692626953125, -0.04052734375, -0.053619384765625, -0.047760009765625, 0.026641845703125, 0.0303955078125, 0.060638427734375, 0.0640869140625, 0.05023193359375, 0.016021728515625, -0.03173828125, -0.0015459060668945312, -0.045196533203125, -0.0241851806640625, -0.0133056640625, -0.0245361328125, -0.058013916015625, 0.0021953582763671875, 0.045135498046875, 0.0214691162109375, -0.01357269287109375, 0.048187255859375, 0.007717132568359375, 0.03192138671875, -0.035552978515625, 0.0206146240234375, -0.038116455078125, 0.0222625732421875, -0.009979248046875, -0.028656005859375, -0.020050048828125, -0.00655364990234375, 0.0028171539306640625, -0.050445556640625, 0.0313720703125, -0.00029468536376953125, 0.10504150390625, 0.0203399658203125, -0.0190582275390625, 0.0182647705078125, -0.049835205078125, 0.0361328125, -0.053680419921875, 0.006275177001953125, 0.032196044921875, 0.034881591796875, -0.004077911376953125, -0.0626220703125, -0.0308685302734375, 0.00336456298828125, -0.0002048015594482422, 0.01507568359375, -0.02020263671875, 0.006954193115234375, 0.05206298828125, 0.047515869140625, -0.04522705078125, 0.0035076141357421875, -0.042755126953125, -0.01161956787109375, 0.03607177734375, 0.030975341796875, -0.0031604766845703125, -0.0205230712890625, -0.0404052734375, -0.0196533203125, -0.025115966796875, 0.0081787109375, 0.0355224609375, 0.0198211669921875, -0.0242919921875, 0.050811767578125, -0.031097412109375, 0.059906005859375, 0.0098876953125, -0.005828857421875, 0.0479736328125, -0.01155853271484375, -0.041900634765625, -0.0024280548095703125, 0.05712890625, 0.054931640625, 0.018157958984375, 0.0146636962890625, -0.01983642578125, -0.0006418228149414062, 0.0258026123046875, -0.067138671875, -0.03289794921875, 0.0006270408630371094, -0.0338134765625, -0.051422119140625, 0.016082763671875, -0.04119873046875, 0.00269317626953125, -0.0242156982421875, 0.04052734375, -0.02020263671875, -0.0214691162109375, 0.0163116455078125, -0.01163482666015625, 0.032501220703125, 0.0092010498046875, -0.0701904296875, 0.0374755859375, 0.03448486328125, 0.049285888671875, 0.0163726806640625, -0.0209197998046875, -0.02294921875, 0.0011606216430664062, -0.0205078125, 0.043060302734375, -0.0161285400390625, -0.0220947265625, -0.0235748291015625, 0.0293426513671875, -0.01190948486328125, -0.037139892578125, 0.05181884765625, -0.0196533203125, 0.0082550048828125, -0.034088134765625, -0.0428466796875, -0.02166748046875, 0.04296875, -0.052734375, 0.09393310546875, 0.01485443115234375, -0.0645751953125, 0.048126220703125, -0.038421630859375, 0.0020961761474609375, 0.009490966796875, -0.005664825439453125, -0.0548095703125, -0.006626129150390625, 0.008636474609375, 0.02752685546875, -0.018463134765625, 0.035308837890625, -0.027313232421875, -0.038330078125, -0.00860595703125, -0.045989990234375, 0.055755615234375, 0.006259918212890625, -0.037261962890625, 0.02734375, -0.0787353515625, 0.0216522216796875, 0.03240966796875, -0.03033447265625, 0.01129913330078125, -0.035308837890625, 0.03216552734375, 0.021697998046875, 0.00917816162109375, -0.0341796875, 0.018402099609375, -0.0240020751953125, 0.02667236328125, 0.04962158203125, 0.009429931640625, 0.0035572052001953125, -0.03558349609375, 0.03436279296875, 0.006809234619140625, 0.03204345703125, 0.0119781494140625, -0.051116943359375, -0.053131103515625, -0.00855255126953125, 0.023040771484375, 0.03057861328125, -0.0275115966796875, 0.051727294921875, -0.022674560546875, -0.054534912109375, -0.00921630859375, -0.0019092559814453125, 0.0335693359375, 0.061920166015625, 0.031646728515625, -0.0261383056640625, -0.035797119140625, -0.07940673828125, 0.00043129920959472656, -0.0042877197265625, 0.0176849365234375, 0.01363372802734375, 0.048858642578125, -0.01157379150390625, 0.060516357421875, -0.044525146484375, -0.00521087646484375, -0.017181396484375, -0.0116119384765625, 0.0247650146484375, 0.05487060546875, 0.057464599609375, -0.027618408203125, -0.0201416015625, -0.0073699951171875, -0.057037353515625, 0.023651123046875, 0.0024738311767578125, -0.03076171875, -0.020355224609375, 0.0203704833984375, -0.03857421875, 0.0673828125, 0.032562255859375, -0.0447998046875, 0.06060791015625, -0.0272064208984375, -0.005283355712890625, -0.09149169921875, 0.0222930908203125, 0.023223876953125, -0.0234222412109375, -0.02386474609375, 0.005359649658203125, 0.01497650146484375, -0.01702880859375, -0.034149169921875, 0.052886962890625, -0.00783538818359375, 0.009735107421875, -0.01099395751953125, -0.0394287109375, 0.012664794921875, 0.053863525390625, 0.00839996337890625, 0.0435791015625, 0.043792724609375, -0.03399658203125, 0.04034423828125, 0.029296875, -0.0283203125, 0.02996826171875, -0.06634521484375, -0.005664825439453125, 0.005504608154296875, 0.0113983154296875, -0.051849365234375, -0.0294189453125, 0.048736572265625, -0.030792236328125, 0.0200347900390625, -0.0242462158203125, -0.024688720703125, -0.0367431640625, -0.00702667236328125, 0.0227508544921875, 0.0203094482421875, -0.0416259765625, 0.025970458984375, -0.0048980712890625, 0.023773193359375, -0.03619384765625, -0.0615234375, -0.01214599609375, -0.01617431640625, -0.03656005859375, 0.01751708984375, 0.00612640380859375, 0.00896453857421875, 0.004608154296875, -0.015838623046875, -0.009490966796875, -0.007904052734375, 0.02777099609375, 0.0224609375, -0.0198822021484375, -0.0070648193359375, -0.00659942626953125, -0.0203094482421875, 0.0191497802734375, -0.0150909423828125, 0.0411376953125, -0.02081298828125, -0.020355224609375, -0.056304931640625, -0.00994110107421875, 0.036895751953125, -0.020294189453125, 0.053436279296875, 0.059478759765625, -0.045654296875, 0.00693511962890625, -0.033905029296875, -0.01486968994140625, -0.0323486328125, 0.04150390625, -0.036773681640625, -0.006931304931640625, 0.052581787109375, 0.00896453857421875, -0.0033111572265625, 0.0687255859375, 0.0293426513671875, 0.017547607421875, 0.06689453125, 0.01348876953125, -0.0031604766845703125, 0.035247802734375, -0.0706787109375, -0.0166473388671875, -0.06146240234375, -0.04083251953125, -0.0309295654296875, -0.035003662109375, -0.04364013671875, -0.01050567626953125, 0.01361083984375, -0.006206512451171875, -0.052581787109375, 0.031646728515625, -0.051300048828125, 0.03759765625, 0.0643310546875, 0.04473876953125, -0.00360870361328125, 0.01678466796875, -0.026092529296875, 0.00003463029861450195, -0.0679931640625, -0.032806396484375, 0.0938720703125, 0.02947998046875, 0.05828857421875, -0.0130462646484375, 0.049591064453125, -0.00443267822265625, 0.009124755859375, -0.04833984375, 0.0252838134765625, 0.007457733154296875, -0.070068359375, -0.0146484375, -0.032745361328125, -0.057830810546875, 0.0133209228515625, -0.0257568359375, -0.049652099609375, 0.0233917236328125, 0.02545166015625, -0.03009033203125, 0.0477294921875, -0.0626220703125, 0.10223388671875, -0.024261474609375, -0.03759765625, 0.0090789794921875, -0.036041259765625, 0.013702392578125, 0.0255279541015625, -0.0267333984375, -0.00225830078125, 0.0264739990234375, 0.0670166015625, -0.05474853515625, 0.0504150390625, -0.0240631103515625, 0.030609130859375, 0.02740478515625, -0.01177215576171875, 0.049774169921875, 0.030364990234375, -0.005496978759765625, 0.006198883056640625, -0.0032501220703125, -0.051513671875, -0.033966064453125, 0.05755615234375, -0.074462890625, -0.018798828125, -0.0516357421875, -0.0250091552734375, -0.0003161430358886719, 0.0205078125, 0.046295166015625, 0.05511474609375, -0.000133514404296875, 0.0208740234375, 0.04327392578125, 0.00018215179443359375, 0.0170135498046875, 0.00893402099609375, -0.0088043212890625, -0.049591064453125, 0.070068359375, -0.0016021728515625, 0.0150604248046875, 0.0015869140625, 0.0194244384765625, -0.0232391357421875, -0.02252197265625, -0.043304443359375, 0.00827789306640625, -0.052276611328125, -0.02752685546875, -0.027252197265625, -0.037109375, -0.01593017578125, 0.00215911865234375, -0.037841796875, -0.0137786865234375, -0.046478271484375, -0.0238800048828125, 0.035736083984375, 0.040618896484375, 0.01134490966796875, 0.037017822265625, -0.040252685546875, 0.002620697021484375, 0.0179290771484375, 0.040863037109375, 0.0020599365234375, -0.062255859375, -0.0262603759765625, 0.0017108917236328125, -0.042449951171875, -0.0577392578125, 0.04522705078125, 0.0024623870849609375, 0.042724609375, 0.034210205078125, -0.025115966796875, 0.07562255859375, -0.01006317138671875, 0.0557861328125, 0.0304718017578125, -0.040252685546875, 0.038543701171875, -0.0219879150390625, 0.041778564453125, 0.044464111328125, 0.02496337890625, -0.022491455078125, 0.0017118453979492188, -0.0875244140625, -0.06341552734375, 0.0582275390625, 0.021697998046875, -0.0016632080078125, 0.0017347335815429688, 0.0303802490234375, -0.00621795654296875, 0.01117706298828125, -0.060272216796875, -0.039947509765625, -0.033538818359375, -0.01324462890625, -0.00600433349609375, -0.0221405029296875, -0.00438690185546875, -0.061248779296875, 0.0679931640625, -0.003612518310546875, 0.020477294921875, 0.0095672607421875, 0.00893402099609375, -0.014373779296875, -0.00856781005859375, 0.042083740234375, 0.059234619140625, -0.046966552734375, -0.005523681640625, 0.009490966796875, -0.046905517578125, -0.003376007080078125, 0.01512908935546875, -0.01148223876953125, 0.0128936767578125, 0.0197906494140625, 0.085205078125, 0.0233612060546875, -0.007354736328125, 0.034912109375, -0.01470947265625, -0.033111572265625, -0.025177001953125, 0.0107574462890625, -0.01128387451171875, 0.012054443359375, 0.00821685791015625, 0.0413818359375, 0.006504058837890625, -0.0120086669921875, 0.0106201171875, 0.0113372802734375, -0.041412353515625, -0.025482177734375, 0.06304931640625, 0.0030040740966796875, -0.01255035400390625, 0.061004638671875, -0.0193328857421875, -0.0210113525390625, 0.06817626953125, 0.03656005859375, 0.061492919921875, -0.01157379150390625, 0.00725555419921875, 0.068603515625, 0.0182037353515625, -0.0156707763671875, 0.03680419921875, 0.01557159423828125, -0.030303955078125, -0.0075531005859375, -0.05267333984375, -0.01180267333984375, 0.0447998046875, -0.0836181640625, 0.04345703125, -0.03851318359375, -0.041107177734375, 0.018218994140625, 0.0124359130859375, -0.08349609375, 0.042755126953125, 0.00634002685546875, 0.07916259765625, -0.0699462890625, 0.0576171875, 0.05712890625, -0.035125732421875, -0.080078125, -0.0204010009765625, -0.00714874267578125, -0.0667724609375, 0.038543701171875, 0.004665374755859375, 0.03436279296875, 0.01088714599609375, -0.0396728515625, -0.07183837890625, 0.08209228515625, 0.02691650390625, -0.0489501953125, -0.01137542724609375, 0.00882720947265625, 0.03692626953125, -0.00782012939453125, 0.049041748046875, 0.0196380615234375, 0.024383544921875, 0.0185699462890625, -0.0703125, -0.0017309188842773438, -0.0361328125, 0.0160064697265625, 0.005687713623046875, -0.05328369140625, 0.0648193359375, 0.0012063980102539062, 0.0281982421875, -0.001789093017578125, 0.035919189453125, 0.0139312744140625, 0.01413726806640625, 0.02392578125, 0.071533203125, 0.043212890625, -0.013153076171875, 0.07427978515625, -0.0360107421875, 0.060333251953125, 0.09173583984375, 0.01407623291015625, 0.033416748046875, 0.025421142578125, -0.01216888427734375, 0.020172119140625, 0.07305908203125, -0.0307159423828125, 0.0309295654296875, 0.0218963623046875, 0.0078125, -0.024627685546875, 0.0169525146484375, -0.0537109375, 0.027587890625, 0.00920867919921875, -0.056243896484375, -0.02606201171875, -0.01477813720703125, -0.00893402099609375, -0.02459716796875, -0.0305023193359375, 0.040313720703125, -0.0252532958984375, -0.0241851806640625, 0.06182861328125, 0.008453369140625, 0.0248260498046875, -0.040130615234375, -0.01143646240234375, -0.01067352294921875, 0.01800537109375, -0.0272369384765625, -0.048675537109375, 0.0165863037109375, -0.00037670135498046875, -0.01751708984375, 0.002307891845703125, 0.0416259765625, -0.01284027099609375, -0.0638427734375, -0.0018396377563476562, 0.02239990234375, 0.006641387939453125, -0.008026123046875, -0.06927490234375, 0.00328826904296875, -0.00921630859375, -0.040374755859375, 0.005237579345703125, 0.0232391357421875, 0.00334930419921875, 0.0401611328125, 0.03314208984375, 0.004558563232421875, 0.020111083984375, 0.004634857177734375, 0.084228515625, -0.052032470703125, -0.050445556640625, -0.04669189453125, 0.03350830078125, -0.0167999267578125, -0.0699462890625, 0.04278564453125, 0.07916259765625, 0.0748291015625, -0.022674560546875, 0.03863525390625, 0.0004367828369140625, 0.0117034912109375, -0.0305938720703125, 0.04400634765625, -0.03790283203125, -0.002017974853515625, -0.0231475830078125, -0.0626220703125, -0.006175994873046875, 0.045684814453125, -0.036865234375, 0.01393890380859375, 0.041656494140625, 0.061737060546875, -0.0207672119140625, 0.0002770423889160156, 0.017608642578125, -0.00720977783203125, 0.0107421875, 0.033660888671875, 0.031646728515625, -0.06622314453125, 0.034454345703125, -0.0478515625, -0.00629425048828125, -0.01526641845703125, -0.033233642578125, -0.06292724609375, -0.021484375, -0.03790283203125, -0.03857421875, 0.01085662841796875, 0.0772705078125, 0.072509765625, -0.051422119140625, -0.02001953125, -0.01386260986328125, -0.0215911865234375, -0.034637451171875, -0.019317626953125, 0.030364990234375, -0.0062713623046875, -0.047393798828125, -0.00800323486328125, -0.0182342529296875, 0.01800537109375, -0.004657745361328125, -0.01849365234375, -0.019989013671875, -0.0243377685546875, 0.01238250732421875, 0.01495361328125, -0.04010009765625, -0.0196380615234375, -0.0020008087158203125, -0.006160736083984375, 0.0255126953125, 0.020599365234375, -0.0439453125, 0.049285888671875, 0.02020263671875, 0.0249786376953125, 0.0635986328125, 0.005229949951171875, 0.01309967041015625, -0.057952880859375, 0.0207061767578125, 0.0162506103515625, 0.0289154052734375, 0.010955810546875, -0.02752685546875, 0.03350830078125, 0.03912353515625, -0.0379638671875, -0.05462646484375, -0.0174407958984375, -0.08282470703125, 0.018829345703125, 0.089599609375, 0.0011272430419921875, -0.04730224609375, 0.0294342041015625, -0.0135040283203125, 0.0251312255859375, -0.02496337890625, 0.03631591796875, 0.047027587890625, 0.007740020751953125, 0.002971649169921875, -0.042388916015625, 0.033111572265625, 0.0130767822265625, -0.047637939453125, -0.0219268798828125, 0.0199127197265625, 0.04595947265625, -0.003795623779296875, 0.024627685546875, -0.005268096923828125, 0.0281524658203125, 0.0203857421875, 0.02630615234375, -0.03082275390625, -0.027069091796875, -0.0216217041015625, 0.0014257431030273438, 0.004421234130859375, -0.04718017578125 ] ]
dccuchile/bert-base-spanish-wwm-uncased
2022-05-31T15:02:39.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "masked-lm", "es", "arxiv:1904.09077", "arxiv:1906.01502", "arxiv:1812.10464", "arxiv:1901.07291", "arxiv:1904.02099", "arxiv:1906.01569", "arxiv:1908.11828", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
dccuchile
null
null
dccuchile/bert-base-spanish-wwm-uncased
42
125,603
transformers
2022-03-02T23:29:05
--- language: - es tags: - masked-lm --- # BETO: Spanish BERT BETO is a [BERT model](https://github.com/google-research/bert) trained on a [big Spanish corpus](https://github.com/josecannete/spanish-corpora). BETO is of size similar to a BERT-Base and was trained with the Whole Word Masking technique. Below you find Tensorflow and Pytorch checkpoints for the uncased and cased versions, as well as some results for Spanish benchmarks comparing BETO with [Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) as well as other (not BERT-based) models. ## Download | | | | | |-|:--------:|:-----:|:----:| |BETO uncased|[tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/uncased_2M/pytorch_weights.tar.gz) | [vocab](./config/uncased_2M/vocab.txt), [config](./config/uncased_2M/config.json) | |BETO cased| [tensorflow_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/tensorflow_weights.tar.gz) | [pytorch_weights](https://users.dcc.uchile.cl/~jperez/beto/cased_2M/pytorch_weights.tar.gz) | [vocab](./config/cased_2M/vocab.txt), [config](./config/cased_2M/config.json) | All models use a vocabulary of about 31k BPE subwords constructed using SentencePiece and were trained for 2M steps. ## Benchmarks The following table shows some BETO results in the Spanish version of every task. We compare BETO (cased and uncased) with the Best Multilingual BERT results that we found in the literature (as of October 2019). The table also shows some alternative methods for the same tasks (not necessarily BERT-based methods). References for all methods can be found [here](#references). |Task | BETO-cased | BETO-uncased | Best Multilingual BERT | Other results | |-------|--------------:|--------------:|--------------------------:|-------------------------------:| |[POS](https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-1827) | **98.97** | 98.44 | 97.10 [2] | 98.91 [6], 96.71 [3] | |[NER-C](https://www.kaggle.com/nltkdata/conll-corpora) | [**88.43**](https://github.com/gchaperon/beto-benchmarks/blob/master/conll2002/dev_results_beto-cased_conll2002.txt) | 82.67 | 87.38 [2] | 87.18 [3] | |[MLDoc](https://github.com/facebookresearch/MLDoc) | [95.60](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-cased_mldoc.txt) | [**96.12**](https://github.com/gchaperon/beto-benchmarks/blob/master/MLDoc/dev_results_beto-uncased_mldoc.txt) | 95.70 [2] | 88.75 [4] | |[PAWS-X](https://github.com/google-research-datasets/paws/tree/master/pawsx) | 89.05 | 89.55 | 90.70 [8] | |[XNLI](https://github.com/facebookresearch/XNLI) | **82.01** | 80.15 | 78.50 [2] | 80.80 [5], 77.80 [1], 73.15 [4]| ## Example of use For further details on how to use BETO you can visit the [🤗Huggingface Transformers library](https://github.com/huggingface/transformers), starting by the [Quickstart section](https://huggingface.co/transformers/quickstart.html). BETO models can be accessed simply as [`'dccuchile/bert-base-spanish-wwm-cased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) and [`'dccuchile/bert-base-spanish-wwm-uncased'`](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) by using the Transformers library. An example on how to download and use the models in this page can be found in [this colab notebook](https://colab.research.google.com/drive/1uRwg4UmPgYIqGYY4gW_Nsw9782GFJbPt). (We will soon add a more detailed step-by-step tutorial in Spanish for newcommers 😉) ## Acknowledgments We thank [Adereso](https://www.adere.so/) for kindly providing support for traininig BETO-uncased, and the [Millennium Institute for Foundational Research on Data](https://imfd.cl/en/) that provided support for training BETO-cased. Also thanks to Google for helping us with the [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc) program. ## Citation [Spanish Pre-Trained BERT Model and Evaluation Data](https://users.dcc.uchile.cl/~jperez/papers/pml4dc2020.pdf) To cite this resource in a publication please use the following: ``` @inproceedings{CaneteCFP2020, title={Spanish Pre-Trained BERT Model and Evaluation Data}, author={Cañete, José and Chaperon, Gabriel and Fuentes, Rodrigo and Ho, Jou-Hui and Kang, Hojin and Pérez, Jorge}, booktitle={PML4DC at ICLR 2020}, year={2020} } ``` ## License Disclaimer The license CC BY 4.0 best describes our intentions for our work. However we are not sure that all the datasets used to train BETO have licenses compatible with CC BY 4.0 (specially for commercial use). Please use at your own discretion and verify that the licenses of the original text resources match your needs. ## References * [1] [Original Multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) * [2] [Multilingual BERT on "Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT"](https://arxiv.org/pdf/1904.09077.pdf) * [3] [Multilingual BERT on "How Multilingual is Multilingual BERT?"](https://arxiv.org/pdf/1906.01502.pdf) * [4] [LASER](https://arxiv.org/abs/1812.10464) * [5] [XLM (MLM+TLM)](https://arxiv.org/pdf/1901.07291.pdf) * [6] [UDPipe on "75 Languages, 1 Model: Parsing Universal Dependencies Universally"](https://arxiv.org/pdf/1904.02099.pdf) * [7] [Multilingual BERT on "Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation"](https://arxiv.org/pdf/1906.01569.pdf) * [8] [Multilingual BERT on "PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification"](https://arxiv.org/abs/1908.11828)
5,905
[ [ -0.0306854248046875, -0.040130615234375, 0.014923095703125, 0.033966064453125, -0.014892578125, 0.006168365478515625, -0.040557861328125, -0.044769287109375, 0.0287933349609375, 0.0089263916015625, -0.0307159423828125, -0.043548583984375, -0.04180908203125, 0.004302978515625, -0.00774383544921875, 0.08740234375, -0.0095672607421875, 0.0322265625, -0.0081939697265625, -0.0021724700927734375, -0.0135498046875, -0.0430908203125, -0.046051025390625, -0.0258331298828125, 0.0311279296875, 0.0165863037109375, 0.05084228515625, 0.018341064453125, 0.0386962890625, 0.0277862548828125, -0.01447296142578125, 0.005672454833984375, -0.020294189453125, -0.01355743408203125, 0.01068115234375, -0.0272369384765625, -0.032440185546875, -0.00665283203125, 0.03753662109375, 0.042816162109375, -0.01186370849609375, -0.0022449493408203125, -0.00833892822265625, 0.044586181640625, -0.0200653076171875, 0.0258331298828125, -0.04608154296875, 0.0014600753784179688, -0.00469970703125, 0.0138702392578125, -0.0310516357421875, -0.0303802490234375, 0.03411865234375, -0.0462646484375, 0.037872314453125, -0.0003991127014160156, 0.09637451171875, 0.01027679443359375, -0.019439697265625, -0.04345703125, -0.0364990234375, 0.07061767578125, -0.05291748046875, 0.049774169921875, 0.031158447265625, 0.021209716796875, -0.0169219970703125, -0.047943115234375, -0.03656005859375, -0.0294952392578125, -0.009613037109375, 0.0296630859375, -0.013153076171875, 0.0009593963623046875, 0.006366729736328125, 0.0323486328125, -0.037353515625, 0.0012578964233398438, -0.0303802490234375, -0.0273590087890625, 0.045745849609375, -0.032440185546875, 0.017486572265625, -0.0261077880859375, -0.03753662109375, -0.044891357421875, -0.042877197265625, 0.021026611328125, 0.034698486328125, 0.0235137939453125, -0.0252685546875, 0.02899169921875, 0.01690673828125, 0.035308837890625, -0.01324462890625, -0.01116943359375, 0.046630859375, -0.0242919921875, -0.0140228271484375, -0.010040283203125, 0.07891845703125, 0.0023593902587890625, 0.0239410400390625, -0.033294677734375, -0.0026302337646484375, -0.0221710205078125, 0.020355224609375, -0.054412841796875, -0.00859832763671875, 0.0281982421875, -0.03155517578125, -0.007122039794921875, -0.0015916824340820312, -0.05450439453125, 0.00791168212890625, -0.0128326416015625, 0.03460693359375, -0.0587158203125, -0.00402069091796875, 0.01346588134765625, -0.004169464111328125, 0.0296630859375, 0.012054443359375, -0.06182861328125, 0.0117340087890625, 0.0350341796875, 0.0616455078125, -0.0089111328125, -0.0268096923828125, -0.00002300739288330078, -0.0167388916015625, -0.0287933349609375, 0.04205322265625, -0.00026488304138183594, -0.0211029052734375, 0.0146331787109375, 0.0204620361328125, -0.0201873779296875, -0.027099609375, 0.07373046875, -0.037078857421875, 0.044281005859375, -0.03778076171875, -0.0382080078125, -0.01678466796875, 0.00946807861328125, -0.048004150390625, 0.1004638671875, 0.002918243408203125, -0.056640625, 0.026580810546875, -0.04144287109375, -0.041748046875, -0.0156402587890625, 0.00897216796875, -0.045806884765625, -0.004726409912109375, 0.0240478515625, 0.03753662109375, -0.022003173828125, 0.035736083984375, -0.020965576171875, -0.01251983642578125, 0.004581451416015625, -0.00832366943359375, 0.089111328125, 0.021392822265625, -0.030059814453125, 0.00432586669921875, -0.0504150390625, -0.004993438720703125, 0.0194549560546875, -0.03289794921875, -0.0142669677734375, 0.0036983489990234375, 0.0067138671875, 0.0162811279296875, 0.033935546875, -0.059783935546875, 0.0007925033569335938, -0.028717041015625, 0.024627685546875, 0.04217529296875, -0.032501220703125, 0.01131439208984375, -0.0302276611328125, 0.023834228515625, -0.01387786865234375, 0.027435302734375, 0.002429962158203125, -0.046661376953125, -0.0789794921875, -0.04541015625, 0.046844482421875, 0.053131103515625, -0.0712890625, 0.0390625, -0.031280517578125, -0.052978515625, -0.057220458984375, -0.0034427642822265625, 0.044952392578125, 0.04052734375, 0.04315185546875, -0.015960693359375, -0.0487060546875, -0.07928466796875, 0.01445770263671875, -0.0124053955078125, -0.01861572265625, 0.031463623046875, 0.052947998046875, -0.01099395751953125, 0.05877685546875, -0.0214080810546875, -0.018341064453125, -0.00740814208984375, 0.003879547119140625, 0.032684326171875, 0.033447265625, 0.062347412109375, -0.052703857421875, -0.0386962890625, 0.00397491455078125, -0.056884765625, 0.01446533203125, 0.002655029296875, -0.01462554931640625, 0.0263824462890625, 0.028076171875, -0.0350341796875, -0.0006971359252929688, 0.040069580078125, -0.01226043701171875, 0.041351318359375, -0.041107177734375, 0.00545501708984375, -0.09295654296875, 0.013702392578125, 0.0119476318359375, -0.007442474365234375, -0.0445556640625, -0.0010557174682617188, 0.0080413818359375, 0.006237030029296875, -0.04754638671875, 0.043792724609375, -0.0313720703125, -0.002960205078125, 0.018829345703125, -0.012725830078125, -0.007709503173828125, 0.0546875, 0.0173797607421875, 0.0631103515625, 0.04620361328125, -0.032012939453125, 0.019500732421875, 0.025970458984375, -0.035247802734375, 0.01168060302734375, -0.07598876953125, 0.005939483642578125, -0.002002716064453125, 0.010406494140625, -0.0723876953125, -0.006069183349609375, 0.0123443603515625, -0.045196533203125, 0.040740966796875, -0.01531982421875, -0.052032470703125, -0.040130615234375, -0.0511474609375, 0.00466156005859375, 0.040252685546875, -0.050872802734375, 0.0238189697265625, 0.026519775390625, -0.0037059783935546875, -0.054718017578125, -0.050537109375, -0.003086090087890625, -0.017120361328125, -0.053680419921875, 0.0574951171875, -0.02276611328125, 0.00640869140625, -0.01045989990234375, 0.01009368896484375, -0.0124969482421875, -0.0108795166015625, -0.003204345703125, 0.03680419921875, -0.001476287841796875, 0.00829315185546875, 0.002170562744140625, 0.012939453125, -0.0067901611328125, -0.0008563995361328125, 0.0400390625, -0.0311279296875, 0.0011816024780273438, -0.009063720703125, 0.020721435546875, 0.04046630859375, -0.00418853759765625, 0.0621337890625, 0.063720703125, -0.0242919921875, 0.0082244873046875, -0.0435791015625, 0.0034008026123046875, -0.03289794921875, 0.0162200927734375, -0.031402587890625, -0.057220458984375, 0.0595703125, 0.018829345703125, 0.0176239013671875, 0.042144775390625, 0.0518798828125, -0.01837158203125, 0.053924560546875, 0.054473876953125, -0.01146697998046875, 0.05047607421875, -0.035980224609375, 0.0027294158935546875, -0.058502197265625, -0.0330810546875, -0.056365966796875, -0.00485992431640625, -0.066650390625, -0.035430908203125, 0.0247650146484375, 0.00946807861328125, -0.01374053955078125, 0.0548095703125, -0.03216552734375, 0.0101318359375, 0.0650634765625, 0.017120361328125, -0.002471923828125, 0.017486572265625, -0.035186767578125, -0.0087432861328125, -0.06500244140625, -0.0313720703125, 0.10009765625, 0.03863525390625, 0.03509521484375, 0.007236480712890625, 0.053436279296875, 0.01702880859375, 0.0162200927734375, -0.061798095703125, 0.028717041015625, -0.0241546630859375, -0.056793212890625, -0.01715087890625, -0.0306549072265625, -0.08782958984375, 0.0313720703125, -0.0107574462890625, -0.05328369140625, 0.031890869140625, -0.003955841064453125, -0.0118560791015625, 0.0148468017578125, -0.075927734375, 0.06927490234375, -0.031402587890625, -0.017242431640625, 0.005748748779296875, -0.047760009765625, 0.007686614990234375, 0.003650665283203125, 0.0196533203125, 0.003475189208984375, 0.00412750244140625, 0.0714111328125, -0.043487548828125, 0.0538330078125, -0.005405426025390625, -0.013427734375, 0.025665283203125, -0.02008056640625, 0.030517578125, -0.00786590576171875, -0.01189422607421875, 0.05328369140625, 0.01134490966796875, -0.0369873046875, -0.0098724365234375, 0.0443115234375, -0.071044921875, -0.01012420654296875, -0.040985107421875, -0.048309326171875, -0.022735595703125, 0.0167083740234375, 0.0322265625, 0.0099639892578125, -0.01493072509765625, 0.0179290771484375, 0.0555419921875, -0.038055419921875, 0.047027587890625, 0.041595458984375, 0.0081939697265625, -0.042694091796875, 0.0665283203125, 0.0032596588134765625, 0.0004878044128417969, 0.0411376953125, 0.0129547119140625, -0.036895751953125, -0.037750244140625, -0.040985107421875, 0.033782958984375, -0.032867431640625, -0.0119781494140625, -0.04180908203125, -0.00426483154296875, -0.0418701171875, -0.00487518310546875, -0.039459228515625, -0.0261077880859375, -0.01082611083984375, 0.0014162063598632812, 0.02947998046875, 0.0174102783203125, -0.0135498046875, 0.0208740234375, -0.03424072265625, 0.0167999267578125, 0.0184326171875, 0.0263824462890625, -0.00493621826171875, -0.042449951171875, -0.0270538330078125, 0.011688232421875, -0.01641845703125, -0.059326171875, 0.03289794921875, 0.0118560791015625, 0.05010986328125, 0.0157012939453125, -0.013397216796875, 0.04364013671875, -0.0491943359375, 0.049652099609375, 0.0198516845703125, -0.06292724609375, 0.038604736328125, -0.0273284912109375, -0.0022258758544921875, 0.044708251953125, 0.060760498046875, -0.035736083984375, -0.01145172119140625, -0.05419921875, -0.07470703125, 0.06585693359375, 0.0237884521484375, 0.007328033447265625, -0.0042877197265625, 0.006984710693359375, 0.00882720947265625, 0.0236663818359375, -0.07342529296875, -0.032073974609375, -0.018585205078125, -0.02154541015625, -0.0050048828125, -0.026519775390625, -0.01230621337890625, -0.034759521484375, 0.0679931640625, -0.001766204833984375, 0.05316162109375, 0.03094482421875, -0.0134735107421875, 0.013580322265625, 0.011871337890625, 0.0423583984375, 0.035888671875, -0.049072265625, -0.01067352294921875, 0.01238250732421875, -0.041778564453125, -0.0194091796875, 0.036956787109375, -0.0186767578125, 0.0267333984375, 0.044158935546875, 0.06573486328125, 0.0168914794921875, -0.049041748046875, 0.035736083984375, -0.01189422607421875, -0.0272369384765625, -0.0211029052734375, -0.019134521484375, 0.002582550048828125, 0.01221466064453125, 0.0280914306640625, -0.0179901123046875, -0.0011358261108398438, -0.04437255859375, 0.006256103515625, 0.032745361328125, -0.027679443359375, -0.021240234375, 0.042694091796875, 0.00960540771484375, -0.0036907196044921875, 0.033599853515625, -0.026824951171875, -0.048583984375, 0.056304931640625, 0.03338623046875, 0.060302734375, -0.00849151611328125, 0.029998779296875, 0.046112060546875, 0.03857421875, -0.00295257568359375, 0.0291290283203125, -0.0108795166015625, -0.06329345703125, -0.03192138671875, -0.05706787109375, -0.032196044921875, 0.0153045654296875, -0.040557861328125, 0.0171966552734375, -0.01270294189453125, -0.0064697265625, 0.01139068603515625, 0.0222625732421875, -0.056884765625, 0.01444244384765625, 0.00376129150390625, 0.060760498046875, -0.058258056640625, 0.073974609375, 0.060882568359375, -0.04144287109375, -0.050079345703125, -0.0168609619140625, -0.0273590087890625, -0.07293701171875, 0.03216552734375, -0.007320404052734375, 0.01244354248046875, -0.0230255126953125, -0.02056884765625, -0.0699462890625, 0.07440185546875, 0.035186767578125, -0.034698486328125, 0.003566741943359375, 0.013885498046875, 0.0692138671875, -0.005970001220703125, 0.045440673828125, 0.026458740234375, 0.0292510986328125, 0.017059326171875, -0.08123779296875, -0.004253387451171875, -0.02557373046875, 0.00930023193359375, 0.00013875961303710938, -0.06512451171875, 0.062469482421875, -0.01375579833984375, 0.004055023193359375, -0.00923919677734375, 0.042205810546875, 0.0143585205078125, 0.0096435546875, 0.02862548828125, 0.051605224609375, 0.055877685546875, -0.0233154296875, 0.085205078125, -0.0278778076171875, 0.05181884765625, 0.07061767578125, 0.006877899169921875, 0.04815673828125, 0.03155517578125, -0.044952392578125, 0.036468505859375, 0.06927490234375, -0.01003265380859375, 0.040802001953125, 0.00286102294921875, -0.003513336181640625, -0.0018281936645507812, -0.0085906982421875, -0.041412353515625, 0.0304107666015625, 0.0037403106689453125, -0.02313232421875, -0.013427734375, 0.00008726119995117188, 0.0355224609375, -0.0189971923828125, -0.00202178955078125, 0.03375244140625, -0.0027103424072265625, -0.05419921875, 0.06646728515625, 0.005596160888671875, 0.07275390625, -0.058441162109375, 0.017303466796875, -0.02850341796875, 0.0051422119140625, -0.005321502685546875, -0.050933837890625, 0.01512908935546875, 0.0121917724609375, -0.0220794677734375, -0.033966064453125, 0.03955078125, -0.03607177734375, -0.05426025390625, 0.043243408203125, 0.03729248046875, 0.0262603759765625, 0.0108795166015625, -0.0772705078125, -0.0014200210571289062, 0.01352691650390625, -0.027252197265625, 0.0156707763671875, 0.0174407958984375, -0.0067138671875, 0.0450439453125, 0.05596923828125, 0.0076751708984375, 0.01751708984375, 0.0188140869140625, 0.045440673828125, -0.031585693359375, -0.01751708984375, -0.04034423828125, 0.025299072265625, -0.004329681396484375, -0.0299530029296875, 0.059722900390625, 0.055755615234375, 0.091796875, -0.02008056640625, 0.035858154296875, -0.0312347412109375, 0.03656005859375, -0.0283203125, 0.049072265625, -0.055572509765625, -0.00577545166015625, -0.034912109375, -0.056243896484375, -0.0182952880859375, 0.048126220703125, -0.0272674560546875, 0.01287078857421875, 0.048004150390625, 0.0472412109375, 0.0037631988525390625, -0.0206298828125, 0.002689361572265625, 0.01548004150390625, 0.021392822265625, 0.046722412109375, 0.0300140380859375, -0.05877685546875, 0.0552978515625, -0.04644775390625, -0.015167236328125, -0.0118560791015625, -0.060455322265625, -0.06744384765625, -0.052734375, -0.02362060546875, -0.0162200927734375, 0.0032253265380859375, 0.05523681640625, 0.0618896484375, -0.08245849609375, -0.034912109375, -0.002941131591796875, 0.019744873046875, -0.014617919921875, -0.0146636962890625, 0.0526123046875, -0.029052734375, -0.08294677734375, 0.0244903564453125, -0.01102447509765625, 0.004878997802734375, 0.001087188720703125, -0.01157379150390625, -0.038665771484375, -0.005558013916015625, 0.052978515625, 0.0301513671875, -0.042938232421875, -0.0139923095703125, 0.0196990966796875, -0.001186370849609375, 0.013916015625, 0.03302001953125, -0.039215087890625, 0.042144775390625, 0.0382080078125, 0.035888671875, 0.056427001953125, -0.0267486572265625, 0.017303466796875, -0.057281494140625, 0.023223876953125, 0.01568603515625, 0.048614501953125, 0.027740478515625, -0.011566162109375, 0.055938720703125, 0.01534271240234375, -0.0289154052734375, -0.057098388671875, -0.016845703125, -0.08819580078125, -0.02490234375, 0.07391357421875, -0.017486572265625, -0.0182037353515625, 0.0171356201171875, -0.01200103759765625, 0.037200927734375, -0.0330810546875, 0.0667724609375, 0.07000732421875, -0.01031494140625, 0.0095367431640625, -0.049774169921875, 0.03302001953125, 0.053924560546875, -0.0611572265625, -0.0295257568359375, 0.018798828125, 0.0219573974609375, 0.026458740234375, 0.03570556640625, -0.01537322998046875, 0.0040435791015625, -0.0176544189453125, 0.041717529296875, 0.00251007080078125, -0.02313232421875, -0.01084136962890625, -0.011627197265625, -0.0264129638671875, -0.03082275390625 ] ]
lmsys/vicuna-7b-v1.5
2023-08-02T18:54:45.000Z
[ "transformers", "pytorch", "llama", "text-generation", "arxiv:2307.09288", "arxiv:2306.05685", "license:llama2", "has_space", "text-generation-inference", "region:us" ]
text-generation
lmsys
null
null
lmsys/vicuna-7b-v1.5
97
124,880
transformers
2023-07-29T04:42:33
--- inference: false license: llama2 --- # Vicuna Model Card ## Model Details Vicuna is a chat assistant trained by fine-tuning Llama 2 on user-shared conversations collected from ShareGPT. - **Developed by:** [LMSYS](https://lmsys.org/) - **Model type:** An auto-regressive language model based on the transformer architecture - **License:** Llama 2 Community License Agreement - **Finetuned from model:** [Llama 2](https://arxiv.org/abs/2307.09288) ### Model Sources - **Repository:** https://github.com/lm-sys/FastChat - **Blog:** https://lmsys.org/blog/2023-03-30-vicuna/ - **Paper:** https://arxiv.org/abs/2306.05685 - **Demo:** https://chat.lmsys.org/ ## Uses The primary use of Vicuna is research on large language models and chatbots. The primary intended users of the model are researchers and hobbyists in natural language processing, machine learning, and artificial intelligence. ## How to Get Started with the Model - Command line interface: https://github.com/lm-sys/FastChat#vicuna-weights - APIs (OpenAI API, Huggingface API): https://github.com/lm-sys/FastChat/tree/main#api ## Training Details Vicuna v1.5 is fine-tuned from Llama 2 with supervised instruction fine-tuning. The training data is around 125K conversations collected from ShareGPT.com. See more details in the "Training Details of Vicuna Models" section in the appendix of this [paper](https://arxiv.org/pdf/2306.05685.pdf). ## Evaluation ![Evaluation Results](https://github.com/lm-sys/lm-sys.github.io/blob/main/public/images/webdata/vicuna_v1.5_eval.png?raw=true) Vicuna is evaluated with standard benchmarks, human preference, and LLM-as-a-judge. See more details in this [paper](https://arxiv.org/pdf/2306.05685.pdf) and [leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard). ## Difference between different versions of Vicuna See [vicuna_weights_version.md](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md)
1,965
[ [ -0.0217132568359375, -0.0667724609375, 0.0311431884765625, 0.0240936279296875, -0.04437255859375, -0.0138702392578125, -0.006328582763671875, -0.046417236328125, 0.0200958251953125, 0.025787353515625, -0.0460205078125, -0.046600341796875, -0.0374755859375, -0.004116058349609375, -0.01079559326171875, 0.056671142578125, 0.01338958740234375, 0.0089569091796875, -0.00530242919921875, -0.016815185546875, -0.06939697265625, -0.040618896484375, -0.073486328125, -0.018829345703125, 0.042449951171875, 0.035308837890625, 0.0450439453125, 0.0458984375, 0.029876708984375, 0.024078369140625, -0.002918243408203125, 0.02978515625, -0.038665771484375, 0.004802703857421875, 0.01947021484375, -0.0704345703125, -0.053985595703125, -0.0217132568359375, 0.035064697265625, -0.005626678466796875, -0.012359619140625, 0.0173187255859375, 0.0018100738525390625, 0.03204345703125, -0.0249786376953125, 0.021697998046875, -0.042266845703125, -0.01383209228515625, -0.0216064453125, -0.042999267578125, -0.01568603515625, -0.0269927978515625, -0.015655517578125, -0.035064697265625, 0.0025539398193359375, -0.005641937255859375, 0.08441162109375, 0.043304443359375, -0.024627685546875, -0.01134490966796875, -0.05572509765625, 0.03271484375, -0.063720703125, 0.0267181396484375, 0.03179931640625, 0.046417236328125, -0.0208892822265625, -0.040985107421875, -0.0457763671875, -0.017425537109375, 0.00469970703125, 0.00977325439453125, -0.0201568603515625, 0.0104827880859375, 0.00885009765625, 0.035430908203125, -0.034027099609375, 0.031951904296875, -0.04119873046875, 0.0088348388671875, 0.04095458984375, 0.03131103515625, 0.01221466064453125, -0.016265869140625, -0.0301055908203125, -0.0245361328125, -0.02557373046875, -0.0005774497985839844, 0.03070068359375, 0.031097412109375, -0.03302001953125, 0.03955078125, -0.014251708984375, 0.03875732421875, -0.00948333740234375, -0.0150604248046875, 0.0242462158203125, -0.0037899017333984375, -0.03765869140625, -0.02178955078125, 0.0869140625, 0.034393310546875, -0.0043487548828125, 0.01062774658203125, 0.0017852783203125, 0.00443267822265625, 0.015960693359375, -0.05877685546875, 0.005329132080078125, 0.048370361328125, -0.0224761962890625, -0.038055419921875, -0.004314422607421875, -0.0343017578125, -0.0310211181640625, -0.0174713134765625, 0.02703857421875, -0.028564453125, -0.0281982421875, 0.011016845703125, 0.000751495361328125, 0.03240966796875, 0.0265350341796875, -0.05255126953125, 0.025665283203125, 0.052642822265625, 0.07940673828125, -0.00765228271484375, -0.0293121337890625, -0.00921630859375, -0.0213623046875, -0.0227813720703125, 0.0718994140625, -0.005283355712890625, -0.025543212890625, -0.003772735595703125, 0.0106201171875, -0.0012769699096679688, -0.041839599609375, 0.04638671875, -0.0204315185546875, 0.020050048828125, -0.012359619140625, -0.037811279296875, -0.002285003662109375, 0.0186767578125, -0.048095703125, 0.09149169921875, 0.00315093994140625, -0.052978515625, 0.01357269287109375, -0.053924560546875, 0.00823974609375, 0.00823974609375, -0.0054931640625, -0.035003662109375, -0.006183624267578125, -0.0029582977294921875, 0.039337158203125, -0.044647216796875, 0.0408935546875, -0.018951416015625, -0.037109375, 0.021392822265625, -0.043853759765625, 0.07550048828125, 0.0217437744140625, -0.0283966064453125, 0.0355224609375, -0.058685302734375, -0.012298583984375, 0.021026611328125, -0.01491546630859375, -0.020233154296875, -0.0181121826171875, 0.0024852752685546875, 0.006496429443359375, 0.0308837890625, -0.0173187255859375, 0.0253143310546875, -0.00112152099609375, 0.01428985595703125, 0.05029296875, 0.004150390625, 0.00998687744140625, -0.033233642578125, 0.0313720703125, -0.0013256072998046875, 0.059661865234375, 0.0101470947265625, -0.039093017578125, -0.083984375, -0.032928466796875, 0.0018262863159179688, 0.050689697265625, -0.046173095703125, 0.046722412109375, -0.0226287841796875, -0.08197021484375, -0.06976318359375, 0.01306915283203125, 0.03204345703125, 0.007541656494140625, 0.022674560546875, -0.03375244140625, -0.047454833984375, -0.07659912109375, -0.00565338134765625, -0.030029296875, -0.0030956268310546875, 0.032135009765625, 0.0174407958984375, -0.04254150390625, 0.06256103515625, -0.0308837890625, -0.029052734375, -0.004665374755859375, -0.00592803955078125, 0.004680633544921875, 0.031463623046875, 0.050262451171875, -0.046295166015625, -0.022552490234375, -0.00586700439453125, -0.06463623046875, -0.003143310546875, -0.006435394287109375, -0.036346435546875, 0.0173492431640625, 0.028778076171875, -0.04742431640625, 0.039825439453125, 0.05438232421875, -0.038818359375, 0.03411865234375, -0.020172119140625, 0.00724029541015625, -0.1021728515625, 0.0112457275390625, 0.0005903244018554688, -0.0287322998046875, -0.0439453125, 0.005199432373046875, -0.0084075927734375, 0.0186614990234375, -0.048095703125, 0.06524658203125, -0.0262908935546875, 0.004058837890625, -0.036376953125, -0.0158843994140625, -0.0039215087890625, 0.05792236328125, 0.00746917724609375, 0.054443359375, 0.0309906005859375, -0.0615234375, 0.03680419921875, 0.0159454345703125, -0.0120086669921875, 0.027557373046875, -0.06781005859375, 0.022552490234375, 0.007602691650390625, 0.01290130615234375, -0.0699462890625, -0.007389068603515625, 0.048095703125, -0.036346435546875, 0.009033203125, -0.0029659271240234375, -0.044281005859375, -0.01409912109375, -0.010894775390625, 0.01160430908203125, 0.034393310546875, -0.044891357421875, 0.0293426513671875, 0.03314208984375, 0.0165252685546875, -0.037506103515625, -0.044647216796875, -0.0029430389404296875, -0.032318115234375, -0.01220703125, 0.0010271072387695312, -0.0247344970703125, -0.0173492431640625, -0.0103759765625, 0.01206207275390625, -0.00970458984375, 0.009246826171875, 0.035552978515625, 0.018035888671875, -0.00677490234375, 0.01032257080078125, -0.0042572021484375, -0.00644683837890625, -0.00966644287109375, 0.0019521713256835938, 0.0743408203125, -0.035797119140625, -0.00243377685546875, -0.0679931640625, -0.00384521484375, 0.047943115234375, 0.005218505859375, 0.08685302734375, 0.051727294921875, -0.0167083740234375, 0.01279449462890625, -0.056365966796875, -0.01416778564453125, -0.03521728515625, 0.0203704833984375, -0.0272674560546875, -0.052215576171875, 0.051605224609375, 0.0188751220703125, 0.02630615234375, 0.0362548828125, 0.059173583984375, 0.008270263671875, 0.034454345703125, 0.0626220703125, -0.00383758544921875, 0.06732177734375, -0.0271453857421875, -0.0119171142578125, -0.05712890625, -0.031585693359375, -0.04388427734375, -0.00726318359375, -0.054962158203125, -0.05029296875, -0.0034027099609375, -0.0005450248718261719, -0.025390625, 0.053314208984375, -0.044830322265625, 0.01419830322265625, 0.045654296875, 0.01959228515625, 0.0204315185546875, -0.0102386474609375, 0.018951416015625, 0.01090240478515625, -0.052459716796875, -0.0526123046875, 0.07818603515625, 0.050506591796875, 0.03753662109375, 0.013671875, 0.0531005859375, 0.01959228515625, 0.034393310546875, -0.06805419921875, 0.04095458984375, 0.017852783203125, -0.055145263671875, -0.031646728515625, -0.041748046875, -0.08148193359375, 0.02655029296875, -0.015899658203125, -0.049560546875, 0.0261077880859375, 0.0111541748046875, -0.0145111083984375, 0.02252197265625, -0.05474853515625, 0.0682373046875, -0.0296783447265625, -0.0287322998046875, -0.0024261474609375, -0.0294952392578125, 0.041046142578125, 0.007289886474609375, 0.005092620849609375, -0.0148468017578125, -0.006011962890625, 0.0576171875, -0.042236328125, 0.08123779296875, -0.0118865966796875, -0.0181427001953125, 0.021026611328125, -0.000751495361328125, 0.01328277587890625, 0.005279541015625, 0.005725860595703125, 0.033782958984375, 0.00991058349609375, -0.037445068359375, -0.04608154296875, 0.047088623046875, -0.08685302734375, -0.035552978515625, -0.0286407470703125, -0.022735595703125, 0.0009226799011230469, 0.0081024169921875, 0.0293426513671875, 0.0193023681640625, -0.020538330078125, 0.01154327392578125, 0.039703369140625, -0.02313232421875, 0.0046234130859375, 0.031463623046875, -0.0223846435546875, -0.0330810546875, 0.04791259765625, -0.00609588623046875, 0.01265716552734375, 0.033203125, 0.0122222900390625, -0.0171661376953125, -0.0110015869140625, -0.0170135498046875, 0.0318603515625, -0.042022705078125, -0.0185546875, -0.051971435546875, -0.022247314453125, -0.0226898193359375, 0.0308837890625, -0.0615234375, -0.0189361572265625, -0.032379150390625, -0.0074462890625, 0.0474853515625, 0.034881591796875, 0.01534271240234375, 0.054443359375, -0.039459228515625, 0.0217132568359375, 0.022430419921875, 0.0269927978515625, 0.0007009506225585938, -0.0506591796875, -0.0191802978515625, 0.005786895751953125, -0.0198516845703125, -0.06646728515625, 0.03985595703125, -0.012420654296875, 0.042633056640625, 0.036346435546875, -0.0017547607421875, 0.067138671875, -0.010101318359375, 0.046417236328125, 0.0106964111328125, -0.040283203125, 0.039886474609375, -0.01849365234375, 0.016204833984375, 0.0499267578125, 0.022857666015625, -0.045928955078125, -0.0224151611328125, -0.0673828125, -0.053985595703125, 0.037872314453125, 0.01934814453125, 0.0199127197265625, -0.004444122314453125, 0.036834716796875, 0.012481689453125, 0.01409912109375, -0.056427001953125, -0.045074462890625, -0.00946807861328125, -0.01251220703125, -0.01395416259765625, -0.032806396484375, -0.003528594970703125, -0.0237884521484375, 0.054046630859375, -0.002071380615234375, 0.041107177734375, 0.00876617431640625, 0.006198883056640625, -0.0000597834587097168, 0.011505126953125, 0.052276611328125, 0.027252197265625, -0.03179931640625, -0.021270751953125, 0.007190704345703125, -0.034027099609375, -0.0032196044921875, 0.0009140968322753906, 0.002170562744140625, 0.01166534423828125, 0.022003173828125, 0.10894775390625, 0.0100250244140625, -0.033782958984375, 0.022613525390625, -0.05206298828125, -0.015777587890625, -0.042510986328125, 0.022918701171875, 0.01160430908203125, 0.0361328125, 0.00948333740234375, -0.006336212158203125, 0.001800537109375, -0.052734375, -0.023345947265625, 0.0218353271484375, -0.0306396484375, -0.0161590576171875, 0.0491943359375, 0.01238250732421875, -0.04925537109375, 0.029449462890625, 0.0084991455078125, -0.0205535888671875, 0.036773681640625, 0.018829345703125, 0.069091796875, -0.0204315185546875, 0.012481689453125, 0.041778564453125, 0.021942138671875, -0.01076507568359375, 0.015350341796875, -0.0134429931640625, -0.048095703125, 0.00971221923828125, -0.0423583984375, -0.044036865234375, 0.02935791015625, -0.05474853515625, 0.03759765625, -0.0304412841796875, -0.035797119140625, -0.027923583984375, 0.032135009765625, -0.07257080078125, 0.0004565715789794922, -0.002902984619140625, 0.06927490234375, -0.06658935546875, 0.07470703125, 0.0343017578125, -0.035186767578125, -0.06854248046875, -0.0218353271484375, -0.006465911865234375, -0.06341552734375, 0.014862060546875, 0.002269744873046875, -0.0014276504516601562, -0.01100921630859375, -0.044891357421875, -0.047454833984375, 0.10784912109375, 0.0293426513671875, -0.058685302734375, -0.0032482147216796875, -0.00002777576446533203, 0.054168701171875, -0.0129852294921875, 0.042755126953125, 0.043212890625, 0.01479339599609375, 0.01375579833984375, -0.08642578125, -0.00039005279541015625, -0.0382080078125, 0.00494384765625, -0.0189666748046875, -0.08526611328125, 0.0572509765625, 0.00592041015625, -0.004978179931640625, 0.018310546875, 0.0614013671875, 0.048309326171875, 0.0156402587890625, 0.032745361328125, 0.020538330078125, 0.07745361328125, 0.006900787353515625, 0.0848388671875, -0.010833740234375, 0.01149749755859375, 0.08294677734375, 0.01396942138671875, 0.0718994140625, 0.03656005859375, 0.004169464111328125, 0.03741455078125, 0.059661865234375, 0.00861358642578125, 0.021453857421875, -0.00516510009765625, 0.005069732666015625, -0.007434844970703125, 0.003814697265625, -0.0369873046875, 0.038543701171875, 0.0218048095703125, -0.0175323486328125, 0.01629638671875, -0.01091766357421875, 0.02099609375, -0.016265869140625, -0.00136566162109375, 0.0635986328125, 0.0154266357421875, -0.04937744140625, 0.0677490234375, 0.00484466552734375, 0.068359375, -0.049835205078125, 0.004322052001953125, -0.044097900390625, 0.023468017578125, -0.003932952880859375, -0.020263671875, 0.008270263671875, 0.01380157470703125, 0.01071929931640625, 0.01461029052734375, 0.032928466796875, -0.0215911865234375, -0.0265045166015625, 0.0283050537109375, 0.036285400390625, 0.04339599609375, 0.00785064697265625, -0.058319091796875, 0.036865234375, -0.00933837890625, -0.037811279296875, 0.01544189453125, 0.0299835205078125, -0.01512908935546875, 0.0704345703125, 0.045379638671875, 0.009735107421875, -0.00171661376953125, 0.0186309814453125, 0.0634765625, -0.038665771484375, -0.0374755859375, -0.066162109375, 0.0281524658203125, -0.006664276123046875, -0.041900634765625, 0.057525634765625, 0.047454833984375, 0.042999267578125, 0.00665283203125, 0.043212890625, 0.005962371826171875, 0.022125244140625, -0.037872314453125, 0.049560546875, -0.053375244140625, 0.0251617431640625, -0.0169830322265625, -0.07177734375, -0.0184326171875, 0.04833984375, -0.0170135498046875, 0.0018711090087890625, 0.037017822265625, 0.06011962890625, 0.0033855438232421875, -0.018524169921875, 0.03350830078125, 0.013763427734375, 0.040618896484375, 0.034942626953125, 0.049041748046875, -0.05426025390625, 0.037933349609375, -0.01525115966796875, -0.019378662109375, -0.038909912109375, -0.04254150390625, -0.09130859375, -0.04827880859375, -0.0152740478515625, -0.0274505615234375, 0.0166015625, 0.073974609375, 0.04742431640625, -0.025421142578125, -0.045501708984375, -0.0011968612670898438, -0.01020050048828125, -0.01483917236328125, -0.0150909423828125, 0.02520751953125, -0.0010433197021484375, -0.0643310546875, 0.00859832763671875, -0.01351165771484375, 0.0152740478515625, -0.02752685546875, -0.0281829833984375, -0.009674072265625, 0.01050567626953125, 0.023345947265625, 0.041259765625, -0.047149658203125, -0.0022373199462890625, 0.0027675628662109375, -0.0335693359375, 0.0171356201171875, 0.0249176025390625, -0.04571533203125, 0.01042938232421875, 0.0226593017578125, 0.0106048583984375, 0.048309326171875, 0.0014162063598632812, 0.0273590087890625, -0.039306640625, 0.041473388671875, -0.0018987655639648438, 0.0247955322265625, 0.03070068359375, -0.0312347412109375, 0.03485107421875, 0.0005083084106445312, -0.0286712646484375, -0.07159423828125, -0.00878143310546875, -0.07958984375, -0.0148162841796875, 0.10333251953125, 0.01313018798828125, -0.04888916015625, 0.005584716796875, -0.042144775390625, 0.0496826171875, -0.0226898193359375, 0.057464599609375, 0.02880859375, 0.015228271484375, -0.037750244140625, -0.05352783203125, 0.03656005859375, 0.00614166259765625, -0.07391357421875, 0.002826690673828125, 0.0195770263671875, 0.0341796875, 0.001064300537109375, 0.0880126953125, -0.0050048828125, 0.010223388671875, 0.003284454345703125, 0.037261962890625, -0.0302734375, -0.032470703125, -0.0185546875, -0.024169921875, 0.0187835693359375, -0.0335693359375 ] ]
roberta-large-mnli
2023-04-06T13:40:16.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "roberta", "text-classification", "autogenerated-modelcard", "en", "dataset:multi_nli", "dataset:wikipedia", "dataset:bookcorpus", "arxiv:1907.11692", "arxiv:1806.02847", "arxiv:1804.07461", "arxiv:1704.05426", "arxiv:1508.05326", "arxiv:1809.05053", "arxiv:1910.09700", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-classification
null
null
null
roberta-large-mnli
93
123,080
transformers
2022-03-02T23:29:04
--- language: - en license: mit tags: - autogenerated-modelcard datasets: - multi_nli - wikipedia - bookcorpus --- # roberta-large-mnli ## Table of Contents - [Model Details](#model-details) - [How To Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation-results) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-author) ## Model Details **Model Description:** roberta-large-mnli is the [RoBERTa large model](https://huggingface.co/roberta-large) fine-tuned on the [Multi-Genre Natural Language Inference (MNLI)](https://huggingface.co/datasets/multi_nli) corpus. The model is a pretrained model on English language text using a masked language modeling (MLM) objective. - **Developed by:** See [GitHub Repo](https://github.com/facebookresearch/fairseq/tree/main/examples/roberta) for model developers - **Model Type:** Transformer-based language model - **Language(s):** English - **License:** MIT - **Parent Model:** This model is a fine-tuned version of the RoBERTa large model. Users should see the [RoBERTa large model card](https://huggingface.co/roberta-large) for relevant information. - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/1907.11692) - [GitHub Repo](https://github.com/facebookresearch/fairseq/tree/main/examples/roberta) ## How to Get Started with the Model Use the code below to get started with the model. The model can be loaded with the zero-shot-classification pipeline like so: ```python from transformers import pipeline classifier = pipeline('zero-shot-classification', model='roberta-large-mnli') ``` You can then use this pipeline to classify sequences into any of the class names you specify. For example: ```python sequence_to_classify = "one day I will see the world" candidate_labels = ['travel', 'cooking', 'dancing'] classifier(sequence_to_classify, candidate_labels) ``` ## Uses #### Direct Use This fine-tuned model can be used for zero-shot classification tasks, including zero-shot sentence-pair classification (see the [GitHub repo](https://github.com/facebookresearch/fairseq/tree/main/examples/roberta) for examples) and zero-shot sequence classification. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propogate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). The [RoBERTa large model card](https://huggingface.co/roberta-large) notes that: "The training data used for this model contains a lot of unfiltered content from the internet, which is far from neutral." Predictions generated by the model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example: ```python sequence_to_classify = "The CEO had a strong handshake." candidate_labels = ['male', 'female'] hypothesis_template = "This text speaks about a {} profession." classifier(sequence_to_classify, candidate_labels, hypothesis_template=hypothesis_template) ``` Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. ## Training #### Training Data This model was fine-tuned on the [Multi-Genre Natural Language Inference (MNLI)](https://cims.nyu.edu/~sbowman/multinli/) corpus. Also see the [MNLI data card](https://huggingface.co/datasets/multi_nli) for more information. As described in the [RoBERTa large model card](https://huggingface.co/roberta-large): > The RoBERTa model was pretrained on the reunion of five datasets: > > - [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books; > - [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers) ; > - [CC-News](https://commoncrawl.org/2016/10/news-dataset-available/), a dataset containing 63 millions English news articles crawled between September 2016 and February 2019. > - [OpenWebText](https://github.com/jcpeterson/openwebtext), an opensource recreation of the WebText dataset used to train GPT-2, > - [Stories](https://arxiv.org/abs/1806.02847), a dataset containing a subset of CommonCrawl data filtered to match the story-like style of Winograd schemas. > > Together theses datasets weight 160GB of text. Also see the [bookcorpus data card](https://huggingface.co/datasets/bookcorpus) and the [wikipedia data card](https://huggingface.co/datasets/wikipedia) for additional information. #### Training Procedure ##### Preprocessing As described in the [RoBERTa large model card](https://huggingface.co/roberta-large): > The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50,000. The inputs of > the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked > with `<s>` and the end of one by `</s>` > > The details of the masking procedure for each sentence are the following: > - 15% of the tokens are masked. > - In 80% of the cases, the masked tokens are replaced by `<mask>`. > - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. > - In the 10% remaining cases, the masked tokens are left as is. > > Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed). ##### Pretraining Also as described in the [RoBERTa large model card](https://huggingface.co/roberta-large): > The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The > optimizer used is Adam with a learning rate of 4e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and > \\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 30,000 steps and linear decay of the learning > rate after. ## Evaluation The following evaluation information is extracted from the associated [GitHub repo for RoBERTa](https://github.com/facebookresearch/fairseq/tree/main/examples/roberta). #### Testing Data, Factors and Metrics The model developers report that the model was evaluated on the following tasks and datasets using the listed metrics: - **Dataset:** Part of [GLUE (Wang et al., 2019)](https://arxiv.org/pdf/1804.07461.pdf), the General Language Understanding Evaluation benchmark, a collection of 9 datasets for evaluating natural language understanding systems. Specifically, the model was evaluated on the [Multi-Genre Natural Language Inference (MNLI)](https://cims.nyu.edu/~sbowman/multinli/) corpus. See the [GLUE data card](https://huggingface.co/datasets/glue) or [Wang et al. (2019)](https://arxiv.org/pdf/1804.07461.pdf) for further information. - **Tasks:** NLI. [Wang et al. (2019)](https://arxiv.org/pdf/1804.07461.pdf) describe the inference task for MNLI as: > The Multi-Genre Natural Language Inference Corpus [(Williams et al., 2018)](https://arxiv.org/abs/1704.05426) is a crowd-sourced collection of sentence pairs with textual entailment annotations. Given a premise sentence and a hypothesis sentence, the task is to predict whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral). The premise sentences are gathered from ten different sources, including transcribed speech, fiction, and government reports. We use the standard test set, for which we obtained private labels from the authors, and evaluate on both the matched (in-domain) and mismatched (cross-domain) sections. We also use and recommend the SNLI corpus [(Bowman et al., 2015)](https://arxiv.org/abs/1508.05326) as 550k examples of auxiliary training data. - **Metrics:** Accuracy - **Dataset:** [XNLI (Conneau et al., 2018)](https://arxiv.org/pdf/1809.05053.pdf), the extension of the [Multi-Genre Natural Language Inference (MNLI)](https://cims.nyu.edu/~sbowman/multinli/) corpus to 15 languages: English, French, Spanish, German, Greek, Bulgarian, Russian, Turkish, Arabic, Vietnamese, Thai, Chinese, Hindi, Swahili and Urdu. See the [XNLI data card](https://huggingface.co/datasets/xnli) or [Conneau et al. (2018)](https://arxiv.org/pdf/1809.05053.pdf) for further information. - **Tasks:** Translate-test (e.g., the model is used to translate input sentences in other languages to the training language) - **Metrics:** Accuracy #### Results GLUE test results (dev set, single model, single-task fine-tuning): 90.2 on MNLI XNLI test results: | Task | en | fr | es | de | el | bg | ru | tr | ar | vi | th | zh | hi | sw | ur | |:----:|:--:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | |91.3|82.91|84.27|81.24|81.74|83.13|78.28|76.79|76.64|74.17|74.05| 77.5| 70.9|66.65|66.81| ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type and hours used based on the [associated paper](https://arxiv.org/pdf/1907.11692.pdf). - **Hardware Type:** 1024 V100 GPUs - **Hours used:** 24 hours (one day) - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/pdf/1907.11692.pdf) for details on the modeling architecture, objective, compute infrastructure, and training details. ## Citation Information ```bibtex @article{liu2019roberta, title = {RoBERTa: A Robustly Optimized BERT Pretraining Approach}, author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and Luke Zettlemoyer and Veselin Stoyanov}, journal={arXiv preprint arXiv:1907.11692}, year = {2019}, } ```
10,712
[ [ -0.0205230712890625, -0.062164306640625, 0.0284271240234375, 0.0064849853515625, -0.0120849609375, -0.019256591796875, -0.0321044921875, -0.04931640625, 0.006214141845703125, 0.040557861328125, -0.041229248046875, -0.04779052734375, -0.0531005859375, 0.010711669921875, -0.0243988037109375, 0.1026611328125, 0.0145721435546875, 0.007740020751953125, -0.0036830902099609375, -0.00589752197265625, -0.0211181640625, -0.05267333984375, -0.048797607421875, -0.007190704345703125, 0.033233642578125, 0.0047454833984375, 0.035369873046875, 0.03509521484375, 0.0265045166015625, 0.0210418701171875, -0.02020263671875, 0.0020599365234375, -0.040924072265625, -0.01300048828125, 0.00203704833984375, -0.038787841796875, -0.032073974609375, 0.0249176025390625, 0.04852294921875, 0.04461669921875, 0.0010290145874023438, 0.030548095703125, -0.0015611648559570312, 0.0408935546875, -0.03533935546875, 0.00986480712890625, -0.046417236328125, -0.00405120849609375, -0.026397705078125, 0.0082855224609375, -0.0504150390625, -0.00975799560546875, 0.01021575927734375, -0.03851318359375, 0.0174713134765625, 0.00788116455078125, 0.08770751953125, 0.00543975830078125, -0.0419921875, -0.01763916015625, -0.04864501953125, 0.078369140625, -0.06756591796875, 0.026885986328125, 0.032745361328125, 0.01035308837890625, -0.0047149658203125, -0.04595947265625, -0.056976318359375, -0.0211944580078125, -0.005298614501953125, 0.01221466064453125, -0.0202178955078125, -0.0007653236389160156, 0.031097412109375, 0.0215911865234375, -0.06451416015625, 0.00888824462890625, -0.04425048828125, -0.0091400146484375, 0.051239013671875, -0.0025157928466796875, 0.019866943359375, -0.03253173828125, -0.03131103515625, -0.00920867919921875, -0.03045654296875, 0.006439208984375, 0.037567138671875, 0.031463623046875, -0.021636962890625, 0.038543701171875, -0.003513336181640625, 0.06829833984375, 0.000008225440979003906, -0.0279388427734375, 0.04443359375, -0.03387451171875, -0.023040771484375, -0.0116729736328125, 0.0670166015625, 0.02545166015625, 0.022552490234375, -0.00876617431640625, 0.0003399848937988281, 0.005496978759765625, 0.0018520355224609375, -0.0650634765625, -0.003910064697265625, 0.033294677734375, -0.03973388671875, -0.0239105224609375, 0.01096343994140625, -0.060394287109375, -0.005138397216796875, -0.02423095703125, 0.040740966796875, -0.0413818359375, -0.012298583984375, 0.015777587890625, -0.0206451416015625, 0.0101470947265625, 0.0091552734375, -0.0439453125, 0.01009368896484375, 0.040740966796875, 0.05938720703125, -0.0009183883666992188, -0.02008056640625, -0.03240966796875, 0.0005154609680175781, -0.0037078857421875, 0.034942626953125, -0.0305938720703125, -0.00829315185546875, -0.013702392578125, 0.0170135498046875, -0.01495361328125, -0.017059326171875, 0.051971435546875, -0.0318603515625, 0.05267333984375, 0.01047515869140625, -0.0489501953125, -0.020538330078125, 0.01947021484375, -0.04327392578125, 0.07757568359375, 0.0014104843139648438, -0.0631103515625, 0.01904296875, -0.0504150390625, -0.027313232421875, -0.0194854736328125, 0.00927734375, -0.03533935546875, -0.0112762451171875, 0.01163482666015625, 0.040802001953125, -0.024261474609375, 0.04833984375, -0.0252685546875, -0.0269012451171875, 0.0233612060546875, -0.039031982421875, 0.096435546875, 0.0174560546875, -0.02716064453125, 0.007808685302734375, -0.0745849609375, -0.00997161865234375, 0.0187225341796875, -0.0228118896484375, -0.02264404296875, -0.01947021484375, 0.0249481201171875, 0.0286102294921875, 0.0171356201171875, -0.045379638671875, 0.011077880859375, -0.039459228515625, 0.0360107421875, 0.046417236328125, -0.00997161865234375, 0.03131103515625, -0.0321044921875, 0.03643798828125, -0.0008988380432128906, 0.02655029296875, -0.0161590576171875, -0.0499267578125, -0.053192138671875, -0.0256195068359375, 0.047943115234375, 0.04913330078125, -0.034027099609375, 0.051605224609375, -0.0233306884765625, -0.053009033203125, -0.04901123046875, -0.0033092498779296875, 0.043670654296875, 0.027252197265625, 0.03546142578125, -0.01708984375, -0.044586181640625, -0.055511474609375, -0.0052490234375, 0.005207061767578125, -0.0119171142578125, 0.022552490234375, 0.048828125, -0.0123291015625, 0.055084228515625, -0.0367431640625, -0.0310821533203125, -0.020477294921875, 0.0160369873046875, 0.034332275390625, 0.042724609375, 0.0472412109375, -0.06884765625, -0.0361328125, -0.0240631103515625, -0.048065185546875, 0.012359619140625, -0.01042938232421875, -0.0125732421875, 0.04962158203125, 0.0305938720703125, -0.055511474609375, 0.0277557373046875, 0.058197021484375, -0.0297698974609375, 0.041748046875, -0.006160736083984375, -0.002880096435546875, -0.09918212890625, 0.01812744140625, 0.01428985595703125, -0.0029010772705078125, -0.05023193359375, 0.00844573974609375, -0.01168060302734375, -0.01010894775390625, -0.03057861328125, 0.050567626953125, -0.03912353515625, 0.006198883056640625, 0.0039825439453125, 0.0200653076171875, 0.01219940185546875, 0.0450439453125, 0.007656097412109375, 0.04443359375, 0.034942626953125, -0.03997802734375, 0.0037784576416015625, 0.0142059326171875, -0.026092529296875, 0.0259857177734375, -0.04302978515625, 0.0071868896484375, -0.0202789306640625, 0.03253173828125, -0.06439208984375, -0.0133209228515625, 0.016326904296875, -0.046295166015625, 0.034820556640625, -0.0166473388671875, -0.045806884765625, -0.0360107421875, -0.029052734375, 0.01495361328125, 0.042510986328125, -0.025604248046875, 0.0458984375, 0.02752685546875, 0.00714111328125, -0.0545654296875, -0.057220458984375, 0.00817108154296875, -0.0239715576171875, -0.047515869140625, 0.02691650390625, -0.0180511474609375, -0.01514434814453125, 0.00669097900390625, 0.0210418701171875, -0.01629638671875, 0.00777435302734375, 0.0180511474609375, 0.035675048828125, -0.0139617919921875, 0.022216796875, -0.0211639404296875, -0.004940032958984375, -0.01221466064453125, -0.027191162109375, 0.052978515625, -0.0046539306640625, -0.00487518310546875, -0.039093017578125, 0.0164031982421875, 0.031585693359375, -0.0299530029296875, 0.08013916015625, 0.07470703125, -0.021453857421875, 0.0012845993041992188, -0.048187255859375, -0.0111541748046875, -0.0293426513671875, 0.032562255859375, -0.0120849609375, -0.06658935546875, 0.0268707275390625, 0.025634765625, -0.004116058349609375, 0.052947998046875, 0.0294189453125, 0.002208709716796875, 0.0657958984375, 0.044677734375, -0.016693115234375, 0.032501220703125, -0.0311431884765625, 0.0208282470703125, -0.06182861328125, -0.0091552734375, -0.0531005859375, -0.01361846923828125, -0.059173583984375, -0.0452880859375, 0.0129241943359375, 0.00998687744140625, -0.025115966796875, 0.032806396484375, -0.039825439453125, 0.0220489501953125, 0.057769775390625, 0.00849151611328125, 0.023590087890625, -0.0096893310546875, 0.01412200927734375, -0.00922393798828125, -0.050567626953125, -0.041900634765625, 0.109619140625, 0.03076171875, 0.03216552734375, 0.0124053955078125, 0.057861328125, 0.0007753372192382812, 0.0187225341796875, -0.048004150390625, 0.03192138671875, -0.0293426513671875, -0.0794677734375, -0.031829833984375, -0.037750244140625, -0.0782470703125, 0.0140838623046875, -0.034576416015625, -0.064208984375, 0.00933837890625, -0.0015964508056640625, -0.0220184326171875, 0.026031494140625, -0.0526123046875, 0.0792236328125, -0.0198974609375, -0.0155029296875, -0.0084991455078125, -0.0543212890625, 0.044281005859375, -0.015228271484375, 0.033172607421875, -0.00911712646484375, 0.016693115234375, 0.05914306640625, -0.014862060546875, 0.07421875, -0.01541900634765625, 0.00733184814453125, 0.0146636962890625, -0.0160980224609375, 0.031036376953125, -0.016693115234375, -0.01678466796875, 0.03436279296875, 0.0090789794921875, -0.0269775390625, -0.03143310546875, 0.036834716796875, -0.055511474609375, -0.028411865234375, -0.0374755859375, -0.03594970703125, 0.0008029937744140625, 0.0189056396484375, 0.03125, 0.0452880859375, -0.0113067626953125, 0.0244598388671875, 0.046844482421875, -0.041534423828125, 0.02020263671875, 0.0389404296875, -0.025146484375, -0.01506805419921875, 0.06597900390625, 0.0181732177734375, 0.0200653076171875, 0.0289306640625, 0.0077362060546875, -0.0166015625, -0.0386962890625, -0.02288818359375, 0.028289794921875, -0.037200927734375, -0.011138916015625, -0.064208984375, -0.0234375, -0.045745849609375, 0.0086669921875, -0.0111083984375, -0.034423828125, -0.032073974609375, -0.0016298294067382812, 0.0259552001953125, 0.046905517578125, 0.01154327392578125, 0.01532745361328125, -0.05224609375, 0.01345062255859375, -0.000331878662109375, 0.0164794921875, 0.003551483154296875, -0.0667724609375, -0.0141754150390625, 0.0098419189453125, -0.0190887451171875, -0.053619384765625, 0.032958984375, 0.0215911865234375, 0.03558349609375, 0.0274810791015625, 0.01082611083984375, 0.043060302734375, -0.039459228515625, 0.07244873046875, 0.010101318359375, -0.0628662109375, 0.042022705078125, -0.031402587890625, 0.01401519775390625, 0.031280517578125, 0.0494384765625, -0.03839111328125, -0.042724609375, -0.06488037109375, -0.07489013671875, 0.054168701171875, 0.0089874267578125, 0.020477294921875, -0.00823974609375, 0.0142974853515625, -0.004238128662109375, 0.01308441162109375, -0.09307861328125, -0.021087646484375, -0.0038547515869140625, -0.02655029296875, -0.0306243896484375, -0.0284271240234375, -0.0099334716796875, -0.03033447265625, 0.06756591796875, -0.0006322860717773438, 0.0279693603515625, 0.013885498046875, -0.0172882080078125, 0.00737762451171875, 0.026123046875, 0.05206298828125, 0.037872314453125, -0.0313720703125, -0.00908660888671875, 0.01026153564453125, -0.0229339599609375, -0.00989532470703125, 0.0231475830078125, -0.021087646484375, 0.004974365234375, 0.0419921875, 0.074462890625, 0.023834228515625, -0.06219482421875, 0.05535888671875, -0.0002791881561279297, -0.0212554931640625, -0.0384521484375, 0.0018281936645507812, 0.00786590576171875, 0.0174713134765625, 0.0191192626953125, -0.0004343986511230469, 0.0113677978515625, -0.052337646484375, 0.01068115234375, 0.03265380859375, -0.019622802734375, -0.019195556640625, 0.053924560546875, 0.00206756591796875, -0.012725830078125, 0.044097900390625, -0.03033447265625, -0.03411865234375, 0.048858642578125, 0.0457763671875, 0.04949951171875, -0.002880096435546875, 0.02386474609375, 0.05108642578125, 0.02880859375, -0.0004048347473144531, 0.0021209716796875, 0.0170440673828125, -0.054901123046875, -0.0386962890625, -0.0712890625, 0.0031833648681640625, 0.02008056640625, -0.049957275390625, 0.021240234375, -0.031829833984375, -0.01425933837890625, 0.01204681396484375, 0.00844573974609375, -0.059906005859375, 0.0153350830078125, 0.00626373291015625, 0.0728759765625, -0.07470703125, 0.06744384765625, 0.0350341796875, -0.059906005859375, -0.06658935546875, 0.0139312744140625, 0.010162353515625, -0.051055908203125, 0.060577392578125, 0.033843994140625, 0.02337646484375, 0.000007987022399902344, -0.0377197265625, -0.07012939453125, 0.083740234375, 0.01312255859375, -0.041748046875, -0.00888824462890625, -0.00830841064453125, 0.05511474609375, -0.0272369384765625, 0.029541015625, 0.022308349609375, 0.041046142578125, -0.0004754066467285156, -0.0819091796875, 0.0008931159973144531, -0.01374053955078125, 0.00017821788787841797, 0.0010251998901367188, -0.049560546875, 0.08642578125, -0.01336669921875, -0.01300048828125, 0.010833740234375, 0.0284576416015625, 0.00855255126953125, 0.022613525390625, 0.035797119140625, 0.056884765625, 0.0711669921875, -0.005680084228515625, 0.09234619140625, -0.0260467529296875, 0.034332275390625, 0.08856201171875, -0.005535125732421875, 0.0848388671875, 0.0133209228515625, -0.0297088623046875, 0.05078125, 0.045440673828125, -0.0192718505859375, 0.021636962890625, 0.01000213623046875, -0.00795745849609375, -0.00780487060546875, -0.0101776123046875, -0.0230255126953125, 0.0467529296875, 0.0098114013671875, -0.036529541015625, -0.0019350051879882812, 0.01279449462890625, 0.0203704833984375, 0.0021572113037109375, -0.0068206787109375, 0.0498046875, -0.0028476715087890625, -0.048858642578125, 0.046173095703125, 0.0131988525390625, 0.072021484375, -0.046722412109375, 0.00951385498046875, -0.00406646728515625, 0.0082855224609375, -0.01012420654296875, -0.051483154296875, 0.01800537109375, 0.0025463104248046875, -0.028656005859375, -0.005237579345703125, 0.038604736328125, -0.048004150390625, -0.0499267578125, 0.036346435546875, 0.0212860107421875, 0.0299835205078125, -0.0086669921875, -0.06646728515625, 0.004917144775390625, 0.00888824462890625, -0.0227813720703125, 0.032684326171875, 0.0255889892578125, -0.0028057098388671875, 0.045135498046875, 0.06256103515625, 0.0020046234130859375, -0.0046844482421875, 0.00612640380859375, 0.046539306640625, -0.043304443359375, -0.0245819091796875, -0.055419921875, 0.050750732421875, -0.01335906982421875, -0.027008056640625, 0.067138671875, 0.045257568359375, 0.07415771484375, -0.003574371337890625, 0.051025390625, -0.010345458984375, 0.0482177734375, -0.052398681640625, 0.04815673828125, -0.052734375, 0.0020046234130859375, -0.0284576416015625, -0.06591796875, -0.0276947021484375, 0.050201416015625, -0.0257720947265625, 0.017181396484375, 0.04644775390625, 0.06396484375, -0.0002455711364746094, -0.0093231201171875, 0.01345062255859375, 0.034515380859375, 0.009429931640625, 0.040374755859375, 0.027130126953125, -0.047576904296875, 0.044464111328125, -0.011505126953125, -0.0228729248046875, -0.0141448974609375, -0.06634521484375, -0.0782470703125, -0.05682373046875, -0.01873779296875, -0.0394287109375, 0.00946044921875, 0.0728759765625, 0.0538330078125, -0.06011962890625, -0.0158538818359375, 0.0098419189453125, -0.00278472900390625, -0.0027065277099609375, -0.0233306884765625, 0.0308685302734375, -0.0221710205078125, -0.06463623046875, 0.01177215576171875, 0.002777099609375, 0.0003142356872558594, -0.0191497802734375, -0.006893157958984375, -0.04412841796875, -0.003292083740234375, 0.05059814453125, 0.00920867919921875, -0.064453125, -0.0175628662109375, -0.0041351318359375, -0.006938934326171875, -0.004146575927734375, 0.0212554931640625, -0.04071044921875, 0.025482177734375, 0.008544921875, 0.03466796875, 0.050750732421875, -0.00603485107421875, 0.0279541015625, -0.0548095703125, 0.02874755859375, 0.006542205810546875, 0.0219268798828125, 0.02252197265625, -0.031707763671875, 0.045562744140625, 0.0275421142578125, -0.040283203125, -0.06854248046875, 0.0026454925537109375, -0.08172607421875, -0.04156494140625, 0.093017578125, -0.0130615234375, -0.0255584716796875, 0.00015151500701904297, -0.013336181640625, 0.021331787109375, -0.0197296142578125, 0.0447998046875, 0.065185546875, 0.0096588134765625, -0.01190948486328125, -0.0684814453125, 0.033294677734375, 0.0207061767578125, -0.047332763671875, -0.01010894775390625, 0.0396728515625, 0.040069580078125, 0.02459716796875, 0.054443359375, -0.007633209228515625, -0.00021338462829589844, -0.00893402099609375, 0.02142333984375, -0.0013742446899414062, -0.0202789306640625, -0.0325927734375, 0.0062255859375, -0.0038585662841796875, 0.0005245208740234375 ] ]
hkunlp/instructor-base
2023-01-21T06:31:16.000Z
[ "sentence-transformers", "pytorch", "t5", "text-embedding", "embeddings", "information-retrieval", "beir", "text-classification", "language-model", "text-clustering", "text-semantic-similarity", "text-evaluation", "prompt-retrieval", "text-reranking", "feature-extraction", "sentence-similarity", "transformers", "English", "Sentence Similarity", "natural_questions", "ms_marco", "fever", "hotpot_qa", "mteb", "en", "arxiv:2212.09741", "license:apache-2.0", "model-index", "has_space", "text-generation-inference", "region:us" ]
sentence-similarity
hkunlp
null
null
hkunlp/instructor-base
77
123,041
sentence-transformers
2022-12-20T05:59:40
--- pipeline_tag: sentence-similarity tags: - text-embedding - embeddings - information-retrieval - beir - text-classification - language-model - text-clustering - text-semantic-similarity - text-evaluation - prompt-retrieval - text-reranking - sentence-transformers - feature-extraction - sentence-similarity - transformers - t5 - English - Sentence Similarity - natural_questions - ms_marco - fever - hotpot_qa - mteb language: en inference: false license: apache-2.0 model-index: - name: final_base_results results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 86.2089552238806 - type: ap value: 55.76273850794966 - type: f1 value: 81.26104211414781 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 88.35995000000001 - type: ap value: 84.18839957309655 - type: f1 value: 88.317619250081 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 44.64 - type: f1 value: 42.48663956478136 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 27.383000000000003 - type: map_at_10 value: 43.024 - type: map_at_100 value: 44.023 - type: map_at_1000 value: 44.025999999999996 - type: map_at_3 value: 37.684 - type: map_at_5 value: 40.884 - type: mrr_at_1 value: 28.094 - type: mrr_at_10 value: 43.315 - type: mrr_at_100 value: 44.313 - type: mrr_at_1000 value: 44.317 - type: mrr_at_3 value: 37.862 - type: mrr_at_5 value: 41.155 - type: ndcg_at_1 value: 27.383000000000003 - type: ndcg_at_10 value: 52.032000000000004 - type: ndcg_at_100 value: 56.19499999999999 - type: ndcg_at_1000 value: 56.272 - type: ndcg_at_3 value: 41.166000000000004 - type: ndcg_at_5 value: 46.92 - type: precision_at_1 value: 27.383000000000003 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.989 - type: precision_at_1000 value: 0.099 - type: precision_at_3 value: 17.093 - type: precision_at_5 value: 13.044 - type: recall_at_1 value: 27.383000000000003 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 98.86200000000001 - type: recall_at_1000 value: 99.431 - type: recall_at_3 value: 51.28 - type: recall_at_5 value: 65.22 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 39.68441054431849 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 29.188539728343844 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 63.173362687519784 - type: mrr value: 76.18860748362133 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_spearman value: 82.30789953771232 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 77.03571428571428 - type: f1 value: 75.87384305045917 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 32.98041170516364 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 25.71652988451154 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 33.739999999999995 - type: map_at_10 value: 46.197 - type: map_at_100 value: 47.814 - type: map_at_1000 value: 47.934 - type: map_at_3 value: 43.091 - type: map_at_5 value: 44.81 - type: mrr_at_1 value: 41.059 - type: mrr_at_10 value: 52.292 - type: mrr_at_100 value: 52.978 - type: mrr_at_1000 value: 53.015 - type: mrr_at_3 value: 49.976 - type: mrr_at_5 value: 51.449999999999996 - type: ndcg_at_1 value: 41.059 - type: ndcg_at_10 value: 52.608 - type: ndcg_at_100 value: 57.965 - type: ndcg_at_1000 value: 59.775999999999996 - type: ndcg_at_3 value: 48.473 - type: ndcg_at_5 value: 50.407999999999994 - type: precision_at_1 value: 41.059 - type: precision_at_10 value: 9.943 - type: precision_at_100 value: 1.6070000000000002 - type: precision_at_1000 value: 0.20500000000000002 - type: precision_at_3 value: 23.413999999999998 - type: precision_at_5 value: 16.481 - type: recall_at_1 value: 33.739999999999995 - type: recall_at_10 value: 63.888999999999996 - type: recall_at_100 value: 85.832 - type: recall_at_1000 value: 97.475 - type: recall_at_3 value: 51.953 - type: recall_at_5 value: 57.498000000000005 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 31.169999999999998 - type: map_at_10 value: 41.455 - type: map_at_100 value: 42.716 - type: map_at_1000 value: 42.847 - type: map_at_3 value: 38.568999999999996 - type: map_at_5 value: 40.099000000000004 - type: mrr_at_1 value: 39.427 - type: mrr_at_10 value: 47.818 - type: mrr_at_100 value: 48.519 - type: mrr_at_1000 value: 48.558 - type: mrr_at_3 value: 45.86 - type: mrr_at_5 value: 46.936 - type: ndcg_at_1 value: 39.427 - type: ndcg_at_10 value: 47.181 - type: ndcg_at_100 value: 51.737 - type: ndcg_at_1000 value: 53.74 - type: ndcg_at_3 value: 43.261 - type: ndcg_at_5 value: 44.891 - type: precision_at_1 value: 39.427 - type: precision_at_10 value: 8.847 - type: precision_at_100 value: 1.425 - type: precision_at_1000 value: 0.189 - type: precision_at_3 value: 20.785999999999998 - type: precision_at_5 value: 14.560999999999998 - type: recall_at_1 value: 31.169999999999998 - type: recall_at_10 value: 56.971000000000004 - type: recall_at_100 value: 76.31400000000001 - type: recall_at_1000 value: 88.93900000000001 - type: recall_at_3 value: 45.208 - type: recall_at_5 value: 49.923 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 39.682 - type: map_at_10 value: 52.766000000000005 - type: map_at_100 value: 53.84100000000001 - type: map_at_1000 value: 53.898 - type: map_at_3 value: 49.291000000000004 - type: map_at_5 value: 51.365 - type: mrr_at_1 value: 45.266 - type: mrr_at_10 value: 56.093 - type: mrr_at_100 value: 56.763 - type: mrr_at_1000 value: 56.793000000000006 - type: mrr_at_3 value: 53.668000000000006 - type: mrr_at_5 value: 55.1 - type: ndcg_at_1 value: 45.266 - type: ndcg_at_10 value: 58.836 - type: ndcg_at_100 value: 62.863 - type: ndcg_at_1000 value: 63.912 - type: ndcg_at_3 value: 53.19199999999999 - type: ndcg_at_5 value: 56.125 - type: precision_at_1 value: 45.266 - type: precision_at_10 value: 9.492 - type: precision_at_100 value: 1.236 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 23.762 - type: precision_at_5 value: 16.414 - type: recall_at_1 value: 39.682 - type: recall_at_10 value: 73.233 - type: recall_at_100 value: 90.335 - type: recall_at_1000 value: 97.452 - type: recall_at_3 value: 58.562000000000005 - type: recall_at_5 value: 65.569 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.743 - type: map_at_10 value: 34.016000000000005 - type: map_at_100 value: 35.028999999999996 - type: map_at_1000 value: 35.113 - type: map_at_3 value: 31.763 - type: map_at_5 value: 33.013999999999996 - type: mrr_at_1 value: 28.927000000000003 - type: mrr_at_10 value: 36.32 - type: mrr_at_100 value: 37.221 - type: mrr_at_1000 value: 37.281 - type: mrr_at_3 value: 34.105000000000004 - type: mrr_at_5 value: 35.371 - type: ndcg_at_1 value: 28.927000000000003 - type: ndcg_at_10 value: 38.474000000000004 - type: ndcg_at_100 value: 43.580000000000005 - type: ndcg_at_1000 value: 45.64 - type: ndcg_at_3 value: 34.035 - type: ndcg_at_5 value: 36.186 - type: precision_at_1 value: 28.927000000000003 - type: precision_at_10 value: 5.74 - type: precision_at_100 value: 0.8710000000000001 - type: precision_at_1000 value: 0.108 - type: precision_at_3 value: 14.124 - type: precision_at_5 value: 9.74 - type: recall_at_1 value: 26.743 - type: recall_at_10 value: 49.955 - type: recall_at_100 value: 73.904 - type: recall_at_1000 value: 89.133 - type: recall_at_3 value: 38.072 - type: recall_at_5 value: 43.266 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 16.928 - type: map_at_10 value: 23.549 - type: map_at_100 value: 24.887 - type: map_at_1000 value: 25.018 - type: map_at_3 value: 21.002000000000002 - type: map_at_5 value: 22.256 - type: mrr_at_1 value: 21.02 - type: mrr_at_10 value: 27.898 - type: mrr_at_100 value: 29.018 - type: mrr_at_1000 value: 29.099999999999998 - type: mrr_at_3 value: 25.456 - type: mrr_at_5 value: 26.625 - type: ndcg_at_1 value: 21.02 - type: ndcg_at_10 value: 28.277 - type: ndcg_at_100 value: 34.54 - type: ndcg_at_1000 value: 37.719 - type: ndcg_at_3 value: 23.707 - type: ndcg_at_5 value: 25.482 - type: precision_at_1 value: 21.02 - type: precision_at_10 value: 5.361 - type: precision_at_100 value: 0.9809999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 11.401 - type: precision_at_5 value: 8.209 - type: recall_at_1 value: 16.928 - type: recall_at_10 value: 38.601 - type: recall_at_100 value: 65.759 - type: recall_at_1000 value: 88.543 - type: recall_at_3 value: 25.556 - type: recall_at_5 value: 30.447000000000003 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 28.549000000000003 - type: map_at_10 value: 38.426 - type: map_at_100 value: 39.845000000000006 - type: map_at_1000 value: 39.956 - type: map_at_3 value: 35.372 - type: map_at_5 value: 37.204 - type: mrr_at_1 value: 35.034 - type: mrr_at_10 value: 44.041000000000004 - type: mrr_at_100 value: 44.95 - type: mrr_at_1000 value: 44.997 - type: mrr_at_3 value: 41.498000000000005 - type: mrr_at_5 value: 43.077 - type: ndcg_at_1 value: 35.034 - type: ndcg_at_10 value: 44.218 - type: ndcg_at_100 value: 49.958000000000006 - type: ndcg_at_1000 value: 52.019000000000005 - type: ndcg_at_3 value: 39.34 - type: ndcg_at_5 value: 41.892 - type: precision_at_1 value: 35.034 - type: precision_at_10 value: 7.911 - type: precision_at_100 value: 1.26 - type: precision_at_1000 value: 0.16 - type: precision_at_3 value: 18.511 - type: precision_at_5 value: 13.205 - type: recall_at_1 value: 28.549000000000003 - type: recall_at_10 value: 56.035999999999994 - type: recall_at_100 value: 79.701 - type: recall_at_1000 value: 93.149 - type: recall_at_3 value: 42.275 - type: recall_at_5 value: 49.097 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.391000000000002 - type: map_at_10 value: 39.48 - type: map_at_100 value: 40.727000000000004 - type: map_at_1000 value: 40.835 - type: map_at_3 value: 36.234 - type: map_at_5 value: 37.877 - type: mrr_at_1 value: 35.959 - type: mrr_at_10 value: 44.726 - type: mrr_at_100 value: 45.531 - type: mrr_at_1000 value: 45.582 - type: mrr_at_3 value: 42.047000000000004 - type: mrr_at_5 value: 43.611 - type: ndcg_at_1 value: 35.959 - type: ndcg_at_10 value: 45.303 - type: ndcg_at_100 value: 50.683 - type: ndcg_at_1000 value: 52.818 - type: ndcg_at_3 value: 39.987 - type: ndcg_at_5 value: 42.243 - type: precision_at_1 value: 35.959 - type: precision_at_10 value: 8.241999999999999 - type: precision_at_100 value: 1.274 - type: precision_at_1000 value: 0.163 - type: precision_at_3 value: 18.836 - type: precision_at_5 value: 13.196 - type: recall_at_1 value: 29.391000000000002 - type: recall_at_10 value: 57.364000000000004 - type: recall_at_100 value: 80.683 - type: recall_at_1000 value: 94.918 - type: recall_at_3 value: 42.263 - type: recall_at_5 value: 48.634 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.791749999999997 - type: map_at_10 value: 35.75541666666667 - type: map_at_100 value: 37.00791666666667 - type: map_at_1000 value: 37.12408333333333 - type: map_at_3 value: 33.02966666666667 - type: map_at_5 value: 34.56866666666667 - type: mrr_at_1 value: 31.744333333333337 - type: mrr_at_10 value: 39.9925 - type: mrr_at_100 value: 40.86458333333333 - type: mrr_at_1000 value: 40.92175000000001 - type: mrr_at_3 value: 37.68183333333334 - type: mrr_at_5 value: 39.028499999999994 - type: ndcg_at_1 value: 31.744333333333337 - type: ndcg_at_10 value: 40.95008333333334 - type: ndcg_at_100 value: 46.25966666666667 - type: ndcg_at_1000 value: 48.535333333333334 - type: ndcg_at_3 value: 36.43333333333333 - type: ndcg_at_5 value: 38.602333333333334 - type: precision_at_1 value: 31.744333333333337 - type: precision_at_10 value: 7.135166666666666 - type: precision_at_100 value: 1.1535833333333334 - type: precision_at_1000 value: 0.15391666666666665 - type: precision_at_3 value: 16.713 - type: precision_at_5 value: 11.828416666666666 - type: recall_at_1 value: 26.791749999999997 - type: recall_at_10 value: 51.98625 - type: recall_at_100 value: 75.30358333333334 - type: recall_at_1000 value: 91.05433333333333 - type: recall_at_3 value: 39.39583333333333 - type: recall_at_5 value: 45.05925 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 22.219 - type: map_at_10 value: 29.162 - type: map_at_100 value: 30.049999999999997 - type: map_at_1000 value: 30.144 - type: map_at_3 value: 27.204 - type: map_at_5 value: 28.351 - type: mrr_at_1 value: 25.153 - type: mrr_at_10 value: 31.814999999999998 - type: mrr_at_100 value: 32.573 - type: mrr_at_1000 value: 32.645 - type: mrr_at_3 value: 29.934 - type: mrr_at_5 value: 30.946 - type: ndcg_at_1 value: 25.153 - type: ndcg_at_10 value: 33.099000000000004 - type: ndcg_at_100 value: 37.768 - type: ndcg_at_1000 value: 40.331 - type: ndcg_at_3 value: 29.473 - type: ndcg_at_5 value: 31.206 - type: precision_at_1 value: 25.153 - type: precision_at_10 value: 5.183999999999999 - type: precision_at_100 value: 0.8170000000000001 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 12.831999999999999 - type: precision_at_5 value: 8.895999999999999 - type: recall_at_1 value: 22.219 - type: recall_at_10 value: 42.637 - type: recall_at_100 value: 64.704 - type: recall_at_1000 value: 83.963 - type: recall_at_3 value: 32.444 - type: recall_at_5 value: 36.802 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.427999999999997 - type: map_at_10 value: 24.029 - type: map_at_100 value: 25.119999999999997 - type: map_at_1000 value: 25.257 - type: map_at_3 value: 22.016 - type: map_at_5 value: 23.143 - type: mrr_at_1 value: 21.129 - type: mrr_at_10 value: 27.750000000000004 - type: mrr_at_100 value: 28.666999999999998 - type: mrr_at_1000 value: 28.754999999999995 - type: mrr_at_3 value: 25.849 - type: mrr_at_5 value: 26.939999999999998 - type: ndcg_at_1 value: 21.129 - type: ndcg_at_10 value: 28.203 - type: ndcg_at_100 value: 33.44 - type: ndcg_at_1000 value: 36.61 - type: ndcg_at_3 value: 24.648999999999997 - type: ndcg_at_5 value: 26.316 - type: precision_at_1 value: 21.129 - type: precision_at_10 value: 5.055 - type: precision_at_100 value: 0.909 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 11.666 - type: precision_at_5 value: 8.3 - type: recall_at_1 value: 17.427999999999997 - type: recall_at_10 value: 36.923 - type: recall_at_100 value: 60.606 - type: recall_at_1000 value: 83.19 - type: recall_at_3 value: 26.845000000000002 - type: recall_at_5 value: 31.247000000000003 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.457000000000004 - type: map_at_10 value: 35.228 - type: map_at_100 value: 36.475 - type: map_at_1000 value: 36.585 - type: map_at_3 value: 32.444 - type: map_at_5 value: 34.046 - type: mrr_at_1 value: 30.784 - type: mrr_at_10 value: 39.133 - type: mrr_at_100 value: 40.11 - type: mrr_at_1000 value: 40.169 - type: mrr_at_3 value: 36.692 - type: mrr_at_5 value: 38.17 - type: ndcg_at_1 value: 30.784 - type: ndcg_at_10 value: 40.358 - type: ndcg_at_100 value: 46.119 - type: ndcg_at_1000 value: 48.428 - type: ndcg_at_3 value: 35.504000000000005 - type: ndcg_at_5 value: 37.864 - type: precision_at_1 value: 30.784 - type: precision_at_10 value: 6.800000000000001 - type: precision_at_100 value: 1.083 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 15.920000000000002 - type: precision_at_5 value: 11.437 - type: recall_at_1 value: 26.457000000000004 - type: recall_at_10 value: 51.845 - type: recall_at_100 value: 77.046 - type: recall_at_1000 value: 92.892 - type: recall_at_3 value: 38.89 - type: recall_at_5 value: 44.688 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 29.378999999999998 - type: map_at_10 value: 37.373 - type: map_at_100 value: 39.107 - type: map_at_1000 value: 39.317 - type: map_at_3 value: 34.563 - type: map_at_5 value: 36.173 - type: mrr_at_1 value: 35.178 - type: mrr_at_10 value: 42.44 - type: mrr_at_100 value: 43.434 - type: mrr_at_1000 value: 43.482 - type: mrr_at_3 value: 39.987 - type: mrr_at_5 value: 41.370000000000005 - type: ndcg_at_1 value: 35.178 - type: ndcg_at_10 value: 42.82 - type: ndcg_at_100 value: 48.935 - type: ndcg_at_1000 value: 51.28 - type: ndcg_at_3 value: 38.562999999999995 - type: ndcg_at_5 value: 40.687 - type: precision_at_1 value: 35.178 - type: precision_at_10 value: 7.945 - type: precision_at_100 value: 1.524 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 17.721 - type: precision_at_5 value: 12.925 - type: recall_at_1 value: 29.378999999999998 - type: recall_at_10 value: 52.141999999999996 - type: recall_at_100 value: 79.49000000000001 - type: recall_at_1000 value: 93.782 - type: recall_at_3 value: 39.579 - type: recall_at_5 value: 45.462 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 19.814999999999998 - type: map_at_10 value: 27.383999999999997 - type: map_at_100 value: 28.483999999999998 - type: map_at_1000 value: 28.585 - type: map_at_3 value: 24.807000000000002 - type: map_at_5 value: 26.485999999999997 - type: mrr_at_1 value: 21.996 - type: mrr_at_10 value: 29.584 - type: mrr_at_100 value: 30.611 - type: mrr_at_1000 value: 30.684 - type: mrr_at_3 value: 27.11 - type: mrr_at_5 value: 28.746 - type: ndcg_at_1 value: 21.996 - type: ndcg_at_10 value: 32.024 - type: ndcg_at_100 value: 37.528 - type: ndcg_at_1000 value: 40.150999999999996 - type: ndcg_at_3 value: 27.016000000000002 - type: ndcg_at_5 value: 29.927999999999997 - type: precision_at_1 value: 21.996 - type: precision_at_10 value: 5.102 - type: precision_at_100 value: 0.856 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 11.583 - type: precision_at_5 value: 8.577 - type: recall_at_1 value: 19.814999999999998 - type: recall_at_10 value: 44.239 - type: recall_at_100 value: 69.269 - type: recall_at_1000 value: 89.216 - type: recall_at_3 value: 31.102999999999998 - type: recall_at_5 value: 38.078 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 11.349 - type: map_at_10 value: 19.436 - type: map_at_100 value: 21.282999999999998 - type: map_at_1000 value: 21.479 - type: map_at_3 value: 15.841 - type: map_at_5 value: 17.558 - type: mrr_at_1 value: 25.863000000000003 - type: mrr_at_10 value: 37.218 - type: mrr_at_100 value: 38.198 - type: mrr_at_1000 value: 38.236 - type: mrr_at_3 value: 33.409 - type: mrr_at_5 value: 35.602000000000004 - type: ndcg_at_1 value: 25.863000000000003 - type: ndcg_at_10 value: 27.953 - type: ndcg_at_100 value: 35.327 - type: ndcg_at_1000 value: 38.708999999999996 - type: ndcg_at_3 value: 21.985 - type: ndcg_at_5 value: 23.957 - type: precision_at_1 value: 25.863000000000003 - type: precision_at_10 value: 8.99 - type: precision_at_100 value: 1.6889999999999998 - type: precision_at_1000 value: 0.232 - type: precision_at_3 value: 16.308 - type: precision_at_5 value: 12.912 - type: recall_at_1 value: 11.349 - type: recall_at_10 value: 34.581 - type: recall_at_100 value: 60.178 - type: recall_at_1000 value: 78.88199999999999 - type: recall_at_3 value: 20.041999999999998 - type: recall_at_5 value: 25.458 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 7.893 - type: map_at_10 value: 15.457 - type: map_at_100 value: 20.905 - type: map_at_1000 value: 22.116 - type: map_at_3 value: 11.593 - type: map_at_5 value: 13.134 - type: mrr_at_1 value: 57.49999999999999 - type: mrr_at_10 value: 65.467 - type: mrr_at_100 value: 66.022 - type: mrr_at_1000 value: 66.039 - type: mrr_at_3 value: 63.458000000000006 - type: mrr_at_5 value: 64.546 - type: ndcg_at_1 value: 45.875 - type: ndcg_at_10 value: 33.344 - type: ndcg_at_100 value: 36.849 - type: ndcg_at_1000 value: 44.03 - type: ndcg_at_3 value: 37.504 - type: ndcg_at_5 value: 34.892 - type: precision_at_1 value: 57.49999999999999 - type: precision_at_10 value: 25.95 - type: precision_at_100 value: 7.89 - type: precision_at_1000 value: 1.669 - type: precision_at_3 value: 40.333000000000006 - type: precision_at_5 value: 33.050000000000004 - type: recall_at_1 value: 7.893 - type: recall_at_10 value: 20.724999999999998 - type: recall_at_100 value: 42.516 - type: recall_at_1000 value: 65.822 - type: recall_at_3 value: 12.615000000000002 - type: recall_at_5 value: 15.482000000000001 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.760000000000005 - type: f1 value: 45.51690565701713 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 53.882 - type: map_at_10 value: 65.902 - type: map_at_100 value: 66.33 - type: map_at_1000 value: 66.348 - type: map_at_3 value: 63.75999999999999 - type: map_at_5 value: 65.181 - type: mrr_at_1 value: 58.041 - type: mrr_at_10 value: 70.133 - type: mrr_at_100 value: 70.463 - type: mrr_at_1000 value: 70.47 - type: mrr_at_3 value: 68.164 - type: mrr_at_5 value: 69.465 - type: ndcg_at_1 value: 58.041 - type: ndcg_at_10 value: 71.84700000000001 - type: ndcg_at_100 value: 73.699 - type: ndcg_at_1000 value: 74.06700000000001 - type: ndcg_at_3 value: 67.855 - type: ndcg_at_5 value: 70.203 - type: precision_at_1 value: 58.041 - type: precision_at_10 value: 9.427000000000001 - type: precision_at_100 value: 1.049 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 27.278000000000002 - type: precision_at_5 value: 17.693 - type: recall_at_1 value: 53.882 - type: recall_at_10 value: 85.99 - type: recall_at_100 value: 94.09100000000001 - type: recall_at_1000 value: 96.612 - type: recall_at_3 value: 75.25 - type: recall_at_5 value: 80.997 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 19.165 - type: map_at_10 value: 31.845000000000002 - type: map_at_100 value: 33.678999999999995 - type: map_at_1000 value: 33.878 - type: map_at_3 value: 27.881 - type: map_at_5 value: 30.049999999999997 - type: mrr_at_1 value: 38.272 - type: mrr_at_10 value: 47.04 - type: mrr_at_100 value: 47.923 - type: mrr_at_1000 value: 47.973 - type: mrr_at_3 value: 44.985 - type: mrr_at_5 value: 46.150000000000006 - type: ndcg_at_1 value: 38.272 - type: ndcg_at_10 value: 39.177 - type: ndcg_at_100 value: 45.995000000000005 - type: ndcg_at_1000 value: 49.312 - type: ndcg_at_3 value: 36.135 - type: ndcg_at_5 value: 36.936 - type: precision_at_1 value: 38.272 - type: precision_at_10 value: 10.926 - type: precision_at_100 value: 1.809 - type: precision_at_1000 value: 0.23700000000000002 - type: precision_at_3 value: 24.331 - type: precision_at_5 value: 17.747 - type: recall_at_1 value: 19.165 - type: recall_at_10 value: 45.103 - type: recall_at_100 value: 70.295 - type: recall_at_1000 value: 90.592 - type: recall_at_3 value: 32.832 - type: recall_at_5 value: 37.905 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 32.397 - type: map_at_10 value: 44.83 - type: map_at_100 value: 45.716 - type: map_at_1000 value: 45.797 - type: map_at_3 value: 41.955999999999996 - type: map_at_5 value: 43.736999999999995 - type: mrr_at_1 value: 64.794 - type: mrr_at_10 value: 71.866 - type: mrr_at_100 value: 72.22 - type: mrr_at_1000 value: 72.238 - type: mrr_at_3 value: 70.416 - type: mrr_at_5 value: 71.304 - type: ndcg_at_1 value: 64.794 - type: ndcg_at_10 value: 54.186 - type: ndcg_at_100 value: 57.623000000000005 - type: ndcg_at_1000 value: 59.302 - type: ndcg_at_3 value: 49.703 - type: ndcg_at_5 value: 52.154999999999994 - type: precision_at_1 value: 64.794 - type: precision_at_10 value: 11.219 - type: precision_at_100 value: 1.394 - type: precision_at_1000 value: 0.16199999999999998 - type: precision_at_3 value: 30.767 - type: precision_at_5 value: 20.397000000000002 - type: recall_at_1 value: 32.397 - type: recall_at_10 value: 56.096999999999994 - type: recall_at_100 value: 69.696 - type: recall_at_1000 value: 80.88499999999999 - type: recall_at_3 value: 46.150999999999996 - type: recall_at_5 value: 50.993 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 81.1744 - type: ap value: 75.44973697032414 - type: f1 value: 81.09901117955782 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 19.519000000000002 - type: map_at_10 value: 31.025000000000002 - type: map_at_100 value: 32.275999999999996 - type: map_at_1000 value: 32.329 - type: map_at_3 value: 27.132 - type: map_at_5 value: 29.415999999999997 - type: mrr_at_1 value: 20.115 - type: mrr_at_10 value: 31.569000000000003 - type: mrr_at_100 value: 32.768 - type: mrr_at_1000 value: 32.816 - type: mrr_at_3 value: 27.748 - type: mrr_at_5 value: 29.956 - type: ndcg_at_1 value: 20.115 - type: ndcg_at_10 value: 37.756 - type: ndcg_at_100 value: 43.858000000000004 - type: ndcg_at_1000 value: 45.199 - type: ndcg_at_3 value: 29.818 - type: ndcg_at_5 value: 33.875 - type: precision_at_1 value: 20.115 - type: precision_at_10 value: 6.122 - type: precision_at_100 value: 0.919 - type: precision_at_1000 value: 0.10300000000000001 - type: precision_at_3 value: 12.794 - type: precision_at_5 value: 9.731 - type: recall_at_1 value: 19.519000000000002 - type: recall_at_10 value: 58.62500000000001 - type: recall_at_100 value: 86.99 - type: recall_at_1000 value: 97.268 - type: recall_at_3 value: 37.002 - type: recall_at_5 value: 46.778 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.71865025079799 - type: f1 value: 93.38906173610519 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.2576379388965 - type: f1 value: 49.20405830249464 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.48486886348351 - type: f1 value: 64.92199176095157 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.59246805648958 - type: f1 value: 72.1222026389164 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 30.887642595096825 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 28.3764418784054 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.81544126336991 - type: mrr value: 32.82666576268031 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.185 - type: map_at_10 value: 11.158 - type: map_at_100 value: 14.041 - type: map_at_1000 value: 15.360999999999999 - type: map_at_3 value: 8.417 - type: map_at_5 value: 9.378 - type: mrr_at_1 value: 44.582 - type: mrr_at_10 value: 53.083999999999996 - type: mrr_at_100 value: 53.787 - type: mrr_at_1000 value: 53.824000000000005 - type: mrr_at_3 value: 51.187000000000005 - type: mrr_at_5 value: 52.379 - type: ndcg_at_1 value: 42.57 - type: ndcg_at_10 value: 31.593 - type: ndcg_at_100 value: 29.093999999999998 - type: ndcg_at_1000 value: 37.909 - type: ndcg_at_3 value: 37.083 - type: ndcg_at_5 value: 34.397 - type: precision_at_1 value: 43.963 - type: precision_at_10 value: 23.498 - type: precision_at_100 value: 7.6160000000000005 - type: precision_at_1000 value: 2.032 - type: precision_at_3 value: 34.572 - type: precision_at_5 value: 29.412 - type: recall_at_1 value: 5.185 - type: recall_at_10 value: 15.234 - type: recall_at_100 value: 29.49 - type: recall_at_1000 value: 62.273999999999994 - type: recall_at_3 value: 9.55 - type: recall_at_5 value: 11.103 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 23.803 - type: map_at_10 value: 38.183 - type: map_at_100 value: 39.421 - type: map_at_1000 value: 39.464 - type: map_at_3 value: 33.835 - type: map_at_5 value: 36.327 - type: mrr_at_1 value: 26.68 - type: mrr_at_10 value: 40.439 - type: mrr_at_100 value: 41.415 - type: mrr_at_1000 value: 41.443999999999996 - type: mrr_at_3 value: 36.612 - type: mrr_at_5 value: 38.877 - type: ndcg_at_1 value: 26.68 - type: ndcg_at_10 value: 45.882 - type: ndcg_at_100 value: 51.227999999999994 - type: ndcg_at_1000 value: 52.207 - type: ndcg_at_3 value: 37.511 - type: ndcg_at_5 value: 41.749 - type: precision_at_1 value: 26.68 - type: precision_at_10 value: 7.9750000000000005 - type: precision_at_100 value: 1.0959999999999999 - type: precision_at_1000 value: 0.11900000000000001 - type: precision_at_3 value: 17.449 - type: precision_at_5 value: 12.897 - type: recall_at_1 value: 23.803 - type: recall_at_10 value: 67.152 - type: recall_at_100 value: 90.522 - type: recall_at_1000 value: 97.743 - type: recall_at_3 value: 45.338 - type: recall_at_5 value: 55.106 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 70.473 - type: map_at_10 value: 84.452 - type: map_at_100 value: 85.101 - type: map_at_1000 value: 85.115 - type: map_at_3 value: 81.435 - type: map_at_5 value: 83.338 - type: mrr_at_1 value: 81.19 - type: mrr_at_10 value: 87.324 - type: mrr_at_100 value: 87.434 - type: mrr_at_1000 value: 87.435 - type: mrr_at_3 value: 86.31 - type: mrr_at_5 value: 87.002 - type: ndcg_at_1 value: 81.21000000000001 - type: ndcg_at_10 value: 88.19 - type: ndcg_at_100 value: 89.44 - type: ndcg_at_1000 value: 89.526 - type: ndcg_at_3 value: 85.237 - type: ndcg_at_5 value: 86.892 - type: precision_at_1 value: 81.21000000000001 - type: precision_at_10 value: 13.417000000000002 - type: precision_at_100 value: 1.537 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.59 - type: recall_at_1 value: 70.473 - type: recall_at_10 value: 95.367 - type: recall_at_100 value: 99.616 - type: recall_at_1000 value: 99.996 - type: recall_at_3 value: 86.936 - type: recall_at_5 value: 91.557 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 59.25776525253911 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.22135271663078 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.003 - type: map_at_10 value: 10.062999999999999 - type: map_at_100 value: 11.854000000000001 - type: map_at_1000 value: 12.145999999999999 - type: map_at_3 value: 7.242 - type: map_at_5 value: 8.652999999999999 - type: mrr_at_1 value: 19.7 - type: mrr_at_10 value: 29.721999999999998 - type: mrr_at_100 value: 30.867 - type: mrr_at_1000 value: 30.944 - type: mrr_at_3 value: 26.683 - type: mrr_at_5 value: 28.498 - type: ndcg_at_1 value: 19.7 - type: ndcg_at_10 value: 17.095 - type: ndcg_at_100 value: 24.375 - type: ndcg_at_1000 value: 29.831000000000003 - type: ndcg_at_3 value: 16.305 - type: ndcg_at_5 value: 14.291 - type: precision_at_1 value: 19.7 - type: precision_at_10 value: 8.799999999999999 - type: precision_at_100 value: 1.9349999999999998 - type: precision_at_1000 value: 0.32399999999999995 - type: precision_at_3 value: 15.2 - type: precision_at_5 value: 12.540000000000001 - type: recall_at_1 value: 4.003 - type: recall_at_10 value: 17.877000000000002 - type: recall_at_100 value: 39.217 - type: recall_at_1000 value: 65.862 - type: recall_at_3 value: 9.242 - type: recall_at_5 value: 12.715000000000002 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_spearman value: 80.25888668589654 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_spearman value: 77.02037527837669 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_spearman value: 86.58432681008449 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_spearman value: 81.31697756099051 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_spearman value: 88.18867599667057 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_spearman value: 84.87853941747623 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_spearman value: 89.46479925383916 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_spearman value: 66.45272113649146 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_spearman value: 86.43357313527851 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 78.82761687254882 - type: mrr value: 93.46223674655047 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 44.583 - type: map_at_10 value: 52.978 - type: map_at_100 value: 53.803 - type: map_at_1000 value: 53.839999999999996 - type: map_at_3 value: 50.03300000000001 - type: map_at_5 value: 51.939 - type: mrr_at_1 value: 47.0 - type: mrr_at_10 value: 54.730000000000004 - type: mrr_at_100 value: 55.31399999999999 - type: mrr_at_1000 value: 55.346 - type: mrr_at_3 value: 52.0 - type: mrr_at_5 value: 53.783 - type: ndcg_at_1 value: 47.0 - type: ndcg_at_10 value: 57.82899999999999 - type: ndcg_at_100 value: 61.49400000000001 - type: ndcg_at_1000 value: 62.676 - type: ndcg_at_3 value: 52.373000000000005 - type: ndcg_at_5 value: 55.481 - type: precision_at_1 value: 47.0 - type: precision_at_10 value: 7.867 - type: precision_at_100 value: 0.997 - type: precision_at_1000 value: 0.11 - type: precision_at_3 value: 20.556 - type: precision_at_5 value: 14.066999999999998 - type: recall_at_1 value: 44.583 - type: recall_at_10 value: 71.172 - type: recall_at_100 value: 87.7 - type: recall_at_1000 value: 97.333 - type: recall_at_3 value: 56.511 - type: recall_at_5 value: 64.206 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.66237623762376 - type: cos_sim_ap value: 90.35465126226322 - type: cos_sim_f1 value: 82.44575936883628 - type: cos_sim_precision value: 81.32295719844358 - type: cos_sim_recall value: 83.6 - type: dot_accuracy value: 99.66237623762376 - type: dot_ap value: 90.35464287920453 - type: dot_f1 value: 82.44575936883628 - type: dot_precision value: 81.32295719844358 - type: dot_recall value: 83.6 - type: euclidean_accuracy value: 99.66237623762376 - type: euclidean_ap value: 90.3546512622632 - type: euclidean_f1 value: 82.44575936883628 - type: euclidean_precision value: 81.32295719844358 - type: euclidean_recall value: 83.6 - type: manhattan_accuracy value: 99.65940594059406 - type: manhattan_ap value: 90.29220174849843 - type: manhattan_f1 value: 82.4987605354487 - type: manhattan_precision value: 81.80924287118977 - type: manhattan_recall value: 83.2 - type: max_accuracy value: 99.66237623762376 - type: max_ap value: 90.35465126226322 - type: max_f1 value: 82.4987605354487 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 65.0394225901397 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.27954189859326 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 50.99055979974896 - type: mrr value: 51.82745257193787 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.21655465344237 - type: cos_sim_spearman value: 29.853205339630172 - type: dot_pearson value: 30.216540628083564 - type: dot_spearman value: 29.868978894753027 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.2 - type: map_at_10 value: 1.398 - type: map_at_100 value: 7.406 - type: map_at_1000 value: 18.401 - type: map_at_3 value: 0.479 - type: map_at_5 value: 0.772 - type: mrr_at_1 value: 70.0 - type: mrr_at_10 value: 79.25999999999999 - type: mrr_at_100 value: 79.25999999999999 - type: mrr_at_1000 value: 79.25999999999999 - type: mrr_at_3 value: 77.333 - type: mrr_at_5 value: 78.133 - type: ndcg_at_1 value: 63.0 - type: ndcg_at_10 value: 58.548 - type: ndcg_at_100 value: 45.216 - type: ndcg_at_1000 value: 41.149 - type: ndcg_at_3 value: 60.641999999999996 - type: ndcg_at_5 value: 61.135 - type: precision_at_1 value: 70.0 - type: precision_at_10 value: 64.0 - type: precision_at_100 value: 46.92 - type: precision_at_1000 value: 18.642 - type: precision_at_3 value: 64.667 - type: precision_at_5 value: 66.4 - type: recall_at_1 value: 0.2 - type: recall_at_10 value: 1.6729999999999998 - type: recall_at_100 value: 10.856 - type: recall_at_1000 value: 38.964999999999996 - type: recall_at_3 value: 0.504 - type: recall_at_5 value: 0.852 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 1.6629999999999998 - type: map_at_10 value: 8.601 - type: map_at_100 value: 14.354 - type: map_at_1000 value: 15.927 - type: map_at_3 value: 4.1930000000000005 - type: map_at_5 value: 5.655 - type: mrr_at_1 value: 18.367 - type: mrr_at_10 value: 34.466 - type: mrr_at_100 value: 35.235 - type: mrr_at_1000 value: 35.27 - type: mrr_at_3 value: 28.571 - type: mrr_at_5 value: 31.531 - type: ndcg_at_1 value: 14.285999999999998 - type: ndcg_at_10 value: 20.374 - type: ndcg_at_100 value: 33.532000000000004 - type: ndcg_at_1000 value: 45.561 - type: ndcg_at_3 value: 18.442 - type: ndcg_at_5 value: 18.076 - type: precision_at_1 value: 18.367 - type: precision_at_10 value: 20.204 - type: precision_at_100 value: 7.489999999999999 - type: precision_at_1000 value: 1.5630000000000002 - type: precision_at_3 value: 21.769 - type: precision_at_5 value: 20.408 - type: recall_at_1 value: 1.6629999999999998 - type: recall_at_10 value: 15.549 - type: recall_at_100 value: 47.497 - type: recall_at_1000 value: 84.524 - type: recall_at_3 value: 5.289 - type: recall_at_5 value: 8.035 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.8194 - type: ap value: 14.447702451658554 - type: f1 value: 55.13659412856185 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 63.310696095076416 - type: f1 value: 63.360434851097814 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 51.30677907335145 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.12386004649221 - type: cos_sim_ap value: 73.99096426215495 - type: cos_sim_f1 value: 68.18416968442834 - type: cos_sim_precision value: 66.86960933536275 - type: cos_sim_recall value: 69.55145118733509 - type: dot_accuracy value: 86.12386004649221 - type: dot_ap value: 73.99096813038672 - type: dot_f1 value: 68.18416968442834 - type: dot_precision value: 66.86960933536275 - type: dot_recall value: 69.55145118733509 - type: euclidean_accuracy value: 86.12386004649221 - type: euclidean_ap value: 73.99095984980165 - type: euclidean_f1 value: 68.18416968442834 - type: euclidean_precision value: 66.86960933536275 - type: euclidean_recall value: 69.55145118733509 - type: manhattan_accuracy value: 86.09405734040651 - type: manhattan_ap value: 73.96825745608601 - type: manhattan_f1 value: 68.13888179729383 - type: manhattan_precision value: 65.99901088031652 - type: manhattan_recall value: 70.42216358839049 - type: max_accuracy value: 86.12386004649221 - type: max_ap value: 73.99096813038672 - type: max_f1 value: 68.18416968442834 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.99367407924865 - type: cos_sim_ap value: 86.19720829843081 - type: cos_sim_f1 value: 78.39889075384951 - type: cos_sim_precision value: 74.5110278818144 - type: cos_sim_recall value: 82.71481367416075 - type: dot_accuracy value: 88.99367407924865 - type: dot_ap value: 86.19718471454047 - type: dot_f1 value: 78.39889075384951 - type: dot_precision value: 74.5110278818144 - type: dot_recall value: 82.71481367416075 - type: euclidean_accuracy value: 88.99367407924865 - type: euclidean_ap value: 86.1972021422436 - type: euclidean_f1 value: 78.39889075384951 - type: euclidean_precision value: 74.5110278818144 - type: euclidean_recall value: 82.71481367416075 - type: manhattan_accuracy value: 88.95680521597392 - type: manhattan_ap value: 86.16659921351506 - type: manhattan_f1 value: 78.39125971550081 - type: manhattan_precision value: 74.82502799552073 - type: manhattan_recall value: 82.31444410224823 - type: max_accuracy value: 88.99367407924865 - type: max_ap value: 86.19720829843081 - type: max_f1 value: 78.39889075384951 --- # hkunlp/instructor-base We introduce **Instructor**👨‍🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨‍ achieves sota on 70 diverse embedding tasks! The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)! **************************** **Updates** **************************** * 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-base) trained with hard negatives, which gives better performance. * 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-base) and [project page](https://instructor-embedding.github.io/)! Check them out! ## Quick start <hr /> ## Installation ```bash pip install InstructorEmbedding ``` ## Compute your customized embeddings Then you can use the model like this to calculate domain-specific and task-aware embeddings: ```python from InstructorEmbedding import INSTRUCTOR model = INSTRUCTOR('hkunlp/instructor-base') sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments" instruction = "Represent the Science title:" embeddings = model.encode([[instruction,sentence]]) print(embeddings) ``` ## Use cases <hr /> ## Calculate embeddings for your customized texts If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions: &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Represent the `domain` `text_type` for `task_objective`: * `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc. * `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc. * `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc. ## Calculate Sentence similarities You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**. ```python from sklearn.metrics.pairwise import cosine_similarity sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'], ['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']] sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'], ['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']] embeddings_a = model.encode(sentences_a) embeddings_b = model.encode(sentences_b) similarities = cosine_similarity(embeddings_a,embeddings_b) print(similarities) ``` ## Information Retrieval You can also use **customized embeddings** for information retrieval. ```python import numpy as np from sklearn.metrics.pairwise import cosine_similarity query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']] corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'], ['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"], ['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']] query_embeddings = model.encode(query) corpus_embeddings = model.encode(corpus) similarities = cosine_similarity(query_embeddings,corpus_embeddings) retrieved_doc_id = np.argmax(similarities) print(retrieved_doc_id) ``` ## Clustering Use **customized embeddings** for clustering texts in groups. ```python import sklearn.cluster sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'], ['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'], ['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'], ['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"], ['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']] embeddings = model.encode(sentences) clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2) clustering_model.fit(embeddings) cluster_assignment = clustering_model.labels_ print(cluster_assignment) ```
66,209
[ [ -0.01410675048828125, -0.078125, 0.035125732421875, 0.0014123916625976562, -0.001983642578125, -0.01454925537109375, -0.0178070068359375, -0.007354736328125, 0.0192718505859375, 0.0206146240234375, -0.01806640625, -0.06134033203125, -0.0347900390625, -0.0015439987182617188, -0.0224456787109375, 0.08038330078125, -0.005218505859375, 0.01328277587890625, -0.043731689453125, -0.00989532470703125, -0.0244598388671875, -0.041473388671875, -0.01165008544921875, -0.025726318359375, 0.0261077880859375, 0.0184326171875, 0.042236328125, 0.0552978515625, 0.0291290283203125, 0.024322509765625, -0.0037899017333984375, -0.00952911376953125, -0.028076171875, -0.01678466796875, -0.01019287109375, -0.044586181640625, -0.01641845703125, -0.0017518997192382812, 0.053070068359375, 0.0579833984375, -0.0056610107421875, 0.0056915283203125, 0.02001953125, 0.0268402099609375, -0.041473388671875, 0.020751953125, -0.026702880859375, 0.0097503662109375, -0.0010442733764648438, 0.0009145736694335938, -0.01525115966796875, -0.0226287841796875, 0.01812744140625, -0.053955078125, 0.01154327392578125, 0.015594482421875, 0.072265625, 0.0294189453125, -0.03509521484375, -0.0379638671875, -0.004512786865234375, 0.08026123046875, -0.0595703125, 0.0284271240234375, 0.031494140625, -0.00910186767578125, -0.02276611328125, -0.0662841796875, -0.0498046875, -0.0181732177734375, -0.024169921875, 0.01132965087890625, -0.004573822021484375, -0.0003330707550048828, 0.01910400390625, 0.036224365234375, -0.04913330078125, 0.002887725830078125, -0.0313720703125, -0.0136260986328125, 0.03533935546875, 0.0198974609375, 0.034881591796875, -0.03546142578125, -0.02801513671875, -0.0228118896484375, -0.0308837890625, 0.01328277587890625, 0.022430419921875, 0.0126190185546875, -0.0291290283203125, 0.06341552734375, -0.0008573532104492188, 0.03759765625, 0.0023326873779296875, -0.018768310546875, 0.038299560546875, -0.0267181396484375, -0.01088714599609375, 0.0048675537109375, 0.0615234375, 0.032745361328125, -0.01036834716796875, 0.0006117820739746094, 0.005458831787109375, 0.01364898681640625, 0.0037479400634765625, -0.049713134765625, -0.0235443115234375, 0.0178070068359375, -0.045013427734375, -0.0274200439453125, 0.0084686279296875, -0.060791015625, -0.01113128662109375, -0.00795745849609375, 0.052947998046875, -0.043121337890625, 0.00568389892578125, 0.0145111083984375, -0.0264434814453125, -0.0023040771484375, 0.0014629364013671875, -0.061279296875, 0.0299224853515625, 0.052825927734375, 0.06329345703125, -0.0111083984375, -0.034088134765625, -0.0088348388671875, 0.009765625, 0.00794219970703125, 0.03546142578125, -0.051513671875, -0.01128387451171875, 0.007755279541015625, 0.0091400146484375, -0.01471710205078125, -0.025726318359375, 0.0192108154296875, -0.00759124755859375, 0.037628173828125, -0.00360107421875, -0.0743408203125, -0.036956787109375, 0.0201416015625, -0.037109375, 0.08697509765625, 0.015869140625, -0.0728759765625, 0.0005817413330078125, -0.06134033203125, -0.031402587890625, -0.021270751953125, -0.0183868408203125, -0.032257080078125, -0.018707275390625, 0.0018377304077148438, 0.054229736328125, -0.0024471282958984375, 0.01065826416015625, -0.0255279541015625, -0.0283966064453125, 0.0226593017578125, 0.0063629150390625, 0.07330322265625, 0.01049041748046875, -0.032318115234375, 0.002399444580078125, -0.04547119140625, -0.0106353759765625, 0.0071868896484375, -0.03839111328125, -0.038665771484375, 0.0032501220703125, 0.01381683349609375, -0.0139312744140625, 0.0377197265625, -0.045135498046875, 0.0258331298828125, -0.023406982421875, 0.05157470703125, 0.050628662109375, 0.0098114013671875, 0.03753662109375, -0.028472900390625, 0.0254669189453125, 0.0047454833984375, -0.001964569091796875, -0.01080322265625, -0.046630859375, -0.049407958984375, -0.0231475830078125, 0.024993896484375, 0.046478271484375, -0.0390625, 0.0501708984375, -0.00833892822265625, -0.05084228515625, -0.041717529296875, -0.002445220947265625, 0.0293426513671875, 0.0225677490234375, 0.049713134765625, -0.0093841552734375, -0.048370361328125, -0.052215576171875, -0.0253448486328125, 0.0053863525390625, 0.0165557861328125, 0.0260162353515625, 0.062744140625, -0.03802490234375, 0.058380126953125, -0.07275390625, -0.0235443115234375, -0.0167388916015625, -0.01203155517578125, 0.0190582275390625, 0.061798095703125, 0.055419921875, -0.05908203125, -0.048431396484375, -0.0081329345703125, -0.0494384765625, 0.0177459716796875, -0.00861358642578125, -0.0083770751953125, -0.0006279945373535156, 0.038787841796875, -0.043212890625, 0.02880859375, 0.02703857421875, -0.031494140625, 0.039276123046875, -0.0209808349609375, 0.0026988983154296875, -0.10638427734375, -0.0016078948974609375, 0.01483917236328125, -0.0083465576171875, -0.038787841796875, 0.018096923828125, -0.00850677490234375, 0.0019702911376953125, -0.02630615234375, 0.0428466796875, -0.0230865478515625, 0.01219940185546875, 0.000018775463104248047, 0.01715087890625, -0.0099334716796875, 0.04827880859375, -0.01039886474609375, 0.03350830078125, 0.049530029296875, -0.053436279296875, 0.026763916015625, 0.043060302734375, -0.0295867919921875, 0.020538330078125, -0.042816162109375, -0.004833221435546875, 0.0022640228271484375, 0.023040771484375, -0.05145263671875, -0.012969970703125, 0.0284271240234375, -0.03179931640625, 0.01190948486328125, 0.005176544189453125, -0.03375244140625, -0.0286712646484375, -0.04144287109375, 0.02105712890625, 0.046295166015625, -0.031585693359375, 0.01526641845703125, 0.043853759765625, -0.000022232532501220703, -0.057647705078125, -0.052886962890625, -0.01239776611328125, -0.02471923828125, -0.02178955078125, 0.043548583984375, -0.01529693603515625, -0.005214691162109375, 0.035919189453125, 0.004802703857421875, -0.016632080078125, 0.01416778564453125, 0.025787353515625, 0.01629638671875, -0.01470947265625, 0.010528564453125, 0.01611328125, 0.0029506683349609375, -0.0016164779663085938, -0.02410888671875, 0.049102783203125, -0.0206756591796875, -0.0185394287109375, -0.036407470703125, 0.0308685302734375, 0.016510009765625, 0.007366180419921875, 0.051788330078125, 0.0635986328125, -0.044952392578125, 0.005035400390625, -0.02545166015625, -0.018646240234375, -0.037322998046875, 0.0538330078125, -0.01617431640625, -0.08245849609375, 0.0203094482421875, 0.00260162353515625, 0.006641387939453125, 0.050384521484375, 0.044708251953125, -0.01096343994140625, 0.05999755859375, 0.0587158203125, -0.010833740234375, 0.04412841796875, -0.04290771484375, 0.0223541259765625, -0.058929443359375, -0.03387451171875, -0.039154052734375, -0.028961181640625, -0.06121826171875, -0.040771484375, 0.01105499267578125, 0.0225067138671875, -0.020538330078125, 0.045257568359375, -0.043487548828125, 0.0211944580078125, 0.04998779296875, 0.0124359130859375, -0.0040283203125, 0.0141448974609375, -0.0255889892578125, -0.01099395751953125, -0.062164306640625, -0.042755126953125, 0.09033203125, 0.02862548828125, 0.0584716796875, -0.0158843994140625, 0.07562255859375, 0.019805908203125, 0.003627777099609375, -0.05731201171875, 0.041168212890625, -0.0285491943359375, -0.033843994140625, -0.011383056640625, -0.033935546875, -0.08062744140625, 0.0222930908203125, -0.032562255859375, -0.061004638671875, 0.01209259033203125, -0.0013284683227539062, -0.0206146240234375, 0.035125732421875, -0.039031982421875, 0.0806884765625, -0.007701873779296875, -0.0136871337890625, -0.037384033203125, -0.0308074951171875, 0.002925872802734375, 0.007320404052734375, 0.02056884765625, -0.0039043426513671875, -0.00464630126953125, 0.07830810546875, -0.0252685546875, 0.07489013671875, -0.01313018798828125, 0.022491455078125, 0.0308685302734375, -0.025360107421875, 0.011688232421875, -0.0021533966064453125, -0.004047393798828125, 0.0080108642578125, 0.0269775390625, -0.041046142578125, -0.037506103515625, 0.06988525390625, -0.07171630859375, -0.0322265625, -0.03350830078125, -0.05450439453125, 0.021026611328125, 0.018096923828125, 0.021148681640625, 0.0243377685546875, -0.0219268798828125, 0.0311737060546875, 0.02813720703125, -0.033355712890625, 0.0218505859375, 0.01248931884765625, -0.007038116455078125, -0.03863525390625, 0.08489990234375, 0.009490966796875, -0.0052947998046875, 0.04046630859375, 0.0231475830078125, -0.018280029296875, -0.018951416015625, -0.007843017578125, 0.027618408203125, -0.049652099609375, -0.01079559326171875, -0.07733154296875, -0.013671875, -0.054443359375, -0.0287628173828125, -0.0118408203125, -0.048431396484375, -0.044403076171875, -0.00922393798828125, 0.02777099609375, 0.07183837890625, -0.01338958740234375, 0.0161895751953125, -0.052001953125, 0.01477813720703125, 0.0007653236389160156, 0.0012683868408203125, 0.00688934326171875, -0.0225677490234375, -0.050384521484375, 0.01025390625, -0.043914794921875, -0.06231689453125, 0.025390625, 0.01509857177734375, 0.0670166015625, 0.03619384765625, 0.00809478759765625, 0.052825927734375, -0.04632568359375, 0.07696533203125, 0.0261993408203125, -0.0712890625, 0.037750244140625, 0.0021152496337890625, -0.01229095458984375, 0.0277099609375, 0.05426025390625, -0.0400390625, -0.037261962890625, -0.05572509765625, -0.0709228515625, 0.0364990234375, 0.0113525390625, 0.02490234375, -0.01279449462890625, 0.037353515625, 0.0164337158203125, 0.01464080810546875, -0.074462890625, -0.042755126953125, -0.0293426513671875, -0.049652099609375, -0.0046844482421875, -0.009246826171875, -0.00200653076171875, -0.03802490234375, 0.04736328125, 0.00499725341796875, 0.037811279296875, 0.022216796875, -0.026702880859375, 0.0171966552734375, 0.0204620361328125, 0.03741455078125, 0.025360107421875, -0.0168914794921875, 0.0181884765625, 0.0274810791015625, -0.049652099609375, 0.00846099853515625, 0.008941650390625, -0.007038116455078125, 0.00815582275390625, 0.028961181640625, 0.0487060546875, 0.0328369140625, -0.04608154296875, 0.045196533203125, 0.0162200927734375, -0.0209503173828125, -0.0297088623046875, 0.018280029296875, 0.00473785400390625, 0.03326416015625, 0.02154541015625, -0.0010242462158203125, 0.01265716552734375, -0.055938720703125, 0.01641845703125, 0.011322021484375, -0.034942626953125, -0.0272979736328125, 0.045257568359375, 0.00042247772216796875, -0.01470184326171875, 0.0343017578125, -0.01885986328125, -0.06451416015625, 0.055419921875, 0.06494140625, 0.051605224609375, -0.0014810562133789062, 0.0158538818359375, 0.05010986328125, 0.01316070556640625, -0.0090484619140625, 0.025634765625, 0.005035400390625, -0.051177978515625, -0.00341796875, -0.043853759765625, -0.0205230712890625, 0.00013184547424316406, -0.0576171875, 0.0115509033203125, -0.036773681640625, -0.010894775390625, -0.0085601806640625, 0.01412200927734375, -0.058685302734375, 0.00615692138671875, -0.007350921630859375, 0.057891845703125, -0.06964111328125, 0.061798095703125, 0.086669921875, -0.03729248046875, -0.043365478515625, 0.005096435546875, -0.01222991943359375, -0.049102783203125, 0.036346435546875, 0.033721923828125, 0.030975341796875, 0.006839752197265625, -0.043426513671875, -0.05908203125, 0.103759765625, 0.00614166259765625, -0.0282440185546875, -0.0092010498046875, 0.01123809814453125, 0.03472900390625, -0.04608154296875, 0.00640869140625, 0.0286102294921875, 0.03564453125, -0.04193115234375, -0.046173095703125, 0.0288238525390625, -0.0207366943359375, -0.01348876953125, -0.0137939453125, -0.04656982421875, 0.0823974609375, -0.026123046875, -0.0030956268310546875, 0.014404296875, 0.04913330078125, 0.011260986328125, 0.038055419921875, 0.039031982421875, 0.07012939453125, 0.0694580078125, -0.0015516281127929688, 0.08685302734375, -0.036102294921875, 0.05224609375, 0.0640869140625, 0.01282501220703125, 0.075927734375, 0.0279998779296875, -0.02850341796875, 0.065673828125, 0.058258056640625, -0.03277587890625, 0.052947998046875, 0.01837158203125, -0.002407073974609375, -0.00386810302734375, -0.008056640625, -0.0546875, 0.0218658447265625, 0.03076171875, -0.0245819091796875, 0.0025386810302734375, 0.015045166015625, 0.006618499755859375, 0.0054473876953125, -0.0115203857421875, 0.048431396484375, 0.022918701171875, -0.0212554931640625, 0.02838134765625, 0.01224517822265625, 0.07269287109375, -0.03204345703125, -0.00693511962890625, 0.004375457763671875, 0.020294189453125, -0.0330810546875, -0.05914306640625, 0.007411956787109375, -0.006053924560546875, -0.01554107666015625, 0.0007886886596679688, 0.044708251953125, -0.05126953125, -0.0265350341796875, 0.03131103515625, 0.022003173828125, 0.0290985107421875, 0.011383056640625, -0.0693359375, -0.01195526123046875, 0.011627197265625, -0.0173492431640625, 0.027618408203125, 0.019805908203125, 0.01300048828125, 0.03350830078125, 0.052886962890625, -0.0091705322265625, 0.011016845703125, -0.00878143310546875, 0.0672607421875, -0.065673828125, -0.052520751953125, -0.057220458984375, 0.0364990234375, -0.0038909912109375, -0.0172882080078125, 0.057830810546875, 0.065185546875, 0.07861328125, -0.01465606689453125, 0.06231689453125, -0.01715087890625, 0.023284912109375, -0.036651611328125, 0.06805419921875, -0.065185546875, -0.007518768310546875, -0.0298004150390625, -0.0714111328125, -0.01371002197265625, 0.07916259765625, -0.0191497802734375, -0.007297515869140625, 0.06536865234375, 0.057830810546875, -0.008819580078125, -0.01107025146484375, 0.011474609375, 0.0251007080078125, 0.01255035400390625, 0.039215087890625, 0.05072021484375, -0.050537109375, 0.04571533203125, -0.042755126953125, -0.00386810302734375, -0.035491943359375, -0.0494384765625, -0.07659912109375, -0.05938720703125, -0.04364013671875, -0.030670166015625, 0.001163482666015625, 0.07635498046875, 0.034942626953125, -0.073974609375, -0.0208587646484375, 0.00362396240234375, 0.022430419921875, -0.026153564453125, -0.024017333984375, 0.049072265625, -0.0210113525390625, -0.056610107421875, 0.018524169921875, -0.005809783935546875, 0.01195526123046875, -0.00844573974609375, -0.0126800537109375, -0.03643798828125, -0.00807952880859375, 0.03985595703125, 0.002643585205078125, -0.06256103515625, -0.025604248046875, -0.0033855438232421875, -0.0269927978515625, 0.0019330978393554688, 0.0474853515625, -0.037322998046875, 0.0095672607421875, 0.0435791015625, 0.053863525390625, 0.037261962890625, -0.0187225341796875, 0.0242156982421875, -0.04437255859375, 0.0128173828125, 0.008148193359375, 0.040679931640625, 0.01438140869140625, -0.03216552734375, 0.040313720703125, 0.028289794921875, -0.04180908203125, -0.0648193359375, -0.00870513916015625, -0.07867431640625, -0.03656005859375, 0.06488037109375, -0.039886474609375, -0.025665283203125, -0.00162506103515625, -0.01800537109375, 0.03863525390625, -0.032318115234375, 0.05804443359375, 0.04754638671875, 0.0000336766242980957, 0.005584716796875, -0.037994384765625, 0.022369384765625, 0.042633056640625, -0.048980712890625, -0.033172607421875, 0.0195465087890625, 0.041656494140625, 0.0257568359375, 0.02508544921875, 0.004444122314453125, 0.01434326171875, 0.0148773193359375, 0.00553131103515625, -0.0067291259765625, -0.00042366981506347656, -0.0126800537109375, 0.00949859619140625, -0.03179931640625, -0.02630615234375 ] ]
tner/roberta-large-ontonotes5
2022-09-26T14:12:05.000Z
[ "transformers", "pytorch", "roberta", "token-classification", "dataset:tner/ontonotes5", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
token-classification
tner
null
null
tner/roberta-large-ontonotes5
5
122,733
transformers
2022-08-12T10:33:41
--- datasets: - tner/ontonotes5 metrics: - f1 - precision - recall model-index: - name: tner/roberta-large-ontonotes5 results: - task: name: Token Classification type: token-classification dataset: name: tner/ontonotes5 type: tner/ontonotes5 args: tner/ontonotes5 metrics: - name: F1 type: f1 value: 0.908632361399938 - name: Precision type: precision value: 0.905148095909732 - name: Recall type: recall value: 0.9121435551212579 - name: F1 (macro) type: f1_macro value: 0.8265477704565624 - name: Precision (macro) type: precision_macro value: 0.8170668848546687 - name: Recall (macro) type: recall_macro value: 0.8387672780349001 - name: F1 (entity span) type: f1_entity_span value: 0.9284544931640193 - name: Precision (entity span) type: precision_entity_span value: 0.9248942172073342 - name: Recall (entity span) type: recall_entity_span value: 0.9320422848005685 pipeline_tag: token-classification widget: - text: "Jacob Collier is a Grammy awarded artist from England." example_title: "NER Example 1" --- # tner/roberta-large-ontonotes5 This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the [tner/ontonotes5](https://huggingface.co/datasets/tner/ontonotes5) dataset. Model fine-tuning is done via [T-NER](https://github.com/asahi417/tner)'s hyper-parameter search (see the repository for more detail). It achieves the following results on the test set: - F1 (micro): 0.908632361399938 - Precision (micro): 0.905148095909732 - Recall (micro): 0.9121435551212579 - F1 (macro): 0.8265477704565624 - Precision (macro): 0.8170668848546687 - Recall (macro): 0.8387672780349001 The per-entity breakdown of the F1 score on the test set are below: - cardinal_number: 0.8605277329025309 - date: 0.872996300863132 - event: 0.7424242424242424 - facility: 0.7732342007434945 - geopolitical_area: 0.9687148323205043 - group: 0.9470588235294117 - language: 0.7499999999999999 - law: 0.6666666666666666 - location: 0.7593582887700535 - money: 0.901098901098901 - ordinal_number: 0.85785536159601 - organization: 0.9227360841872057 - percent: 0.9171428571428571 - person: 0.9556004036326943 - product: 0.7857142857142858 - quantity: 0.7945205479452055 - time: 0.6870588235294116 - work_of_art: 0.7151515151515151 For F1 scores, the confidence interval is obtained by bootstrap as below: - F1 (micro): - 90%: [0.9039454247544766, 0.9128956119702822] - 95%: [0.9030263216115454, 0.9138350859566045] - F1 (macro): - 90%: [0.9039454247544766, 0.9128956119702822] - 95%: [0.9030263216115454, 0.9138350859566045] Full evaluation can be found at [metric file of NER](https://huggingface.co/tner/roberta-large-ontonotes5/raw/main/eval/metric.json) and [metric file of entity span](https://huggingface.co/tner/roberta-large-ontonotes5/raw/main/eval/metric_span.json). ### Usage This model can be used through the [tner library](https://github.com/asahi417/tner). Install the library via pip ```shell pip install tner ``` and activate model as below. ```python from tner import TransformersNER model = TransformersNER("tner/roberta-large-ontonotes5") model.predict(["Jacob Collier is a Grammy awarded English artist from London"]) ``` It can be used via transformers library but it is not recommended as CRF layer is not supported at the moment. ### Training hyperparameters The following hyperparameters were used during training: - dataset: ['tner/ontonotes5'] - dataset_split: train - dataset_name: None - local_dataset: None - model: roberta-large - crf: True - max_length: 128 - epoch: 15 - batch_size: 64 - lr: 1e-05 - random_seed: 42 - gradient_accumulation_steps: 1 - weight_decay: None - lr_warmup_step_ratio: 0.1 - max_grad_norm: 10.0 The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/tner/roberta-large-ontonotes5/raw/main/trainer_config.json). ### Reference If you use any resource from T-NER, please consider to cite our [paper](https://aclanthology.org/2021.eacl-demos.7/). ``` @inproceedings{ushio-camacho-collados-2021-ner, title = "{T}-{NER}: An All-Round Python Library for Transformer-based Named Entity Recognition", author = "Ushio, Asahi and Camacho-Collados, Jose", booktitle = "Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations", month = apr, year = "2021", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.eacl-demos.7", doi = "10.18653/v1/2021.eacl-demos.7", pages = "53--62", abstract = "Language model (LM) pretraining has led to consistent improvements in many NLP downstream tasks, including named entity recognition (NER). In this paper, we present T-NER (Transformer-based Named Entity Recognition), a Python library for NER LM finetuning. In addition to its practical utility, T-NER facilitates the study and investigation of the cross-domain and cross-lingual generalization ability of LMs finetuned on NER. Our library also provides a web app where users can get model predictions interactively for arbitrary text, which facilitates qualitative model evaluation for non-expert programmers. We show the potential of the library by compiling nine public NER datasets into a unified format and evaluating the cross-domain and cross- lingual performance across the datasets. The results from our initial experiments show that in-domain performance is generally competitive across datasets. However, cross-domain generalization is challenging even with a large pretrained LM, which has nevertheless capacity to learn domain-specific features if fine- tuned on a combined dataset. To facilitate future research, we also release all our LM checkpoints via the Hugging Face model hub.", } ```
6,027
[ [ -0.037994384765625, -0.0540771484375, 0.0311737060546875, 0.01422119140625, -0.01120758056640625, -0.01241302490234375, -0.04241943359375, -0.0308990478515625, 0.02996826171875, 0.02728271484375, -0.033599853515625, -0.05377197265625, -0.05810546875, 0.0210418701171875, -0.0235443115234375, 0.08221435546875, -0.0032062530517578125, 0.0067138671875, 0.012847900390625, -0.0079193115234375, -0.01395416259765625, -0.0302581787109375, -0.07684326171875, -0.017120361328125, 0.045684814453125, 0.0198822021484375, 0.024078369140625, 0.0394287109375, 0.0460205078125, 0.025848388671875, -0.01123809814453125, 0.0005235671997070312, -0.0302581787109375, -0.0016355514526367188, -0.0026912689208984375, -0.037811279296875, -0.043731689453125, -0.0015163421630859375, 0.06005859375, 0.0304412841796875, 0.00824737548828125, 0.03546142578125, 0.00872802734375, 0.0333251953125, -0.029205322265625, 0.01727294921875, -0.037200927734375, -0.00746917724609375, -0.019012451171875, -0.01438140869140625, -0.0274810791015625, -0.020416259765625, 0.01416015625, -0.05322265625, 0.031463623046875, 0.0218963623046875, 0.10870361328125, 0.0252532958984375, -0.0176544189453125, 0.0031948089599609375, -0.047454833984375, 0.07830810546875, -0.05596923828125, 0.0361328125, 0.017791748046875, 0.013458251953125, -0.01302337646484375, -0.06463623046875, -0.052154541015625, -0.01021575927734375, -0.01154327392578125, -0.0009226799011230469, -0.0256500244140625, -0.0032501220703125, 0.0280914306640625, 0.033599853515625, -0.038818359375, 0.00551605224609375, -0.0246734619140625, -0.01074981689453125, 0.043975830078125, 0.0118408203125, 0.00905609130859375, -0.0208740234375, -0.022979736328125, -0.0318603515625, -0.0308837890625, 0.005535125732421875, 0.00489044189453125, 0.04522705078125, -0.025054931640625, 0.03948974609375, -0.00724029541015625, 0.05352783203125, 0.031829833984375, -0.013275146484375, 0.0406494140625, -0.0184326171875, -0.0241241455078125, -0.00823211669921875, 0.08746337890625, 0.02569580078125, 0.0330810546875, -0.005031585693359375, -0.020965576171875, -0.01497650146484375, 0.006412506103515625, -0.06878662109375, -0.015380859375, 0.01361846923828125, -0.0269927978515625, -0.02288818359375, 0.00865936279296875, -0.064208984375, -0.0059051513671875, -0.0288238525390625, 0.03521728515625, -0.041961669921875, -0.0172271728515625, 0.0008935928344726562, -0.01229095458984375, 0.0166168212890625, 0.019744873046875, -0.049468994140625, 0.0223541259765625, 0.0391845703125, 0.06414794921875, -0.0006694793701171875, -0.01392364501953125, -0.033233642578125, -0.01210784912109375, -0.009613037109375, 0.05279541015625, -0.03021240234375, -0.01953125, -0.022369384765625, 0.025115966796875, -0.0301361083984375, -0.0248260498046875, 0.051177978515625, -0.034698486328125, 0.026458740234375, -0.006412506103515625, -0.046875, -0.01117706298828125, 0.03363037109375, -0.060821533203125, 0.08758544921875, 0.0177764892578125, -0.06512451171875, 0.03680419921875, -0.050567626953125, -0.0205078125, -0.00974273681640625, -0.0011339187622070312, -0.06329345703125, -0.0046539306640625, 0.0238037109375, 0.04266357421875, -0.0418701171875, 0.025787353515625, -0.028656005859375, -0.030517578125, 0.017486572265625, -0.0253448486328125, 0.06005859375, -0.004238128662109375, -0.042449951171875, 0.003986358642578125, -0.07891845703125, 0.0249481201171875, 0.0159759521484375, -0.0240478515625, -0.023590087890625, -0.040252685546875, 0.040191650390625, 0.0161895751953125, 0.00736236572265625, -0.039093017578125, 0.00830078125, -0.0254974365234375, 0.035186767578125, 0.04156494140625, 0.021148681640625, 0.02447509765625, -0.0200958251953125, 0.03155517578125, 0.0063018798828125, 0.0018291473388671875, 0.015655517578125, -0.0297698974609375, -0.0716552734375, -0.0188140869140625, 0.063720703125, 0.03338623046875, -0.03558349609375, 0.051605224609375, -0.038055419921875, -0.049652099609375, -0.040771484375, -0.012603759765625, 0.0347900390625, 0.04888916015625, 0.057708740234375, 0.0018453598022460938, -0.054962158203125, -0.059051513671875, -0.0137786865234375, 0.0013265609741210938, -0.00626373291015625, 0.0150146484375, 0.0550537109375, -0.0050506591796875, 0.05023193359375, -0.0341796875, -0.03900146484375, -0.0192718505859375, -0.003353118896484375, 0.0296783447265625, 0.051116943359375, 0.034393310546875, -0.0587158203125, -0.04522705078125, -0.007396697998046875, -0.0440673828125, 0.0198211669921875, -0.010589599609375, 0.00215911865234375, 0.0279388427734375, 0.035430908203125, -0.05340576171875, 0.03277587890625, 0.033447265625, -0.031890869140625, 0.035797119140625, -0.00736236572265625, 0.0112152099609375, -0.1065673828125, 0.025787353515625, 0.040313720703125, -0.004779815673828125, -0.042236328125, -0.0048980712890625, 0.00844573974609375, 0.01654052734375, -0.01407623291015625, 0.060394287109375, -0.037322998046875, 0.01264190673828125, 0.005466461181640625, 0.00829315185546875, 0.006500244140625, 0.036407470703125, 0.006580352783203125, 0.05224609375, 0.036865234375, -0.0445556640625, 0.0031452178955078125, 0.0244903564453125, -0.0302886962890625, 0.038909912109375, -0.06195068359375, -0.001773834228515625, 0.01404571533203125, 0.020843505859375, -0.04931640625, -0.012298583984375, 0.0192108154296875, -0.04736328125, 0.0504150390625, -0.01666259765625, -0.02960205078125, -0.0297088623046875, -0.00992584228515625, 0.01953125, 0.0244598388671875, -0.0302581787109375, 0.05474853515625, 0.0201263427734375, 0.0030460357666015625, -0.050628662109375, -0.052490234375, -0.00194549560546875, -0.031646728515625, -0.041748046875, 0.039825439453125, 0.004894256591796875, -0.0089569091796875, -0.0003707408905029297, -0.0037212371826171875, -0.0025482177734375, -0.00959014892578125, 0.0235443115234375, 0.035736083984375, -0.0310821533203125, 0.0177154541015625, -0.0140838623046875, -0.0288238525390625, -0.00386810302734375, -0.024444580078125, 0.056488037109375, -0.0232391357421875, 0.005268096923828125, -0.04974365234375, -0.0142669677734375, 0.0282745361328125, -0.0266265869140625, 0.0635986328125, 0.06695556640625, -0.0284423828125, 0.005252838134765625, -0.0482177734375, -0.01117706298828125, -0.03497314453125, 0.034515380859375, -0.046783447265625, -0.06768798828125, 0.03668212890625, -0.002368927001953125, -0.006195068359375, 0.07177734375, 0.0247344970703125, 0.0015439987182617188, 0.0546875, 0.041412353515625, -0.01293182373046875, 0.0340576171875, -0.050262451171875, 0.016387939453125, -0.07293701171875, -0.0239410400390625, -0.05523681640625, -0.03363037109375, -0.06329345703125, -0.0290069580078125, 0.0211181640625, 0.0037822723388671875, -0.044586181640625, 0.04156494140625, -0.039154052734375, 0.022613525390625, 0.048828125, 0.0128326416015625, 0.01611328125, -0.0037212371826171875, -0.0051422119140625, -0.01313018798828125, -0.0299224853515625, -0.0305633544921875, 0.0919189453125, 0.02410888671875, 0.040191650390625, -0.0141448974609375, 0.0657958984375, -0.00574493408203125, 0.01532745361328125, -0.055389404296875, 0.039276123046875, -0.01094818115234375, -0.049346923828125, -0.04168701171875, -0.04779052734375, -0.07421875, 0.0043487548828125, -0.0382080078125, -0.067626953125, 0.02288818359375, 0.00482177734375, -0.03424072265625, 0.036712646484375, -0.032623291015625, 0.07012939453125, -0.005290985107421875, -0.02154541015625, 0.006927490234375, -0.0438232421875, 0.007305145263671875, -0.0080108642578125, -0.003509521484375, 0.005245208740234375, 0.0019311904907226562, 0.061004638671875, -0.01910400390625, 0.046600341796875, -0.01561737060546875, 0.015960693359375, 0.00983428955078125, -0.0098419189453125, 0.04473876953125, 0.0115814208984375, -0.02203369140625, 0.02783203125, -0.004756927490234375, -0.0228271484375, -0.0435791015625, 0.061126708984375, -0.06427001953125, -0.03155517578125, -0.036224365234375, -0.0399169921875, -0.00815582275390625, 0.0347900390625, 0.037261962890625, 0.03436279296875, -0.0145263671875, 0.0297088623046875, 0.05224609375, -0.009521484375, 0.0312347412109375, 0.0257415771484375, -0.0016841888427734375, -0.0423583984375, 0.06298828125, 0.00988006591796875, 0.00775909423828125, 0.0433349609375, 0.0032176971435546875, -0.0296478271484375, -0.038360595703125, -0.032257080078125, 0.037506103515625, -0.0266265869140625, -0.0272369384765625, -0.06756591796875, -0.0269622802734375, -0.03082275390625, 0.004119873046875, -0.0279388427734375, -0.04852294921875, -0.03643798828125, -0.01108551025390625, 0.037994384765625, 0.040802001953125, 0.01155853271484375, 0.016632080078125, -0.04742431640625, 0.01024627685546875, -0.01317596435546875, 0.03326416015625, 0.001178741455078125, -0.06573486328125, -0.01525115966796875, -0.0016326904296875, -0.01885986328125, -0.037872314453125, 0.037261962890625, 0.01605224609375, 0.025970458984375, 0.0281982421875, 0.003509521484375, 0.0675048828125, -0.026641845703125, 0.04620361328125, 0.013641357421875, -0.0614013671875, 0.0271759033203125, -0.01392364501953125, 0.01178741455078125, 0.049468994140625, 0.03814697265625, -0.03759765625, -0.031463623046875, -0.07391357421875, -0.0667724609375, 0.054840087890625, 0.01450347900390625, -0.012054443359375, 0.012054443359375, 0.0220947265625, -0.01499176025390625, -0.0018215179443359375, -0.061309814453125, -0.036376953125, -0.006450653076171875, -0.0283050537109375, -0.0301055908203125, 0.0032062530517578125, -0.005214691162109375, -0.0305023193359375, 0.06591796875, 0.0008401870727539062, 0.020477294921875, 0.020660400390625, -0.0032062530517578125, -0.01039886474609375, 0.0216064453125, 0.033355712890625, 0.0255889892578125, -0.0262298583984375, -0.0085906982421875, 0.025848388671875, -0.041717529296875, -0.0001424551010131836, 0.035369873046875, -0.021026611328125, 0.00441741943359375, 0.0296478271484375, 0.06298828125, 0.0219573974609375, -0.021026611328125, 0.0333251953125, -0.019775390625, -0.0257415771484375, -0.04888916015625, 0.011688232421875, 0.0064849853515625, 0.0225372314453125, 0.02227783203125, 0.02288818359375, 0.0020771026611328125, -0.01178741455078125, 0.015716552734375, 0.0230865478515625, -0.042022705078125, -0.0240020751953125, 0.050628662109375, -0.00970458984375, -0.022003173828125, 0.06024169921875, -0.009857177734375, -0.030059814453125, 0.0550537109375, 0.0467529296875, 0.0623779296875, -0.0120391845703125, 0.00534820556640625, 0.06787109375, 0.018402099609375, -0.0132293701171875, 0.0179901123046875, 0.0192718505859375, -0.04229736328125, -0.0158538818359375, -0.0673828125, -0.0082244873046875, 0.032989501953125, -0.062164306640625, 0.041900634765625, -0.037017822265625, -0.033294677734375, 0.02960205078125, 0.0240325927734375, -0.0743408203125, 0.037017822265625, -0.00609588623046875, 0.074951171875, -0.05010986328125, 0.06005859375, 0.055084228515625, -0.049407958984375, -0.09674072265625, 0.01070404052734375, -0.006259918212890625, -0.04473876953125, 0.051177978515625, 0.021453857421875, 0.0196533203125, 0.00946807861328125, -0.02093505859375, -0.0909423828125, 0.08929443359375, 0.006542205810546875, -0.0517578125, -0.0160675048828125, 0.0169525146484375, 0.05810546875, -0.023834228515625, 0.0457763671875, 0.040863037109375, 0.05511474609375, 0.0029754638671875, -0.079833984375, 0.024871826171875, -0.029876708984375, -0.000766754150390625, 0.035003662109375, -0.0662841796875, 0.07489013671875, -0.0242156982421875, -0.007770538330078125, 0.00653076171875, 0.048095703125, 0.0286712646484375, 0.03143310546875, 0.03546142578125, 0.05914306640625, 0.05859375, -0.0273895263671875, 0.07525634765625, -0.034149169921875, 0.0482177734375, 0.0841064453125, 0.007213592529296875, 0.069580078125, 0.035186767578125, -0.02496337890625, 0.046112060546875, 0.0506591796875, -0.0288238525390625, 0.01751708984375, 0.00983428955078125, 0.0034275054931640625, -0.0182037353515625, 0.007480621337890625, -0.0301666259765625, 0.033233642578125, 0.022216796875, -0.04339599609375, -0.003047943115234375, -0.0158233642578125, 0.0088653564453125, -0.0130615234375, -0.0175018310546875, 0.0435791015625, 0.0195770263671875, -0.03985595703125, 0.055206298828125, 0.0004916191101074219, 0.04931640625, -0.029144287109375, 0.00011467933654785156, 0.00021946430206298828, 0.0309600830078125, -0.021148681640625, -0.051177978515625, 0.00472259521484375, -0.0087432861328125, -0.0165252685546875, 0.007434844970703125, 0.047088623046875, -0.0283966064453125, -0.05267333984375, 0.0277557373046875, 0.02685546875, 0.01093292236328125, -0.0127105712890625, -0.06414794921875, -0.006206512451171875, 0.00336456298828125, -0.05224609375, 0.01727294921875, 0.041290283203125, -0.00669097900390625, 0.044647216796875, 0.047515869140625, 0.00751495361328125, -0.006114959716796875, 0.0149383544921875, 0.06591796875, -0.06756591796875, -0.029296875, -0.055419921875, 0.042694091796875, -0.0207672119140625, -0.052642822265625, 0.054351806640625, 0.05474853515625, 0.07208251953125, 0.0022182464599609375, 0.04583740234375, -0.018951416015625, 0.04913330078125, -0.0282440185546875, 0.05633544921875, -0.052978515625, 0.0032367706298828125, -0.0296478271484375, -0.0755615234375, -0.03204345703125, 0.036407470703125, -0.040740966796875, 0.025054931640625, 0.049560546875, 0.06292724609375, -0.01403045654296875, 0.004467010498046875, 0.006072998046875, 0.02532958984375, 0.011260986328125, 0.036865234375, 0.02679443359375, -0.0640869140625, 0.04669189453125, -0.0288238525390625, -0.005779266357421875, -0.01464080810546875, -0.0435791015625, -0.0802001953125, -0.05328369140625, -0.0185546875, -0.04583740234375, -0.0039825439453125, 0.079833984375, 0.042694091796875, -0.07391357421875, -0.00016939640045166016, -0.0031108856201171875, -0.012939453125, -0.01145172119140625, -0.0238037109375, 0.05206298828125, -0.0167694091796875, -0.07220458984375, 0.006072998046875, -0.01035308837890625, 0.0073394775390625, 0.00478363037109375, -0.00800323486328125, -0.0180511474609375, -0.01739501953125, 0.01343536376953125, 0.01416015625, -0.046600341796875, -0.01654052734375, 0.01103973388671875, -0.01007843017578125, 0.01201629638671875, 0.034149169921875, -0.0423583984375, 0.007663726806640625, 0.0174407958984375, 0.04083251953125, 0.057342529296875, 0.0002601146697998047, 0.0240325927734375, -0.03741455078125, 0.002582550048828125, 0.01517486572265625, 0.036865234375, 0.0271759033203125, -0.031341552734375, 0.043548583984375, 0.013458251953125, -0.049713134765625, -0.05975341796875, -0.023345947265625, -0.08831787109375, 0.007228851318359375, 0.07952880859375, -0.01540374755859375, -0.03131103515625, 0.0054473876953125, 0.00670623779296875, 0.038726806640625, -0.0487060546875, 0.05706787109375, 0.03631591796875, 0.0006642341613769531, -0.017974853515625, -0.0347900390625, 0.035430908203125, 0.0248260498046875, -0.04730224609375, -0.0196685791015625, 0.00525665283203125, 0.051483154296875, 0.033966064453125, 0.02801513671875, -0.00870513916015625, 0.010345458984375, 0.0005125999450683594, 0.02410888671875, -0.01035308837890625, -0.0191192626953125, -0.0243682861328125, 0.00665283203125, -0.00885009765625, -0.0034770965576171875 ] ]
Salesforce/codegen-350M-mono
2022-10-03T16:18:49.000Z
[ "transformers", "pytorch", "codegen", "text-generation", "arxiv:2203.13474", "license:bsd-3-clause", "endpoints_compatible", "has_space", "region:us" ]
text-generation
Salesforce
null
null
Salesforce/codegen-350M-mono
64
121,675
transformers
2022-04-11T16:18:21
--- license: bsd-3-clause --- # CodeGen (CodeGen-Mono 350M) ## Model description CodeGen is a family of autoregressive language models for **program synthesis** from the paper: [A Conversational Paradigm for Program Synthesis](https://arxiv.org/abs/2203.13474) by Erik Nijkamp, Bo Pang, Hiroaki Hayashi, Lifu Tu, Huan Wang, Yingbo Zhou, Silvio Savarese, Caiming Xiong. The models are originally released in [this repository](https://github.com/salesforce/CodeGen), under 3 pre-training data variants (`NL`, `Multi`, `Mono`) and 4 model size variants (`350M`, `2B`, `6B`, `16B`). The checkpoint included in this repository is denoted as **CodeGen-Mono 350M** in the paper, where "Mono" means the model is initialized with *CodeGen-Multi 350M* and further pre-trained on a Python programming language dataset, and "350M" refers to the number of trainable parameters. ## Training data This checkpoint (CodeGen-Mono 350M) was firstly initialized with *CodeGen-Multi 350M*, and then pre-trained on BigPython dataset. The data consists of 71.7B tokens of Python programming language. See Section 2.1 of the [paper](https://arxiv.org/abs/2203.13474) for more details. ## Training procedure CodeGen was trained using cross-entropy loss to maximize the likelihood of sequential inputs. The family of models are trained using multiple TPU-v4-512 by Google, leveraging data and model parallelism. See Section 2.3 of the [paper](https://arxiv.org/abs/2203.13474) for more details. ## Evaluation results We evaluate our models on two code generation benchmark: HumanEval and MTPB. Please refer to the [paper](https://arxiv.org/abs/2203.13474) for more details. ## Intended Use and Limitations As an autoregressive language model, CodeGen is capable of extracting features from given natural language and programming language texts, and calculating the likelihood of them. However, the model is intended for and best at **program synthesis**, that is, generating executable code given English prompts, where the prompts should be in the form of a comment string. The model can complete partially-generated code as well. ## How to use This model can be easily loaded using the `AutoModelForCausalLM` functionality: ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Salesforce/codegen-350M-mono") model = AutoModelForCausalLM.from_pretrained("Salesforce/codegen-350M-mono") text = "def hello_world():" input_ids = tokenizer(text, return_tensors="pt").input_ids generated_ids = model.generate(input_ids, max_length=128) print(tokenizer.decode(generated_ids[0], skip_special_tokens=True)) ``` ## BibTeX entry and citation info ```bibtex @article{Nijkamp2022ACP, title={A Conversational Paradigm for Program Synthesis}, author={Nijkamp, Erik and Pang, Bo and Hayashi, Hiroaki and Tu, Lifu and Wang, Huan and Zhou, Yingbo and Savarese, Silvio and Xiong, Caiming}, journal={arXiv preprint}, year={2022} } ```
2,979
[ [ -0.0396728515625, -0.04522705078125, -0.00067138671875, 0.020111083984375, -0.00044918060302734375, 0.022369384765625, -0.0284423828125, -0.0282745361328125, -0.0038051605224609375, 0.0178070068359375, -0.03936767578125, -0.041351318359375, -0.0307159423828125, 0.0095977783203125, -0.0258636474609375, 0.08551025390625, -0.00860595703125, -0.0014743804931640625, -0.0122528076171875, 0.001781463623046875, -0.00960540771484375, -0.056610107421875, -0.0198822021484375, -0.02166748046875, 0.00511932373046875, 0.00899505615234375, 0.035308837890625, 0.065185546875, 0.0232696533203125, 0.021026611328125, -0.0034351348876953125, -0.01515960693359375, -0.0283050537109375, 0.0011320114135742188, 0.01898193359375, -0.035614013671875, -0.04412841796875, 0.00933074951171875, 0.03546142578125, 0.053924560546875, -0.0018768310546875, 0.034698486328125, 0.009552001953125, 0.0282745361328125, -0.027587890625, 0.023284912109375, -0.048675537109375, 0.0131378173828125, 0.006771087646484375, -0.0133209228515625, -0.0301513671875, -0.029998779296875, 0.004039764404296875, -0.0192413330078125, 0.035980224609375, -0.00713348388671875, 0.0721435546875, 0.0123443603515625, -0.00601959228515625, -0.0228118896484375, -0.049560546875, 0.06689453125, -0.081298828125, 0.01415252685546875, 0.0006690025329589844, -0.00525665283203125, -0.018829345703125, -0.057830810546875, -0.030364990234375, -0.019073486328125, 0.006359100341796875, -0.005954742431640625, -0.0216522216796875, 0.0193634033203125, 0.044586181640625, 0.0452880859375, -0.06414794921875, -0.00933074951171875, -0.05572509765625, -0.0345458984375, 0.04638671875, 0.0179290771484375, 0.0224609375, -0.0259246826171875, -0.04473876953125, -0.01007080078125, -0.04998779296875, 0.002361297607421875, 0.0211334228515625, 0.0091705322265625, -0.0305328369140625, 0.0254974365234375, -0.01123046875, 0.07464599609375, -0.0198516845703125, -0.00745391845703125, 0.0396728515625, -0.05316162109375, -0.0307464599609375, -0.006114959716796875, 0.09112548828125, 0.00295257568359375, 0.0233154296875, -0.0018224716186523438, -0.01023101806640625, 0.01055908203125, -0.01070404052734375, -0.075927734375, -0.0285491943359375, 0.0140838623046875, -0.0574951171875, -0.044708251953125, 0.00620269775390625, -0.0657958984375, 0.01140594482421875, -0.0138397216796875, 0.01534271240234375, -0.034423828125, -0.02264404296875, 0.00966644287109375, -0.01123046875, 0.023895263671875, -0.018035888671875, -0.059417724609375, -0.0008835792541503906, 0.0268707275390625, 0.05224609375, -0.00664520263671875, -0.039459228515625, -0.0151824951171875, -0.00902557373046875, -0.02557373046875, 0.03106689453125, -0.03546142578125, -0.02679443359375, -0.004276275634765625, 0.0016937255859375, -0.01557159423828125, -0.042877197265625, 0.002025604248046875, -0.01461029052734375, 0.023651123046875, 0.01056671142578125, -0.040374755859375, -0.0067596435546875, -0.0030498504638671875, -0.025604248046875, 0.080810546875, 0.01103973388671875, -0.046722412109375, 0.029571533203125, -0.04656982421875, -0.0236358642578125, -0.008209228515625, -0.0219879150390625, -0.016082763671875, 0.003086090087890625, -0.00830841064453125, 0.022125244140625, -0.035552978515625, 0.031402587890625, -0.0304107666015625, -0.0085906982421875, 0.018463134765625, -0.0233612060546875, 0.06842041015625, 0.043060302734375, -0.04095458984375, 0.009368896484375, -0.06884765625, 0.02117919921875, 0.013916015625, -0.03729248046875, -0.005237579345703125, -0.0095367431640625, -0.01557159423828125, 0.041778564453125, 0.028106689453125, -0.032958984375, 0.0296173095703125, -0.0401611328125, 0.048919677734375, 0.0367431640625, -0.0008568763732910156, 0.030975341796875, -0.021759033203125, 0.06549072265625, -0.00237274169921875, 0.0126953125, -0.042022705078125, -0.039459228515625, -0.06756591796875, -0.01447296142578125, 0.0306396484375, 0.042938232421875, -0.04815673828125, 0.051849365234375, -0.01290130615234375, -0.04205322265625, -0.029266357421875, 0.0189208984375, 0.05364990234375, 0.0234222412109375, 0.010986328125, -0.018829345703125, -0.07196044921875, -0.066162109375, -0.011566162109375, -0.033294677734375, 0.0081787109375, -0.0010290145874023438, 0.0361328125, -0.027008056640625, 0.06854248046875, -0.042144775390625, 0.0122222900390625, -0.0301361083984375, 0.037322998046875, 0.0200347900390625, 0.0562744140625, 0.041717529296875, -0.025970458984375, -0.033447265625, -0.0252532958984375, -0.06072998046875, -0.0171051025390625, -0.0083160400390625, -0.0103607177734375, 0.03662109375, 0.058197021484375, -0.0238189697265625, 0.032989501953125, 0.053741455078125, -0.017730712890625, 0.03741455078125, -0.0142822265625, 0.00023448467254638672, -0.0933837890625, 0.0210113525390625, -0.0111236572265625, -0.00720977783203125, -0.0565185546875, 0.00439453125, -0.00261688232421875, -0.0221710205078125, -0.0266876220703125, 0.03179931640625, -0.047088623046875, 0.0017910003662109375, -0.01526641845703125, -0.0218353271484375, -0.004482269287109375, 0.056365966796875, 0.0005478858947753906, 0.0718994140625, 0.053985595703125, -0.057220458984375, 0.01139068603515625, 0.01039886474609375, -0.0224609375, -0.01410675048828125, -0.058258056640625, 0.01448822021484375, 0.01800537109375, 0.00867462158203125, -0.08148193359375, -0.006359100341796875, 0.01226043701171875, -0.048126220703125, 0.01442718505859375, -0.020721435546875, -0.0523681640625, -0.041900634765625, -0.0080718994140625, 0.044158935546875, 0.06427001953125, -0.017059326171875, 0.027008056640625, 0.032989501953125, -0.0168609619140625, -0.0153045654296875, -0.06951904296875, 0.0087890625, -0.007427215576171875, -0.061492919921875, 0.0130157470703125, -0.03228759765625, -0.0011682510375976562, -0.0140228271484375, 0.0015172958374023438, -0.002056121826171875, -0.010101318359375, 0.00894927978515625, 0.0399169921875, -0.0203094482421875, -0.0017852783203125, -0.014556884765625, -0.0131683349609375, 0.007720947265625, -0.0267486572265625, 0.0665283203125, -0.026123046875, 0.0024890899658203125, -0.006290435791015625, 0.019775390625, 0.047393798828125, -0.0360107421875, 0.043304443359375, 0.042449951171875, -0.01432037353515625, -0.0235137939453125, -0.029541015625, -0.01126861572265625, -0.03533935546875, 0.0250396728515625, -0.02459716796875, -0.0506591796875, 0.04962158203125, 0.01541900634765625, 0.00734710693359375, 0.03173828125, 0.057281494140625, 0.032470703125, 0.0870361328125, 0.045989990234375, -0.00678253173828125, 0.040557861328125, -0.039093017578125, 0.0167236328125, -0.03753662109375, -0.00890350341796875, -0.04205322265625, 0.0034580230712890625, -0.0517578125, -0.037567138671875, 0.0188446044921875, -0.00811767578125, -0.00865936279296875, 0.053375244140625, -0.052764892578125, 0.00856781005859375, 0.049896240234375, -0.019989013671875, 0.00672149658203125, 0.0036411285400390625, -0.0219268798828125, 0.004604339599609375, -0.07476806640625, -0.038421630859375, 0.0870361328125, 0.026214599609375, 0.0767822265625, -0.0174560546875, 0.07568359375, -0.0019369125366210938, 0.0033969879150390625, -0.042816162109375, 0.05584716796875, -0.004283905029296875, -0.0587158203125, 0.007080078125, -0.05615234375, -0.0655517578125, 0.0006079673767089844, 0.001987457275390625, -0.044036865234375, 0.0243072509765625, 0.02020263671875, -0.0259552001953125, 0.0146484375, -0.072265625, 0.08294677734375, -0.01247406005859375, -0.00734710693359375, 0.004970550537109375, -0.05230712890625, 0.0400390625, -0.0028209686279296875, 0.0229644775390625, -0.005672454833984375, 0.01422882080078125, 0.0631103515625, -0.031768798828125, 0.05279541015625, -0.0094146728515625, -0.01197052001953125, 0.024169921875, 0.0113067626953125, 0.0181884765625, -0.0118255615234375, -0.006855010986328125, 0.02862548828125, 0.0184326171875, -0.0228729248046875, -0.0091400146484375, 0.0479736328125, -0.044525146484375, -0.03497314453125, -0.0287933349609375, -0.049957275390625, -0.001308441162109375, 0.0291595458984375, 0.046051025390625, 0.054229736328125, -0.0148468017578125, 0.0159454345703125, 0.050445556640625, -0.048797607421875, 0.04833984375, 0.0284881591796875, -0.031707763671875, -0.031646728515625, 0.07171630859375, 0.0013322830200195312, 0.048492431640625, 0.00539398193359375, 0.001132965087890625, 0.007343292236328125, -0.02325439453125, -0.025604248046875, 0.0106964111328125, -0.047454833984375, -0.017791748046875, -0.0523681640625, -0.034515380859375, -0.053131103515625, 0.003570556640625, -0.0302734375, -0.024810791015625, -0.0173797607421875, 0.007045745849609375, 0.0021457672119140625, 0.031829833984375, -0.011505126953125, -0.006435394287109375, -0.062744140625, 0.034576416015625, 0.00545501708984375, 0.029296875, -0.00995635986328125, -0.04180908203125, -0.0251617431640625, 0.0143280029296875, -0.01222991943359375, -0.058685302734375, 0.038665771484375, 0.0177764892578125, 0.0452880859375, 0.01171112060546875, 0.0030727386474609375, 0.040435791015625, -0.0243072509765625, 0.060821533203125, 0.0131378173828125, -0.07391357421875, 0.04412841796875, -0.0191497802734375, 0.0498046875, 0.03900146484375, 0.002063751220703125, -0.0305328369140625, -0.051971435546875, -0.054534912109375, -0.06365966796875, 0.07415771484375, 0.0222930908203125, 0.02020263671875, -0.00655364990234375, 0.0023326873779296875, 0.006683349609375, 0.0215301513671875, -0.06683349609375, -0.04217529296875, -0.0217437744140625, -0.0195465087890625, -0.0009813308715820312, -0.0238037109375, -0.0011539459228515625, -0.0307159423828125, 0.055206298828125, 0.0013523101806640625, 0.049041748046875, 0.017578125, -0.01450347900390625, -0.0077972412109375, 0.01461029052734375, 0.049957275390625, 0.047454833984375, -0.0236053466796875, -0.0030384063720703125, -0.0166168212890625, -0.0197296142578125, 0.007720947265625, 0.0249786376953125, -0.013214111328125, 0.019195556640625, 0.0184783935546875, 0.0721435546875, 0.002391815185546875, -0.051666259765625, 0.03826904296875, -0.007450103759765625, -0.036956787109375, -0.031585693359375, 0.0243377685546875, -0.0052947998046875, -0.0002532005310058594, 0.0259552001953125, 0.01404571533203125, 0.006877899169921875, -0.042877197265625, 0.0220947265625, 0.023468017578125, -0.01776123046875, -0.0168609619140625, 0.062164306640625, 0.00904083251953125, -0.01091766357421875, 0.0247039794921875, -0.0301513671875, -0.043243408203125, 0.0673828125, 0.03662109375, 0.08587646484375, -0.0086822509765625, 0.00957489013671875, 0.0487060546875, 0.02947998046875, 0.0020751953125, 0.026611328125, -0.003093719482421875, -0.040557861328125, -0.015533447265625, -0.053924560546875, 0.00696563720703125, 0.01303863525390625, -0.0440673828125, 0.0240478515625, -0.033538818359375, -0.01270294189453125, 0.0017185211181640625, -0.0213165283203125, -0.046142578125, 0.00435638427734375, 0.0289154052734375, 0.073974609375, -0.05841064453125, 0.076171875, 0.05322265625, -0.05322265625, -0.067138671875, 0.0096588134765625, -0.006023406982421875, -0.05157470703125, 0.04669189453125, 0.007110595703125, 0.0073394775390625, 0.0183868408203125, -0.05364990234375, -0.06585693359375, 0.0654296875, 0.0244293212890625, -0.0269317626953125, -0.01473236083984375, 0.01275634765625, 0.03997802734375, -0.039764404296875, 0.0011043548583984375, 0.040374755859375, 0.01387786865234375, -0.0025386810302734375, -0.06866455078125, 0.0088653564453125, -0.04296875, 0.0196685791015625, 0.006374359130859375, -0.0452880859375, 0.07867431640625, -0.008270263671875, -0.009124755859375, 0.02569580078125, 0.042816162109375, 0.026397705078125, 0.00024819374084472656, 0.02874755859375, 0.0189666748046875, 0.0254364013671875, -0.00774383544921875, 0.07135009765625, -0.07568359375, 0.043426513671875, 0.0704345703125, 0.0025463104248046875, 0.0289459228515625, 0.0304412841796875, -0.0132293701171875, 0.0511474609375, 0.0382080078125, -0.0261993408203125, 0.03179931640625, 0.0235443115234375, -0.00812530517578125, -0.0177764892578125, 0.0196533203125, -0.049957275390625, 0.0272216796875, 0.0196685791015625, -0.058502197265625, -0.01085662841796875, 0.008087158203125, 0.013031005859375, -0.02178955078125, 0.00952911376953125, 0.056640625, 0.005641937255859375, -0.07855224609375, 0.09112548828125, 0.0080718994140625, 0.055877685546875, -0.056793212890625, 0.0029659271240234375, -0.0298004150390625, 0.015594482421875, -0.007602691650390625, -0.02337646484375, 0.00930023193359375, 0.0140228271484375, -0.0218963623046875, -0.0166473388671875, 0.042083740234375, -0.046356201171875, -0.03875732421875, 0.0216522216796875, 0.0233154296875, 0.010833740234375, 0.022125244140625, -0.06951904296875, 0.01216888427734375, 0.026123046875, -0.0191802978515625, 0.00923919677734375, 0.01544952392578125, 0.0193023681640625, 0.042938232421875, 0.045562744140625, 0.0044708251953125, 0.033203125, -0.007415771484375, 0.059478759765625, -0.051910400390625, -0.035736083984375, -0.06219482421875, 0.03662109375, 0.01019287109375, -0.0287933349609375, 0.062286376953125, 0.061676025390625, 0.06475830078125, -0.0238494873046875, 0.08935546875, -0.046539306640625, 0.0218963623046875, -0.032318115234375, 0.033599853515625, -0.043121337890625, 0.027130126953125, -0.03204345703125, -0.06683349609375, -0.00736236572265625, 0.03668212890625, -0.01451873779296875, 0.020416259765625, 0.0528564453125, 0.08465576171875, 0.0014486312866210938, -0.0170745849609375, 0.021636962890625, 0.025604248046875, 0.0294647216796875, 0.05767822265625, 0.05291748046875, -0.0450439453125, 0.06671142578125, -0.025115966796875, -0.0230560302734375, -0.01092529296875, -0.051971435546875, -0.0689697265625, -0.04510498046875, -0.03521728515625, -0.04638671875, -0.00955963134765625, 0.08380126953125, 0.061431884765625, -0.0635986328125, -0.0435791015625, -0.033905029296875, -0.01360321044921875, 0.002727508544921875, -0.0219268798828125, 0.0024509429931640625, -0.056976318359375, -0.056915283203125, 0.0218353271484375, 0.0106353759765625, 0.00714111328125, -0.0323486328125, -0.01221466064453125, -0.021759033203125, 0.00992584228515625, 0.021636962890625, 0.0374755859375, -0.04046630859375, 0.0073394775390625, 0.00930023193359375, -0.03594970703125, 0.0192413330078125, 0.0634765625, -0.048248291015625, 0.0479736328125, 0.06109619140625, 0.019378662109375, 0.031646728515625, -0.01338958740234375, 0.06890869140625, -0.052581787109375, 0.0311431884765625, -0.0019588470458984375, 0.035888671875, 0.01009368896484375, -0.005035400390625, 0.031280517578125, 0.0340576171875, -0.04327392578125, -0.06781005859375, 0.017822265625, -0.052947998046875, -0.032562255859375, 0.117919921875, -0.01302337646484375, -0.01419830322265625, 0.001068115234375, -0.01296234130859375, 0.031524658203125, -0.01554107666015625, 0.046844482421875, 0.043701171875, 0.006778717041015625, -0.0067291259765625, -0.04412841796875, 0.048095703125, 0.026824951171875, -0.056610107421875, 0.01535797119140625, 0.036102294921875, 0.0286712646484375, 0.01338958740234375, 0.047210693359375, -0.02423095703125, 0.024749755859375, 0.006816864013671875, 0.055816650390625, -0.00920867919921875, -0.00843048095703125, -0.034271240234375, 0.0012331008911132812, 0.006988525390625, -0.0179443359375 ] ]
dreamlike-art/dreamlike-photoreal-2.0
2023-03-13T01:05:06.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "photorealistic", "photoreal", "en", "license:other", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
dreamlike-art
null
null
dreamlike-art/dreamlike-photoreal-2.0
1,548
121,377
diffusers
2023-01-04T03:01:40
--- language: - en license: other tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - photorealistic - photoreal - diffusers inference: false --- # Dreamlike Photoreal 2.0 is a photorealistic model based on Stable Diffusion 1.5, made by [dreamlike.art](https://dreamlike.art/). # If you want to use dreamlike models on your website/app/etc., check the license at the bottom first! Warning: This model is horny! Add "nude, naked" to the negative prompt if want to avoid NSFW. You can add **photo** to your prompt to make your gens look more photorealistic. Non-square aspect ratios work better for some prompts. If you want a portrait photo, try using a vertical aspect ratio. If you want a landscape photo, try using a horizontal aspect ratio. This model was trained on 768x768px images, so use 768x768px, 640x896px, 896x640px, etc. It also works pretty good with higher resolutions such as 768x1024px or 1024x768px. ### Examples <img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/preview1.jpg" style="max-width: 800px;" width="100%"/> <img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/preview2.jpg" style="max-width: 800px;" width="100%"/> <img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/preview3.jpg" style="max-width: 800px;" width="100%"/> ### dreamlike.art You can use this model for free on [dreamlike.art](https://dreamlike.art/)! <img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/dreamlike.jpg" style="max-width: 1000px;" width="100%"/> ### CKPT [Download dreamlike-photoreal-2.0.ckpt (2.13GB)](https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/dreamlike-photoreal-2.0.ckpt) ### Safetensors [Download dreamlike-photoreal-2.0.safetensors (2.13GB)](https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/dreamlike-photoreal-2.0.safetensors) ### 🧨 Diffusers This model can be used just like any other Stable Diffusion model. For more information, please have a look at the [Stable Diffusion Pipeline](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion). ```python from diffusers import StableDiffusionPipeline import torch model_id = "dreamlike-art/dreamlike-photoreal-2.0" pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16) pipe = pipe.to("cuda") prompt = "photo, a church in the middle of a field of crops, bright cinematic lighting, gopro, fisheye lens" image = pipe(prompt).images[0] image.save("./result.jpg") ``` <img src="https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/resolve/main/church.jpg" style="max-width: 640px;" width="100%"/> # License This model is licesed under a **modified** CreativeML OpenRAIL-M license. - **You are not allowed to host, finetune, or do inference with the model or its derivatives on websites/apps/etc. If you want to, please email us at contact@dreamlike.art** - **You are free to host the model card and files (Without any actual inference or finetuning) on both commercial and non-commercial websites/apps/etc. Please state the full model name (Dreamlike Photoreal 2.0) and include the license as well as a link to the model card (https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0)** - **You are free to use the outputs (images) of the model for commercial purposes in teams of 10 or less** - You can't use the model to deliberately produce nor share illegal or harmful outputs or content - The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license - You may re-distribute the weights. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the **modified** CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here: https://huggingface.co/dreamlike-art/dreamlike-photoreal-2.0/blob/main/LICENSE.md
4,112
[ [ -0.030670166015625, -0.0521240234375, 0.0186309814453125, 0.0305023193359375, -0.03009033203125, -0.0160064697265625, 0.0026340484619140625, -0.0560302734375, 0.0299530029296875, 0.039459228515625, -0.040985107421875, -0.04248046875, -0.0304412841796875, -0.017822265625, -0.0135955810546875, 0.06524658203125, -0.0194549560546875, -0.01171112060546875, -0.026153564453125, 0.002124786376953125, -0.01303863525390625, 0.0023899078369140625, -0.0565185546875, -0.0145416259765625, 0.031585693359375, 0.004974365234375, 0.0604248046875, 0.0190582275390625, 0.0299224853515625, 0.0227508544921875, -0.007106781005859375, -0.01154327392578125, -0.03729248046875, -0.00043773651123046875, -0.00013744831085205078, -0.03961181640625, -0.07293701171875, 0.0256500244140625, 0.0217742919921875, 0.00592041015625, -0.022796630859375, 0.0179595947265625, 0.006206512451171875, 0.05096435546875, -0.00868988037109375, 0.0151824951171875, -0.007678985595703125, 0.01593017578125, -0.01045989990234375, 0.012481689453125, -0.000606536865234375, -0.04046630859375, -0.0013875961303710938, -0.061004638671875, 0.0235443115234375, 0.01117706298828125, 0.0836181640625, 0.000012576580047607422, -0.019683837890625, 0.0141143798828125, -0.044921875, 0.033355712890625, -0.04705810546875, 0.027191162109375, 0.01464080810546875, 0.034515380859375, -0.009246826171875, -0.05810546875, -0.03204345703125, -0.0007810592651367188, 0.0051727294921875, 0.03399658203125, -0.034576416015625, 0.01318359375, 0.0296173095703125, 0.03192138671875, -0.06744384765625, -0.018890380859375, -0.041473388671875, -0.0080108642578125, 0.0592041015625, -0.003864288330078125, 0.039337158203125, -0.0202178955078125, -0.03582763671875, -0.004180908203125, -0.03375244140625, 0.0136566162109375, 0.0294036865234375, 0.006103515625, -0.07037353515625, 0.048095703125, 0.0093994140625, 0.039794921875, 0.0084228515625, -0.00634765625, 0.0204925537109375, 0.002643585205078125, -0.017242431640625, -0.0350341796875, 0.055694580078125, 0.0792236328125, 0.00798797607421875, -0.0084228515625, -0.0199432373046875, 0.01263427734375, -0.00177764892578125, -0.09197998046875, -0.03533935546875, 0.03741455078125, -0.055694580078125, -0.0360107421875, -0.021453857421875, -0.06317138671875, -0.01433563232421875, -0.0023937225341796875, 0.017822265625, -0.032440185546875, -0.065185546875, 0.00782012939453125, -0.024169921875, 0.01531982421875, 0.0330810546875, -0.038818359375, 0.01708984375, 0.0181427001953125, 0.08160400390625, 0.007061004638671875, 0.0176239013671875, 0.0107421875, 0.00655364990234375, -0.0285491943359375, 0.057525634765625, -0.0141754150390625, -0.039459228515625, -0.01290130615234375, 0.0145416259765625, 0.0038280487060546875, -0.038177490234375, 0.03558349609375, -0.044219970703125, 0.0227508544921875, -0.020599365234375, -0.04864501953125, -0.02166748046875, 0.004322052001953125, -0.04412841796875, 0.0487060546875, 0.0350341796875, -0.055084228515625, 0.02239990234375, -0.04998779296875, 0.005741119384765625, 0.00997161865234375, 0.0020160675048828125, -0.0262298583984375, 0.01290130615234375, -0.0207672119140625, 0.0180816650390625, 0.003353118896484375, -0.0088348388671875, -0.0401611328125, -0.01385498046875, -0.01629638671875, 0.0035915374755859375, 0.08074951171875, 0.03533935546875, -0.00763702392578125, 0.007343292236328125, -0.03289794921875, 0.0044708251953125, 0.04473876953125, 0.0018396377563476562, -0.01220703125, -0.023284912109375, 0.04791259765625, 0.037109375, 0.0081939697265625, -0.0506591796875, 0.0255889892578125, -0.043914794921875, 0.005054473876953125, 0.039581298828125, 0.0013227462768554688, 0.02886962890625, -0.04791259765625, 0.0595703125, 0.017913818359375, 0.022308349609375, 0.0313720703125, -0.04742431640625, -0.046051025390625, -0.0198211669921875, 0.011749267578125, 0.023284912109375, -0.0506591796875, 0.0021305084228515625, 0.00634002685546875, -0.060089111328125, -0.029632568359375, -0.0012760162353515625, 0.0296173095703125, 0.021484375, 0.007904052734375, -0.032257080078125, -0.03216552734375, -0.070556640625, -0.005519866943359375, 0.001979827880859375, -0.0009474754333496094, 0.0305023193359375, 0.0400390625, -0.0014553070068359375, 0.0504150390625, -0.032745361328125, -0.0248565673828125, -0.0172882080078125, -0.02313232421875, 0.0247802734375, 0.06341552734375, 0.0931396484375, -0.0667724609375, -0.0433349609375, -0.0233154296875, -0.07330322265625, 0.0017347335815429688, 0.01006317138671875, -0.030548095703125, 0.016204833984375, 0.001140594482421875, -0.08056640625, 0.043731689453125, 0.058868408203125, -0.058929443359375, 0.048675537109375, -0.023284912109375, 0.01282501220703125, -0.0972900390625, 0.005207061767578125, 0.034698486328125, -0.039459228515625, -0.032440185546875, 0.03753662109375, -0.0181427001953125, -0.00838470458984375, -0.049530029296875, 0.07073974609375, -0.03369140625, 0.01287078857421875, -0.0306243896484375, 0.00823211669921875, 0.005115509033203125, 0.0287628173828125, 0.005306243896484375, 0.03326416015625, 0.061248779296875, -0.03179931640625, 0.042327880859375, 0.031402587890625, -0.02252197265625, 0.058868408203125, -0.06396484375, 0.009765625, -0.0239105224609375, 0.032928466796875, -0.059356689453125, -0.0357666015625, 0.045196533203125, -0.034881591796875, 0.016357421875, -0.023681640625, -0.0300445556640625, -0.04290771484375, -0.00624847412109375, 0.035003662109375, 0.07440185546875, -0.0204620361328125, 0.04315185546875, 0.02117919921875, 0.01251220703125, -0.006092071533203125, -0.043792724609375, -0.006053924560546875, -0.024322509765625, -0.06451416015625, 0.042022705078125, -0.010467529296875, -0.0248565673828125, 0.0249786376953125, 0.004947662353515625, 0.01190948486328125, -0.02545166015625, 0.04669189453125, 0.0272369384765625, 0.0017986297607421875, -0.02899169921875, 0.0173797607421875, -0.0002262592315673828, -0.0009813308715820312, -0.0224151611328125, 0.034942626953125, -0.0173187255859375, -0.00514984130859375, -0.06512451171875, 0.025146484375, 0.040985107421875, 0.00803375244140625, 0.06536865234375, 0.042144775390625, -0.056854248046875, -0.01052093505859375, -0.03741455078125, -0.0139923095703125, -0.03594970703125, -0.0005049705505371094, -0.040374755859375, -0.0423583984375, 0.0499267578125, 0.007793426513671875, 0.03326416015625, 0.045379638671875, 0.036163330078125, -0.030120849609375, 0.07659912109375, 0.0540771484375, 0.03448486328125, 0.0293121337890625, -0.06317138671875, -0.0249176025390625, -0.05712890625, -0.044586181640625, 0.0032176971435546875, -0.049835205078125, -0.0215911865234375, -0.05029296875, 0.0196075439453125, 0.0200958251953125, -0.018798828125, 0.041046142578125, -0.02801513671875, 0.021453857421875, 0.0131988525390625, 0.0340576171875, 0.0132598876953125, 0.0084228515625, -0.023345947265625, -0.0126190185546875, -0.032562255859375, -0.02557373046875, 0.055755615234375, 0.019683837890625, 0.04974365234375, 0.015655517578125, 0.0390625, 0.01227569580078125, 0.0252838134765625, -0.04278564453125, 0.045989990234375, 0.0047607421875, -0.06414794921875, 0.02581787109375, -0.021514892578125, -0.052001953125, 0.0178680419921875, -0.0247650146484375, -0.0540771484375, 0.01322174072265625, 0.036651611328125, -0.03302001953125, 0.048248291015625, -0.050079345703125, 0.06396484375, -0.0012655258178710938, -0.043609619140625, -0.0136566162109375, -0.03558349609375, 0.029144287109375, 0.0232696533203125, 0.008514404296875, -0.0248565673828125, -0.0072784423828125, 0.051849365234375, -0.043182373046875, 0.05975341796875, -0.054718017578125, 0.00003600120544433594, 0.029144287109375, 0.029876708984375, 0.0308074951171875, 0.0063323974609375, -0.01096343994140625, 0.01108551025390625, 0.0208282470703125, -0.046173095703125, -0.03924560546875, 0.054595947265625, -0.061431884765625, -0.0350341796875, -0.024200439453125, -0.042144775390625, 0.000019848346710205078, 0.01558685302734375, 0.055938720703125, 0.0305328369140625, -0.0218353271484375, -0.00424957275390625, 0.04132080078125, -0.00849151611328125, 0.033172607421875, 0.0125885009765625, -0.062286376953125, -0.021240234375, 0.04681396484375, -0.001190185546875, 0.020538330078125, -0.0088348388671875, 0.018768310546875, -0.027191162109375, -0.035400390625, -0.04107666015625, 0.0340576171875, -0.0218048095703125, -0.021636962890625, -0.049652099609375, -0.0222930908203125, -0.0280914306640625, -0.0181121826171875, -0.04254150390625, -0.03778076171875, -0.0298309326171875, -0.0074615478515625, 0.055023193359375, 0.039337158203125, -0.0186767578125, 0.01279449462890625, -0.03912353515625, 0.0225982666015625, 0.01024627685546875, 0.0504150390625, -0.0059661865234375, -0.048919677734375, 0.007183074951171875, 0.010009765625, -0.0145416259765625, -0.06109619140625, 0.0401611328125, 0.0031719207763671875, 0.0178375244140625, 0.03277587890625, -0.0034809112548828125, 0.04742431640625, -0.0274505615234375, 0.053680419921875, 0.040679931640625, -0.033355712890625, 0.05596923828125, -0.05810546875, 0.0015001296997070312, 0.0199737548828125, 0.026824951171875, -0.03729248046875, -0.0246429443359375, -0.062255859375, -0.04046630859375, 0.039276123046875, 0.03900146484375, 0.026885986328125, 0.035003662109375, 0.0633544921875, -0.0008249282836914062, 0.01611328125, -0.06903076171875, -0.0259552001953125, -0.024505615234375, -0.001338958740234375, 0.0111846923828125, -0.01154327392578125, -0.0238189697265625, -0.031036376953125, 0.05810546875, 0.0104522705078125, 0.03936767578125, 0.017181396484375, 0.0234375, -0.0350341796875, -0.0279388427734375, 0.01708984375, 0.034759521484375, -0.025390625, -0.0181884765625, -0.01331329345703125, -0.0211181640625, 0.0028514862060546875, 0.01529693603515625, -0.03076171875, -0.0004973411560058594, -0.01464080810546875, 0.05670166015625, -0.01171875, -0.0272216796875, 0.049041748046875, -0.0055999755859375, -0.039825439453125, -0.046905517578125, 0.01611328125, 0.0268096923828125, 0.05609130859375, -0.01552581787109375, 0.05596923828125, 0.0165863037109375, -0.01314544677734375, 0.008270263671875, 0.05340576171875, -0.036712646484375, -0.047576904296875, 0.0985107421875, 0.0093536376953125, -0.0194244384765625, 0.029815673828125, -0.036407470703125, -0.00799560546875, 0.049652099609375, 0.058563232421875, 0.059906005859375, -0.020660400390625, 0.038421630859375, 0.041168212890625, 0.001689910888671875, -0.00514984130859375, 0.0509033203125, 0.0234375, -0.0450439453125, 0.0011806488037109375, -0.054107666015625, -0.00658416748046875, 0.00308990478515625, -0.036407470703125, 0.038604736328125, -0.056304931640625, -0.0231170654296875, -0.012115478515625, -0.00022864341735839844, -0.050811767578125, 0.0258941650390625, 0.0264434814453125, 0.09539794921875, -0.05474853515625, 0.0567626953125, 0.048187255859375, -0.041656494140625, -0.06939697265625, -0.01552581787109375, 0.031585693359375, -0.04779052734375, 0.0030460357666015625, 0.0092926025390625, -0.004802703857421875, 0.01232147216796875, -0.05010986328125, -0.052825927734375, 0.083984375, 0.034942626953125, -0.0340576171875, -0.0335693359375, -0.0341796875, 0.046051025390625, -0.03582763671875, 0.016693115234375, 0.011688232421875, 0.0120849609375, 0.0482177734375, -0.050537109375, 0.0174560546875, -0.04119873046875, 0.031524658203125, -0.010162353515625, -0.08636474609375, 0.044281005859375, -0.039642333984375, -0.0166778564453125, 0.042694091796875, 0.06134033203125, 0.025299072265625, 0.0196380615234375, 0.042633056640625, 0.0638427734375, 0.0352783203125, -0.013092041015625, 0.090087890625, -0.0137481689453125, 0.045745849609375, 0.045318603515625, -0.0028896331787109375, 0.052978515625, 0.0100860595703125, -0.0164031982421875, 0.0501708984375, 0.06634521484375, -0.01134490966796875, 0.028350830078125, 0.0216827392578125, -0.03326416015625, -0.0177459716796875, -0.015899658203125, -0.04705810546875, 0.0216522216796875, 0.01177215576171875, -0.0316162109375, -0.01055145263671875, 0.018646240234375, 0.003353118896484375, -0.0132904052734375, -0.0117034912109375, 0.0297698974609375, 0.020721435546875, -0.01171875, 0.032623291015625, 0.0012140274047851562, 0.049530029296875, -0.034423828125, 0.0004067420959472656, -0.0254058837890625, 0.0033512115478515625, -0.034088134765625, -0.0496826171875, 0.021636962890625, 0.0021648406982421875, 0.007610321044921875, -0.024993896484375, 0.058624267578125, -0.02667236328125, -0.053375244140625, 0.0295257568359375, 0.038604736328125, 0.031768798828125, -0.00128936767578125, -0.08544921875, 0.0157470703125, -0.00807952880859375, -0.0401611328125, 0.01210784912109375, -0.008544921875, 0.032012939453125, 0.053985595703125, 0.022369384765625, 0.01702880859375, -0.0226287841796875, -0.00392913818359375, 0.06768798828125, -0.03594970703125, -0.03167724609375, -0.034271240234375, 0.05322265625, -0.01349639892578125, -0.01071929931640625, 0.0509033203125, 0.059234619140625, 0.053253173828125, -0.0191497802734375, 0.0562744140625, -0.03936767578125, 0.025238037109375, -0.0188446044921875, 0.07208251953125, -0.084228515625, -0.007579803466796875, -0.0494384765625, -0.08404541015625, -0.0188751220703125, 0.05059814453125, 0.00368499755859375, 0.0258941650390625, 0.0028057098388671875, 0.06927490234375, -0.01922607421875, 0.004955291748046875, 0.016693115234375, 0.01386260986328125, 0.01079559326171875, 0.0390625, 0.049163818359375, -0.0450439453125, 0.0177764892578125, -0.04351806640625, -0.03924560546875, -0.0033740997314453125, -0.053070068359375, -0.0677490234375, -0.042724609375, -0.053985595703125, -0.0667724609375, -0.0006561279296875, 0.08160400390625, 0.07257080078125, -0.046478271484375, -0.01468658447265625, -0.00968170166015625, -0.00948333740234375, -0.0084991455078125, -0.018646240234375, 0.01148223876953125, 0.042205810546875, -0.07891845703125, 0.0014753341674804688, 0.010528564453125, 0.05975341796875, -0.003971099853515625, -0.0048065185546875, 0.00888824462890625, -0.01552581787109375, 0.041015625, 0.022918701171875, -0.05426025390625, -0.016754150390625, -0.00893402099609375, -0.0035800933837890625, 0.0003647804260253906, 0.0227813720703125, -0.0404052734375, 0.0264129638671875, 0.0172882080078125, -0.006229400634765625, 0.04302978515625, 0.006435394287109375, 0.023834228515625, -0.035308837890625, 0.0262298583984375, 0.019805908203125, 0.02960205078125, 0.00634765625, -0.0322265625, 0.03790283203125, 0.0538330078125, -0.037567138671875, -0.0357666015625, 0.0164947509765625, -0.10443115234375, -0.0130157470703125, 0.08197021484375, -0.009765625, -0.004180908203125, 0.0221099853515625, -0.0263671875, 0.01129913330078125, -0.0255584716796875, 0.02703857421875, 0.0274505615234375, -0.03851318359375, -0.03759765625, -0.05572509765625, 0.052154541015625, 0.00510406494140625, -0.05108642578125, -0.01149749755859375, 0.0616455078125, 0.0445556640625, 0.00930023193359375, 0.05419921875, -0.032989501953125, 0.01360321044921875, 0.015899658203125, 0.01480865478515625, -0.004638671875, -0.010406494140625, -0.01036834716796875, 0.0122833251953125, -0.017608642578125, -0.0122833251953125 ] ]
biu-nlp/f-coref
2022-11-28T11:35:52.000Z
[ "transformers", "pytorch", "roberta", "fast", "coreference-resolution", "en", "dataset:multi_news", "dataset:ontonotes", "arxiv:2209.04280", "arxiv:2205.12644", "arxiv:1907.10529", "arxiv:2101.00434", "arxiv:2109.04127", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
null
biu-nlp
null
null
biu-nlp/f-coref
6
120,914
transformers
2022-08-19T12:01:10
--- language: - en tags: - fast - coreference-resolution license: mit datasets: - multi_news - ontonotes metrics: - CoNLL task_categories: - coreference-resolution model-index: - name: biu-nlp/f-coref results: - task: type: coreference-resolution name: coreference-resolution dataset: name: ontonotes type: coreference metrics: - name: Avg. F1 type: CoNLL value: 78.5 --- ## F-Coref: Fast, Accurate and Easy to Use Coreference Resolution [F-Coref](https://arxiv.org/abs/2209.04280) allows to process 2.8K OntoNotes documents in 25 seconds on a V100 GPU (compared to 6 minutes for the [LingMess](https://arxiv.org/abs/2205.12644) model, and to 12 minutes of the popular AllenNLP coreference model) with only a modest drop in accuracy. The fast speed is achieved through a combination of distillation of a compact model from the LingMess model, and an efficient batching implementation using a technique we call leftover Please check the [official repository](https://github.com/shon-otmazgin/fastcoref) for more details and updates. #### Experiments | Model | Runtime | Memory | |-----------------------|---------|---------| | [Joshi et al. (2020)](https://arxiv.org/abs/1907.10529) | 12:06 | 27.4 | | [Otmazgin et al. (2022)](https://arxiv.org/abs/2205.12644) | 06:43 | 4.6 | | + Batching | 06:00 | 6.6 | | [Kirstain et al. (2021)](https://arxiv.org/abs/2101.00434) | 04:37 | 4.4 | | [Dobrovolskii (2021)](https://arxiv.org/abs/2109.04127) | 03:49 | 3.5 | | [F-Coref](https://arxiv.org/abs/2209.04280) | 00:45 | 3.3 | | + Batching | 00:35 | 4.5 | | + Leftovers batching | 00:25 | 4.0 | The inference time(Min:Sec) and memory(GiB) for each model on 2.8K documents. Average of 3 runs. Hardware, NVIDIA Tesla V100 SXM2. ### Citation ``` @inproceedings{Otmazgin2022FcorefFA, title={F-coref: Fast, Accurate and Easy to Use Coreference Resolution}, author={Shon Otmazgin and Arie Cattan and Yoav Goldberg}, booktitle={AACL}, year={2022} } ``` [F-coref: Fast, Accurate and Easy to Use Coreference Resolution](https://aclanthology.org/2022.aacl-demo.6) (Otmazgin et al., AACL-IJCNLP 2022)
2,324
[ [ -0.0521240234375, -0.0552978515625, 0.043426513671875, -0.0007958412170410156, -0.0249176025390625, -0.017120361328125, -0.042816162109375, -0.051971435546875, -0.00998687744140625, 0.030487060546875, -0.01397705078125, -0.03204345703125, -0.0450439453125, 0.01165008544921875, -0.0199737548828125, 0.05419921875, 0.00890350341796875, -0.0017538070678710938, -0.015045166015625, -0.016082763671875, 0.016510009765625, -0.033111572265625, -0.036712646484375, -0.043304443359375, 0.0026187896728515625, 0.03765869140625, 0.0467529296875, 0.07476806640625, 0.050567626953125, 0.0213165283203125, -0.0216827392578125, 0.0159759521484375, -0.00812530517578125, -0.0010700225830078125, -0.012054443359375, -0.0017871856689453125, -0.0413818359375, 0.0017414093017578125, 0.051910400390625, 0.038177490234375, 0.01328277587890625, 0.01519775390625, 0.0030231475830078125, 0.053314208984375, -0.03729248046875, 0.01554107666015625, -0.023101806640625, -0.016876220703125, -0.0205230712890625, 0.005649566650390625, -0.0307769775390625, -0.00424957275390625, 0.02435302734375, -0.05487060546875, 0.045440673828125, 0.0292816162109375, 0.0811767578125, 0.00170135498046875, -0.0191497802734375, -0.02618408203125, -0.044525146484375, 0.068359375, -0.061553955078125, 0.0306396484375, 0.02978515625, 0.0071258544921875, 0.01464080810546875, -0.05828857421875, -0.039642333984375, -0.016815185546875, -0.026214599609375, -0.010589599609375, -0.01145172119140625, 0.0192108154296875, 0.054351806640625, 0.056976318359375, -0.035980224609375, -0.0014638900756835938, -0.01415252685546875, -0.0108184814453125, 0.04046630859375, 0.01157379150390625, -0.015899658203125, -0.036285400390625, -0.0228424072265625, -0.0211639404296875, -0.0157318115234375, 0.01065826416015625, -0.003490447998046875, 0.01849365234375, 0.0121002197265625, 0.0212860107421875, -0.0203857421875, 0.050048828125, 0.05462646484375, -0.033477783203125, 0.01299285888671875, -0.0548095703125, -0.010101318359375, 0.006683349609375, 0.06500244140625, 0.02978515625, -0.007534027099609375, -0.0007328987121582031, -0.005279541015625, -0.016448974609375, 0.0048675537109375, -0.09796142578125, -0.006298065185546875, 0.0194091796875, -0.031951904296875, 0.0029888153076171875, 0.004161834716796875, -0.0418701171875, -0.0161285400390625, -0.0225677490234375, 0.004314422607421875, -0.059783935546875, 0.006099700927734375, -0.0145263671875, -0.01174163818359375, 0.002899169921875, 0.029144287109375, -0.053009033203125, 0.02880859375, 0.04254150390625, 0.07049560546875, 0.0005011558532714844, -0.002330780029296875, -0.03350830078125, -0.00797271728515625, -0.01151275634765625, 0.06707763671875, -0.0166778564453125, -0.0491943359375, -0.037017822265625, -0.0025920867919921875, 0.0031414031982421875, -0.046295166015625, 0.0731201171875, -0.0104827880859375, -0.0082244873046875, -0.038116455078125, -0.051483154296875, -0.0264129638671875, 0.023040771484375, -0.045745849609375, 0.092529296875, -0.0029315948486328125, -0.06561279296875, 0.006198883056640625, -0.034912109375, -0.020294189453125, -0.0112762451171875, 0.0023822784423828125, -0.01474761962890625, -0.0021610260009765625, 0.032073974609375, 0.04608154296875, -0.05621337890625, 0.028167724609375, -0.02197265625, -0.0117950439453125, 0.004108428955078125, 0.00988006591796875, 0.057281494140625, 0.0310211181640625, -0.03424072265625, 0.01049041748046875, -0.05499267578125, -0.00481414794921875, -0.00325775146484375, -0.0047149658203125, -0.037628173828125, -0.01043701171875, 0.00902557373046875, 0.01064300537109375, 0.02142333984375, -0.051483154296875, 0.00658416748046875, -0.026763916015625, 0.039703369140625, 0.031463623046875, -0.01166534423828125, 0.0408935546875, -0.03424072265625, 0.0012683868408203125, 0.0166778564453125, 0.00353240966796875, -0.041412353515625, -0.038970947265625, -0.0469970703125, -0.0239105224609375, 0.0177764892578125, 0.036529541015625, -0.0303955078125, 0.035919189453125, -0.027374267578125, -0.06976318359375, -0.01033782958984375, 0.006458282470703125, 0.050048828125, 0.06268310546875, 0.032318115234375, -0.02215576171875, -0.036834716796875, -0.07305908203125, -0.00899505615234375, -0.029632568359375, -0.00201416015625, 0.041046142578125, 0.053924560546875, 0.00829315185546875, 0.06982421875, -0.04632568359375, -0.022247314453125, 0.006252288818359375, 0.019439697265625, 0.0305633544921875, 0.045318603515625, 0.057891845703125, -0.049163818359375, -0.04522705078125, 0.004444122314453125, -0.03619384765625, -0.004352569580078125, 0.0052947998046875, -0.01143646240234375, 0.0294036865234375, 0.025848388671875, -0.041748046875, 0.0108184814453125, 0.0177764892578125, -0.039154052734375, 0.05072021484375, -0.008941650390625, 0.03350830078125, -0.07720947265625, 0.04046630859375, 0.0115509033203125, -0.025299072265625, -0.0287933349609375, 0.00565338134765625, 0.0208740234375, -0.0011081695556640625, -0.03118896484375, 0.061248779296875, -0.0548095703125, -0.00008296966552734375, -0.00527191162109375, 0.0210418701171875, 0.0113983154296875, 0.03466796875, 0.0136871337890625, 0.05364990234375, 0.0222625732421875, -0.051239013671875, 0.01047515869140625, 0.0294952392578125, -0.03765869140625, 0.04888916015625, -0.043670654296875, 0.01021575927734375, 0.0162506103515625, 0.0362548828125, -0.062164306640625, -0.0055999755859375, 0.0005831718444824219, -0.026763916015625, 0.0198516845703125, 0.009185791015625, -0.0311279296875, -0.030120849609375, -0.031890869140625, 0.03106689453125, 0.02392578125, -0.04827880859375, 0.06683349609375, 0.012664794921875, 0.0008802413940429688, -0.054718017578125, -0.04974365234375, -0.02020263671875, 0.00710296630859375, -0.065185546875, 0.06915283203125, -0.0211181640625, -0.0273284912109375, 0.01885986328125, -0.01377105712890625, -0.00460052490234375, -0.0162506103515625, 0.0474853515625, 0.0125732421875, -0.0220794677734375, -0.0038700103759765625, -0.005268096923828125, -0.0125885009765625, -0.0199127197265625, -0.015869140625, 0.016632080078125, -0.04327392578125, 0.0026378631591796875, -0.0258941650390625, 0.0309295654296875, 0.04681396484375, -0.0033969879150390625, 0.042266845703125, 0.043212890625, -0.04052734375, 0.0017900466918945312, -0.06414794921875, -0.0224609375, -0.03204345703125, 0.00728607177734375, -0.0276641845703125, -0.068603515625, 0.05230712890625, 0.01361083984375, 0.032806396484375, 0.06158447265625, 0.03662109375, 0.017608642578125, 0.0379638671875, 0.0142059326171875, 0.016448974609375, 0.043365478515625, -0.0277252197265625, 0.01837158203125, -0.0595703125, -0.01165008544921875, -0.06927490234375, -0.01377105712890625, -0.00919342041015625, -0.055084228515625, 0.00632476806640625, 0.033782958984375, -0.017547607421875, 0.032806396484375, -0.044586181640625, 0.0220794677734375, 0.054443359375, -0.0009341239929199219, 0.0343017578125, -0.015533447265625, -0.0264129638671875, -0.01025390625, -0.04217529296875, -0.04034423828125, 0.07464599609375, 0.002307891845703125, 0.035308837890625, -0.011383056640625, 0.041046142578125, 0.02557373046875, 0.0411376953125, -0.065185546875, 0.046600341796875, -0.0355224609375, -0.0751953125, -0.00855255126953125, -0.0241851806640625, -0.0293121337890625, 0.01076507568359375, -0.01259613037109375, -0.0498046875, 0.030181884765625, 0.033203125, -0.048828125, 0.046417236328125, -0.043731689453125, 0.05938720703125, 0.004360198974609375, -0.02899169921875, -0.0180511474609375, -0.0343017578125, 0.039825439453125, -0.0017604827880859375, 0.0128631591796875, -0.0260162353515625, 0.009735107421875, 0.08056640625, -0.036712646484375, 0.042572021484375, -0.00807952880859375, 0.0257415771484375, 0.047576904296875, -0.0120849609375, 0.039276123046875, 0.004962921142578125, -0.028533935546875, 0.03509521484375, 0.0204010009765625, -0.0233154296875, -0.0256195068359375, 0.07421875, -0.04620361328125, -0.037933349609375, -0.04986572265625, -0.04132080078125, -0.0005230903625488281, 0.0174713134765625, 0.034271240234375, 0.03375244140625, -0.0343017578125, 0.0219268798828125, 0.046112060546875, 0.00807952880859375, 0.0595703125, 0.03619384765625, -0.00988006591796875, -0.04107666015625, 0.03155517578125, 0.037811279296875, 0.002017974853515625, 0.0298309326171875, 0.00856781005859375, -0.0286407470703125, -0.03155517578125, -0.0355224609375, 0.042236328125, -0.04937744140625, -0.0303955078125, -0.07275390625, 0.0019550323486328125, -0.05413818359375, -0.0212249755859375, -0.045501708984375, -0.054412841796875, -0.0413818359375, 0.0007457733154296875, 0.04437255859375, 0.0168914794921875, -0.018707275390625, 0.0305328369140625, -0.07171630859375, 0.0043487548828125, 0.0008826255798339844, 0.0047149658203125, 0.005596160888671875, -0.049774169921875, -0.033843994140625, -0.01885986328125, -0.03192138671875, -0.045318603515625, 0.01898193359375, 0.03753662109375, 0.033538818359375, 0.052825927734375, 0.03790283203125, 0.053466796875, -0.015777587890625, 0.07061767578125, 0.0073089599609375, -0.0888671875, 0.0230865478515625, -0.002536773681640625, 0.0311126708984375, 0.04278564453125, 0.029052734375, -0.034820556640625, -0.003833770751953125, -0.0538330078125, -0.060272216796875, 0.05682373046875, -0.0066070556640625, -0.031494140625, -0.003833770751953125, 0.046295166015625, 0.0037784576416015625, -0.01016998291015625, 0.0026187896728515625, -0.0237579345703125, -0.005245208740234375, 0.004032135009765625, -0.0226287841796875, -0.033172607421875, -0.0028438568115234375, -0.030731201171875, 0.07403564453125, -0.02276611328125, 0.036651611328125, 0.048736572265625, -0.01105499267578125, -0.0005083084106445312, 0.015289306640625, 0.057830810546875, 0.040313720703125, -0.03668212890625, 0.029144287109375, 0.00909423828125, -0.05047607421875, 0.0037097930908203125, 0.025848388671875, -0.01209259033203125, 0.0182037353515625, 0.028533935546875, 0.06256103515625, 0.0209503173828125, -0.05706787109375, 0.01049041748046875, 0.0020542144775390625, -0.0282135009765625, -0.0256195068359375, 0.0017881393432617188, 0.01141357421875, 0.020843505859375, 0.04010009765625, 0.01885986328125, 0.00821685791015625, -0.0157318115234375, 0.0216522216796875, 0.00644683837890625, -0.02337646484375, -0.0287017822265625, 0.053741455078125, 0.019500732421875, 0.00203704833984375, 0.05419921875, -0.0212860107421875, -0.033538818359375, 0.041778564453125, 0.0165863037109375, 0.062164306640625, -0.021575927734375, -0.0014829635620117188, 0.07861328125, 0.00930023193359375, -0.03729248046875, -0.00023925304412841797, -0.0145416259765625, -0.05560302734375, -0.01543426513671875, -0.070068359375, -0.04034423828125, -0.00948333740234375, -0.0552978515625, -0.0079345703125, -0.060699462890625, -0.05029296875, 0.030670166015625, 0.0364990234375, -0.04541015625, 0.016082763671875, -0.0115966796875, 0.08740234375, -0.048126220703125, 0.0694580078125, 0.05609130859375, -0.01434326171875, -0.047332763671875, -0.035064697265625, 0.02044677734375, -0.0153350830078125, 0.04083251953125, 0.0145416259765625, -0.0037250518798828125, -0.0001569986343383789, -0.01526641845703125, -0.060699462890625, 0.0938720703125, 0.0235595703125, -0.05780029296875, -0.0137176513671875, -0.0267333984375, 0.016998291015625, -0.033843994140625, 0.0251312255859375, 0.0587158203125, 0.058380126953125, 0.004894256591796875, -0.07415771484375, 0.0231170654296875, -0.03155517578125, -0.0051116943359375, 0.00899505615234375, -0.05029296875, 0.08026123046875, -0.0196990966796875, -0.0196380615234375, 0.009185791015625, 0.0701904296875, 0.016937255859375, 0.026611328125, 0.05029296875, 0.0654296875, 0.055816650390625, -0.0268096923828125, 0.07861328125, -0.048004150390625, 0.0377197265625, 0.0799560546875, -0.0063934326171875, 0.056427001953125, 0.04437255859375, -0.00830078125, 0.0161895751953125, 0.059478759765625, 0.0031414031982421875, 0.0130615234375, -0.017333984375, -0.015899658203125, -0.036376953125, -0.0246429443359375, -0.03082275390625, -0.00481414794921875, 0.0238189697265625, -0.038482666015625, -0.0172576904296875, -0.00290679931640625, 0.019256591796875, 0.0006451606750488281, -0.016632080078125, 0.023712158203125, 0.01299285888671875, -0.06085205078125, 0.0391845703125, 0.0197601318359375, 0.06201171875, -0.03326416015625, 0.0197296142578125, -0.010833740234375, 0.0400390625, -0.016876220703125, -0.033538818359375, 0.037139892578125, 0.00794219970703125, -0.045440673828125, -0.0223846435546875, 0.04840087890625, -0.0156402587890625, -0.04046630859375, 0.02716064453125, 0.0183563232421875, 0.0120391845703125, -0.018402099609375, -0.044464111328125, 0.01384735107421875, -0.010406494140625, -0.04998779296875, 0.00662994384765625, 0.0080413818359375, -0.0035266876220703125, 0.0224761962890625, 0.042449951171875, -0.0192413330078125, 0.00787353515625, -0.047119140625, 0.051300048828125, -0.058624267578125, -0.03271484375, -0.04278564453125, 0.0125885009765625, -0.007274627685546875, -0.05499267578125, 0.0538330078125, 0.051177978515625, 0.064453125, 0.0017871856689453125, 0.032562255859375, -0.034576416015625, 0.04351806640625, -0.0258941650390625, 0.0357666015625, -0.04022216796875, -0.0038280487060546875, -0.0179290771484375, -0.05926513671875, -0.0257415771484375, 0.045318603515625, -0.0316162109375, 0.0011882781982421875, 0.06707763671875, 0.05682373046875, -0.006076812744140625, 0.01279449462890625, 0.0130615234375, 0.0210113525390625, 0.0273590087890625, 0.031707763671875, 0.005664825439453125, -0.03424072265625, 0.028533935546875, -0.05682373046875, -0.013641357421875, -0.03253173828125, -0.07342529296875, -0.056121826171875, -0.059295654296875, -0.040191650390625, -0.045440673828125, -0.0135345458984375, 0.08319091796875, 0.0606689453125, -0.0732421875, -0.0189056396484375, 0.0240325927734375, -0.00826263427734375, -0.048370361328125, -0.01155853271484375, 0.045989990234375, -0.0008225440979003906, -0.07122802734375, 0.01434326171875, 0.005252838134765625, -0.015960693359375, -0.006103515625, -0.0233154296875, -0.02947998046875, 0.0156402587890625, 0.0345458984375, 0.0209503173828125, -0.0654296875, -0.016265869140625, 0.00974273681640625, -0.02935791015625, 0.01294708251953125, 0.0831298828125, -0.03155517578125, 0.031524658203125, 0.05828857421875, 0.0377197265625, 0.06927490234375, 0.01215362548828125, 0.00954437255859375, -0.01180267333984375, 0.022918701171875, 0.0165252685546875, 0.0206298828125, 0.023712158203125, -0.0233154296875, 0.036407470703125, 0.0443115234375, -0.03839111328125, -0.0538330078125, -0.00555419921875, -0.10784912109375, -0.018646240234375, 0.05401611328125, -0.039276123046875, -0.044830322265625, 0.007434844970703125, -0.0164337158203125, 0.0233001708984375, -0.05810546875, 0.032806396484375, 0.06549072265625, -0.024200439453125, -0.004047393798828125, -0.023895263671875, 0.040252685546875, 0.0235748291015625, -0.034820556640625, 0.00888824462890625, 0.044921875, 0.0262298583984375, 0.05096435546875, 0.060699462890625, -0.0011272430419921875, 0.015289306640625, 0.0177001953125, 0.01293182373046875, -0.042572021484375, -0.01113128662109375, -0.005306243896484375, 0.028533935546875, -0.0142059326171875, 0.010894775390625 ] ]
finiteautomata/bertweet-base-sentiment-analysis
2023-02-17T02:17:31.000Z
[ "transformers", "pytorch", "tf", "roberta", "text-classification", "sentiment-analysis", "en", "arxiv:2106.09462", "endpoints_compatible", "has_space", "region:us" ]
text-classification
finiteautomata
null
null
finiteautomata/bertweet-base-sentiment-analysis
85
120,883
transformers
2022-03-02T23:29:05
--- language: - en tags: - sentiment-analysis --- # Sentiment Analysis in English ## bertweet-sentiment-analysis Repository: [https://github.com/finiteautomata/pysentimiento/](https://github.com/finiteautomata/pysentimiento/) Model trained with SemEval 2017 corpus (around ~40k tweets). Base model is [BERTweet](https://github.com/VinAIResearch/BERTweet), a RoBERTa model trained on English tweets. Uses `POS`, `NEG`, `NEU` labels. ## License `pysentimiento` is an open-source library for non-commercial use and scientific research purposes only. Please be aware that models are trained with third-party datasets and are subject to their respective licenses. 1. [TASS Dataset license](http://tass.sepln.org/tass_data/download.php) 2. [SEMEval 2017 Dataset license]() ## Citation If you use `pysentimiento` in your work, please cite [this paper](https://arxiv.org/abs/2106.09462) ``` @misc{perez2021pysentimiento, title={pysentimiento: A Python Toolkit for Sentiment Analysis and SocialNLP tasks}, author={Juan Manuel Pérez and Juan Carlos Giudici and Franco Luque}, year={2021}, eprint={2106.09462}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` Enjoy! 🤗
1,213
[ [ -0.01126861572265625, -0.042327880859375, 0.0200958251953125, 0.07098388671875, -0.0498046875, 0.0009737014770507812, -0.0294342041015625, -0.0184173583984375, 0.033416748046875, -0.00016689300537109375, -0.0250396728515625, -0.06475830078125, -0.05010986328125, -0.0003428459167480469, -0.0022830963134765625, 0.08734130859375, 0.00463104248046875, 0.039215087890625, 0.002117156982421875, -0.0213165283203125, 0.0207672119140625, -0.0279693603515625, -0.062286376953125, -0.0058746337890625, 0.05419921875, 0.004047393798828125, 0.019805908203125, -0.0303497314453125, 0.0234222412109375, 0.01531219482421875, -0.01220703125, -0.003520965576171875, -0.032318115234375, 0.01332855224609375, -0.02313232421875, -0.0080718994140625, -0.034637451171875, -0.01334381103515625, 0.03082275390625, 0.0294342041015625, 0.00782012939453125, 0.0230865478515625, 0.017608642578125, 0.038116455078125, -0.0300750732421875, 0.0243377685546875, -0.041473388671875, -0.01421356201171875, -0.0099029541015625, 0.0064239501953125, -0.0176544189453125, -0.07568359375, 0.0135955810546875, -0.00659942626953125, -0.0007524490356445312, -0.018218994140625, 0.09027099609375, 0.00457763671875, -0.0125579833984375, -0.00954437255859375, -0.021209716796875, 0.072265625, -0.06787109375, 0.021087646484375, 0.00930023193359375, 0.0164947509765625, 0.01322174072265625, -0.038909912109375, -0.0224151611328125, 0.007076263427734375, 0.02398681640625, 0.020477294921875, -0.0157623291015625, -0.0273284912109375, -0.001750946044921875, 0.01334381103515625, -0.02288818359375, -0.0102996826171875, -0.0303192138671875, -0.012969970703125, 0.044158935546875, -0.00548553466796875, -0.00081634521484375, -0.0253448486328125, -0.03125, -0.0135650634765625, -0.0257415771484375, 0.019317626953125, 0.026763916015625, 0.0108184814453125, -0.025543212890625, 0.037261962890625, 0.0066986083984375, 0.010162353515625, -0.023101806640625, 0.0186920166015625, 0.058868408203125, -0.016876220703125, -0.0255126953125, -0.0310211181640625, 0.09674072265625, 0.03564453125, 0.033843994140625, -0.00021398067474365234, -0.0185699462890625, -0.007640838623046875, 0.00989532470703125, -0.047454833984375, -0.0162200927734375, 0.033203125, -0.03778076171875, -0.039215087890625, 0.026092529296875, -0.0723876953125, -0.036834716796875, 0.01001739501953125, 0.0260009765625, -0.034759521484375, -0.048431396484375, -0.00042057037353515625, -0.0104522705078125, 0.040008544921875, 0.02423095703125, -0.0274200439453125, 0.0008935928344726562, 0.040191650390625, 0.07550048828125, 0.00550079345703125, -0.0166168212890625, 0.005191802978515625, -0.026336669921875, -0.0205841064453125, 0.061492919921875, -0.0015697479248046875, -0.03021240234375, 0.0194854736328125, 0.024017333984375, -0.009307861328125, -0.0143280029296875, 0.06561279296875, -0.024932861328125, 0.0188140869140625, -0.0291290283203125, -0.017669677734375, -0.01666259765625, 0.0177001953125, -0.048858642578125, 0.0882568359375, 0.018524169921875, -0.07843017578125, 0.0059051513671875, -0.059051513671875, -0.041961669921875, -0.015869140625, 0.01019287109375, -0.050689697265625, 0.0164337158203125, 0.0200347900390625, 0.053192138671875, -0.015380859375, 0.019073486328125, -0.033935546875, 0.0020809173583984375, 0.0278472900390625, -0.0123138427734375, 0.1041259765625, 0.03875732421875, -0.0186004638671875, 0.0015926361083984375, -0.03436279296875, -0.0088043212890625, 0.01061248779296875, -0.0142669677734375, -0.0236968994140625, 0.0046539306640625, 0.02789306640625, 0.0130462646484375, 0.047576904296875, -0.07781982421875, -0.0007824897766113281, -0.045440673828125, 0.0316162109375, 0.052032470703125, 0.0162811279296875, 0.0231475830078125, -0.0176544189453125, 0.03814697265625, 0.00778961181640625, 0.0203704833984375, 0.0106964111328125, -0.04620361328125, -0.041412353515625, -0.017913818359375, 0.02227783203125, 0.046417236328125, -0.04998779296875, 0.044891357421875, -0.019989013671875, -0.0556640625, -0.038238525390625, -0.0190887451171875, -0.00025010108947753906, 0.0196990966796875, 0.0249481201171875, 0.01171875, -0.06005859375, -0.046966552734375, -0.019256591796875, -0.0109100341796875, -0.003734588623046875, 0.01206207275390625, 0.06085205078125, -0.004901885986328125, 0.0562744140625, -0.0294342041015625, -0.021728515625, -0.01296234130859375, 0.01213836669921875, 0.044769287109375, 0.0206298828125, 0.055419921875, -0.04925537109375, -0.0623779296875, -0.0004379749298095703, -0.047271728515625, -0.011505126953125, 0.011688232421875, -0.004375457763671875, 0.040252685546875, 0.011474609375, -0.019195556640625, 0.02880859375, 0.0367431640625, -0.024688720703125, 0.0275726318359375, 0.009063720703125, 0.0119476318359375, -0.1129150390625, 0.008575439453125, 0.032562255859375, -0.0153350830078125, -0.0109710693359375, -0.01059722900390625, 0.0015316009521484375, 0.0019330978393554688, -0.061126708984375, 0.04132080078125, -0.02142333984375, 0.019012451171875, 0.014923095703125, -0.025634765625, -0.005001068115234375, 0.032257080078125, 0.002262115478515625, 0.0496826171875, 0.044647216796875, -0.016571044921875, 0.040069580078125, 0.01526641845703125, -0.03619384765625, 0.038360595703125, -0.08465576171875, -0.0027790069580078125, 0.0074920654296875, 0.0022029876708984375, -0.0859375, -0.0080413818359375, 0.035797119140625, -0.0750732421875, -0.007282257080078125, -0.0076904296875, -0.0445556640625, -0.0196075439453125, -0.0650634765625, -0.0097503662109375, 0.048248291015625, -0.034942626953125, 0.03533935546875, 0.024322509765625, -0.011474609375, -0.0521240234375, -0.0640869140625, -0.0030727386474609375, -0.0296630859375, -0.05419921875, -0.005168914794921875, 0.005756378173828125, -0.0260009765625, 0.012451171875, 0.01107025146484375, -0.0227813720703125, 0.000278472900390625, 0.01033782958984375, 0.014434814453125, -0.003047943115234375, 0.01654052734375, 0.003688812255859375, 0.0172882080078125, 0.0106658935546875, -0.0092620849609375, 0.04901123046875, -0.029571533203125, 0.011138916015625, -0.0372314453125, 0.0205078125, 0.055267333984375, -0.01503753662109375, 0.07830810546875, 0.06134033203125, -0.024810791015625, -0.0173492431640625, -0.0360107421875, 0.01001739501953125, -0.026092529296875, 0.023406982421875, -0.0171051025390625, -0.046417236328125, 0.05718994140625, 0.0165557861328125, 0.001979827880859375, 0.04901123046875, 0.058258056640625, -0.04083251953125, 0.0535888671875, 0.0167388916015625, -0.03521728515625, 0.045013427734375, -0.03961181640625, 0.01396942138671875, -0.051483154296875, -0.0243377685546875, -0.0618896484375, -0.01824951171875, -0.043609619140625, -0.01947021484375, 0.0205078125, -0.0182037353515625, -0.02923583984375, 0.021270751953125, -0.030670166015625, 0.01690673828125, 0.03802490234375, 0.01526641845703125, -0.0150146484375, 0.020782470703125, -0.0137786865234375, -0.01751708984375, -0.023529052734375, -0.0419921875, 0.08526611328125, 0.051666259765625, 0.06256103515625, 0.01055145263671875, 0.072021484375, 0.01898193359375, -0.00736236572265625, -0.06939697265625, 0.037109375, -0.020294189453125, -0.03302001953125, -0.014617919921875, -0.0272979736328125, -0.07843017578125, 0.0013456344604492188, -0.002567291259765625, -0.05279541015625, 0.022247314453125, -0.0267181396484375, -0.01006317138671875, 0.0264129638671875, -0.049835205078125, 0.061920166015625, -0.010833740234375, -0.0275726318359375, -0.0126953125, -0.031707763671875, -0.004085540771484375, 0.01300811767578125, 0.0225830078125, -0.0142669677734375, -0.022918701171875, 0.060302734375, -0.028472900390625, 0.07220458984375, -0.041351318359375, -0.01100921630859375, 0.0206756591796875, -0.00402069091796875, 0.0268707275390625, 0.0176849365234375, -0.0239410400390625, 0.033233642578125, 0.018096923828125, -0.03759765625, -0.00916290283203125, 0.0726318359375, -0.0782470703125, 0.0034008026123046875, -0.059906005859375, -0.015594482421875, -0.01160430908203125, 0.00081634521484375, 0.036041259765625, 0.02203369140625, -0.036956787109375, 0.042205810546875, 0.04461669921875, -0.0230712890625, 0.04736328125, 0.0210113525390625, 0.00579833984375, -0.05133056640625, 0.0660400390625, 0.021026611328125, -0.01416778564453125, 0.0143890380859375, 0.019866943359375, -0.033660888671875, -0.0313720703125, -0.0022830963134765625, 0.041534423828125, -0.03173828125, -0.00917816162109375, -0.05206298828125, -0.008148193359375, -0.05889892578125, -0.0021686553955078125, -0.021636962890625, -0.034576416015625, -0.036956787109375, -0.00908660888671875, 0.036712646484375, 0.035919189453125, -0.0215301513671875, 0.0221405029296875, -0.047149658203125, 0.0005106925964355469, -0.0017957687377929688, 0.0233306884765625, -0.004276275634765625, -0.03759765625, -0.0322265625, 0.00537109375, -0.020263671875, -0.0787353515625, 0.06396484375, 0.00228118896484375, 0.029876708984375, 0.02569580078125, 0.008636474609375, 0.0167083740234375, -0.0036830902099609375, 0.07861328125, 0.011749267578125, -0.05914306640625, 0.05328369140625, -0.054779052734375, 0.0096588134765625, 0.056427001953125, 0.05279541015625, -0.030670166015625, -0.033599853515625, -0.06280517578125, -0.0799560546875, 0.040740966796875, 0.004749298095703125, 0.01910400390625, -0.01010894775390625, -0.0027923583984375, -0.006336212158203125, 0.0262908935546875, -0.07293701171875, -0.005016326904296875, -0.0309906005859375, -0.0389404296875, 0.004657745361328125, -0.028228759765625, -0.01003265380859375, -0.04254150390625, 0.07867431640625, 0.0170440673828125, 0.038177490234375, 0.020782470703125, -0.01702880859375, 0.011016845703125, 0.0173187255859375, 0.031982421875, 0.04376220703125, -0.033294677734375, -0.0088043212890625, 0.0009908676147460938, -0.0261383056640625, -0.0100860595703125, -0.0033664703369140625, -0.01450347900390625, 0.0216217041015625, 0.016571044921875, 0.047149658203125, 0.006458282470703125, -0.014404296875, 0.05914306640625, -0.003696441650390625, -0.033905029296875, -0.05694580078125, -0.0225372314453125, 0.01531219482421875, 0.0230865478515625, 0.03900146484375, 0.0083770751953125, -0.01038360595703125, -0.0174560546875, -0.0049591064453125, 0.0231781005859375, -0.041351318359375, -0.038970947265625, 0.028228759765625, 0.0253143310546875, -0.00867462158203125, 0.0225830078125, -0.032135009765625, -0.044677734375, 0.0450439453125, 0.00652313232421875, 0.0877685546875, 0.002651214599609375, 0.039215087890625, 0.0538330078125, 0.0304718017578125, -0.0137786865234375, 0.05718994140625, 0.00846099853515625, -0.06842041015625, -0.027252197265625, -0.051239013671875, -0.020111083984375, 0.023651123046875, -0.049835205078125, 0.0205078125, -0.04803466796875, -0.0163116455078125, -0.0214080810546875, 0.01244354248046875, -0.0309906005859375, 0.01708984375, -0.00940704345703125, 0.07427978515625, -0.0816650390625, 0.072021484375, 0.08233642578125, -0.05926513671875, -0.06085205078125, 0.006473541259765625, -0.01056671142578125, -0.0469970703125, 0.033416748046875, -0.0025482177734375, -0.006439208984375, 0.003665924072265625, -0.06280517578125, -0.0261383056640625, 0.06402587890625, 0.0042572021484375, -0.01441192626953125, 0.0135650634765625, -0.024627685546875, 0.05755615234375, -0.021209716796875, 0.01155853271484375, 0.031219482421875, 0.0338134765625, 0.0057525634765625, -0.053070068359375, -0.03570556640625, -0.03570556640625, -0.008056640625, -0.000492095947265625, -0.051116943359375, 0.070068359375, 0.0095062255859375, 0.0010471343994140625, 0.0028934478759765625, 0.0419921875, -0.008056640625, 0.02606201171875, 0.02581787109375, 0.05706787109375, 0.0452880859375, -0.02996826171875, 0.06781005859375, -0.01390838623046875, 0.05853271484375, 0.06890869140625, -0.00322723388671875, 0.07080078125, 0.039215087890625, -0.035736083984375, 0.055938720703125, 0.05511474609375, -0.0108184814453125, 0.040374755859375, -0.01654052734375, -0.022003173828125, 0.005218505859375, -0.008453369140625, -0.035736083984375, 0.01593017578125, 0.02239990234375, -0.033233642578125, -0.00933074951171875, 0.001766204833984375, 0.022308349609375, 0.0121917724609375, -0.030548095703125, 0.042938232421875, 0.00983428955078125, -0.0199127197265625, 0.038330078125, -0.013397216796875, 0.07281494140625, -0.039398193359375, 0.03179931640625, -0.0215301513671875, 0.010498046875, -0.031463623046875, -0.07489013671875, 0.0244140625, 0.0174102783203125, -0.002124786376953125, -0.021392822265625, 0.050506591796875, -0.0036029815673828125, -0.05902099609375, 0.0308837890625, 0.038238525390625, 0.017822265625, -0.003559112548828125, -0.06744384765625, 0.005290985107421875, 0.02166748046875, -0.0445556640625, 0.00788116455078125, 0.0391845703125, 0.00334930419921875, 0.034454345703125, 0.033538818359375, 0.04730224609375, -0.0032367706298828125, 0.027130126953125, 0.057342529296875, -0.04803466796875, -0.046661376953125, -0.061248779296875, 0.0491943359375, -0.031707763671875, -0.026031494140625, 0.07586669921875, 0.0411376953125, 0.0562744140625, -0.0240020751953125, 0.07489013671875, -0.0187225341796875, 0.072998046875, -0.0116424560546875, 0.059326171875, -0.04052734375, -0.00348663330078125, -0.0255126953125, -0.06640625, -0.0279388427734375, 0.07476806640625, -0.047393798828125, -0.008575439453125, 0.060638427734375, 0.062744140625, 0.016571044921875, -0.01329803466796875, 0.005115509033203125, 0.04669189453125, 0.004741668701171875, 0.032501220703125, 0.06475830078125, -0.032501220703125, 0.024932861328125, -0.0219879150390625, -0.01264190673828125, -0.018829345703125, -0.06304931640625, -0.06451416015625, -0.044891357421875, -0.0191497802734375, -0.062103271484375, -0.01050567626953125, 0.080078125, 0.0193328857421875, -0.0841064453125, -0.0291290283203125, 0.0078277587890625, 0.00753021240234375, 0.00537109375, -0.0238800048828125, 0.0293426513671875, -0.0318603515625, -0.06524658203125, -0.0018320083618164062, -0.00283050537109375, -0.00841522216796875, -0.01265716552734375, 0.0002624988555908203, -0.028228759765625, -0.00011199712753295898, 0.038665771484375, 0.021087646484375, -0.03997802734375, -0.0119476318359375, 0.006671905517578125, -0.0181732177734375, 0.01158905029296875, 0.0289306640625, -0.031707763671875, 0.0178070068359375, 0.0697021484375, 0.0273895263671875, 0.026123046875, 0.000006616115570068359, 0.021087646484375, -0.050445556640625, 0.0265960693359375, 0.03704833984375, 0.032928466796875, 0.030487060546875, -0.015655517578125, 0.045440673828125, 0.021575927734375, -0.034942626953125, -0.053192138671875, -0.00870513916015625, -0.08709716796875, -0.0189971923828125, 0.09710693359375, -0.01456451416015625, -0.036651611328125, 0.00855255126953125, -0.0309906005859375, 0.0182952880859375, -0.06988525390625, 0.035003662109375, 0.056854248046875, 0.00960540771484375, -0.0072174072265625, -0.03851318359375, 0.0240936279296875, 0.01702880859375, -0.040618896484375, -0.0227813720703125, 0.0286712646484375, 0.03631591796875, -0.010009765625, 0.05438232421875, -0.022674560546875, 0.022491455078125, -0.027099609375, 0.0227813720703125, 0.0046234130859375, -0.00411224365234375, -0.041534423828125, 0.0151824951171875, 0.003681182861328125, -0.032867431640625 ] ]
cointegrated/rubert-tiny2
2023-10-14T21:23:32.000Z
[ "sentence-transformers", "pytorch", "safetensors", "bert", "russian", "fill-mask", "pretraining", "embeddings", "masked-lm", "tiny", "feature-extraction", "sentence-similarity", "transformers", "ru", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
cointegrated
null
null
cointegrated/rubert-tiny2
52
120,728
sentence-transformers
2022-03-02T23:29:05
--- language: - ru pipeline_tag: sentence-similarity tags: - russian - fill-mask - pretraining - embeddings - masked-lm - tiny - feature-extraction - sentence-similarity - sentence-transformers - transformers license: mit widget: - text: Миниатюрная модель для [MASK] разных задач. --- This is an updated version of [cointegrated/rubert-tiny](https://huggingface.co/cointegrated/rubert-tiny): a small Russian BERT-based encoder with high-quality sentence embeddings. This [post in Russian](https://habr.com/ru/post/669674/) gives more details. The differences from the previous version include: - a larger vocabulary: 83828 tokens instead of 29564; - larger supported sequences: 2048 instead of 512; - sentence embeddings approximate LaBSE closer than before; - meaningful segment embeddings (tuned on the NLI task) - the model is focused only on Russian. The model should be used as is to produce sentence embeddings (e.g. for KNN classification of short texts) or fine-tuned for a downstream task. Sentence embeddings can be produced as follows: ```python # pip install transformers sentencepiece import torch from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("cointegrated/rubert-tiny2") model = AutoModel.from_pretrained("cointegrated/rubert-tiny2") # model.cuda() # uncomment it if you have a GPU def embed_bert_cls(text, model, tokenizer): t = tokenizer(text, padding=True, truncation=True, return_tensors='pt') with torch.no_grad(): model_output = model(**{k: v.to(model.device) for k, v in t.items()}) embeddings = model_output.last_hidden_state[:, 0, :] embeddings = torch.nn.functional.normalize(embeddings) return embeddings[0].cpu().numpy() print(embed_bert_cls('привет мир', model, tokenizer).shape) # (312,) ``` Alternatively, you can use the model with `sentence_transformers`: ```Python from sentence_transformers import SentenceTransformer model = SentenceTransformer('cointegrated/rubert-tiny2') sentences = ["привет мир", "hello world", "здравствуй вселенная"] embeddings = model.encode(sentences) print(embeddings) ```
2,117
[ [ -0.00923919677734375, -0.0538330078125, 0.0164642333984375, -0.0005803108215332031, -0.0289459228515625, -0.0097198486328125, -0.048736572265625, -0.01476287841796875, 0.0236358642578125, 0.0280303955078125, -0.0282745361328125, -0.01371002197265625, -0.03948974609375, -0.0048065185546875, -0.01403045654296875, 0.09417724609375, 0.0005855560302734375, 0.0205535888671875, -0.02093505859375, 0.006366729736328125, -0.0253448486328125, -0.0188140869140625, -0.0258636474609375, -0.045745849609375, 0.015655517578125, 0.03887939453125, 0.060546875, 0.0162353515625, 0.04583740234375, 0.0233154296875, -0.01190185546875, -0.00583648681640625, -0.009918212890625, -0.0229339599609375, 0.018646240234375, -0.0313720703125, -0.0272216796875, -0.004085540771484375, 0.033233642578125, 0.0286865234375, -0.0152130126953125, 0.01551055908203125, -0.00260162353515625, 0.0200958251953125, -0.0205535888671875, 0.007350921630859375, -0.0273284912109375, 0.00524139404296875, 0.002506256103515625, 0.03094482421875, -0.055084228515625, -0.006610870361328125, 0.0137481689453125, -0.04193115234375, 0.0091705322265625, 0.005947113037109375, 0.09893798828125, 0.01311492919921875, -0.00921630859375, -0.039337158203125, -0.031646728515625, 0.05780029296875, -0.062286376953125, 0.020538330078125, 0.00910186767578125, 0.0186004638671875, -0.0036182403564453125, -0.0850830078125, -0.04229736328125, -0.00782012939453125, -0.0261383056640625, 0.00021207332611083984, -0.023040771484375, -0.00574493408203125, 0.00806427001953125, 0.023101806640625, -0.04217529296875, -0.007266998291015625, -0.0399169921875, -0.025970458984375, 0.0265655517578125, -0.005191802978515625, 0.0118255615234375, -0.04595947265625, -0.02081298828125, -0.0257568359375, -0.031707763671875, -0.004718780517578125, 0.0263214111328125, 0.013458251953125, -0.04315185546875, 0.05194091796875, -0.00989532470703125, 0.045013427734375, 0.0177001953125, 0.0012302398681640625, 0.03863525390625, -0.00801849365234375, -0.0167388916015625, 0.0078125, 0.065185546875, 0.01293182373046875, 0.0149688720703125, -0.0137176513671875, -0.024566650390625, -0.0081329345703125, 0.0032100677490234375, -0.06744384765625, -0.056854248046875, 0.00881195068359375, -0.05169677734375, -0.0260772705078125, 0.0176544189453125, -0.0440673828125, 0.000026226043701171875, 0.008392333984375, 0.059295654296875, -0.04998779296875, 0.0015926361083984375, 0.0257568359375, -0.011627197265625, 0.0278778076171875, 0.00811767578125, -0.06829833984375, 0.033355712890625, 0.0579833984375, 0.07647705078125, 0.016021728515625, -0.0017480850219726562, -0.044342041015625, -0.024993896484375, -0.00867462158203125, 0.054473876953125, -0.0242919921875, -0.00923919677734375, 0.00753021240234375, 0.003421783447265625, 0.01174163818359375, -0.03204345703125, 0.033660888671875, -0.03948974609375, 0.04901123046875, 0.00046253204345703125, -0.0285491943359375, -0.0006337165832519531, 0.018402099609375, -0.03948974609375, 0.084716796875, 0.0151214599609375, -0.06268310546875, 0.0288543701171875, -0.058074951171875, -0.032379150390625, -0.000016927719116210938, 0.00531005859375, -0.0535888671875, 0.005931854248046875, 0.018035888671875, 0.0211181640625, -0.0062103271484375, 0.00899505615234375, -0.0193328857421875, -0.032806396484375, 0.0416259765625, -0.0164031982421875, 0.0740966796875, 0.026397705078125, -0.0103302001953125, 0.0127716064453125, -0.0560302734375, -0.0002186298370361328, 0.004711151123046875, -0.023681640625, -0.00783538818359375, -0.01549530029296875, 0.038330078125, 0.01629638671875, 0.040496826171875, -0.06744384765625, 0.0187835693359375, -0.050537109375, 0.0684814453125, 0.040008544921875, 0.0026226043701171875, 0.052886962890625, -0.031768798828125, 0.01311492919921875, 0.01042938232421875, 0.02508544921875, -0.0049591064453125, -0.0533447265625, -0.0830078125, -0.0237274169921875, 0.056549072265625, 0.039093017578125, -0.06756591796875, 0.04290771484375, -0.03564453125, -0.0347900390625, -0.05596923828125, -0.00399017333984375, -0.0007114410400390625, -0.00609588623046875, 0.01546478271484375, -0.0085296630859375, -0.045074462890625, -0.0712890625, -0.00858306884765625, -0.00922393798828125, -0.00543212890625, -0.0085601806640625, 0.0718994140625, -0.01971435546875, 0.06781005859375, -0.02734375, -0.0198822021484375, -0.02593994140625, 0.0251617431640625, 0.039093017578125, 0.06011962890625, 0.03289794921875, -0.03106689453125, -0.050140380859375, 0.0070037841796875, -0.020355224609375, 0.024169921875, -0.03631591796875, -0.01477813720703125, 0.010467529296875, 0.01441192626953125, -0.07000732421875, 0.0281982421875, 0.02813720703125, -0.04669189453125, 0.02313232421875, -0.046875, -0.00665283203125, -0.0880126953125, -0.00414276123046875, -0.0206298828125, -0.027923583984375, -0.043426513671875, -0.00597381591796875, 0.0273284912109375, -0.0033588409423828125, -0.033599853515625, 0.03887939453125, -0.036102294921875, -0.001880645751953125, -0.012359619140625, 0.0271148681640625, 0.00011360645294189453, 0.021575927734375, -0.01507568359375, 0.05126953125, 0.0421142578125, -0.019500732421875, 0.03985595703125, 0.04705810546875, -0.054779052734375, 0.020599365234375, -0.056549072265625, 0.013031005859375, 0.0002536773681640625, -0.005859375, -0.0660400390625, -0.01535797119140625, 0.033355712890625, -0.06201171875, 0.032470703125, -0.0029697418212890625, -0.06622314453125, -0.030517578125, -0.0311431884765625, 0.00560760498046875, 0.06976318359375, -0.0460205078125, 0.042144775390625, 0.0098114013671875, -0.00482940673828125, -0.0303497314453125, -0.091552734375, 0.0167999267578125, -0.0032196044921875, -0.06341552734375, 0.03192138671875, -0.002689361572265625, -0.0121307373046875, 0.016845703125, 0.018585205078125, -0.0034542083740234375, -0.0130767822265625, -0.0020923614501953125, 0.0307769775390625, -0.0163116455078125, 0.01519012451171875, 0.0052947998046875, 0.0018148422241210938, 0.004913330078125, -0.003971099853515625, 0.047454833984375, -0.01541900634765625, 0.0095977783203125, -0.039031982421875, 0.007904052734375, 0.018157958984375, -0.01500701904296875, 0.06304931640625, 0.09375, -0.03369140625, -0.01251220703125, -0.0355224609375, -0.0266876220703125, -0.040924072265625, 0.0241546630859375, -0.026763916015625, -0.06884765625, 0.047332763671875, 0.0204620361328125, -0.006969451904296875, 0.043487548828125, 0.042877197265625, -0.0111541748046875, 0.06103515625, 0.049560546875, 0.002712249755859375, 0.050201416015625, -0.04034423828125, 0.027130126953125, -0.0726318359375, -0.0213623046875, -0.024444580078125, -0.023345947265625, -0.051300048828125, -0.049407958984375, 0.00972747802734375, -0.0125274658203125, -0.0167999267578125, 0.020904541015625, -0.04571533203125, 0.0185699462890625, 0.04608154296875, 0.0132293701171875, -0.01267242431640625, 0.00582122802734375, 0.0017108917236328125, -0.02374267578125, -0.07061767578125, -0.041015625, 0.07763671875, 0.023406982421875, 0.0682373046875, 0.0027008056640625, 0.05120849609375, 0.03314208984375, 0.02789306640625, -0.061614990234375, 0.02996826171875, -0.024169921875, -0.0482177734375, 0.006282806396484375, -0.01739501953125, -0.059844970703125, 0.00691986083984375, -0.0374755859375, -0.05389404296875, 0.012298583984375, -0.00885009765625, -0.0374755859375, 0.023040771484375, -0.051727294921875, 0.092529296875, -0.000640869140625, -0.007480621337890625, -0.0119781494140625, -0.03399658203125, 0.01197052001953125, 0.0193328857421875, 0.0118255615234375, 0.00771331787109375, -0.0086669921875, 0.0733642578125, -0.033935546875, 0.0560302734375, -0.0104522705078125, 0.0159149169921875, 0.035430908203125, -0.0091094970703125, 0.05126953125, 0.00844573974609375, 0.0005059242248535156, 0.0018901824951171875, -0.0032901763916015625, -0.025543212890625, -0.026885986328125, 0.06671142578125, -0.0701904296875, -0.028533935546875, -0.04681396484375, -0.0258026123046875, 0.0250244140625, 0.0170745849609375, 0.022308349609375, 0.037567138671875, -0.0186920166015625, 0.03753662109375, 0.0443115234375, -0.0242919921875, 0.052825927734375, 0.00743865966796875, 0.0038356781005859375, -0.05126953125, 0.04644775390625, -0.015716552734375, 0.005077362060546875, 0.03424072265625, 0.0227203369140625, -0.008209228515625, -0.0177764892578125, -0.03564453125, 0.0401611328125, -0.05303955078125, -0.02691650390625, -0.04400634765625, -0.00844573974609375, -0.04595947265625, -0.0234527587890625, -0.0257415771484375, -0.04705810546875, -0.03985595703125, 0.00350189208984375, 0.05096435546875, 0.050323486328125, -0.03253173828125, 0.04669189453125, -0.04840087890625, 0.01104736328125, 0.0128173828125, 0.011322021484375, -0.0300140380859375, -0.06536865234375, -0.0477294921875, -0.004589080810546875, -0.01300811767578125, -0.06451416015625, 0.040802001953125, 0.023956298828125, 0.02496337890625, 0.020172119140625, 0.00801849365234375, 0.039459228515625, -0.0465087890625, 0.08038330078125, 0.0176239013671875, -0.07501220703125, 0.0335693359375, 0.00074005126953125, 0.030303955078125, 0.0211639404296875, -0.010528564453125, -0.05029296875, -0.011505126953125, -0.050506591796875, -0.044219970703125, 0.07733154296875, 0.0228271484375, 0.026275634765625, -0.01474761962890625, 0.028778076171875, 0.0099945068359375, 0.0005211830139160156, -0.059234619140625, -0.024444580078125, -0.0280609130859375, -0.046478271484375, -0.00902557373046875, -0.031707763671875, 0.006504058837890625, -0.0202484130859375, 0.054412841796875, 0.01169586181640625, 0.058380126953125, 0.0241851806640625, -0.033905029296875, 0.004482269287109375, 0.016571044921875, 0.04058837890625, 0.0135650634765625, -0.025787353515625, 0.007190704345703125, 0.0240478515625, -0.048187255859375, -0.0186767578125, 0.01287841796875, -0.028564453125, 0.03466796875, 0.0092315673828125, 0.06475830078125, 0.025146484375, -0.044891357421875, 0.06256103515625, -0.0020160675048828125, -0.0177764892578125, -0.04620361328125, -0.01552581787109375, -0.005126953125, 0.0241851806640625, 0.01015472412109375, 0.023223876953125, 0.01678466796875, -0.042755126953125, 0.01922607421875, 0.0271148681640625, -0.03662109375, -0.031951904296875, 0.059661865234375, 0.01273345947265625, -0.03839111328125, 0.050079345703125, -0.0310211181640625, -0.043487548828125, 0.028900146484375, 0.041717529296875, 0.06671142578125, 0.01448822021484375, 0.03094482421875, 0.044921875, 0.057464599609375, -0.005077362060546875, 0.032318115234375, -0.006229400634765625, -0.046875, -0.034881591796875, -0.0308074951171875, -0.0166168212890625, -0.012054443359375, -0.060943603515625, 0.0283050537109375, -0.02777099609375, -0.0215606689453125, -0.00923919677734375, 0.01096343994140625, -0.06146240234375, 0.0116424560546875, 0.01451873779296875, 0.0643310546875, -0.056640625, 0.08563232421875, 0.06915283203125, -0.01183319091796875, -0.016754150390625, -0.0301666259765625, -0.035675048828125, -0.050933837890625, 0.0479736328125, 0.0015010833740234375, 0.0232086181640625, -0.01568603515625, -0.0345458984375, -0.0721435546875, 0.0767822265625, 0.003993988037109375, -0.0184173583984375, 0.0064239501953125, 0.01219940185546875, 0.0294189453125, -0.03289794921875, 0.020751953125, 0.0404052734375, 0.032470703125, -0.01995849609375, -0.059417724609375, 0.007534027099609375, -0.0254669189453125, 0.005596160888671875, 0.0118865966796875, -0.0516357421875, 0.0806884765625, -0.0191650390625, -0.0048065185546875, 0.039337158203125, 0.06671142578125, -0.005828857421875, -0.003688812255859375, 0.0249786376953125, 0.04168701171875, 0.039459228515625, -0.03564453125, 0.05908203125, -0.020751953125, 0.05859375, 0.0712890625, 0.021575927734375, 0.08294677734375, 0.045989990234375, -0.00677490234375, 0.06854248046875, 0.037353515625, -0.0215301513671875, 0.061614990234375, -0.0006589889526367188, -0.000743865966796875, -0.020416259765625, 0.024322509765625, -0.01318359375, 0.0294952392578125, 0.0213775634765625, -0.0264129638671875, 0.0035305023193359375, 0.023223876953125, 0.01541900634765625, -0.0089569091796875, -0.002658843994140625, 0.039581298828125, -0.00939178466796875, -0.0242767333984375, 0.021820068359375, 0.02520751953125, 0.08428955078125, -0.033111572265625, 0.0132293701171875, -0.0091094970703125, 0.034881591796875, 0.00839996337890625, -0.047454833984375, 0.01165771484375, 0.01519775390625, 0.001575469970703125, -0.018402099609375, 0.061614990234375, -0.0386962890625, -0.0484619140625, -0.01169586181640625, 0.015411376953125, 0.014923095703125, 0.016082763671875, -0.060577392578125, 0.01629638671875, 0.0005106925964355469, -0.0253143310546875, 0.00811767578125, 0.037353515625, 0.04034423828125, 0.022705078125, 0.0252685546875, 0.006366729736328125, 0.0154571533203125, -0.0032253265380859375, 0.048858642578125, -0.053131103515625, -0.0491943359375, -0.047332763671875, 0.049560546875, -0.022918701171875, -0.033660888671875, 0.05340576171875, 0.05499267578125, 0.0677490234375, -0.034149169921875, 0.03173828125, -0.0274658203125, 0.006816864013671875, -0.035858154296875, 0.06390380859375, -0.040435791015625, -0.003002166748046875, 0.0067291259765625, -0.0513916015625, -0.0227508544921875, 0.0877685546875, -0.0147247314453125, -0.0022411346435546875, 0.056365966796875, 0.042877197265625, -0.00537109375, -0.021331787109375, 0.0200347900390625, 0.027740478515625, 0.018157958984375, 0.037109375, 0.04400634765625, -0.0787353515625, 0.063232421875, -0.04736328125, 0.0005750656127929688, -0.033782958984375, -0.047760009765625, -0.07598876953125, -0.03448486328125, -0.0521240234375, -0.027679443359375, -0.00982666015625, 0.0721435546875, 0.050262451171875, -0.0596923828125, -0.01398468017578125, 0.0196533203125, -0.006450653076171875, -0.007564544677734375, -0.0201416015625, 0.0272979736328125, -0.02581787109375, -0.05621337890625, 0.002246856689453125, 0.01116943359375, 0.004131317138671875, -0.00896453857421875, -0.0134429931640625, -0.01055908203125, 0.00685882568359375, 0.042449951171875, -0.01073455810546875, -0.05889892578125, -0.05816650390625, 0.0130767822265625, -0.01197052001953125, 0.01294708251953125, 0.032806396484375, -0.050201416015625, 0.043853759765625, 0.0384521484375, 0.0171661376953125, 0.060791015625, -0.039337158203125, 0.013885498046875, -0.04962158203125, 0.032073974609375, 0.01030731201171875, 0.0433349609375, 0.04962158203125, -0.0118865966796875, 0.013946533203125, 0.007785797119140625, -0.04766845703125, -0.06256103515625, -0.01322174072265625, -0.09423828125, -0.0108642578125, 0.08544921875, -0.00958251953125, -0.050262451171875, 0.016510009765625, -0.0280914306640625, 0.0276947021484375, -0.0289764404296875, 0.05059814453125, 0.061614990234375, 0.0163726806640625, -0.017059326171875, -0.0286865234375, 0.0278778076171875, 0.0248260498046875, -0.0302276611328125, -0.012298583984375, 0.0156402587890625, 0.0290679931640625, 0.0223388671875, 0.0165252685546875, -0.00659942626953125, 0.036285400390625, 0.02777099609375, 0.034271240234375, 0.0007829666137695312, -0.002887725830078125, -0.0283355712890625, -0.0141754150390625, 0.0005164146423339844, -0.0443115234375 ] ]
meta-llama/Llama-2-13b-hf
2023-10-12T16:23:05.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "facebook", "meta", "llama-2", "en", "arxiv:2307.09288", "has_space", "text-generation-inference", "region:us" ]
text-generation
meta-llama
null
null
meta-llama/Llama-2-13b-hf
417
120,032
transformers
2023-07-13T15:49:56
--- extra_gated_heading: Access Llama 2 on Hugging Face extra_gated_description: >- This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days. extra_gated_prompt: "**Your Hugging Face account email address MUST match the email you provide on the Meta website, or your request will not be approved.**" extra_gated_button_content: Submit extra_gated_fields: I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox language: - en pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 --- # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
10,359
[ [ -0.016357421875, -0.05303955078125, 0.0279388427734375, 0.01512908935546875, -0.0282135009765625, 0.017974853515625, -0.00429534912109375, -0.056304931640625, 0.00536346435546875, 0.0225677490234375, -0.053314208984375, -0.041748046875, -0.050811767578125, 0.0052337646484375, -0.016510009765625, 0.08050537109375, -0.001232147216796875, -0.021484375, -0.00910186767578125, 0.007022857666015625, -0.036468505859375, -0.0296783447265625, -0.039794921875, -0.031951904296875, 0.029205322265625, 0.036407470703125, 0.045135498046875, 0.048828125, 0.04119873046875, 0.018341064453125, -0.0192413330078125, 0.0166778564453125, -0.053863525390625, -0.020050048828125, 0.0095977783203125, -0.037017822265625, -0.051239013671875, 0.012420654296875, 0.027130126953125, 0.013153076171875, -0.021453857421875, 0.039642333984375, 0.005840301513671875, 0.035797119140625, -0.042022705078125, 0.01247406005859375, -0.055084228515625, 0.0027904510498046875, -0.016998291015625, -0.006134033203125, -0.01448822021484375, -0.0221710205078125, -0.01451873779296875, -0.062225341796875, -0.008819580078125, 0.005878448486328125, 0.078857421875, 0.048797607421875, -0.034088134765625, -0.00867462158203125, -0.021728515625, 0.07135009765625, -0.06353759765625, 0.004329681396484375, 0.04388427734375, 0.0215301513671875, -0.0167694091796875, -0.057403564453125, -0.04827880859375, -0.01039886474609375, 0.004756927490234375, 0.02679443359375, -0.0309295654296875, 0.00032210350036621094, 0.01262664794921875, 0.0283660888671875, -0.04327392578125, 0.043548583984375, -0.03826904296875, -0.0129547119140625, 0.0791015625, 0.01800537109375, -0.0006341934204101562, -0.0031642913818359375, -0.037109375, -0.021820068359375, -0.05987548828125, 0.01311492919921875, 0.03680419921875, -0.003154754638671875, -0.035675048828125, 0.04656982421875, -0.0309295654296875, 0.0214080810546875, 0.0019855499267578125, -0.038543701171875, 0.036376953125, -0.0355224609375, -0.02044677734375, -0.00975799560546875, 0.0672607421875, 0.054534912109375, 0.0120849609375, 0.00765228271484375, -0.004581451416015625, 0.00879669189453125, -0.0009813308715820312, -0.061767578125, -0.0038776397705078125, 0.01824951171875, -0.028228759765625, -0.04473876953125, -0.022735595703125, -0.05596923828125, -0.01190185546875, -0.0074310302734375, 0.018707275390625, -0.00307464599609375, -0.0291900634765625, 0.00862884521484375, 0.0038700103759765625, 0.041656494140625, 0.015838623046875, -0.0714111328125, 0.0169830322265625, 0.042327880859375, 0.058807373046875, -0.0185546875, -0.0270538330078125, 0.0011148452758789062, -0.0020198822021484375, -0.0240631103515625, 0.068115234375, -0.026458740234375, -0.040191650390625, -0.01678466796875, -0.0018625259399414062, 0.0121917724609375, -0.038604736328125, 0.031829833984375, -0.0297088623046875, 0.0135650634765625, -0.024932861328125, -0.028228759765625, -0.02471923828125, 0.014801025390625, -0.029083251953125, 0.10943603515625, 0.00894927978515625, -0.03662109375, 0.022979736328125, -0.05084228515625, -0.013885498046875, -0.01532745361328125, 0.007282257080078125, -0.0396728515625, -0.020294189453125, 0.00971221923828125, 0.0272674560546875, -0.048553466796875, 0.0355224609375, -0.0152740478515625, -0.03277587890625, 0.003398895263671875, -0.0311279296875, 0.0633544921875, 0.0219268798828125, -0.034881591796875, 0.005329132080078125, -0.062042236328125, 0.0047760009765625, 0.034393310546875, -0.035888671875, 0.0206146240234375, 0.005825042724609375, -0.00867462158203125, 0.01439666748046875, 0.037139892578125, -0.027435302734375, 0.01239776611328125, -0.0236663818359375, 0.03765869140625, 0.05645751953125, 0.003269195556640625, 0.0126495361328125, -0.038848876953125, 0.038604736328125, -0.0028896331787109375, 0.0292205810546875, 0.0014886856079101562, -0.053741455078125, -0.076904296875, -0.01390838623046875, -0.002864837646484375, 0.06365966796875, -0.0192108154296875, 0.052825927734375, -0.001007080078125, -0.05615234375, -0.03118896484375, 0.027740478515625, 0.050933837890625, 0.037841796875, 0.032135009765625, -0.0213775634765625, -0.04638671875, -0.0760498046875, 0.004367828369140625, -0.033477783203125, -0.0019855499267578125, 0.026641845703125, 0.049072265625, -0.0254364013671875, 0.05487060546875, -0.040802001953125, -0.01332855224609375, -0.019805908203125, -0.01004791259765625, 0.004329681396484375, 0.0264739990234375, 0.049468994140625, -0.028778076171875, -0.016265869140625, -0.009552001953125, -0.067626953125, -0.0079193115234375, 0.0087432861328125, -0.0161895751953125, 0.0176239013671875, 0.023529052734375, -0.045928955078125, 0.034088134765625, 0.053558349609375, -0.0134429931640625, 0.039337158203125, 0.0002903938293457031, -0.01324462890625, -0.0811767578125, 0.00257110595703125, -0.0157623291015625, 0.0025882720947265625, -0.03271484375, -0.003063201904296875, -0.0157928466796875, 0.006298065185546875, -0.045684814453125, 0.04473876953125, -0.0230712890625, -0.0121612548828125, -0.00989532470703125, 0.00428009033203125, 0.004375457763671875, 0.04656982421875, -0.009490966796875, 0.08050537109375, 0.0303192138671875, -0.044036865234375, 0.01947021484375, 0.030029296875, -0.03778076171875, 0.01155853271484375, -0.06646728515625, 0.0276641845703125, 0.00868988037109375, 0.03997802734375, -0.073974609375, -0.0289154052734375, 0.02423095703125, -0.0328369140625, 0.007373809814453125, 0.01776123046875, -0.04168701171875, -0.0303192138671875, -0.032318115234375, 0.0234222412109375, 0.061920166015625, -0.03424072265625, 0.01300048828125, 0.028778076171875, 0.0021915435791015625, -0.0518798828125, -0.0628662109375, 0.004421234130859375, -0.0271453857421875, -0.040191650390625, 0.022674560546875, -0.01398468017578125, -0.01763916015625, -0.0196685791015625, 0.005229949951171875, -0.00040435791015625, 0.02850341796875, 0.0276641845703125, 0.0276641845703125, -0.0090789794921875, -0.0016736984252929688, 0.01071929931640625, -0.01548004150390625, 0.002742767333984375, 0.015167236328125, 0.044769287109375, -0.012908935546875, -0.01678466796875, -0.055816650390625, 0.002887725830078125, 0.021148681640625, -0.019287109375, 0.0460205078125, 0.03240966796875, -0.0164337158203125, 0.0175323486328125, -0.05853271484375, -0.00853729248046875, -0.0404052734375, 0.041015625, -0.01617431640625, -0.0628662109375, 0.039886474609375, -0.0006098747253417969, 0.0328369140625, 0.055908203125, 0.0472412109375, -0.006633758544921875, 0.06005859375, 0.043060302734375, -0.005184173583984375, 0.0258026123046875, -0.036834716796875, -0.0077667236328125, -0.0706787109375, -0.046783447265625, -0.0240631103515625, -0.03314208984375, -0.049835205078125, -0.031524658203125, 0.01953125, 0.0142822265625, -0.05145263671875, 0.0243682861328125, -0.0440673828125, 0.04327392578125, 0.0401611328125, 0.01010894775390625, 0.022430419921875, 0.00835418701171875, 0.01088714599609375, 0.00365447998046875, -0.038787841796875, -0.05572509765625, 0.11083984375, 0.03228759765625, 0.03375244140625, 0.00774383544921875, 0.051055908203125, 0.01053619384765625, 0.0241851806640625, -0.0531005859375, 0.049102783203125, 0.00376129150390625, -0.054046630859375, -0.01171112060546875, -0.00862884521484375, -0.0672607421875, 0.01100921630859375, -0.015655517578125, -0.05926513671875, 0.0019207000732421875, -0.0019016265869140625, -0.027557373046875, 0.0219879150390625, -0.050323486328125, 0.045196533203125, -0.04254150390625, -0.023284912109375, -0.026611328125, -0.060333251953125, 0.051513671875, -0.01532745361328125, 0.00714874267578125, -0.037506103515625, -0.0198211669921875, 0.06787109375, -0.0265655517578125, 0.07550048828125, -0.003726959228515625, -0.007694244384765625, 0.04327392578125, -0.013641357421875, 0.033905029296875, 0.0022830963134765625, -0.0201416015625, 0.050048828125, -0.0098114013671875, -0.0241851806640625, -0.01165771484375, 0.040069580078125, -0.0916748046875, -0.059234619140625, -0.03680419921875, -0.03814697265625, -0.0033054351806640625, 0.00641632080078125, 0.038665771484375, -0.007259368896484375, -0.0025691986083984375, 0.00937652587890625, 0.03424072265625, -0.038360595703125, 0.0352783203125, 0.04168701171875, -0.0078125, -0.034637451171875, 0.0491943359375, 0.00365447998046875, 0.02740478515625, 0.01708984375, 0.00315093994140625, -0.0309600830078125, -0.031890869140625, -0.037841796875, 0.0208740234375, -0.03497314453125, -0.036712646484375, -0.04058837890625, -0.0268707275390625, -0.024688720703125, -0.00547027587890625, -0.033416748046875, -0.032928466796875, -0.05615234375, -0.029144287109375, 0.03948974609375, 0.0614013671875, -0.00042247772216796875, 0.04864501953125, -0.024383544921875, 0.0137939453125, 0.0287322998046875, 0.01404571533203125, -0.0015430450439453125, -0.058258056640625, 0.0047454833984375, 0.0098114013671875, -0.05755615234375, -0.04638671875, 0.017669677734375, 0.0210723876953125, 0.035064697265625, 0.036041259765625, -0.005840301513671875, 0.05865478515625, -0.026885986328125, 0.08209228515625, 0.02734375, -0.049774169921875, 0.052886962890625, -0.01544189453125, 0.00283050537109375, 0.0479736328125, 0.0199737548828125, -0.0059967041015625, -0.01195526123046875, -0.0479736328125, -0.05059814453125, 0.060638427734375, 0.0178985595703125, 0.0141143798828125, 0.0044403076171875, 0.034271240234375, 0.004314422607421875, 0.0079803466796875, -0.061767578125, -0.023284912109375, -0.0205230712890625, -0.007694244384765625, -0.01485443115234375, -0.03826904296875, -0.005096435546875, -0.023773193359375, 0.04779052734375, 0.00438690185546875, 0.0260467529296875, -0.010528564453125, 0.0015048980712890625, -0.007717132568359375, 0.0032138824462890625, 0.054595947265625, 0.037017822265625, -0.0194091796875, -0.01137542724609375, 0.048370361328125, -0.047943115234375, 0.02606201171875, 0.0007786750793457031, -0.00928497314453125, -0.027984619140625, 0.0306396484375, 0.066650390625, 0.019683837890625, -0.0531005859375, 0.025665283203125, 0.01076507568359375, -0.0279083251953125, -0.031585693359375, 0.0273284912109375, 0.006740570068359375, 0.0248565673828125, 0.0209197998046875, -0.01071929931640625, 0.005565643310546875, -0.03802490234375, -0.0090789794921875, 0.028961181640625, 0.00897216796875, -0.03204345703125, 0.07501220703125, 0.024017333984375, -0.0217132568359375, 0.040069580078125, -0.0129547119140625, -0.027252197265625, 0.06805419921875, 0.047637939453125, 0.04888916015625, -0.02044677734375, 0.008941650390625, 0.0535888671875, 0.034088134765625, -0.0172576904296875, 0.0172271728515625, -0.0009908676147460938, -0.03704833984375, -0.0161590576171875, -0.052459716796875, -0.03521728515625, 0.0269622802734375, -0.0426025390625, 0.0234832763671875, -0.047149658203125, -0.0204925537109375, -0.02386474609375, 0.034454345703125, -0.051239013671875, 0.0154876708984375, 0.00832366943359375, 0.06951904296875, -0.054107666015625, 0.05767822265625, 0.037109375, -0.037200927734375, -0.0667724609375, -0.0224456787109375, 0.015167236328125, -0.09344482421875, 0.03985595703125, 0.028076171875, -0.00472259521484375, 0.00960540771484375, -0.05718994140625, -0.09136962890625, 0.1275634765625, 0.03424072265625, -0.05706787109375, -0.0019292831420898438, 0.025634765625, 0.037139892578125, -0.008453369140625, 0.034027099609375, 0.0621337890625, 0.037139892578125, 0.0093994140625, -0.07989501953125, 0.0071258544921875, -0.02679443359375, -0.0023479461669921875, -0.01456451416015625, -0.09857177734375, 0.060821533203125, -0.02972412109375, -0.0177001953125, 0.0164337158203125, 0.0484619140625, 0.05169677734375, 0.041107177734375, 0.0262603759765625, 0.0594482421875, 0.06854248046875, -0.0024814605712890625, 0.0833740234375, -0.027435302734375, 0.01373291015625, 0.067138671875, -0.02239990234375, 0.0731201171875, 0.01788330078125, -0.044921875, 0.04632568359375, 0.07611083984375, -0.0021915435791015625, 0.0447998046875, 0.00469970703125, -0.01220703125, -0.01387786865234375, -0.01245880126953125, -0.04937744140625, 0.038787841796875, 0.0188140869140625, -0.0103759765625, -0.002315521240234375, -0.0249786376953125, 0.0172576904296875, -0.0249176025390625, -0.0002720355987548828, 0.06072998046875, 0.01235198974609375, -0.04608154296875, 0.06695556640625, 0.0031909942626953125, 0.06414794921875, -0.049285888671875, 0.00730133056640625, -0.03955078125, 0.0009298324584960938, -0.0279388427734375, -0.05316162109375, 0.005443572998046875, 0.027801513671875, -0.00013244152069091797, -0.00727081298828125, 0.041229248046875, 0.0030155181884765625, -0.042266845703125, 0.02655029296875, 0.02099609375, 0.0267486572265625, 0.0160369873046875, -0.05078125, 0.01378631591796875, 0.00685882568359375, -0.041046142578125, 0.02886962890625, 0.002655029296875, -0.004459381103515625, 0.059814453125, 0.055694580078125, -0.0157623291015625, 0.01023101806640625, -0.0161895751953125, 0.0751953125, -0.037200927734375, -0.01495361328125, -0.0572509765625, 0.039886474609375, 0.0035247802734375, -0.053466796875, 0.041015625, 0.048614501953125, 0.052276611328125, 0.020355224609375, 0.048828125, 0.00528717041015625, 0.023956298828125, -0.03948974609375, 0.046234130859375, -0.0582275390625, 0.02838134765625, 0.00592041015625, -0.073486328125, -0.004680633544921875, 0.05029296875, -0.0180206298828125, 0.0037670135498046875, 0.02777099609375, 0.06439208984375, 0.01270294189453125, -0.0123443603515625, 0.009033203125, 0.0131683349609375, 0.02642822265625, 0.067138671875, 0.0635986328125, -0.047821044921875, 0.053314208984375, -0.02911376953125, -0.0181884765625, -0.0208740234375, -0.05517578125, -0.07342529296875, -0.0204010009765625, -0.018463134765625, -0.01169586181640625, 0.004573822021484375, 0.05609130859375, 0.0382080078125, -0.0439453125, -0.0222320556640625, -0.005344390869140625, -0.00679779052734375, 0.002841949462890625, -0.01190948486328125, 0.0254058837890625, -0.0092010498046875, -0.043701171875, 0.036041259765625, 0.00043582916259765625, 0.0154876708984375, -0.02423095703125, -0.0205230712890625, -0.01458740234375, 0.01092529296875, 0.046295166015625, 0.021240234375, -0.06951904296875, -0.01708984375, 0.003253936767578125, -0.01108551025390625, 0.0094451904296875, 0.0012464523315429688, -0.058013916015625, 0.00678253173828125, 0.01070404052734375, 0.02880859375, 0.0501708984375, 0.004459381103515625, 0.004169464111328125, -0.039031982421875, 0.034515380859375, 0.0006055831909179688, 0.010467529296875, 0.022735595703125, -0.0306549072265625, 0.05963134765625, 0.01108551025390625, -0.05267333984375, -0.071533203125, 0.00823974609375, -0.0784912109375, -0.00018298625946044922, 0.10345458984375, 0.0007843971252441406, -0.00899505615234375, 0.01477813720703125, -0.015838623046875, 0.028900146484375, -0.0282135009765625, 0.060516357421875, 0.042236328125, -0.005878448486328125, -0.007228851318359375, -0.059234619140625, 0.0263214111328125, 0.0296783447265625, -0.08209228515625, -0.019195556640625, 0.0338134765625, 0.03680419921875, -0.0072784423828125, 0.051849365234375, 0.0015134811401367188, 0.0174713134765625, 0.00562286376953125, 0.00803375244140625, -0.0186767578125, -0.011199951171875, -0.0071258544921875, -0.0199432373046875, -0.004108428955078125, -0.016937255859375 ] ]
madebyollin/sdxl-vae-fp16-fix
2023-09-25T14:55:46.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "license:mit", "has_space", "diffusers:AutoencoderKL", "region:us" ]
null
madebyollin
null
null
madebyollin/sdxl-vae-fp16-fix
222
119,976
diffusers
2023-07-11T04:03:50
--- license: mit tags: - stable-diffusion - stable-diffusion-diffusers inference: false --- # SDXL-VAE-FP16-Fix SDXL-VAE-FP16-Fix is the [SDXL VAE](https://huggingface.co/stabilityai/sdxl-vae)*, but modified to run in fp16 precision without generating NaNs. | VAE | Decoding in `float32` / `bfloat16` precision | Decoding in `float16` precision | | --------------------- | -------------------------------------------- | ------------------------------- | | SDXL-VAE | ✅ ![](./images/orig-fp32.png) | ⚠️ ![](./images/orig-fp16.png) | | SDXL-VAE-FP16-Fix | ✅ ![](./images/fix-fp32.png) | ✅ ![](./images/fix-fp16.png) | ## 🧨 Diffusers Usage Just load this checkpoint via `AutoencoderKL`: ```py import torch from diffusers import DiffusionPipeline, AutoencoderKL vae = AutoencoderKL.from_pretrained("madebyollin/sdxl-vae-fp16-fix", torch_dtype=torch.float16) pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-base-1.0", vae=vae, torch_dtype=torch.float16, variant="fp16", use_safetensors=True) pipe.to("cuda") refiner = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-xl-refiner-1.0", vae=vae, torch_dtype=torch.float16, use_safetensors=True, variant="fp16") refiner.to("cuda") n_steps = 40 high_noise_frac = 0.7 prompt = "A majestic lion jumping from a big stone at night" image = pipe(prompt=prompt, num_inference_steps=n_steps, denoising_end=high_noise_frac, output_type="latent").images image = refiner(prompt=prompt, num_inference_steps=n_steps, denoising_start=high_noise_frac, image=image).images[0] image ``` ![](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/lion_refined.png) ## Details SDXL-VAE generates NaNs in fp16 because the internal activation values are too big: ![](./images/activation-magnitudes.jpg) SDXL-VAE-FP16-Fix was created by finetuning the SDXL-VAE to: 1. keep the final output the same, but 2. make the internal activation values smaller, by 3. scaling down weights and biases within the network There are slight discrepancies between the output of SDXL-VAE-FP16-Fix and SDXL-VAE, but the decoded images should be [close enough for most purposes](https://huggingface.co/madebyollin/sdxl-vae-fp16-fix/discussions/7#64c5c0f8e2e5c94bd04eaa80). --- \* `sdxl-vae-fp16-fix` is specifically based on [SDXL-VAE (0.9)](https://huggingface.co/stabilityai/sdxl-vae/discussions/6#64acea3f7ac35b7de0554490), but it works with SDXL 1.0 too
2,513
[ [ -0.039520263671875, -0.0262298583984375, 0.0282135009765625, 0.0294189453125, -0.015655517578125, -0.010101318359375, -0.0008640289306640625, -0.00962066650390625, 0.035888671875, 0.034423828125, -0.041412353515625, -0.0345458984375, -0.0443115234375, -0.00040340423583984375, -0.01027679443359375, 0.057769775390625, -0.01552581787109375, -0.006488800048828125, 0.0113372802734375, -0.00812530517578125, 0.0004742145538330078, -0.017913818359375, -0.07672119140625, -0.00382232666015625, 0.02996826171875, -0.00804901123046875, 0.042724609375, 0.038909912109375, 0.00798797607421875, 0.0286865234375, -0.0208892822265625, 0.0051727294921875, -0.03582763671875, -0.03277587890625, 0.005390167236328125, -0.03192138671875, -0.04364013671875, -0.0226287841796875, 0.05975341796875, 0.0215911865234375, -0.0180206298828125, -0.003448486328125, -0.006320953369140625, 0.060302734375, -0.053741455078125, -0.0007948875427246094, -0.0272674560546875, 0.000033736228942871094, -0.00579071044921875, -0.00103759765625, -0.015960693359375, -0.01617431640625, -0.01428985595703125, -0.053375244140625, 0.035125732421875, -0.00981903076171875, 0.10125732421875, 0.032623291015625, -0.0221405029296875, 0.00551605224609375, -0.041107177734375, 0.0548095703125, -0.0758056640625, 0.0266571044921875, 0.00881195068359375, 0.006526947021484375, 0.00702667236328125, -0.058685302734375, -0.031280517578125, 0.0028514862060546875, 0.00511932373046875, 0.0164947509765625, -0.0162353515625, 0.0164031982421875, 0.044342041015625, 0.03643798828125, -0.0338134765625, 0.0019550323486328125, -0.039154052734375, -0.0287628173828125, 0.0703125, 0.01444244384765625, 0.00765228271484375, 0.0003542900085449219, -0.0469970703125, -0.0249481201171875, -0.04150390625, 0.00885772705078125, 0.01806640625, -0.0125732421875, -0.033172607421875, 0.0227508544921875, -0.01050567626953125, 0.034210205078125, 0.0308685302734375, -0.0014362335205078125, 0.053375244140625, -0.03729248046875, -0.025299072265625, -0.00545501708984375, 0.07513427734375, 0.039520263671875, 0.0001876354217529297, 0.0173797607421875, -0.038970947265625, -0.00516510009765625, 0.016571044921875, -0.07965087890625, -0.034759521484375, 0.03302001953125, -0.0400390625, -0.026458740234375, 0.01128387451171875, -0.049835205078125, 0.006847381591796875, -0.002742767333984375, 0.035919189453125, -0.03546142578125, -0.0521240234375, 0.0106353759765625, -0.020751953125, 0.0126495361328125, 0.02728271484375, -0.0535888671875, 0.0276336669921875, 0.0299072265625, 0.06781005859375, 0.001983642578125, -0.0105438232421875, -0.047515869140625, -0.0139923095703125, -0.027099609375, 0.0279998779296875, -0.0016908645629882812, -0.030242919921875, -0.01264190673828125, 0.033111572265625, 0.0064697265625, -0.05413818359375, 0.040252685546875, -0.033905029296875, -0.0147857666015625, -0.04522705078125, -0.042755126953125, -0.0121002197265625, -0.01009368896484375, -0.04937744140625, 0.08599853515625, 0.030120849609375, -0.056060791015625, 0.024658203125, -0.042022705078125, 0.00749969482421875, -0.01262664794921875, -0.01009368896484375, -0.043609619140625, 0.00995635986328125, 0.01136016845703125, 0.0198822021484375, 0.0042877197265625, -0.007328033447265625, -0.0218505859375, -0.03948974609375, 0.00037741661071777344, -0.045806884765625, 0.10260009765625, 0.005504608154296875, -0.048187255859375, 0.0269775390625, -0.07373046875, 0.007659912109375, 0.01519012451171875, -0.01067352294921875, 0.00011563301086425781, -0.035186767578125, 0.01361083984375, 0.025482177734375, 0.0283203125, -0.0546875, 0.003269195556640625, -0.038421630859375, 0.025177001953125, 0.068603515625, 0.01387786865234375, 0.029571533203125, -0.0185394287109375, 0.0275726318359375, 0.016815185546875, 0.01116180419921875, 0.006519317626953125, -0.046966552734375, -0.057342529296875, -0.02587890625, 0.0113983154296875, 0.044677734375, -0.0693359375, 0.0311431884765625, -0.001102447509765625, -0.041351318359375, -0.040740966796875, 0.01073455810546875, 0.0335693359375, 0.02313232421875, 0.03240966796875, -0.03082275390625, -0.033477783203125, -0.052001953125, 0.033905029296875, -0.0046539306640625, -0.00757598876953125, 0.0202178955078125, 0.0330810546875, -0.0203857421875, 0.0394287109375, -0.0496826171875, -0.004974365234375, 0.037506103515625, 0.010162353515625, 0.03021240234375, 0.04022216796875, 0.07000732421875, -0.04656982421875, -0.039581298828125, -0.0033245086669921875, -0.044036865234375, -0.007778167724609375, -0.0125732421875, -0.0036258697509765625, 0.0318603515625, 0.037750244140625, -0.046844482421875, 0.04766845703125, 0.0318603515625, -0.0435791015625, 0.07073974609375, -0.04364013671875, 0.01317596435546875, -0.056732177734375, 0.00479888916015625, 0.01331329345703125, -0.0259246826171875, -0.025665283203125, -0.0033054351806640625, 0.03021240234375, 0.0156707763671875, -0.046234130859375, 0.05657958984375, -0.03167724609375, 0.0149688720703125, -0.01806640625, -0.019195556640625, 0.00011092424392700195, 0.034576416015625, 0.0193328857421875, 0.057403564453125, 0.05682373046875, -0.039031982421875, 0.039825439453125, 0.00215911865234375, 0.0033855438232421875, 0.029083251953125, -0.06634521484375, -0.005069732666015625, -0.005519866943359375, 0.00872802734375, -0.06842041015625, 0.00975799560546875, 0.01421356201171875, -0.03277587890625, 0.038665771484375, -0.01186370849609375, -0.0150604248046875, -0.03045654296875, -0.044464111328125, 0.033233642578125, 0.047210693359375, -0.046905517578125, 0.0202178955078125, -0.00765228271484375, 0.0150299072265625, -0.06378173828125, -0.037139892578125, -0.005779266357421875, -0.00730133056640625, -0.0648193359375, 0.03558349609375, -0.024688720703125, -0.015655517578125, -0.0007538795471191406, -0.01934814453125, -0.01544952392578125, -0.0192108154296875, 0.02496337890625, 0.0316162109375, -0.0389404296875, -0.045166015625, 0.0120086669921875, -0.0362548828125, 0.03179931640625, -0.00384521484375, 0.0299835205078125, -0.023193359375, -0.01026153564453125, -0.0335693359375, 0.0167388916015625, 0.039154052734375, -0.0019388198852539062, 0.046783447265625, 0.097412109375, -0.03179931640625, -0.00888824462890625, -0.035888671875, -0.0254364013671875, -0.044586181640625, 0.005977630615234375, -0.03466796875, -0.0458984375, 0.047637939453125, 0.01432037353515625, 0.01084136962890625, 0.039337158203125, 0.032989501953125, -0.0052642822265625, 0.06591796875, 0.0247039794921875, 0.021484375, 0.0305023193359375, -0.0560302734375, 0.0031566619873046875, -0.07037353515625, -0.001903533935546875, -0.01314544677734375, -0.0143280029296875, -0.025726318359375, -0.037994384765625, 0.0193328857421875, 0.0180206298828125, -0.0279998779296875, 0.0335693359375, -0.050201416015625, 0.0216827392578125, 0.034576416015625, 0.003021240234375, -0.006992340087890625, -0.007354736328125, -0.006328582763671875, -0.0236358642578125, -0.02740478515625, -0.0196380615234375, 0.0760498046875, 0.014312744140625, 0.0535888671875, 0.0030117034912109375, 0.0760498046875, 0.002105712890625, 0.00006413459777832031, -0.0248870849609375, 0.035125732421875, -0.0213775634765625, -0.031341552734375, -0.005767822265625, -0.032623291015625, -0.0673828125, 0.010406494140625, -0.0175933837890625, -0.057464599609375, 0.036102294921875, 0.0015592575073242188, -0.0310211181640625, 0.03668212890625, -0.05523681640625, 0.06396484375, -0.007663726806640625, -0.03955078125, -0.0008320808410644531, -0.046478271484375, 0.02789306640625, 0.008270263671875, -0.00742340087890625, -0.003429412841796875, -0.01438140869140625, 0.046112060546875, -0.064697265625, 0.04559326171875, -0.03680419921875, -0.033355712890625, 0.0217132568359375, -0.0103607177734375, 0.037933349609375, 0.036163330078125, -0.0181121826171875, 0.034332275390625, 0.0355224609375, -0.049530029296875, -0.0418701171875, 0.0643310546875, -0.0643310546875, -0.034027099609375, -0.03973388671875, -0.0240631103515625, 0.024017333984375, 0.0248565673828125, 0.047637939453125, 0.02685546875, 0.00403594970703125, -0.0007843971252441406, 0.0765380859375, -0.00762176513671875, 0.04986572265625, 0.030120849609375, -0.024871826171875, -0.040435791015625, 0.07568359375, 0.018096923828125, 0.03558349609375, 0.021820068359375, -0.0037841796875, -0.0242767333984375, -0.033447265625, -0.04217529296875, 0.0323486328125, -0.049072265625, -0.0186309814453125, -0.040771484375, -0.049591064453125, -0.023345947265625, -0.03826904296875, -0.05010986328125, -0.02325439453125, -0.038818359375, 0.0107269287109375, 0.048126220703125, 0.0350341796875, -0.04693603515625, 0.0163726806640625, -0.054107666015625, 0.038604736328125, 0.02435302734375, 0.01910400390625, -0.01221466064453125, -0.0279083251953125, 0.003330230712890625, 0.0162353515625, -0.013580322265625, -0.06280517578125, 0.027923583984375, 0.0029468536376953125, 0.0283660888671875, 0.040069580078125, 0.006389617919921875, 0.039093017578125, -0.0209503173828125, 0.0645751953125, 0.022491455078125, -0.07073974609375, 0.03546142578125, -0.0245513916015625, -0.0007452964782714844, 0.0163421630859375, 0.017669677734375, -0.01654052734375, -0.00676727294921875, -0.07366943359375, -0.0679931640625, 0.043914794921875, 0.02740478515625, -0.01190185546875, 0.0015964508056640625, 0.042083740234375, 0.007595062255859375, -0.01047515869140625, -0.034698486328125, -0.053192138671875, -0.0245819091796875, 0.0059051513671875, 0.00627899169921875, -0.009063720703125, -0.00116729736328125, -0.03875732421875, 0.06890869140625, -0.00506591796875, 0.0286712646484375, 0.03216552734375, 0.01605224609375, -0.0205230712890625, -0.0216217041015625, 0.031341552734375, 0.05377197265625, -0.04339599609375, -0.005817413330078125, 0.0202484130859375, -0.0501708984375, 0.01508331298828125, 0.0009560585021972656, -0.018463134765625, 0.018768310546875, 0.0011472702026367188, 0.06884765625, 0.005893707275390625, -0.0345458984375, 0.061370849609375, -0.022186279296875, -0.032470703125, -0.02569580078125, 0.0292510986328125, 0.0167083740234375, 0.016571044921875, 0.004795074462890625, 0.0310516357421875, 0.0003933906555175781, 0.0010843276977539062, 0.023834228515625, 0.037628173828125, -0.02252197265625, -0.0172271728515625, 0.07098388671875, 0.008026123046875, -0.0222625732421875, 0.04302978515625, -0.046844482421875, -0.01024627685546875, 0.062225341796875, 0.042083740234375, 0.06707763671875, -0.0198516845703125, 0.02655029296875, 0.06085205078125, 0.034332275390625, -0.005817413330078125, 0.04559326171875, -0.003570556640625, -0.0418701171875, -0.03594970703125, -0.047821044921875, -0.0073089599609375, 0.00542449951171875, -0.06402587890625, 0.044769287109375, -0.047943115234375, -0.0005631446838378906, 0.007511138916015625, 0.0137939453125, -0.055328369140625, 0.026519775390625, 0.01064300537109375, 0.07696533203125, -0.07537841796875, 0.0831298828125, 0.0360107421875, -0.0289306640625, -0.040496826171875, -0.0184326171875, 0.002033233642578125, -0.072265625, 0.01593017578125, -0.005130767822265625, -0.00299072265625, 0.0014848709106445312, -0.037567138671875, -0.07464599609375, 0.0965576171875, 0.033355712890625, -0.043365478515625, 0.01529693603515625, -0.015411376953125, 0.026641845703125, -0.007232666015625, 0.028778076171875, 0.034271240234375, 0.03057861328125, 0.004459381103515625, -0.07568359375, 0.032470703125, -0.0311431884765625, 0.021697998046875, 0.02581787109375, -0.06317138671875, 0.0693359375, -0.026214599609375, -0.0006618499755859375, 0.0096588134765625, 0.05804443359375, 0.0309600830078125, 0.01203155517578125, 0.042724609375, 0.06793212890625, 0.05364990234375, -0.005817413330078125, 0.058624267578125, -0.00637054443359375, 0.04345703125, 0.054779052734375, 0.0038909912109375, 0.05621337890625, 0.0293121337890625, -0.03021240234375, 0.035003662109375, 0.046783447265625, -0.0081634521484375, 0.0279998779296875, 0.00643157958984375, -0.034912109375, -0.0009555816650390625, 0.0309906005859375, -0.056060791015625, -0.008758544921875, 0.042694091796875, -0.0225067138671875, -0.0181121826171875, -0.006320953369140625, 0.00833892822265625, -0.020355224609375, -0.0176849365234375, 0.035491943359375, 0.0082550048828125, -0.0207061767578125, 0.09112548828125, -0.007701873779296875, 0.0782470703125, -0.04827880859375, -0.01019287109375, 0.005817413330078125, 0.0225677490234375, -0.034423828125, -0.08282470703125, 0.0462646484375, -0.0223236083984375, -0.0178985595703125, -0.0013866424560546875, 0.049224853515625, -0.02349853515625, -0.038421630859375, 0.03131103515625, -0.0011196136474609375, 0.0233001708984375, -0.0145416259765625, -0.0675048828125, 0.03997802734375, 0.0220794677734375, -0.0135498046875, 0.0206146240234375, 0.03131103515625, 0.0233001708984375, 0.022064208984375, 0.03955078125, 0.0058746337890625, 0.0287933349609375, -0.01904296875, 0.053924560546875, -0.04498291015625, -0.022369384765625, -0.041778564453125, 0.0516357421875, -0.01806640625, -0.0478515625, 0.052764892578125, 0.0445556640625, 0.056976318359375, -0.006038665771484375, 0.047027587890625, -0.0141143798828125, -0.0113372802734375, -0.041656494140625, 0.056060791015625, -0.0516357421875, 0.01190185546875, -0.0164642333984375, -0.076171875, 0.008819580078125, 0.05572509765625, 0.007171630859375, 0.00970458984375, 0.048370361328125, 0.08135986328125, -0.0165863037109375, -0.00921630859375, 0.01445770263671875, 0.029754638671875, 0.0192413330078125, 0.05206298828125, 0.00850677490234375, -0.06512451171875, 0.02978515625, -0.067138671875, -0.0345458984375, -0.00414276123046875, -0.054412841796875, -0.042724609375, -0.055328369140625, -0.0611572265625, -0.0782470703125, -0.0187835693359375, 0.0504150390625, 0.06024169921875, -0.050872802734375, -0.00528717041015625, -0.0122528076171875, 0.0010051727294921875, -0.034423828125, -0.0200958251953125, 0.043182373046875, -0.01187896728515625, -0.05987548828125, 0.0255126953125, 0.034881591796875, 0.0028629302978515625, -0.01666259765625, -0.033447265625, -0.01004791259765625, 0.005474090576171875, 0.0245361328125, 0.0458984375, -0.05517578125, -0.01094818115234375, -0.00824737548828125, 0.0099334716796875, 0.024017333984375, 0.0189056396484375, -0.043853759765625, 0.043243408203125, 0.041778564453125, 0.0195770263671875, 0.07684326171875, -0.0328369140625, 0.01238250732421875, -0.0589599609375, 0.0113983154296875, 0.00965118408203125, 0.0382080078125, 0.004810333251953125, -0.033294677734375, 0.03082275390625, 0.027740478515625, -0.05242919921875, -0.053253173828125, -0.00218963623046875, -0.10943603515625, -0.007049560546875, 0.09881591796875, -0.01287078857421875, -0.049224853515625, -0.005035400390625, -0.04437255859375, 0.00739288330078125, -0.032928466796875, 0.036712646484375, 0.03466796875, 0.0014009475708007812, -0.020111083984375, -0.032623291015625, 0.036041259765625, 0.0239105224609375, -0.03265380859375, -0.001888275146484375, 0.0159149169921875, 0.053375244140625, 0.0299835205078125, 0.06597900390625, -0.01561737060546875, 0.031646728515625, 0.033447265625, 0.0085906982421875, 0.0088043212890625, 0.01364898681640625, -0.032257080078125, -0.00933074951171875, -0.015289306640625, -0.018890380859375 ] ]
google/canine-c
2022-08-08T13:44:46.000Z
[ "transformers", "pytorch", "canine", "feature-extraction", "multilingual", "af", "sq", "ar", "an", "hy", "ast", "az", "ba", "eu", "bar", "be", "bn", "inc", "bs", "br", "bg", "my", "ca", "ceb", "ce", "zh", "cv", "hr", "cs", "da", "nl", "en", "et", "fi", "fr", "gl", "ka", "de", "el", "gu", "ht", "he", "hi", "hu", "is", "io", "id", "ga", "it", "ja", "jv", "kn", "kk", "ky", "ko", "la", "lv", "lt", "roa", "nds", "lm", "mk", "mg", "ms", "ml", "mr", "mn", "min", "ne", "new", "nb", "nn", "oc", "fa", "pms", "pl", "pt", "pa", "ro", "ru", "sco", "sr", "scn", "sk", "sl", "aze", "es", "su", "sw", "sv", "tl", "tg", "th", "ta", "tt", "te", "tr", "uk", "ud", "uz", "vi", "vo", "war", "cy", "fry", "pnb", "yo", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2103.06874", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
google
null
null
google/canine-c
8
119,913
transformers
2022-03-02T23:29:05
--- language: - multilingual - af - sq - ar - an - hy - ast - az - ba - eu - bar - be - bn - inc - bs - br - bg - my - ca - ceb - ce - zh - cv - hr - cs - da - nl - en - et - fi - fr - gl - ka - de - el - gu - ht - he - hi - hu - is - io - id - ga - it - ja - jv - kn - kk - ky - ko - la - lv - lt - roa - nds - lm - mk - mg - ms - ml - mr - mn - min - ne - new - nb - nn - oc - fa - pms - pl - pt - pa - ro - ru - sco - sr - hr - scn - sk - sl - aze - es - su - sw - sv - tl - tg - th - ta - tt - te - tr - uk - ud - uz - vi - vo - war - cy - fry - pnb - yo license: apache-2.0 datasets: - bookcorpus - wikipedia --- # CANINE-c (CANINE pre-trained with autoregressive character loss) Pretrained CANINE model on 104 languages using a masked language modeling (MLM) objective. It was introduced in the paper [CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation](https://arxiv.org/abs/2103.06874) and first released in [this repository](https://github.com/google-research/language/tree/master/language/canine). What's special about CANINE is that it doesn't require an explicit tokenizer (such as WordPiece or SentencePiece) as other models like BERT and RoBERTa. Instead, it directly operates at a character level: each character is turned into its [Unicode code point](https://en.wikipedia.org/wiki/Code_point#:~:text=For%20Unicode%2C%20the%20particular%20sequence,forming%20a%20self%2Dsynchronizing%20code.). This means that input processing is trivial and can typically be accomplished as: ``` input_ids = [ord(char) for char in text] ``` The ord() function is part of Python, and turns each character into its Unicode code point. Disclaimer: The team releasing CANINE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description CANINE is a transformers model pretrained on a large corpus of multilingual data in a self-supervised fashion, similar to BERT. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: * Masked language modeling (MLM): one randomly masks part of the inputs, which the model needs to predict. This model (CANINE-c) is trained with an autoregressive character loss. One masks several character spans within each sequence, which the model then autoregressively predicts. * Next sentence prediction (NSP): the model concatenates two sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of multiple languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the CANINE model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=canine) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at models like GPT2. ### How to use Here is how to use this model: ```python from transformers import CanineTokenizer, CanineModel model = CanineModel.from_pretrained('google/canine-c') tokenizer = CanineTokenizer.from_pretrained('google/canine-c') inputs = ["Life is like a box of chocolates.", "You never know what you gonna get."] encoding = tokenizer(inputs, padding="longest", truncation=True, return_tensors="pt") outputs = model(**encoding) # forward pass pooled_output = outputs.pooler_output sequence_output = outputs.last_hidden_state ``` ## Training data The CANINE model was pretrained on on the multilingual Wikipedia data of [mBERT](https://github.com/google-research/bert/blob/master/multilingual.md), which includes 104 languages. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2103-06874, author = {Jonathan H. Clark and Dan Garrette and Iulia Turc and John Wieting}, title = {{CANINE:} Pre-training an Efficient Tokenization-Free Encoder for Language Representation}, journal = {CoRR}, volume = {abs/2103.06874}, year = {2021}, url = {https://arxiv.org/abs/2103.06874}, archivePrefix = {arXiv}, eprint = {2103.06874}, timestamp = {Tue, 16 Mar 2021 11:26:59 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2103-06874.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
5,179
[ [ -0.039276123046875, -0.053497314453125, 0.0038814544677734375, 0.029510498046875, -0.013458251953125, 0.003971099853515625, -0.025482177734375, -0.040008544921875, 0.01433563232421875, 0.0168609619140625, -0.04693603515625, -0.03582763671875, -0.0279541015625, 0.0099334716796875, -0.009368896484375, 0.078369140625, -0.00215911865234375, 0.0254974365234375, -0.0038814544677734375, -0.008026123046875, -0.01715087890625, -0.07073974609375, -0.047027587890625, -0.03411865234375, 0.0303497314453125, -0.0017461776733398438, 0.0287933349609375, 0.0467529296875, 0.024078369140625, 0.022064208984375, 0.019012451171875, -0.00571441650390625, -0.0216217041015625, -0.01557159423828125, 0.0115814208984375, -0.03289794921875, -0.033782958984375, 0.0281829833984375, 0.03350830078125, 0.057220458984375, -0.004634857177734375, 0.0010471343994140625, 0.003780364990234375, 0.03021240234375, -0.035552978515625, 0.021209716796875, -0.043365478515625, 0.004695892333984375, -0.0235443115234375, 0.0034580230712890625, -0.0272979736328125, -0.0194549560546875, -0.0010042190551757812, -0.0465087890625, 0.0207672119140625, -0.0010309219360351562, 0.0914306640625, 0.00443267822265625, -0.0124053955078125, -0.0294189453125, -0.041717529296875, 0.05621337890625, -0.045379638671875, 0.04864501953125, 0.0533447265625, -0.00939178466796875, -0.0115814208984375, -0.0750732421875, -0.051788330078125, -0.033966064453125, -0.01557159423828125, 0.01343536376953125, -0.0160980224609375, 0.007762908935546875, 0.015655517578125, 0.0325927734375, -0.05145263671875, 0.00982666015625, -0.029876708984375, -0.0245361328125, 0.04791259765625, -0.003108978271484375, 0.033294677734375, -0.004871368408203125, -0.00838470458984375, -0.025054931640625, -0.036376953125, 0.037750244140625, 0.0260467529296875, 0.00946044921875, -0.00832366943359375, 0.038665771484375, -0.0081329345703125, 0.04119873046875, 0.039825439453125, -0.008209228515625, 0.043243408203125, -0.021636962890625, -0.017730712890625, 0.005237579345703125, 0.07928466796875, 0.00711822509765625, 0.0218658447265625, -0.0244598388671875, -0.002986907958984375, 0.00188446044921875, 0.01221466064453125, -0.060791015625, -0.0203704833984375, 0.010711669921875, -0.04461669921875, -0.022430419921875, -0.00335693359375, -0.017578125, 0.0003421306610107422, -0.005615234375, 0.024078369140625, -0.05181884765625, -0.01245880126953125, 0.019195556640625, -0.0082244873046875, 0.03656005859375, -0.0007143020629882812, -0.0841064453125, 0.01276397705078125, 0.0275726318359375, 0.042999267578125, -0.0014638900756835938, -0.0523681640625, -0.0291290283203125, -0.0084075927734375, -0.035919189453125, 0.0355224609375, -0.041656494140625, -0.01003265380859375, -0.01099395751953125, 0.03668212890625, -0.0171051025390625, -0.0328369140625, 0.033111572265625, -0.033447265625, 0.0263214111328125, -0.00850677490234375, -0.035797119140625, -0.0146636962890625, 0.01464080810546875, -0.04345703125, 0.101806640625, 0.0013837814331054688, -0.062103271484375, 0.032196044921875, -0.034912109375, -0.040191650390625, 0.01273345947265625, -0.0006546974182128906, -0.016387939453125, 0.00283050537109375, 0.0300445556640625, 0.0457763671875, -0.0010986328125, 0.03460693359375, -0.0272979736328125, -0.0169830322265625, 0.01064300537109375, -0.00832366943359375, 0.06658935546875, 0.003101348876953125, -0.0271453857421875, 0.0006051063537597656, -0.07232666015625, 0.0104217529296875, 0.0275115966796875, -0.041778564453125, -0.031219482421875, -0.00229644775390625, -0.00794219970703125, 0.032196044921875, 0.0235595703125, -0.040618896484375, 0.00957489013671875, -0.0347900390625, 0.03839111328125, 0.0281829833984375, -0.0179290771484375, 0.0364990234375, -0.021514892578125, 0.037567138671875, -0.0189056396484375, 0.008575439453125, -0.0212554931640625, -0.0247802734375, -0.07403564453125, -0.0257415771484375, 0.03680419921875, 0.05279541015625, -0.04205322265625, 0.060089111328125, -0.005741119384765625, -0.0491943359375, -0.0718994140625, -0.00624847412109375, 0.0272216796875, 0.0222625732421875, 0.029449462890625, -0.036773681640625, -0.047210693359375, -0.058990478515625, -0.0024662017822265625, -0.019134521484375, -0.000812530517578125, -0.0008826255798339844, 0.040069580078125, -0.0234222412109375, 0.06201171875, -0.0091705322265625, -0.00653076171875, -0.03411865234375, 0.0193634033203125, -0.0015382766723632812, 0.0504150390625, 0.03424072265625, -0.02496337890625, -0.03717041015625, -0.00885772705078125, -0.0655517578125, 0.0024471282958984375, -0.01526641845703125, -0.0091094970703125, 0.021209716796875, 0.053070068359375, -0.05206298828125, 0.0244293212890625, 0.037750244140625, 0.0068511962890625, 0.044921875, -0.0157470703125, -0.003894805908203125, -0.069580078125, 0.020751953125, -0.003376007080078125, -0.01129913330078125, -0.060882568359375, -0.002796173095703125, 0.01302337646484375, -0.00798797607421875, -0.03253173828125, 0.052886962890625, -0.0391845703125, 0.008331298828125, -0.0035190582275390625, -0.0015621185302734375, -0.0014801025390625, 0.06365966796875, 0.0355224609375, 0.0504150390625, 0.04608154296875, -0.04827880859375, 0.01485443115234375, 0.024993896484375, -0.02001953125, 0.0223388671875, -0.054046630859375, 0.01470947265625, -0.007381439208984375, 0.020233154296875, -0.04461669921875, -0.00943756103515625, 0.021209716796875, -0.056060791015625, 0.031463623046875, -0.0006313323974609375, -0.037384033203125, -0.0400390625, -0.030975341796875, 0.01381683349609375, 0.02996826171875, -0.037811279296875, 0.0249786376953125, 0.0281219482421875, -0.01479339599609375, -0.0533447265625, -0.052947998046875, 0.00545501708984375, -0.00771331787109375, -0.04840087890625, 0.037933349609375, -0.0192718505859375, 0.01422882080078125, 0.00510406494140625, 0.00897979736328125, -0.0037899017333984375, 0.0027103424072265625, 0.0091400146484375, 0.01383209228515625, -0.020965576171875, 0.0177764892578125, -0.0076904296875, -0.002666473388671875, 0.0010461807250976562, -0.01433563232421875, 0.06427001953125, -0.0245208740234375, -0.01074981689453125, -0.03656005859375, 0.0292510986328125, 0.0325927734375, -0.044097900390625, 0.062103271484375, 0.0745849609375, -0.03515625, 0.0025196075439453125, -0.0426025390625, -0.00751495361328125, -0.036834716796875, 0.052520751953125, -0.039398193359375, -0.058013916015625, 0.040924072265625, 0.01299285888671875, -0.00884246826171875, 0.042144775390625, 0.043731689453125, -0.015869140625, 0.08636474609375, 0.0638427734375, -0.01378631591796875, 0.0445556640625, -0.01474761962890625, 0.0189208984375, -0.06500244140625, -0.038848876953125, -0.0400390625, 0.00890350341796875, -0.05517578125, -0.01641845703125, -0.01000213623046875, 0.04412841796875, -0.0261688232421875, 0.03204345703125, -0.0295867919921875, 0.02301025390625, 0.0626220703125, -0.01396942138671875, -0.004901885986328125, 0.0130615234375, -0.030975341796875, -0.012542724609375, -0.053070068359375, -0.040496826171875, 0.07537841796875, 0.03643798828125, 0.06298828125, -0.0229949951171875, 0.055633544921875, 0.00165557861328125, 0.006107330322265625, -0.0616455078125, 0.03839111328125, -0.0154571533203125, -0.05035400390625, -0.0228271484375, -0.003284454345703125, -0.08477783203125, 0.016357421875, -0.0218963623046875, -0.06585693359375, 0.01120758056640625, -0.0027256011962890625, -0.003910064697265625, 0.009796142578125, -0.04620361328125, 0.06805419921875, -0.031890869140625, -0.016448974609375, 0.0021648406982421875, -0.05572509765625, 0.0256195068359375, -0.0169525146484375, -0.006465911865234375, 0.00507354736328125, 0.02337646484375, 0.06622314453125, -0.04327392578125, 0.0623779296875, -0.005035400390625, 0.0112152099609375, 0.01055908203125, -0.01568603515625, 0.04742431640625, -0.0152435302734375, 0.00598907470703125, 0.0153045654296875, 0.0012292861938476562, -0.024139404296875, -0.0265960693359375, 0.0260162353515625, -0.0692138671875, -0.034423828125, -0.0382080078125, -0.0190277099609375, -0.017425537109375, 0.0280303955078125, 0.044921875, 0.028411865234375, -0.0255889892578125, 0.014068603515625, 0.038482666015625, -0.05255126953125, 0.04620361328125, 0.047821044921875, -0.004764556884765625, -0.035064697265625, 0.0635986328125, -0.008331298828125, 0.00997161865234375, 0.0237274169921875, -0.003139495849609375, -0.039825439453125, -0.038055419921875, -0.039306640625, 0.046905517578125, -0.055328369140625, 0.006458282470703125, -0.07623291015625, -0.0439453125, -0.045166015625, -0.018585205078125, -0.03424072265625, -0.0202789306640625, -0.03125, 0.00044989585876464844, -0.0015153884887695312, 0.043792724609375, -0.01413726806640625, 0.0279541015625, -0.056793212890625, 0.0159912109375, 0.02423095703125, 0.019683837890625, -0.0037689208984375, -0.030364990234375, -0.0266571044921875, 0.004302978515625, -0.021575927734375, -0.055145263671875, 0.04327392578125, 0.023834228515625, 0.073486328125, 0.020782470703125, -0.001422882080078125, 0.06640625, -0.0364990234375, 0.060302734375, 0.03082275390625, -0.07275390625, 0.041717529296875, -0.00907135009765625, 0.02001953125, 0.044708251953125, 0.044525146484375, -0.0654296875, -0.04571533203125, -0.0421142578125, -0.059234619140625, 0.07855224609375, 0.011077880859375, 0.0238189697265625, -0.00827789306640625, 0.00861358642578125, 0.00937652587890625, 0.0187835693359375, -0.085693359375, -0.03131103515625, -0.03546142578125, -0.0150146484375, -0.01270294189453125, -0.01812744140625, 0.0207366943359375, -0.037750244140625, 0.070068359375, 0.004730224609375, 0.03717041015625, 0.01299285888671875, -0.041473388671875, 0.01262664794921875, 0.01323699951171875, 0.0428466796875, 0.015655517578125, -0.0362548828125, -0.004268646240234375, 0.006542205810546875, -0.0584716796875, 0.006923675537109375, 0.0290985107421875, -0.004947662353515625, 0.00042128562927246094, 0.038726806640625, 0.083740234375, -0.0158233642578125, -0.04168701171875, 0.046630859375, -0.0147857666015625, -0.027313232421875, -0.0184478759765625, -0.0093231201171875, 0.0079498291015625, 0.01033782958984375, 0.025726318359375, -0.0165557861328125, -0.007678985595703125, -0.044830322265625, 0.015838623046875, 0.0291900634765625, -0.032318115234375, -0.032257080078125, 0.054443359375, 0.015350341796875, -0.01387786865234375, 0.0650634765625, -0.037994384765625, -0.045989990234375, 0.04425048828125, 0.0587158203125, 0.0689697265625, -0.02288818359375, 0.0246124267578125, 0.04754638671875, 0.040283203125, -0.0016698837280273438, -0.00502777099609375, -0.0031585693359375, -0.06292724609375, -0.0263519287109375, -0.0675048828125, 0.01007843017578125, 0.055267333984375, -0.047271728515625, 0.026031494140625, -0.019866943359375, -0.0118255615234375, 0.014923095703125, 0.01849365234375, -0.058441162109375, 0.026397705078125, 0.0217742919921875, 0.060546875, -0.054290771484375, 0.08795166015625, 0.05926513671875, -0.052947998046875, -0.06768798828125, -0.00010782480239868164, -0.03057861328125, -0.07659912109375, 0.0567626953125, 0.0217132568359375, 0.004390716552734375, 0.005725860595703125, -0.033782958984375, -0.06304931640625, 0.06475830078125, 0.0213775634765625, -0.038818359375, -0.0139617919921875, 0.014801025390625, 0.04010009765625, -0.03411865234375, 0.0150909423828125, 0.038726806640625, 0.0217742919921875, 0.0014514923095703125, -0.0736083984375, 0.006366729736328125, -0.044158935546875, -0.0030078887939453125, -0.00948333740234375, -0.037841796875, 0.0645751953125, -0.01016998291015625, -0.0195770263671875, -0.0052337646484375, 0.0279998779296875, 0.0183563232421875, 0.005016326904296875, 0.02197265625, 0.0357666015625, 0.0723876953125, -0.0014934539794921875, 0.06866455078125, -0.0273590087890625, 0.0218505859375, 0.0845947265625, 0.0008192062377929688, 0.0538330078125, 0.01116943359375, -0.0178375244140625, 0.034332275390625, 0.07928466796875, -0.0177764892578125, 0.061553955078125, 0.01401519775390625, -0.003711700439453125, -0.0210723876953125, 0.0028018951416015625, -0.0439453125, 0.044158935546875, 0.0188751220703125, -0.03070068359375, -0.0009741783142089844, 0.017242431640625, 0.0190582275390625, -0.037078857421875, -0.0123443603515625, 0.052825927734375, -0.005374908447265625, -0.054412841796875, 0.06658935546875, 0.027313232421875, 0.0628662109375, -0.0538330078125, 0.0214996337890625, -0.03460693359375, 0.003643035888671875, -0.0039215087890625, -0.0256500244140625, 0.01276397705078125, 0.0030651092529296875, -0.0178070068359375, -0.01081085205078125, 0.058563232421875, -0.047607421875, -0.039337158203125, 0.0245208740234375, 0.030487060546875, 0.0277862548828125, -0.01131439208984375, -0.07196044921875, -0.0053558349609375, 0.0021648406982421875, -0.0210113525390625, 0.0201568603515625, 0.019866943359375, -0.0204925537109375, 0.047637939453125, 0.04498291015625, 0.0008959770202636719, 0.03094482421875, 0.0009527206420898438, 0.058685302734375, -0.057342529296875, -0.039459228515625, -0.052093505859375, 0.0350341796875, 0.004940032958984375, -0.01314544677734375, 0.049224853515625, 0.0465087890625, 0.0833740234375, -0.01395416259765625, 0.0772705078125, -0.01529693603515625, 0.011627197265625, -0.03497314453125, 0.05938720703125, -0.050201416015625, -0.00466156005859375, -0.010528564453125, -0.055389404296875, -0.0211639404296875, 0.057830810546875, -0.0121917724609375, 0.01177978515625, 0.07232666015625, 0.06719970703125, -0.0021953582763671875, -0.0217132568359375, 0.021636962890625, -0.002330780029296875, 0.01393890380859375, 0.03253173828125, 0.06622314453125, -0.03131103515625, 0.049835205078125, -0.0235595703125, -0.0011959075927734375, -0.00849151611328125, -0.045654296875, -0.084228515625, -0.0560302734375, -0.026092529296875, -0.03094482421875, 0.00022101402282714844, 0.0655517578125, 0.0771484375, -0.05804443359375, -0.01177978515625, -0.008514404296875, -0.021026611328125, -0.004192352294921875, -0.0135040283203125, 0.039398193359375, -0.04608154296875, -0.0555419921875, 0.019134521484375, 0.005168914794921875, 0.01995849609375, -0.01216888427734375, -0.0264129638671875, -0.01593017578125, 0.01374053955078125, 0.041656494140625, 0.027740478515625, -0.058837890625, 0.0005230903625488281, 0.0154266357421875, -0.031982421875, 0.01427459716796875, 0.050048828125, -0.050933837890625, 0.034576416015625, 0.01111602783203125, 0.04150390625, 0.0709228515625, -0.018218994140625, 0.036376953125, -0.0614013671875, 0.0218963623046875, -0.0007305145263671875, 0.01390838623046875, 0.0292205810546875, -0.014678955078125, 0.0277557373046875, 0.023651123046875, -0.024139404296875, -0.051361083984375, -0.0008215904235839844, -0.0699462890625, -0.024139404296875, 0.07452392578125, -0.0267181396484375, -0.01800537109375, -0.01038360595703125, -0.005657196044921875, 0.034820556640625, -0.0201416015625, 0.0404052734375, 0.07257080078125, 0.002422332763671875, -0.0204620361328125, -0.0380859375, 0.05206298828125, 0.006137847900390625, -0.0667724609375, -0.01334381103515625, 0.0175628662109375, 0.03289794921875, -0.0032978057861328125, 0.081298828125, -0.007068634033203125, 0.0011615753173828125, 0.0088958740234375, 0.0440673828125, -0.0037384033203125, -0.01190948486328125, -0.0291290283203125, -0.00551605224609375, -0.0003559589385986328, -0.040802001953125 ] ]
circulus/Llama-2-13b-llava-v1
2023-08-02T09:21:25.000Z
[ "transformers", "pytorch", "llava", "text-generation", "en", "dataset:Open-Orca/OpenOrca", "license:mit", "endpoints_compatible", "region:us" ]
text-generation
circulus
null
null
circulus/Llama-2-13b-llava-v1
2
119,334
transformers
2023-08-02T03:27:22
--- license: mit datasets: - Open-Orca/OpenOrca language: - en library_name: transformers pipeline_tag: text-generation --- ![img](https://huggingface.co/circulus/Llama-2-13b-llava-v1/resolve/main/llava.jpg) ``` model_name = "circulus/Llama-2-13b-llava-v1" tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False) config = BitsAndBytesConfig(load_in_4bit=True, bnb_4bit_compute_dtype=torch.bfloat16, bnb_4bit_use_double_quant=True) model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto", quantization_config=config) ```
553
[ [ -0.0287017822265625, -0.044403076171875, 0.0254364013671875, 0.041259765625, -0.075927734375, 0.02166748046875, 0.01519012451171875, -0.0182037353515625, 0.0333251953125, 0.0246429443359375, -0.035064697265625, -0.0301666259765625, -0.056793212890625, -0.00989532470703125, -0.00673675537109375, 0.06549072265625, -0.01403045654296875, -0.00740814208984375, 0.0019197463989257812, -0.005138397216796875, -0.040435791015625, -0.03448486328125, -0.0457763671875, -0.017425537109375, 0.0158233642578125, 0.0257415771484375, 0.034210205078125, 0.035491943359375, 0.049102783203125, 0.029266357421875, 0.00519561767578125, 0.006465911865234375, -0.006591796875, -0.00975799560546875, 0.021209716796875, -0.032623291015625, -0.056182861328125, 0.004302978515625, 0.045989990234375, -0.006885528564453125, -0.01270294189453125, 0.045928955078125, -0.0129547119140625, 0.024932861328125, -0.037353515625, 0.0002694129943847656, -0.04608154296875, -0.016876220703125, -0.0263214111328125, -0.013214111328125, -0.002788543701171875, -0.0299072265625, -0.020050048828125, -0.068603515625, 0.0172271728515625, 0.016448974609375, 0.09820556640625, 0.040435791015625, -0.048065185546875, -0.0234527587890625, -0.0259552001953125, 0.042938232421875, -0.053314208984375, 0.00812530517578125, 0.029266357421875, 0.01194000244140625, -0.01297760009765625, -0.0780029296875, -0.035675048828125, 0.0036792755126953125, -0.003124237060546875, -0.00991058349609375, -0.0243682861328125, -0.003765106201171875, 0.023223876953125, 0.0244903564453125, -0.0321044921875, 0.006282806396484375, -0.053375244140625, -0.0225677490234375, 0.037078857421875, 0.0102081298828125, 0.005970001220703125, 0.007427215576171875, -0.0548095703125, -0.0164642333984375, -0.04595947265625, 0.000980377197265625, 0.038330078125, -0.01499176025390625, -0.041595458984375, 0.037506103515625, -0.021697998046875, 0.049285888671875, 0.038787841796875, -0.0030803680419921875, 0.046630859375, -0.015625, -0.03717041015625, 0.0027790069580078125, 0.0587158203125, 0.0178070068359375, -0.005359649658203125, 0.01451873779296875, -0.01593017578125, 0.00010472536087036133, 0.00356292724609375, -0.0804443359375, -0.0357666015625, 0.025115966796875, -0.034210205078125, -0.026580810546875, 0.0012311935424804688, -0.036956787109375, 0.004062652587890625, 0.0060577392578125, 0.047576904296875, -0.0244140625, -0.038360595703125, 0.0147705078125, 0.0111846923828125, 0.0283203125, 0.00799560546875, -0.060272216796875, -0.00254058837890625, 0.0290985107421875, 0.057159423828125, 0.0318603515625, -0.0036449432373046875, -0.007904052734375, 0.0105133056640625, -0.011962890625, 0.053436279296875, 0.0036296844482421875, -0.04266357421875, -0.0311279296875, 0.010467529296875, 0.0001590251922607422, -0.0406494140625, 0.046478271484375, -0.0478515625, -0.005023956298828125, -0.0238800048828125, -0.02392578125, -0.03485107421875, 0.0095977783203125, -0.049285888671875, 0.08282470703125, 0.00647735595703125, -0.06524658203125, 0.03228759765625, -0.051055908203125, -0.00794219970703125, 0.01140594482421875, -0.003875732421875, -0.05908203125, 0.023162841796875, -0.00439453125, 0.0287322998046875, -0.0089111328125, 0.01824951171875, -0.031585693359375, -0.032806396484375, 0.037445068359375, -0.015716552734375, 0.0670166015625, 0.028533935546875, -0.033111572265625, 0.01043701171875, -0.0626220703125, 0.00019609928131103516, 0.0280609130859375, -0.01380157470703125, -0.0085296630859375, -0.0223388671875, 0.0011968612670898438, 0.0171051025390625, 0.054168701171875, -0.05010986328125, 0.036895751953125, -0.0018310546875, 0.0386962890625, 0.0555419921875, -0.0016384124755859375, 0.0013284683227539062, -0.007495880126953125, 0.0223236083984375, 0.01502227783203125, 0.03729248046875, 0.032623291015625, -0.01439666748046875, -0.0927734375, -0.04998779296875, 0.0182647705078125, 0.0260467529296875, -0.0469970703125, 0.041107177734375, -0.005401611328125, -0.054840087890625, -0.0352783203125, 0.002132415771484375, 0.0163421630859375, 0.01297760009765625, 0.01265716552734375, -0.0352783203125, -0.055816650390625, -0.0631103515625, 0.018585205078125, -0.0208892822265625, 0.00534820556640625, -0.005519866943359375, 0.063720703125, -0.0176849365234375, 0.04248046875, -0.0340576171875, -0.0153961181640625, 0.0084381103515625, 0.01061248779296875, 0.041778564453125, 0.0677490234375, 0.044708251953125, -0.033721923828125, -0.0163116455078125, -0.033782958984375, -0.05401611328125, -0.00850677490234375, -0.0147857666015625, -0.042694091796875, 0.0009150505065917969, 0.01556396484375, -0.055511474609375, 0.05511474609375, 0.03582763671875, -0.06756591796875, 0.05426025390625, -0.0294036865234375, 0.017913818359375, -0.0653076171875, 0.01549530029296875, -0.0228271484375, -0.0223846435546875, -0.007228851318359375, 0.00811767578125, 0.0194091796875, 0.03387451171875, -0.045654296875, 0.0404052734375, -0.034881591796875, -0.004787445068359375, -0.0305328369140625, -0.0293426513671875, 0.01416015625, 0.019439697265625, -0.00386810302734375, 0.0596923828125, 0.043975830078125, -0.04449462890625, 0.044891357421875, 0.030548095703125, -0.0263671875, 0.025238037109375, -0.08050537109375, 0.0199127197265625, 0.02490234375, 0.01983642578125, -0.066650390625, -0.017608642578125, 0.031494140625, -0.035247802734375, 0.01245880126953125, -0.006763458251953125, -0.056671142578125, -0.0231781005859375, -0.018798828125, 0.07421875, 0.035064697265625, -0.045623779296875, 0.03094482421875, 0.01373291015625, 0.0238800048828125, -0.0203857421875, -0.055816650390625, -0.012115478515625, -0.037017822265625, -0.03021240234375, 0.040069580078125, -0.01386260986328125, -0.00452423095703125, -0.02093505859375, -0.0003426074981689453, -0.0107269287109375, 0.01407623291015625, 0.036407470703125, 0.06341552734375, -0.00901031494140625, 0.0012683868408203125, 0.001739501953125, -0.0245513916015625, 0.0143280029296875, -0.0018796920776367188, 0.0587158203125, -0.02215576171875, -0.0201263427734375, -0.056121826171875, -0.006160736083984375, 0.017120361328125, -0.005260467529296875, 0.050933837890625, 0.0677490234375, -0.01512908935546875, 0.02392578125, -0.0589599609375, -0.008819580078125, -0.0426025390625, 0.0106353759765625, -0.041351318359375, -0.03900146484375, 0.039520263671875, 0.0006937980651855469, 0.0028591156005859375, 0.052764892578125, 0.04901123046875, -0.0261383056640625, 0.04736328125, 0.035919189453125, 0.00511932373046875, 0.041351318359375, -0.0565185546875, -0.00823211669921875, -0.08782958984375, -0.0325927734375, -0.0200347900390625, -0.020782470703125, -0.044219970703125, -0.0269927978515625, 0.01172637939453125, 0.032379150390625, -0.0643310546875, 0.0333251953125, -0.05035400390625, 0.01409149169921875, 0.0484619140625, 0.0286407470703125, 0.0015039443969726562, 0.002864837646484375, -0.006561279296875, 0.0020313262939453125, -0.0296478271484375, -0.038726806640625, 0.0853271484375, 0.0143890380859375, 0.04034423828125, 0.0021839141845703125, 0.057708740234375, -0.0001735687255859375, 0.0178375244140625, -0.01739501953125, 0.02801513671875, 0.0156707763671875, -0.07781982421875, 0.0172119140625, -0.006267547607421875, -0.06304931640625, 0.0246429443359375, -0.01995849609375, -0.033477783203125, 0.022918701171875, 0.0254669189453125, -0.0321044921875, 0.029632568359375, -0.026092529296875, 0.0546875, -0.037353515625, -0.019683837890625, -0.0176239013671875, -0.0296783447265625, 0.0313720703125, 0.0012359619140625, 0.0121002197265625, -0.0273590087890625, -0.018096923828125, 0.07183837890625, -0.04827880859375, 0.051025390625, -0.0225677490234375, 0.01119232177734375, 0.0406494140625, -0.007598876953125, 0.03411865234375, 0.018646240234375, -0.00286102294921875, 0.0259552001953125, 0.008087158203125, -0.058807373046875, -0.01165771484375, 0.03875732421875, -0.07891845703125, -0.033660888671875, -0.053558349609375, -0.0045166015625, 0.016937255859375, -0.007598876953125, 0.053375244140625, 0.0018911361694335938, -0.005619049072265625, -0.003505706787109375, 0.036651611328125, -0.00033593177795410156, 0.04656982421875, 0.01245880126953125, -0.0083465576171875, -0.05535888671875, 0.05047607421875, -0.011322021484375, 0.017486572265625, -0.0258331298828125, -0.01406097412109375, -0.02984619140625, -0.040130615234375, -0.02899169921875, 0.042266845703125, -0.036529541015625, -0.044158935546875, -0.01482391357421875, -0.0180206298828125, -0.0205535888671875, -0.00916290283203125, -0.03192138671875, -0.03289794921875, -0.06610107421875, -0.0163116455078125, 0.057342529296875, 0.01531219482421875, -0.02880859375, 0.07958984375, -0.03558349609375, 0.0225067138671875, 0.027069091796875, 0.01088714599609375, -0.00848388671875, -0.0701904296875, -0.01461029052734375, -0.0017919540405273438, -0.0491943359375, -0.05023193359375, 0.04852294921875, 0.01418304443359375, 0.0277557373046875, 0.02880859375, -0.009918212890625, 0.08197021484375, -0.002880096435546875, 0.03717041015625, 0.033447265625, -0.075439453125, 0.043548583984375, 0.00696563720703125, 0.01617431640625, 0.039642333984375, 0.00696563720703125, -0.007038116455078125, -0.01140594482421875, -0.06396484375, -0.08465576171875, 0.02789306640625, 0.0251007080078125, 0.006534576416015625, 0.01837158203125, 0.035797119140625, 0.01554107666015625, 0.017822265625, -0.08026123046875, -0.0309295654296875, -0.03704833984375, -0.010009765625, 0.013946533203125, -0.032989501953125, -0.0031585693359375, -0.03802490234375, 0.07647705078125, 0.01152801513671875, 0.040679931640625, 0.0173492431640625, -0.0254364013671875, -0.007526397705078125, -0.0205841064453125, 0.037017822265625, 0.04254150390625, -0.033477783203125, -0.001148223876953125, 0.0190582275390625, -0.053314208984375, 0.0250396728515625, 0.0028076171875, -0.00771331787109375, 0.007160186767578125, 0.02447509765625, 0.0794677734375, -0.00373077392578125, -0.0011339187622070312, 0.020538330078125, -0.017242431640625, -0.025177001953125, -0.048187255859375, 0.00292205810546875, 0.004535675048828125, 0.0443115234375, 0.03338623046875, -0.00579833984375, 0.003620147705078125, -0.002613067626953125, -0.00664520263671875, 0.037200927734375, 0.0021457672119140625, -0.028717041015625, 0.06378173828125, 0.0159454345703125, -0.003658294677734375, 0.052642822265625, -0.0208740234375, -0.0202178955078125, 0.07379150390625, 0.037872314453125, 0.062469482421875, -0.002101898193359375, 0.0016164779663085938, 0.0433349609375, 0.025054931640625, 0.0184326171875, 0.047576904296875, -0.01032257080078125, -0.0340576171875, -0.009918212890625, -0.0595703125, 0.002895355224609375, 0.0018939971923828125, -0.044097900390625, 0.029632568359375, -0.0416259765625, -0.041107177734375, -0.00344085693359375, 0.036773681640625, -0.06939697265625, 0.036346435546875, 0.00830841064453125, 0.059967041015625, -0.056365966796875, 0.10455322265625, 0.035736083984375, -0.0257720947265625, -0.057861328125, -0.04046630859375, -0.025054931640625, -0.08038330078125, 0.06097412109375, 0.01019287109375, 0.012451171875, 0.013946533203125, -0.0570068359375, -0.0889892578125, 0.11279296875, 0.0232391357421875, -0.050262451171875, 0.0232391357421875, -0.00901031494140625, 0.0007910728454589844, -0.0264129638671875, 0.044708251953125, 0.0196075439453125, 0.034149169921875, 0.023162841796875, -0.050384521484375, 0.0173187255859375, -0.0092315673828125, 0.0033397674560546875, 0.0192413330078125, -0.07574462890625, 0.0791015625, -0.020294189453125, -0.0033283233642578125, 0.0204925537109375, 0.07904052734375, 0.05535888671875, 0.018310546875, 0.043609619140625, 0.053955078125, 0.032440185546875, -0.01204681396484375, 0.036773681640625, -0.0233154296875, 0.051788330078125, 0.060089111328125, 0.005401611328125, 0.052276611328125, 0.050872802734375, -0.0127410888671875, 0.055908203125, 0.0887451171875, -0.026092529296875, 0.05218505859375, 0.00466156005859375, -0.0210418701171875, -0.0193634033203125, 0.0289306640625, -0.046661376953125, 0.038482666015625, 0.034149169921875, -0.0197906494140625, 0.0028438568115234375, -0.008056640625, 0.00414276123046875, -0.04290771484375, -0.0214080810546875, 0.034393310546875, -0.0035190582275390625, -0.00855255126953125, 0.062164306640625, -0.0012826919555664062, 0.07720947265625, -0.038421630859375, 0.01047515869140625, -0.01232147216796875, 0.004383087158203125, -0.0223846435546875, -0.0195770263671875, 0.0246429443359375, -0.001514434814453125, 0.00783538818359375, 0.01922607421875, 0.0447998046875, -0.00911712646484375, -0.06829833984375, 0.008453369140625, 0.002197265625, 0.0228118896484375, -0.00045228004455566406, -0.064453125, 0.0281219482421875, 0.00652313232421875, -0.040435791015625, 0.0098876953125, 0.0252685546875, 0.01267242431640625, 0.04498291015625, 0.05743408203125, 0.0008559226989746094, 0.0264129638671875, -0.038299560546875, 0.072265625, -0.062744140625, -0.02838134765625, -0.07696533203125, 0.040924072265625, -0.00815582275390625, -0.05255126953125, 0.035064697265625, 0.042327880859375, 0.050933837890625, -0.00623321533203125, 0.0206146240234375, -0.023193359375, -0.00792694091796875, -0.044708251953125, 0.04779052734375, -0.037872314453125, 0.007213592529296875, 0.017578125, -0.046478271484375, 0.003238677978515625, 0.07275390625, -0.0022640228271484375, 0.0019350051879882812, 0.042144775390625, 0.047637939453125, -0.0171051025390625, -0.0167388916015625, 0.0106964111328125, 0.018096923828125, 0.01374053955078125, 0.0369873046875, 0.0474853515625, -0.0618896484375, 0.02471923828125, -0.03704833984375, -0.0198211669921875, -0.030517578125, -0.0654296875, -0.05316162109375, -0.03399658203125, -0.043701171875, -0.0279541015625, -0.0106048583984375, 0.0670166015625, 0.06494140625, -0.045654296875, -0.0307159423828125, 0.00655364990234375, -0.003368377685546875, 0.017913818359375, -0.01154327392578125, 0.03729248046875, 0.0018291473388671875, -0.0367431640625, -0.01216888427734375, -0.005855560302734375, 0.040191650390625, -0.007740020751953125, -0.00435638427734375, -0.030792236328125, 0.005237579345703125, 0.01153564453125, 0.0140228271484375, -0.0494384765625, -0.00688934326171875, -0.0258026123046875, -0.0157928466796875, 0.029144287109375, 0.025177001953125, -0.06646728515625, 0.005252838134765625, 0.01270294189453125, -0.01305389404296875, 0.06866455078125, -0.024627685546875, 0.004329681396484375, -0.038482666015625, 0.0587158203125, -0.0011434555053710938, 0.03350830078125, 0.0192108154296875, -0.0306549072265625, 0.03594970703125, 0.01561737060546875, -0.058868408203125, -0.06024169921875, -0.00836181640625, -0.0848388671875, -0.01082611083984375, 0.0611572265625, -0.0178680419921875, -0.045623779296875, 0.019317626953125, -0.02337646484375, 0.05438232421875, -0.019317626953125, 0.0640869140625, 0.03143310546875, 0.0009493827819824219, 0.0016918182373046875, -0.039215087890625, 0.0173797607421875, 0.0240936279296875, -0.050537109375, -0.043487548828125, 0.020599365234375, 0.044677734375, 0.006435394287109375, 0.04327392578125, 0.0014781951904296875, 0.02178955078125, 0.01342010498046875, 0.04302978515625, -0.01512908935546875, 0.01445770263671875, -0.017608642578125, -0.0206298828125, 0.01499176025390625, -0.0732421875 ] ]
ai-forever/sbert_large_nlu_ru
2023-10-28T10:40:17.000Z
[ "transformers", "pytorch", "jax", "bert", "feature-extraction", "PyTorch", "Transformers", "ru", "endpoints_compatible", "region:us" ]
feature-extraction
ai-forever
null
null
ai-forever/sbert_large_nlu_ru
27
118,414
transformers
2022-03-02T23:29:05
--- language: - ru tags: - PyTorch - Transformers --- # BERT large model (uncased) for Sentence Embeddings in Russian language. The model is described [in this article](https://habr.com/ru/company/sberdevices/blog/527576/) For better quality, use mean token embeddings. ## Usage (HuggingFace Models Repository) You can use the model directly from the model repository to compute sentence embeddings: ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() sum_embeddings = torch.sum(token_embeddings * input_mask_expanded, 1) sum_mask = torch.clamp(input_mask_expanded.sum(1), min=1e-9) return sum_embeddings / sum_mask #Sentences we want sentence embeddings for sentences = ['Привет! Как твои дела?', 'А правда, что 42 твое любимое число?'] #Load AutoModel from huggingface model repository tokenizer = AutoTokenizer.from_pretrained("ai-forever/sbert_large_nlu_ru") model = AutoModel.from_pretrained("ai-forever/sbert_large_nlu_ru") #Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, max_length=24, return_tensors='pt') #Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) #Perform pooling. In this case, mean pooling sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) ``` # Authors + [SberDevices](https://sberdevices.ru/) Team. + Aleksandr Abramov: [HF profile](https://huggingface.co/Andrilko), [Github](https://github.com/Ab1992ao), [Kaggle Competitions Master](https://www.kaggle.com/andrilko); + Denis Antykhov: [Github](https://github.com/gaphex);
1,931
[ [ -0.01229095458984375, -0.054962158203125, 0.0246124267578125, 0.0325927734375, -0.027801513671875, -0.00972747802734375, -0.022247314453125, -0.01788330078125, 0.026397705078125, 0.02056884765625, -0.046356201171875, -0.045684814453125, -0.048187255859375, -0.01007080078125, -0.0183563232421875, 0.0811767578125, -0.01316070556640625, 0.0147247314453125, -0.017303466796875, -0.00394439697265625, -0.01515960693359375, -0.0283660888671875, -0.0197601318359375, -0.031341552734375, 0.00948333740234375, 0.007732391357421875, 0.0404052734375, 0.0282440185546875, 0.0229949951171875, 0.0312042236328125, -0.0032253265380859375, -0.005374908447265625, -0.01406097412109375, 0.0006690025329589844, 0.004642486572265625, -0.0123443603515625, -0.01593017578125, 0.004009246826171875, 0.06683349609375, 0.0171051025390625, -0.01195526123046875, 0.00553131103515625, -0.01399993896484375, 0.0264129638671875, -0.035980224609375, 0.02679443359375, -0.013427734375, 0.00998687744140625, -0.00004762411117553711, 0.0234375, -0.03533935546875, -0.006618499755859375, 0.023651123046875, -0.0252532958984375, -0.011566162109375, 0.01258087158203125, 0.08837890625, 0.01309967041015625, -0.0131378173828125, -0.03265380859375, -0.01218414306640625, 0.07391357421875, -0.0699462890625, 0.02484130859375, 0.02734375, -0.005489349365234375, -0.017364501953125, -0.053497314453125, -0.038360595703125, -0.0157470703125, -0.028106689453125, 0.003139495849609375, -0.03143310546875, 0.01172637939453125, 0.0110626220703125, 0.0199127197265625, -0.06182861328125, -0.0050506591796875, -0.0228271484375, -0.03045654296875, 0.049407958984375, -0.0087432861328125, 0.0267791748046875, -0.03314208984375, -0.0225982666015625, -0.0114288330078125, -0.01512908935546875, -0.007904052734375, 0.029998779296875, 0.006038665771484375, -0.0296783447265625, 0.054718017578125, -0.01287841796875, 0.04901123046875, -0.005878448486328125, 0.00827789306640625, 0.04534912109375, -0.01354217529296875, -0.0293426513671875, -0.0113372802734375, 0.07574462890625, 0.00542449951171875, 0.03399658203125, -0.0091094970703125, -0.0167694091796875, -0.007442474365234375, 0.004184722900390625, -0.06658935546875, -0.03759765625, 0.0221405029296875, -0.029571533203125, -0.020355224609375, 0.00792694091796875, -0.055511474609375, -0.007442474365234375, 0.006092071533203125, 0.058349609375, -0.042999267578125, -0.00811767578125, 0.0197906494140625, -0.0097808837890625, 0.0238037109375, -0.006084442138671875, -0.058135986328125, 0.023284912109375, 0.02886962890625, 0.0677490234375, 0.01549530029296875, -0.0322265625, -0.0477294921875, -0.02099609375, 0.0019683837890625, 0.041046142578125, -0.0301055908203125, 0.003551483154296875, 0.02569580078125, 0.005992889404296875, -0.01544189453125, -0.01337432861328125, 0.0234832763671875, -0.026885986328125, 0.04034423828125, 0.006641387939453125, -0.06005859375, -0.001514434814453125, 0.01052093505859375, -0.0216827392578125, 0.07159423828125, 0.0197906494140625, -0.06036376953125, 0.010894775390625, -0.05853271484375, -0.0289306640625, 0.002017974853515625, 0.006618499755859375, -0.06243896484375, 0.00580596923828125, 0.022735595703125, 0.064453125, 0.0164642333984375, 0.017364501953125, -0.01690673828125, -0.0206146240234375, 0.032318115234375, 0.012786865234375, 0.08612060546875, 0.00989532470703125, -0.0287322998046875, 0.0316162109375, -0.03564453125, 0.008636474609375, 0.01354217529296875, -0.0091705322265625, -0.0050506591796875, -0.01934814453125, 0.0400390625, 0.019378662109375, 0.0067291259765625, -0.063232421875, 0.01229095458984375, -0.04364013671875, 0.06903076171875, 0.052825927734375, -0.00632476806640625, 0.032928466796875, -0.016876220703125, 0.0279083251953125, 0.01064300537109375, -0.0003333091735839844, -0.0167999267578125, -0.03546142578125, -0.0616455078125, -0.0462646484375, 0.036163330078125, 0.032012939453125, -0.04986572265625, 0.058502197265625, 0.0008387565612792969, -0.0298919677734375, -0.05889892578125, 0.0166168212890625, 0.037994384765625, 0.009490966796875, 0.0272216796875, -0.00876617431640625, -0.039459228515625, -0.071533203125, 0.0089263916015625, -0.005008697509765625, 0.007472991943359375, 0.01532745361328125, 0.06988525390625, -0.01397705078125, 0.05426025390625, -0.04656982421875, -0.0298004150390625, -0.021453857421875, 0.0137481689453125, 0.022552490234375, 0.0567626953125, 0.04833984375, -0.04437255859375, -0.042327880859375, -0.03326416015625, -0.04833984375, 0.0130615234375, -0.008880615234375, -0.0240020751953125, 0.039764404296875, 0.0267486572265625, -0.0872802734375, 0.02850341796875, 0.047576904296875, -0.060943603515625, 0.054229736328125, -0.0233001708984375, -0.002033233642578125, -0.09454345703125, 0.01788330078125, 0.01047515869140625, -0.035858154296875, -0.035980224609375, 0.0254974365234375, 0.009918212890625, -0.005992889404296875, -0.035888671875, 0.05133056640625, -0.047271728515625, 0.0029468536376953125, -0.00677490234375, 0.0160980224609375, -0.003681182861328125, 0.044189453125, -0.005367279052734375, 0.042938232421875, 0.05462646484375, -0.0255279541015625, 0.0531005859375, 0.050537109375, -0.0537109375, 0.0171356201171875, -0.06439208984375, 0.0013599395751953125, -0.005279541015625, 0.0301055908203125, -0.079833984375, -0.0309906005859375, 0.025665283203125, -0.0599365234375, 0.035064697265625, -0.0008339881896972656, -0.06640625, -0.044189453125, -0.0284576416015625, 0.01128387451171875, 0.053924560546875, -0.0361328125, 0.0266265869140625, 0.03143310546875, -0.01056671142578125, -0.059814453125, -0.07769775390625, 0.0019512176513671875, -0.01158905029296875, -0.057861328125, 0.024505615234375, -0.0198516845703125, 0.001018524169921875, 0.0166015625, 0.021392822265625, -0.0092010498046875, -0.004878997802734375, 0.00034046173095703125, 0.0275726318359375, -0.00995635986328125, 0.0229949951171875, -0.0008115768432617188, -0.0011615753173828125, 0.005504608154296875, -0.0226287841796875, 0.059326171875, -0.025848388671875, -0.005290985107421875, -0.0474853515625, 0.01806640625, 0.032073974609375, -0.0197296142578125, 0.08160400390625, 0.08685302734375, -0.04510498046875, -0.01113128662109375, -0.053863525390625, -0.0137176513671875, -0.03375244140625, 0.033233642578125, -0.007801055908203125, -0.07684326171875, 0.053375244140625, 0.034423828125, 0.0025463104248046875, 0.0293121337890625, 0.05657958984375, -0.012481689453125, 0.06585693359375, 0.048187255859375, -0.0195770263671875, 0.041229248046875, -0.0357666015625, 0.007778167724609375, -0.06109619140625, -0.0185394287109375, -0.0137481689453125, -0.0248870849609375, -0.039825439453125, -0.0213623046875, 0.0061798095703125, -0.0181427001953125, -0.0168304443359375, 0.031280517578125, -0.044708251953125, 0.030029296875, 0.05078125, 0.01396942138671875, -0.017913818359375, -0.0035228729248046875, -0.01580810546875, -0.002605438232421875, -0.042205810546875, -0.01165008544921875, 0.0653076171875, 0.036163330078125, 0.046875, 0.003574371337890625, 0.05535888671875, 0.01194000244140625, 0.0013370513916015625, -0.068115234375, 0.035247802734375, -0.005222320556640625, -0.05615234375, -0.0225067138671875, -0.0322265625, -0.061981201171875, 0.00716400146484375, -0.0302581787109375, -0.0740966796875, -0.004238128662109375, -0.0026111602783203125, -0.0238800048828125, 0.019866943359375, -0.05682373046875, 0.076171875, 0.017669677734375, -0.02752685546875, -0.01432037353515625, -0.057525634765625, 0.018524169921875, 0.02056884765625, -0.0022449493408203125, 0.000005304813385009766, 0.01416778564453125, 0.07012939453125, -0.0203704833984375, 0.060394287109375, -0.0251312255859375, 0.0229339599609375, 0.0243682861328125, -0.01403045654296875, 0.0243072509765625, 0.0070037841796875, 0.001026153564453125, 0.0010547637939453125, -0.0038661956787109375, -0.049468994140625, -0.029571533203125, 0.06427001953125, -0.07318115234375, -0.0279083251953125, -0.053558349609375, -0.04449462890625, 0.006259918212890625, 0.0220489501953125, 0.0235595703125, 0.0302581787109375, -0.01311492919921875, 0.03265380859375, 0.05108642578125, -0.03240966796875, 0.0543212890625, 0.0107269287109375, -0.01432037353515625, -0.0445556640625, 0.045562744140625, -0.0014562606811523438, -0.0066986083984375, 0.022186279296875, 0.0171661376953125, -0.0247039794921875, -0.029449462890625, -0.039031982421875, 0.054534912109375, -0.03668212890625, -0.033111572265625, -0.06671142578125, -0.031768798828125, -0.06103515625, -0.01515960693359375, -0.0197601318359375, -0.028167724609375, -0.024810791015625, -0.0129852294921875, 0.03948974609375, 0.04541015625, -0.01055145263671875, 0.03375244140625, -0.0496826171875, 0.0032215118408203125, 0.0019550323486328125, 0.03143310546875, -0.0110626220703125, -0.0455322265625, -0.0433349609375, -0.00554656982421875, -0.018341064453125, -0.054656982421875, 0.05352783203125, 0.0093994140625, 0.042327880859375, 0.0125885009765625, 0.0004398822784423828, 0.038909912109375, -0.03936767578125, 0.0570068359375, 0.01230621337890625, -0.0882568359375, 0.037506103515625, -0.0287628173828125, 0.0355224609375, 0.0253143310546875, 0.0222015380859375, -0.031768798828125, -0.0215301513671875, -0.050048828125, -0.056915283203125, 0.0675048828125, 0.03570556640625, 0.036346435546875, -0.00554656982421875, 0.0308380126953125, -0.01004791259765625, 0.01678466796875, -0.077392578125, -0.020172119140625, -0.0203399658203125, -0.0357666015625, -0.0159759521484375, -0.0309906005859375, 0.0018911361694335938, -0.0196533203125, 0.069091796875, 0.01213836669921875, 0.046630859375, 0.0218353271484375, -0.02880859375, -0.007114410400390625, 0.00405120849609375, 0.049224853515625, 0.0345458984375, -0.020843505859375, -0.011810302734375, 0.0033740997314453125, -0.028533935546875, -0.0095367431640625, 0.02764892578125, -0.0042572021484375, 0.02166748046875, 0.0276947021484375, 0.0614013671875, 0.01812744140625, -0.049896240234375, 0.06158447265625, 0.0013055801391601562, -0.031982421875, -0.04559326171875, -0.0146484375, 0.01580810546875, 0.0263214111328125, 0.0109100341796875, -0.017242431640625, 0.005283355712890625, -0.03973388671875, 0.0276947021484375, 0.039703369140625, -0.0270843505859375, -0.0198822021484375, 0.031463623046875, 0.00934600830078125, -0.018585205078125, 0.078125, -0.01529693603515625, -0.050262451171875, 0.0338134765625, 0.04638671875, 0.06805419921875, -0.0210418701171875, 0.0303192138671875, 0.037689208984375, 0.0192108154296875, 0.00446319580078125, 0.00911712646484375, 0.0064544677734375, -0.050384521484375, -0.0362548828125, -0.055450439453125, -0.006618499755859375, 0.01421356201171875, -0.04437255859375, 0.0272979736328125, -0.0341796875, -0.006031036376953125, -0.00031828880310058594, 0.016693115234375, -0.04266357421875, 0.011138916015625, 0.007144927978515625, 0.05712890625, -0.07244873046875, 0.058990478515625, 0.0709228515625, -0.0281524658203125, -0.03387451171875, -0.0139617919921875, -0.005893707275390625, -0.068359375, 0.033966064453125, 0.026336669921875, 0.0257720947265625, 0.00435638427734375, -0.048004150390625, -0.06494140625, 0.07666015625, 0.019073486328125, -0.0108642578125, -0.012847900390625, -0.0246124267578125, 0.039031982421875, -0.035797119140625, 0.0189361572265625, 0.0242462158203125, 0.027587890625, 0.01401519775390625, -0.05950927734375, 0.0166473388671875, -0.03466796875, -0.0117645263671875, 0.01197052001953125, -0.053131103515625, 0.058990478515625, -0.019866943359375, -0.0023975372314453125, 0.025543212890625, 0.057403564453125, 0.01505279541015625, 0.0027294158935546875, 0.020751953125, 0.05426025390625, 0.03497314453125, -0.004604339599609375, 0.06781005859375, -0.006580352783203125, 0.04986572265625, 0.056549072265625, 0.02044677734375, 0.06732177734375, 0.03070068359375, -0.006458282470703125, 0.05255126953125, 0.03955078125, -0.0258026123046875, 0.04388427734375, 0.0347900390625, -0.0042572021484375, -0.01055145263671875, 0.0034923553466796875, -0.0142974853515625, 0.041107177734375, 0.022186279296875, -0.0362548828125, -0.00024020671844482422, 0.01788330078125, 0.0242767333984375, -0.007293701171875, 0.0027294158935546875, 0.05169677734375, 0.006961822509765625, -0.038177490234375, 0.0341796875, 0.01047515869140625, 0.0625, -0.047637939453125, 0.004802703857421875, 0.002857208251953125, 0.0251312255859375, -0.00940704345703125, -0.058990478515625, 0.0253448486328125, 0.0005445480346679688, 0.0017681121826171875, -0.0205230712890625, 0.0474853515625, -0.051055908203125, -0.046875, 0.0179595947265625, 0.0230255126953125, 0.022186279296875, 0.01198577880859375, -0.07293701171875, 0.01434326171875, 0.00250244140625, -0.0423583984375, 0.024749755859375, 0.01702880859375, 0.031524658203125, 0.052947998046875, 0.0279998779296875, 0.0029544830322265625, -0.015045166015625, -0.0033092498779296875, 0.06658935546875, -0.040191650390625, -0.045440673828125, -0.0748291015625, 0.0582275390625, -0.012298583984375, -0.0223541259765625, 0.043487548828125, 0.034698486328125, 0.054046630859375, -0.019500732421875, 0.0362548828125, -0.01279449462890625, 0.0090789794921875, -0.051849365234375, 0.07989501953125, -0.043731689453125, -0.0266265869140625, -0.0274810791015625, -0.07574462890625, -0.037139892578125, 0.0869140625, -0.0123291015625, 0.015045166015625, 0.051300048828125, 0.042144775390625, -0.005603790283203125, -0.0224761962890625, 0.02130126953125, 0.0426025390625, 0.01849365234375, 0.040435791015625, 0.0272064208984375, -0.04510498046875, 0.037689208984375, -0.025970458984375, -0.0265655517578125, -0.023590087890625, -0.06744384765625, -0.09405517578125, -0.058624267578125, -0.04254150390625, -0.043670654296875, 0.0095367431640625, 0.09173583984375, 0.069091796875, -0.069091796875, -0.01073455810546875, -0.004184722900390625, -0.00458526611328125, 0.0072784423828125, -0.02044677734375, 0.04638671875, -0.0294952392578125, -0.05926513671875, -0.00193023681640625, -0.0031185150146484375, 0.0157928466796875, -0.019256591796875, 0.005031585693359375, -0.047027587890625, 0.00466156005859375, 0.037445068359375, -0.00035190582275390625, -0.05450439453125, -0.0309906005859375, 0.004497528076171875, -0.01001739501953125, -0.01087188720703125, 0.022125244140625, -0.041412353515625, 0.03155517578125, 0.0322265625, 0.036956787109375, 0.06353759765625, -0.019256591796875, 0.03912353515625, -0.06439208984375, 0.0204315185546875, 0.01409149169921875, 0.0491943359375, 0.037933349609375, -0.0279998779296875, 0.028533935546875, 0.01126861572265625, -0.058685302734375, -0.06390380859375, 0.0026111602783203125, -0.080810546875, -0.016815185546875, 0.0867919921875, -0.039581298828125, -0.028076171875, 0.0164947509765625, -0.00986480712890625, 0.02734375, -0.04052734375, 0.0653076171875, 0.09344482421875, -0.00328826904296875, -0.0225677490234375, -0.0235443115234375, 0.0298004150390625, 0.03643798828125, -0.040740966796875, 0.004978179931640625, 0.0345458984375, 0.023468017578125, 0.018768310546875, 0.0318603515625, -0.0092926025390625, 0.0172271728515625, -0.00351715087890625, 0.0136566162109375, -0.0147857666015625, -0.00714111328125, -0.023529052734375, -0.01412200927734375, -0.0222930908203125, -0.024566650390625 ] ]
WizardLM/WizardCoder-Python-34B-V1.0
2023-09-09T06:44:14.000Z
[ "transformers", "pytorch", "llama", "text-generation", "code", "arxiv:2304.12244", "arxiv:2306.08568", "arxiv:2308.09583", "arxiv:2303.08774", "license:llama2", "model-index", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
WizardLM
null
null
WizardLM/WizardCoder-Python-34B-V1.0
653
118,314
transformers
2023-08-26T03:59:07
--- license: llama2 metrics: - code_eval library_name: transformers tags: - code model-index: - name: WizardCoder-Python-34B-V1.0 results: - task: type: text-generation dataset: type: openai_humaneval name: HumanEval metrics: - name: pass@1 type: pass@1 value: 0.732 verified: false --- <p align="center"> 🤗 <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> • 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> • 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br> </p> <p align="center"> 👋 Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a> </p> ## News - 🔥🔥🔥[2023/08/26] We released **WizardCoder-Python-34B-V1.0** , which achieves the **73.2 pass@1** and surpasses **GPT4 (2023/03/15)**, **ChatGPT-3.5**, and **Claude2** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). - [2023/06/16] We released **WizardCoder-15B-V1.0** , which achieves the **57.3 pass@1** and surpasses **Claude-Plus (+6.8)**, **Bard (+15.3)** and **InstructCodeT5+ (+22.3)** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). ❗Note: There are two HumanEval results of GPT4 and ChatGPT-3.5. The 67.0 and 48.1 are reported by the official GPT4 Report (2023/03/15) of [OpenAI](https://arxiv.org/abs/2303.08774). The 82.0 and 72.5 are tested by ourselves with the latest API (2023/08/26). | Model | Checkpoint | Paper | HumanEval | MBPP | Demo | License | | ----- |------| ---- |------|-------| ----- | ----- | | WizardCoder-Python-34B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 61.2 | [Demo](http://47.103.63.15:50085/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> | | WizardCoder-15B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 |50.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | | WizardCoder-Python-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 64.0 | 55.6 | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> | | WizardCoder-3B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-3B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 34.8 |37.4 | [Demo](http://47.103.63.15:50086/) | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | | WizardCoder-1B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-1B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 23.8 |28.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | - Our **WizardMath-70B-V1.0** model slightly outperforms some closed-source LLMs on the GSM8K, including **ChatGPT 3.5**, **Claude Instant 1** and **PaLM 2 540B**. - Our **WizardMath-70B-V1.0** model achieves **81.6 pass@1** on the [GSM8k Benchmarks](https://github.com/openai/grade-school-math), which is **24.8** points higher than the SOTA open-source LLM, and achieves **22.7 pass@1** on the [MATH Benchmarks](https://github.com/hendrycks/math), which is **9.2** points higher than the SOTA open-source LLM. <font size=4> | Model | Checkpoint | Paper | GSM8k | MATH |Online Demo| License| | ----- |------| ---- |------|-------| ----- | ----- | | WizardMath-70B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-70B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **81.6** | **22.7** |[Demo](http://47.103.63.15:50083/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | | WizardMath-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **63.9** | **14.0** |[Demo](http://47.103.63.15:50082/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | | WizardMath-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **54.9** | **10.7** | [Demo ](http://47.103.63.15:50080/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a>| </font> - [08/09/2023] We released **WizardLM-70B-V1.0** model. Here is [Full Model Weight](https://huggingface.co/WizardLM/WizardLM-70B-V1.0). <font size=4> | <sup>Model</sup> | <sup>Checkpoint</sup> | <sup>Paper</sup> |<sup>MT-Bench</sup> | <sup>AlpacaEval</sup> | <sup>GSM8k</sup> | <sup>HumanEval</sup> | <sup>License</sup>| | ----- |------| ---- |------|-------| ----- | ----- | ----- | | <sup>**WizardLM-70B-V1.0**</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-70B-V1.0" target="_blank">HF Link</a> </sup>|<sup>📃**Coming Soon**</sup>| <sup>**7.78**</sup> | <sup>**92.91%**</sup> |<sup>**77.6%**</sup> | <sup> **50.6**</sup>|<sup> <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 License </a></sup> | | <sup>WizardLM-13B-V1.2</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.2" target="_blank">HF Link</a> </sup>| | <sup>7.06</sup> | <sup>89.17%</sup> |<sup>55.3%</sup> | <sup>36.6 </sup>|<sup> <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 License </a></sup> | | <sup>WizardLM-13B-V1.1</sup> |<sup> 🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.1" target="_blank">HF Link</a> </sup> | | <sup>6.76</sup> |<sup>86.32%</sup> | | <sup>25.0 </sup>| <sup>Non-commercial</sup>| | <sup>WizardLM-30B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-30B-V1.0" target="_blank">HF Link</a></sup> | | <sup>7.01</sup> | | | <sup>37.8 </sup>| <sup>Non-commercial</sup> | | <sup>WizardLM-13B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.0" target="_blank">HF Link</a> </sup> | | <sup>6.35</sup> | <sup>75.31%</sup> | | <sup> 24.0 </sup> | <sup>Non-commercial</sup>| | <sup>WizardLM-7B-V1.0 </sup>| <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-7B-V1.0" target="_blank">HF Link</a> </sup> |<sup> 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> </sup>| | | |<sup>19.1 </sup>|<sup> Non-commercial</sup>| </font> ## Comparing WizardCoder-Python-34B-V1.0 with Other LLMs. 🔥 The following figure shows that our **WizardCoder-Python-34B-V1.0 attains the second position in this benchmark**, surpassing GPT4 (2023/03/15, 73.2 vs. 67.0), ChatGPT-3.5 (73.2 vs. 72.5) and Claude2 (73.2 vs. 71.2). <p align="center" width="100%"> <a ><img src="https://raw.githubusercontent.com/nlpxucan/WizardLM/main/WizardCoder/imgs/compare_sota.png" alt="WizardCoder" style="width: 96%; min-width: 300px; display: block; margin: auto;"></a> </p> ## Prompt Format ``` "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:" ``` ## Inference Demo Script We provide the inference demo code [here](https://github.com/nlpxucan/WizardLM/tree/main/demo). ## Citation Please cite the repo if you use the data, method or code in this repo. ``` @article{luo2023wizardcoder, title={WizardCoder: Empowering Code Large Language Models with Evol-Instruct}, author={Luo, Ziyang and Xu, Can and Zhao, Pu and Sun, Qingfeng and Geng, Xiubo and Hu, Wenxiang and Tao, Chongyang and Ma, Jing and Lin, Qingwei and Jiang, Daxin}, journal={arXiv preprint arXiv:2306.08568}, year={2023} } ```
8,776
[ [ -0.046875, -0.03509521484375, -0.006847381591796875, 0.02777099609375, 0.00502777099609375, -0.01361083984375, 0.0026092529296875, -0.035614013671875, 0.0162506103515625, 0.02398681640625, -0.049774169921875, -0.046722412109375, -0.03765869140625, 0.0211334228515625, -0.0073089599609375, 0.0635986328125, -0.01175689697265625, -0.01678466796875, -0.01451873779296875, -0.0115203857421875, -0.01107025146484375, -0.036102294921875, -0.019561767578125, -0.033355712890625, 0.031036376953125, 0.0021266937255859375, 0.063720703125, 0.034149169921875, 0.0255126953125, 0.02191162109375, -0.0154571533203125, 0.036865234375, -0.01123046875, -0.018829345703125, 0.0182647705078125, -0.01861572265625, -0.07208251953125, -0.0024852752685546875, 0.043914794921875, 0.022674560546875, -0.0007457733154296875, 0.027984619140625, 0.004894256591796875, 0.0682373046875, -0.0408935546875, 0.0196533203125, -0.0178070068359375, 0.0223541259765625, -0.0124053955078125, -0.00914764404296875, 0.004459381103515625, -0.042266845703125, 0.0019121170043945312, -0.0579833984375, -0.0029239654541015625, 0.00933837890625, 0.0841064453125, 0.0146942138671875, -0.0170745849609375, -0.003421783447265625, -0.0181121826171875, 0.04852294921875, -0.0638427734375, 0.02459716796875, 0.040863037109375, 0.0099334716796875, -0.04058837890625, -0.04461669921875, -0.065673828125, -0.01247406005859375, -0.00806427001953125, 0.00774383544921875, -0.0282745361328125, -0.015655517578125, 0.02496337890625, 0.0254669189453125, -0.052398681640625, -0.00852203369140625, -0.02520751953125, -0.022369384765625, 0.060394287109375, 0.01511383056640625, 0.0355224609375, -0.0166168212890625, -0.0012493133544921875, -0.0149993896484375, -0.0350341796875, 0.0160980224609375, 0.0290069580078125, -0.007568359375, -0.032440185546875, 0.056304931640625, -0.0082855224609375, 0.045318603515625, 0.0092620849609375, -0.044769287109375, 0.04925537109375, -0.0288848876953125, -0.01499176025390625, -0.007175445556640625, 0.0843505859375, 0.04119873046875, 0.0102081298828125, 0.00357818603515625, -0.00026607513427734375, -0.0160675048828125, 0.0017032623291015625, -0.0675048828125, -0.009033203125, 0.0246734619140625, -0.0380859375, -0.01352691650390625, -0.017578125, -0.05816650390625, -0.027587890625, -0.0009388923645019531, 0.0194091796875, -0.04833984375, -0.0238037109375, 0.01800537109375, -0.00919342041015625, 0.047149658203125, 0.040435791015625, -0.0634765625, 0.0191497802734375, 0.038543701171875, 0.060943603515625, -0.01213836669921875, -0.0413818359375, -0.0102691650390625, -0.0003178119659423828, -0.0258331298828125, 0.04168701171875, 0.0024585723876953125, -0.03265380859375, -0.008819580078125, -0.00470733642578125, -0.00992584228515625, -0.026123046875, 0.030548095703125, -0.034881591796875, 0.0214385986328125, -0.0093994140625, -0.037811279296875, -0.016815185546875, 0.0252532958984375, -0.047027587890625, 0.0826416015625, 0.013153076171875, -0.07440185546875, -0.00015020370483398438, -0.047637939453125, -0.00739288330078125, -0.029571533203125, -0.0027980804443359375, -0.047088623046875, -0.02166748046875, 0.02020263671875, 0.0168914794921875, -0.033721923828125, -0.0159454345703125, -0.019195556640625, -0.02044677734375, 0.020965576171875, -0.040252685546875, 0.09173583984375, 0.0166778564453125, -0.0294189453125, -0.006011962890625, -0.07159423828125, 0.0011358261108398438, 0.043914794921875, -0.03643798828125, 0.007843017578125, -0.0242919921875, -0.007419586181640625, 0.01178741455078125, 0.050628662109375, -0.0240478515625, 0.035888671875, -0.037872314453125, -0.0077362060546875, 0.053680419921875, -0.00946807861328125, 0.02874755859375, -0.035675048828125, 0.03131103515625, -0.0133819580078125, 0.0289154052734375, 0.01122283935546875, -0.0523681640625, -0.06158447265625, -0.0284576416015625, 0.00615692138671875, 0.055450439453125, -0.035736083984375, 0.07562255859375, -0.0190277099609375, -0.0654296875, -0.03912353515625, 0.019683837890625, 0.0301513671875, 0.038177490234375, 0.04144287109375, -0.01629638671875, -0.02545166015625, -0.057403564453125, -0.005718231201171875, -0.024383544921875, -0.0082550048828125, 0.02215576171875, 0.04559326171875, -0.0333251953125, 0.07452392578125, -0.044677734375, -0.01788330078125, -0.004680633544921875, -0.015838623046875, 0.029541015625, 0.04608154296875, 0.044464111328125, -0.045623779296875, -0.033935546875, 0.01153564453125, -0.0675048828125, -0.0081024169921875, 0.00803375244140625, -0.0235443115234375, 0.0258636474609375, -0.00788116455078125, -0.068359375, 0.0582275390625, 0.0212554931640625, -0.040985107421875, 0.0672607421875, -0.0260009765625, 0.0105438232421875, -0.07061767578125, 0.0037078857421875, -0.008544921875, 0.00861358642578125, -0.045562744140625, -0.0028400421142578125, 0.00482177734375, 0.022216796875, -0.04376220703125, 0.059356689453125, -0.03509521484375, -0.007663726806640625, -0.004421234130859375, -0.00926971435546875, 0.01476287841796875, 0.057037353515625, -0.0084228515625, 0.054931640625, 0.051055908203125, -0.032684326171875, 0.041656494140625, 0.0290985107421875, -0.0140838623046875, 0.0269927978515625, -0.04150390625, 0.00951385498046875, 0.007038116455078125, 0.02752685546875, -0.039459228515625, -0.00806427001953125, 0.048065185546875, -0.0467529296875, 0.024139404296875, -0.00180816650390625, -0.057037353515625, -0.046661376953125, -0.052001953125, 0.00428009033203125, 0.057159423828125, -0.038787841796875, 0.049163818359375, 0.019439697265625, 0.02203369140625, -0.059783935546875, -0.041351318359375, -0.007701873779296875, -0.0100860595703125, -0.059539794921875, 0.01904296875, -0.0235595703125, -0.00782012939453125, -0.00470733642578125, -0.02740478515625, -0.00006151199340820312, 0.00817108154296875, 0.0181427001953125, 0.033905029296875, -0.0108489990234375, -0.027130126953125, 0.0022869110107421875, -0.00983428955078125, 0.001560211181640625, -0.0222320556640625, 0.03399658203125, -0.021636962890625, -0.037750244140625, -0.033935546875, 0.00357818603515625, 0.035064697265625, -0.02032470703125, 0.06195068359375, 0.050933837890625, -0.035614013671875, 0.00946807861328125, -0.048858642578125, 0.00948333740234375, -0.041229248046875, 0.00811004638671875, -0.032318115234375, -0.052093505859375, 0.04339599609375, 0.0196685791015625, 0.02337646484375, 0.04400634765625, 0.048919677734375, 0.00849151611328125, 0.068359375, 0.03167724609375, -0.00421905517578125, 0.0345458984375, -0.040069580078125, 0.00504302978515625, -0.06549072265625, -0.03912353515625, -0.035125732421875, 0.0022296905517578125, -0.03692626953125, -0.048248291015625, 0.0291595458984375, 0.04473876953125, -0.04345703125, 0.0438232421875, -0.06878662109375, 0.01517486572265625, 0.04022216796875, 0.002086639404296875, 0.014007568359375, 0.01264190673828125, -0.0204620361328125, 0.0172576904296875, -0.026763916015625, -0.044708251953125, 0.078369140625, 0.0187225341796875, 0.0472412109375, 0.01439666748046875, 0.057220458984375, 0.0022449493408203125, -0.002941131591796875, -0.02557373046875, 0.051666259765625, 0.02874755859375, -0.03887939453125, -0.030975341796875, -0.018524169921875, -0.083740234375, 0.035308837890625, -0.01293182373046875, -0.088134765625, 0.0238800048828125, 0.006206512451171875, -0.01861572265625, 0.0372314453125, -0.041900634765625, 0.0654296875, -0.0095977783203125, -0.033050537109375, 0.0050811767578125, -0.034881591796875, 0.019561767578125, 0.00885009765625, 0.006282806396484375, -0.0269622802734375, -0.02337646484375, 0.06317138671875, -0.08148193359375, 0.045379638671875, -0.0034389495849609375, -0.023223876953125, 0.04107666015625, -0.004467010498046875, 0.04058837890625, -0.0048065185546875, -0.0148773193359375, 0.0341796875, 0.01233673095703125, -0.03289794921875, -0.0450439453125, 0.0494384765625, -0.08056640625, -0.0511474609375, -0.0418701171875, -0.02520751953125, -0.00350189208984375, 0.0190887451171875, 0.021697998046875, 0.014129638671875, 0.022674560546875, -0.02557373046875, 0.053863525390625, -0.0267181396484375, 0.0300750732421875, 0.02569580078125, -0.0227813720703125, -0.038116455078125, 0.0743408203125, 0.010467529296875, -0.00600433349609375, 0.025360107421875, 0.0165863037109375, -0.017333984375, -0.0304412841796875, -0.05352783203125, 0.026275634765625, -0.05499267578125, -0.02972412109375, -0.0621337890625, -0.033966064453125, -0.045806884765625, -0.0214385986328125, -0.0292816162109375, -0.03558349609375, -0.0474853515625, 0.00777435302734375, 0.07891845703125, 0.0299530029296875, -0.019287109375, -0.01157379150390625, -0.053070068359375, 0.0274658203125, 0.027313232421875, 0.0156402587890625, 0.02166748046875, -0.039093017578125, -0.01430511474609375, -0.01029205322265625, -0.04022216796875, -0.06793212890625, 0.044708251953125, -0.013214111328125, 0.040283203125, 0.0059051513671875, 0.0037689208984375, 0.06060791015625, -0.045074462890625, 0.07659912109375, 0.04302978515625, -0.0574951171875, 0.03717041015625, -0.01102447509765625, 0.0250396728515625, 0.0171966552734375, 0.0236663818359375, -0.031707763671875, -0.0140533447265625, -0.03472900390625, -0.056243896484375, 0.051605224609375, 0.025390625, -0.002651214599609375, 0.0106201171875, 0.01386260986328125, 0.00211334228515625, 0.002353668212890625, -0.0384521484375, -0.06182861328125, -0.03118896484375, -0.014923095703125, 0.0207672119140625, 0.0012187957763671875, 0.0002880096435546875, -0.037811279296875, 0.05267333984375, -0.005687713623046875, 0.044921875, 0.02679443359375, -0.005512237548828125, -0.00041556358337402344, 0.00955963134765625, 0.036956787109375, 0.037933349609375, -0.01148223876953125, -0.01055908203125, 0.032867431640625, -0.060699462890625, 0.0155792236328125, 0.027099609375, -0.0157623291015625, -0.009613037109375, 0.03521728515625, 0.058563232421875, -0.0049896240234375, -0.036773681640625, 0.04638671875, 0.0004813671112060547, -0.018768310546875, -0.03564453125, 0.01335906982421875, 0.0164031982421875, 0.0290985107421875, 0.03350830078125, 0.0062408447265625, 0.01776123046875, -0.01465606689453125, 0.002254486083984375, 0.0311279296875, -0.001728057861328125, -0.01413726806640625, 0.052032470703125, -0.01499176025390625, -0.0261077880859375, 0.01531219482421875, -0.0234527587890625, -0.045745849609375, 0.057281494140625, 0.036834716796875, 0.053619384765625, 0.0105743408203125, -0.011871337890625, 0.0386962890625, 0.01210784912109375, 0.0011339187622070312, 0.01033782958984375, -0.00890350341796875, -0.035125732421875, -0.009521484375, -0.06298828125, -0.0221405029296875, -0.0159454345703125, -0.02386474609375, 0.03564453125, -0.033203125, -0.00495147705078125, -0.0151519775390625, 0.037017822265625, -0.0684814453125, -0.010650634765625, 0.0192718505859375, 0.0897216796875, -0.012481689453125, 0.07330322265625, 0.031280517578125, -0.051605224609375, -0.07159423828125, -0.009307861328125, 0.0247955322265625, -0.06610107421875, 0.038482666015625, 0.0013170242309570312, -0.00916290283203125, -0.01076507568359375, -0.03271484375, -0.07464599609375, 0.1058349609375, 0.01593017578125, -0.022491455078125, -0.0222015380859375, 0.0010786056518554688, 0.0287628173828125, -0.004253387451171875, 0.044952392578125, 0.045135498046875, 0.046661376953125, 0.005527496337890625, -0.09954833984375, 0.021728515625, -0.042816162109375, 0.0023174285888671875, -0.01348114013671875, -0.06488037109375, 0.06072998046875, -0.0108184814453125, 0.0025234222412109375, 0.020751953125, 0.06060791015625, 0.0576171875, 0.02496337890625, 0.01233673095703125, 0.039947509765625, 0.062255859375, 0.0082244873046875, 0.0950927734375, -0.0168304443359375, 0.033935546875, 0.041656494140625, -0.0018320083618164062, 0.038604736328125, 0.014678955078125, -0.04815673828125, 0.03887939453125, 0.052734375, -0.01206207275390625, 0.0308380126953125, 0.03887939453125, -0.01398468017578125, 0.005657196044921875, 0.0105743408203125, -0.0545654296875, -0.01062774658203125, 0.0189056396484375, 0.0104827880859375, -0.0027141571044921875, 0.0010881423950195312, 0.0152740478515625, -0.0179443359375, -0.030029296875, 0.046417236328125, 0.0075836181640625, -0.020721435546875, 0.08270263671875, -0.00859832763671875, 0.0845947265625, -0.050933837890625, -0.007358551025390625, -0.0245208740234375, -0.002101898193359375, -0.034576416015625, -0.055389404296875, -0.005596160888671875, 0.00777435302734375, -0.006134033203125, 0.01068878173828125, 0.058868408203125, -0.006465911865234375, -0.042816162109375, 0.0276641845703125, 0.0233001708984375, 0.035003662109375, 0.025177001953125, -0.07220458984375, 0.0304412841796875, -0.001834869384765625, -0.050201416015625, 0.030364990234375, 0.038360595703125, 0.0007710456848144531, 0.056060791015625, 0.048004150390625, 0.00234222412109375, 0.035369873046875, -0.01552581787109375, 0.0679931640625, -0.04046630859375, -0.006847381591796875, -0.06365966796875, 0.04229736328125, -0.0157623291015625, -0.024658203125, 0.08245849609375, 0.04461669921875, 0.0521240234375, -0.004024505615234375, 0.047760009765625, -0.010467529296875, 0.011077880859375, -0.018585205078125, 0.065185546875, -0.06317138671875, 0.0072479248046875, -0.03656005859375, -0.05950927734375, -0.035308837890625, 0.06695556640625, -0.0145416259765625, 0.0027923583984375, 0.03350830078125, 0.07489013671875, 0.0130615234375, -0.015594482421875, 0.01262664794921875, -0.01178741455078125, 0.022430419921875, 0.054290771484375, 0.0419921875, -0.052093505859375, 0.049346923828125, -0.028045654296875, -0.01263427734375, -0.024993896484375, -0.045562744140625, -0.0775146484375, -0.0305023193359375, -0.0302734375, -0.05242919921875, -0.0130615234375, 0.09490966796875, 0.0438232421875, -0.050445556640625, -0.01560211181640625, 0.0075836181640625, 0.042877197265625, -0.02099609375, -0.0141143798828125, 0.056427001953125, 0.00783538818359375, -0.060302734375, 0.01226043701171875, 0.01436614990234375, 0.024810791015625, -0.018157958984375, -0.0545654296875, -0.00923919677734375, 0.0180206298828125, 0.03204345703125, 0.0472412109375, -0.05859375, -0.0038318634033203125, 0.0024013519287109375, -0.0252838134765625, 0.01300811767578125, 0.0170440673828125, -0.0413818359375, 0.006744384765625, 0.037872314453125, 0.03582763671875, 0.0399169921875, -0.03759765625, 0.00450897216796875, -0.0157470703125, 0.00518798828125, -0.003116607666015625, 0.0355224609375, 0.01226806640625, -0.033416748046875, 0.044403076171875, 0.0190887451171875, -0.036346435546875, -0.059722900390625, -0.01500701904296875, -0.07537841796875, -0.008453369140625, 0.08062744140625, -0.005115509033203125, -0.046417236328125, 0.00525665283203125, -0.0298919677734375, 0.0246429443359375, -0.03759765625, 0.022735595703125, 0.0285186767578125, -0.022308349609375, -0.005390167236328125, -0.038818359375, 0.03631591796875, 0.003047943115234375, -0.058074951171875, -0.0022182464599609375, 0.035736083984375, 0.0215606689453125, 0.04254150390625, 0.07073974609375, -0.0196380615234375, 0.0274658203125, 0.0178680419921875, 0.034759521484375, -0.0221099853515625, 0.01102447509765625, -0.02423095703125, -0.00426483154296875, -0.00428009033203125, -0.0146942138671875 ] ]
plasmo/vox2
2023-05-05T11:26:46.000Z
[ "diffusers", "text-to-image", "stable-diffusion", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
plasmo
null
null
plasmo/vox2
43
118,257
diffusers
2022-11-25T19:20:36
--- license: creativeml-openrail-m tags: - text-to-image - stable-diffusion widget: - text: "voxel-ish " --- ### Jak's Voxel-ish Image Pack v.1.2 for Stable Diffusion VERSION 1.2 of Voxel-ish Image Pack brought to you by 184 training images through 8000 training steps, 20% Training text crafted by Jak_TheAI_Artist version history: v1.2 - Fine tuned for better faces. Include Prompt trigger: "voxel-ish" to activate. Tip: add "intricate detail" in prompt to make a semi-realistic image. Sample pictures of this concept: voxel-ish ![voxel-ish 0](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/tyson.jpg) ![voxel-ish 1](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/depp.jpg) ![voxel-ish 2](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/pitt.jpg) ![voxel-ish 3](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/ww.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/sm.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/theron.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/watson.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/watson2.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/jc.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/obama.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/musk.jpg) ![voxel-ish 4](https://huggingface.co/plasmo/vox2/resolve/main/concept_images/monroe.jpg)
1,817
[ [ -0.053375244140625, -0.045135498046875, 0.03717041015625, 0.0323486328125, -0.03594970703125, 0.00567626953125, 0.01367950439453125, -0.0382080078125, 0.03839111328125, 0.037506103515625, -0.060028076171875, -0.03961181640625, -0.041534423828125, 0.00789642333984375, -0.019439697265625, 0.057373046875, 0.0166473388671875, -0.0177001953125, -0.033966064453125, -0.0140228271484375, -0.017578125, -0.003086090087890625, -0.03900146484375, -0.036895751953125, 0.049041748046875, 0.036865234375, 0.0406494140625, 0.0268096923828125, 0.01325225830078125, 0.0276947021484375, -0.019439697265625, -0.00928497314453125, -0.03399658203125, 0.0164947509765625, 0.0163116455078125, 0.00582122802734375, -0.050201416015625, 0.004894256591796875, 0.050994873046875, 0.00653076171875, -0.0197601318359375, -0.0064849853515625, -0.00891876220703125, 0.06866455078125, -0.034271240234375, 0.005756378173828125, -0.026580810546875, -0.0018701553344726562, -0.0162506103515625, 0.0090179443359375, -0.01259613037109375, -0.0228271484375, 0.011932373046875, -0.0633544921875, -0.0016002655029296875, -0.004322052001953125, 0.07940673828125, -0.00372314453125, -0.031585693359375, -0.007083892822265625, -0.0012836456298828125, 0.060028076171875, -0.034149169921875, 0.01529693603515625, 0.01384735107421875, 0.040069580078125, -0.00531005859375, -0.045623779296875, -0.0294036865234375, 0.004619598388671875, 0.01250457763671875, 0.0406494140625, -0.04205322265625, 0.0073699951171875, 0.034942626953125, 0.034881591796875, -0.053863525390625, -0.0093536376953125, -0.0506591796875, -0.0181121826171875, 0.04681396484375, -0.0251922607421875, 0.03533935546875, -0.0248260498046875, -0.054595947265625, -0.0168609619140625, -0.058074951171875, -0.0098876953125, 0.0322265625, -0.01302337646484375, -0.06475830078125, 0.02740478515625, -0.004940032958984375, 0.048187255859375, 0.0209503173828125, -0.0095977783203125, 0.0372314453125, -0.0295257568359375, -0.00846099853515625, -0.00788116455078125, 0.05029296875, 0.018798828125, 0.006488800048828125, 0.0217132568359375, -0.0031280517578125, -0.0202178955078125, 0.0423583984375, -0.08807373046875, -0.0095062255859375, 0.014984130859375, -0.046905517578125, -0.0295562744140625, 0.0055694580078125, -0.061004638671875, -0.005367279052734375, 0.0011882781982421875, 0.0244903564453125, -0.050628662109375, -0.04840087890625, -0.006168365478515625, -0.022918701171875, 0.0289764404296875, 0.0267486572265625, -0.04443359375, 0.00754547119140625, 0.0284576416015625, 0.057403564453125, 0.0184783935546875, 0.003917694091796875, -0.0016222000122070312, -0.0022296905517578125, -0.03582763671875, 0.0760498046875, -0.034820556640625, -0.055572509765625, 0.01428985595703125, 0.0141143798828125, 0.031646728515625, -0.03692626953125, 0.03900146484375, -0.0032711029052734375, 0.0207977294921875, -0.05023193359375, -0.03765869140625, -0.043975830078125, 0.028472900390625, -0.03643798828125, 0.05169677734375, 0.0526123046875, -0.05023193359375, 0.005481719970703125, -0.04388427734375, 0.0094146728515625, 0.01177978515625, -0.0246429443359375, -0.0657958984375, 0.00457763671875, -0.01302337646484375, 0.0242462158203125, -0.0006642341613769531, -0.049713134765625, -0.04150390625, -0.0234527587890625, 0.0081939697265625, 0.014678955078125, 0.0814208984375, 0.029541015625, -0.0161895751953125, -0.0029773712158203125, -0.04949951171875, 0.01448822021484375, 0.0157318115234375, -0.00411224365234375, 0.003055572509765625, -0.026824951171875, 0.0036830902099609375, 0.0261688232421875, 0.01151275634765625, -0.0421142578125, 0.0256805419921875, -0.0151824951171875, 0.0065155029296875, 0.05645751953125, 0.017059326171875, 0.027557373046875, -0.04864501953125, 0.060028076171875, 0.0214996337890625, 0.0284576416015625, -0.02032470703125, -0.074951171875, -0.05426025390625, -0.04461669921875, -0.00835418701171875, 0.02142333984375, -0.07611083984375, 0.02960205078125, 0.0009479522705078125, -0.06365966796875, -0.01224517822265625, -0.009613037109375, 0.0577392578125, 0.0202178955078125, 0.0169830322265625, -0.03765869140625, -0.032928466796875, -0.0948486328125, -0.00836181640625, -0.0390625, -0.0029811859130859375, 0.01096343994140625, 0.022369384765625, -0.01073455810546875, 0.031524658203125, -0.01416778564453125, -0.00775146484375, -0.005100250244140625, 0.0308380126953125, 0.0219573974609375, 0.0304412841796875, 0.073974609375, -0.047515869140625, -0.034637451171875, -0.000823974609375, -0.037261962890625, -0.013946533203125, 0.0032558441162109375, -0.005161285400390625, 0.0074920654296875, 0.0218353271484375, -0.03839111328125, 0.031707763671875, 0.045074462890625, -0.048583984375, 0.0394287109375, -0.022125244140625, 0.0213623046875, -0.08807373046875, -0.0036907196044921875, 0.013336181640625, -0.03668212890625, -0.041473388671875, 0.0093994140625, 0.01910400390625, 0.00887298583984375, -0.0587158203125, 0.038818359375, -0.04205322265625, 0.0006127357482910156, -0.011505126953125, -0.01378631591796875, 0.0097198486328125, 0.020843505859375, -0.0167083740234375, 0.03314208984375, 0.061187744140625, -0.0305938720703125, 0.03277587890625, 0.032135009765625, -0.033843994140625, 0.084716796875, -0.0513916015625, 0.0293426513671875, -0.0200347900390625, 0.034942626953125, -0.07183837890625, -0.02459716796875, 0.03961181640625, -0.040618896484375, 0.036865234375, -0.0240325927734375, -0.021697998046875, -0.036285400390625, -0.036468505859375, 0.047149658203125, 0.08050537109375, -0.032073974609375, 0.038665771484375, 0.033111572265625, -0.0037403106689453125, -0.04364013671875, -0.0423583984375, -0.00025582313537597656, -0.0272369384765625, -0.040191650390625, 0.0640869140625, -0.022064208984375, -0.005977630615234375, 0.00986480712890625, -0.002742767333984375, -0.007350921630859375, -0.0221405029296875, 0.0404052734375, 0.01922607421875, 0.0012292861938476562, -0.0305328369140625, 0.0026721954345703125, -0.015838623046875, -0.005367279052734375, 0.00394439697265625, 0.007358551025390625, -0.029693603515625, -0.0214385986328125, -0.0352783203125, 0.029266357421875, 0.047698974609375, 0.031402587890625, 0.0191497802734375, 0.059967041015625, -0.0281524658203125, -0.0032482147216796875, -0.052825927734375, -0.011749267578125, -0.037353515625, 0.006404876708984375, -0.0282440185546875, -0.0589599609375, 0.035736083984375, 0.0107269287109375, 0.02001953125, 0.031524658203125, 0.059326171875, -0.03179931640625, 0.08270263671875, 0.0321044921875, 0.01151275634765625, 0.04754638671875, -0.035400390625, 0.005222320556640625, -0.046905517578125, -0.03887939453125, -0.019012451171875, -0.043426513671875, -0.056640625, -0.047210693359375, 0.05126953125, -0.001361846923828125, -0.0174102783203125, 0.04150390625, -0.047149658203125, 0.040191650390625, 0.02215576171875, 0.01088714599609375, -0.01922607421875, 0.01003265380859375, -0.0008096694946289062, 0.00846099853515625, -0.031951904296875, 0.007175445556640625, 0.051361083984375, 0.024749755859375, 0.03955078125, 0.0168304443359375, 0.052337646484375, 0.00759124755859375, 0.013031005859375, -0.05029296875, 0.050750732421875, -0.00209808349609375, -0.04669189453125, 0.014862060546875, -0.0188140869140625, -0.080078125, 0.0004477500915527344, -0.0207061767578125, -0.0299224853515625, 0.0277557373046875, 0.0136871337890625, -0.01526641845703125, 0.01093292236328125, -0.042388916015625, 0.044921875, 0.0175933837890625, -0.017791748046875, -0.03204345703125, -0.055908203125, 0.02984619140625, 0.007965087890625, 0.004413604736328125, -0.0272369384765625, -0.003223419189453125, 0.040252685546875, -0.034423828125, 0.038909912109375, -0.022491455078125, 0.0165557861328125, 0.046905517578125, -0.00997161865234375, 0.022918701171875, 0.0054473876953125, 0.0098724365234375, 0.0242156982421875, 0.0252685546875, -0.060333251953125, -0.033447265625, 0.05712890625, -0.0577392578125, -0.0240478515625, -0.045318603515625, -0.0113525390625, 0.0208282470703125, -0.00627899169921875, 0.060546875, 0.036285400390625, -0.02740478515625, 0.0384521484375, 0.04296875, -0.022735595703125, 0.0260009765625, 0.0198516845703125, -0.0226593017578125, -0.038055419921875, 0.056671142578125, 0.01422119140625, 0.0114593505859375, 0.00531768798828125, 0.022308349609375, -0.0399169921875, -0.022369384765625, -0.022064208984375, 0.034210205078125, -0.029571533203125, -0.0321044921875, -0.035736083984375, -0.04364013671875, -0.033599853515625, -0.030303955078125, -0.0416259765625, -0.03607177734375, -0.01128387451171875, 0.00864410400390625, 0.0384521484375, 0.058349609375, -0.030059814453125, 0.0128326416015625, -0.04595947265625, 0.031524658203125, 0.047149658203125, 0.00858306884765625, -0.049652099609375, -0.03887939453125, -0.011993408203125, -0.01445770263671875, -0.053924560546875, -0.0777587890625, 0.0203094482421875, 0.0018253326416015625, 0.0316162109375, 0.041351318359375, -0.0175018310546875, 0.04498291015625, -0.02001953125, 0.069580078125, 0.05938720703125, -0.04595947265625, 0.03662109375, -0.058380126953125, 0.048492431640625, 0.026458740234375, 0.005340576171875, -0.04119873046875, -0.0219573974609375, -0.07757568359375, -0.055908203125, 0.040283203125, 0.037841796875, 0.020477294921875, -0.00510406494140625, 0.02423095703125, 0.01105499267578125, 0.006763458251953125, -0.06256103515625, -0.02545166015625, -0.041748046875, -0.0013456344604492188, 0.017120361328125, -0.0509033203125, -0.012451171875, -0.0218353271484375, 0.0684814453125, 0.00428009033203125, 0.04608154296875, 0.01117706298828125, 0.0300140380859375, -0.0225372314453125, 0.0014257431030273438, 0.042633056640625, 0.045318603515625, -0.039764404296875, 0.00567626953125, -0.0029811859130859375, -0.03204345703125, 0.01074981689453125, -0.01288604736328125, -0.0044403076171875, 0.018310546875, 0.00897216796875, 0.052032470703125, 0.0267181396484375, -0.0209808349609375, 0.05377197265625, -0.0115814208984375, -0.0428466796875, -0.05511474609375, 0.0108489990234375, 0.0096893310546875, 0.0226287841796875, -0.007663726806640625, 0.00884246826171875, 0.0157623291015625, -0.036346435546875, 0.01447296142578125, 0.03240966796875, -0.037078857421875, -0.050537109375, 0.0263824462890625, 0.01297760009765625, -0.027923583984375, 0.033935546875, -0.0426025390625, -0.019256591796875, 0.056793212890625, 0.046417236328125, 0.0753173828125, -0.04779052734375, 0.037139892578125, 0.06756591796875, 0.0038204193115234375, 0.002124786376953125, 0.048980712890625, 0.006011962890625, -0.04034423828125, -0.01264190673828125, -0.020599365234375, -0.025299072265625, 0.024627685546875, -0.046905517578125, 0.04888916015625, -0.04595947265625, -0.01149749755859375, -0.01212310791015625, -0.0048065185546875, -0.05352783203125, 0.00705718994140625, 0.034423828125, 0.07354736328125, -0.051025390625, 0.06829833984375, 0.05133056640625, -0.0061492919921875, -0.033538818359375, -0.0183868408203125, 0.044097900390625, -0.0640869140625, 0.01081085205078125, 0.0247650146484375, 0.0140533447265625, 0.0023212432861328125, -0.046478271484375, -0.053466796875, 0.08282470703125, 0.02117919921875, -0.033721923828125, 0.0179290771484375, -0.0236358642578125, 0.015380859375, -0.033538818359375, 0.0300445556640625, 0.0227508544921875, 0.046234130859375, 0.037994384765625, -0.0230560302734375, -0.005218505859375, -0.034942626953125, 0.01456451416015625, 0.00601959228515625, -0.061859130859375, 0.049224853515625, -0.0114593505859375, -0.0201263427734375, 0.032867431640625, 0.047760009765625, 0.045867919921875, 0.02001953125, 0.0843505859375, 0.06573486328125, 0.0198822021484375, -0.0117034912109375, 0.087158203125, -0.00372314453125, 0.01320648193359375, 0.037384033203125, 0.027130126953125, 0.0286712646484375, 0.0309295654296875, -0.0191497802734375, 0.056793212890625, 0.06512451171875, 0.00487518310546875, 0.04095458984375, 0.019561767578125, -0.036041259765625, -0.01666259765625, -0.0044097900390625, -0.048980712890625, 0.04541015625, 0.025787353515625, 0.001811981201171875, -0.02386474609375, -0.01526641845703125, 0.0145263671875, -0.016387939453125, -0.005199432373046875, 0.0491943359375, 0.020538330078125, -0.032867431640625, 0.04925537109375, -0.02105712890625, 0.03179931640625, -0.05426025390625, -0.0167999267578125, -0.01104736328125, -0.0292510986328125, -0.006381988525390625, -0.052154541015625, 0.0136260986328125, -0.0140533447265625, 0.0033130645751953125, -0.043182373046875, 0.05889892578125, -0.0289306640625, -0.0546875, 0.0182647705078125, 0.0277862548828125, 0.0295257568359375, 0.0068511962890625, -0.098876953125, 0.01397705078125, 0.003643035888671875, -0.02978515625, 0.020477294921875, 0.01245880126953125, 0.034576416015625, 0.024200439453125, 0.0220184326171875, 0.0122528076171875, -0.01422119140625, 0.0078887939453125, 0.06292724609375, -0.0274505615234375, -0.034698486328125, -0.04248046875, 0.054931640625, -0.03216552734375, -0.0226287841796875, 0.042449951171875, 0.027191162109375, 0.0304107666015625, -0.00572967529296875, 0.0478515625, -0.0157928466796875, 0.03521728515625, -0.03887939453125, 0.044921875, -0.08349609375, -0.00782012939453125, -0.0264129638671875, -0.08251953125, 0.00458526611328125, 0.050140380859375, 0.01212310791015625, 0.01494598388671875, 0.03985595703125, 0.0572509765625, -0.01242828369140625, -0.007472991943359375, 0.03521728515625, 0.02978515625, 0.0263671875, 0.03546142578125, 0.08111572265625, -0.03277587890625, 0.01233673095703125, -0.03179931640625, -0.0271148681640625, -0.038299560546875, -0.07684326171875, -0.05755615234375, -0.07635498046875, -0.04638671875, -0.049896240234375, -0.0019817352294921875, 0.05535888671875, 0.09375, -0.040191650390625, -0.005931854248046875, -0.0019626617431640625, 0.0253753662109375, -0.005931854248046875, -0.0141143798828125, -0.01032257080078125, 0.0021419525146484375, -0.064697265625, 0.028533935546875, 0.064453125, 0.0278167724609375, -0.0035495758056640625, -0.01259613037109375, 0.001689910888671875, 0.01080322265625, 0.03521728515625, 0.03558349609375, -0.050323486328125, -0.04925537109375, -0.02410888671875, 0.0134735107421875, 0.01290130615234375, 0.0494384765625, -0.046051025390625, 0.058807373046875, 0.06646728515625, 0.01137542724609375, 0.050689697265625, 0.00039649009704589844, 0.0200653076171875, -0.060821533203125, 0.038116455078125, 0.0184783935546875, 0.029571533203125, 0.0149078369140625, -0.008270263671875, 0.03759765625, 0.0234375, -0.06414794921875, -0.048309326171875, 0.0404052734375, -0.1258544921875, -0.023345947265625, 0.0823974609375, -0.0020885467529296875, -0.0290985107421875, -0.004222869873046875, -0.0130157470703125, 0.0118865966796875, -0.02239990234375, 0.06256103515625, 0.049713134765625, -0.00926971435546875, -0.035369873046875, -0.02984619140625, 0.046051025390625, 0.024658203125, -0.05548095703125, -0.03765869140625, 0.033905029296875, 0.01010894775390625, 0.02130126953125, 0.06988525390625, -0.0369873046875, 0.025238037109375, 0.004459381103515625, 0.01141357421875, 0.003139495849609375, -0.004917144775390625, -0.0020351409912109375, -0.018829345703125, -0.0162811279296875, -0.026123046875 ] ]
klue/bert-base
2023-06-12T12:30:04.000Z
[ "transformers", "pytorch", "safetensors", "bert", "fill-mask", "korean", "klue", "ko", "arxiv:2105.09680", "arxiv:1910.09700", "license:cc-by-sa-4.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
klue
null
null
klue/bert-base
23
118,208
transformers
2022-03-02T23:29:05
--- language: ko license: cc-by-sa-4.0 tags: - korean - klue mask_token: "[MASK]" widget: - text: 대한민국의 수도는 [MASK] 입니다. --- # KLUE BERT base ## Table of Contents - [Model Details](#model-details) - [How to Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation) - [Environmental Impact](#environmental-impact) - [Technical Specifications](#technical-specifications) - [Citation Information](#citation-information) - [Model Card Authors](#model-card-authors) ## Model Details **Model Description:** KLUE BERT base is a pre-trained BERT Model on Korean Language. The developers of KLUE BERT base developed the model in the context of the development of the [Korean Language Understanding Evaluation (KLUE) Benchmark](https://arxiv.org/pdf/2105.09680.pdf). - **Developed by:** See [GitHub Repo](https://github.com/facebookresearch/fairseq/tree/main/examples/roberta) for model developers - **Model Type:** Transformer-based language model - **Language(s):** Korean - **License:** cc-by-sa-4.0 - **Parent Model:** See the [BERT base uncased model](https://huggingface.co/bert-base-uncased) for more information about the BERT base model. - **Resources for more information:** - [Research Paper](https://arxiv.org/abs/2105.09680) - [GitHub Repo](https://github.com/KLUE-benchmark/KLUE) ## How to Get Started With the Model ```python from transformers import AutoModel, AutoTokenizer model = AutoModel.from_pretrained("klue/bert-base") tokenizer = AutoTokenizer.from_pretrained("klue/bert-base") ``` ## Uses #### Direct Use The model can be used for tasks including topic classification, semantic textual similarity, natural language inference, named entity recognition, and other tasks outlined in the [KLUE Benchmark](https://github.com/KLUE-benchmark/KLUE). #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). The model developers discuss several ethical considerations related to the model in the [paper](https://arxiv.org/pdf/2105.09680.pdf), including: - Bias issues with the publicly available data used in the pretraining corpora (and considerations related to filtering) - PII in the data used in the pretraining corpora (and efforts to pseudonymize the data) For ethical considerations related to the KLUE Benchmark, also see the [paper](https://arxiv.org/pdf/2105.09680.pdf). ## Training #### Training Data The authors use the following pretraining corpora for the model, described in the [associated paper](https://arxiv.org/pdf/2105.09680.pdf): > We gather the following five publicly available Korean corpora from diverse sources to cover a broad set of topics and many different styles. We combine these corpora to build the final pretraining corpus of size approximately 62GB. > > - **MODU:** [Modu Corpus](https://corpus.korean.go.kr) is a collection of Korean corpora distributed by [National Institute of Korean Languages](https://corpus.korean.go.kr/). It includes both formal articles (news and books) and colloquial text (dialogues). > - **CC-100-Kor:** [CC-100](https://data.statmt.org/cc-100/) is the large-scale multilingual web crawled corpora by using CC-Net ([Wenzek et al., 2020](https://www.aclweb.org/anthology/2020.lrec-1.494)). This is used for training XLM-R ([Conneau et al., 2020](https://aclanthology.org/2020.acl-main.747/)). We use the Korean portion from this corpora. > - **NAMUWIKI:** NAMUWIKI is a Korean web-based encyclopedia, similar to Wikipedia, but known to be less formal. Specifically, we download [the dump](http://dump.thewiki.kr) created on March 2nd, 2020. > - **NEWSCRAWL:** NEWSCRAWL consists of 12,800,000 news articles published from 2011 to 2020, collected from a news aggregation platform. > - **PETITION:** Petition is a collection of public petitions posted to the Blue House asking for administrative actions on social issues. We use the articles in the [Blue House National Petition](https://www1.president.go.kr/petitions) published from [August 2017 to March 2019](https://ko-nlp.github.io/Korpora/en-docs/corpuslist/korean_petitions.html). The authors also describe ethical considerations related to the pretraining corpora in the [associated paper](https://arxiv.org/pdf/2105.09680.pdf). #### Training Procedure ##### Preprocessing The authors describe their preprocessing procedure in the [associated paper](https://arxiv.org/pdf/2105.09680.pdf): > We filter noisy text and non-Korean text using the same methods from Section 2.3 (of the paper). Each document in the corpus is split into sentences using C++ implementation (v1.3.1.) of rule-based [Korean Sentence Splitter (KSS)](https://github.com/likejazz/korean-sentence-splitter). For CC-100-Kor and NEWSCRAWL, we keep sentences of length greater than equal to 200 characters, as a heuristics to keep well-formed sentences. We then remove sentences included in our benchmark task datasets, using BM25 as a sentence similarity metric ([reference](https://www.microsoft.com/en-us/research/publication/okapi-at-trec-3/)). ###### Tokenization The authors describe their tokenization procedure in the [associated paper](https://arxiv.org/pdf/2105.09680.pdf): > We design and use a new tokenization method, morpheme-based subword tokenization. When building a vocabulary, we pre-tokenize a raw text into morphemes using a morphological analyzer, and then we apply byte pair encoding (BPE) ([Senrich et al., 2016](https://aclanthology.org/P16-1162/)) to get the final vocabulary. For morpheme segmentation, we use [Mecab-ko](https://bitbucket.org/eunjeon/mecab-ko), MeCab ([Kudo, 2006](https://taku910.github.io/mecab/)) adapted for Korean, and for BPE segmentation, we use the wordpiece tokenizer from [Huggingface Tokenizers library](https://github.com/huggingface/tokenizers). We specify the vocabulary size to 32k. After building the vocabulary, we only use the BPE model during inference, which allows us to tokenize a word sequence by reflecting morphemes without a morphological analyzer. This improves both usability and speed. The training configurations are further described in the [paper](https://arxiv.org/pdf/2105.09680.pdf). ## Evaluation #### Testing Data, Factors and Metrics The model was evaluated on the [KLUE Benchmark](https://github.com/KLUE-benchmark/KLUE). The tasks and metrics from the KLUE Benchmark that were used to evaluate this model are described briefly below. For more information about the KLUE Benchmark, see the [data card](https://huggingface.co/datasets/klue), [Github Repository](https://github.com/KLUE-benchmark/KLUE), and [associated paper](https://arxiv.org/pdf/2105.09680.pdf). - **Task:** Topic Classification (TC) - Yonhap News Agency Topic Classification (YNAT), **Metrics:** Macro F1 score, defined as the mean of topic-wise F1 scores, giving the same importance to each topic. - **Task:** Semantic Textual Similarity (STS), **Metrics:** Pearsons' correlation coefficient (Pearson’ r) and F1 score - **Task:** Natural Language Inference (NLI), **Metrics:** Accuracy - **Task:** Named Entity Recognition (NER), **Metrics:** Entity-level macro F1 (Entity F1) and character-level macro F1 (Char F1) scores - **Task:** Relation Extraction (RE), **Metrics:** Micro F1 score on relation existing cases and area under the precision- recall curve (AUPRC) on all classes - **Task:** Dependency Parsing (DP), **Metrics:** Unlabeled attachment score (UAS) and labeled attachment score (LAS) - **Task:** Machine Reading Comprehension (MRC), **Metrics:** Exact match (EM) and character-level ROUGE-W (ROUGE), which can be viewed as longest common consecutive subsequence (LCCS)-based F1 score. - **Task:** Dialogue State Tracking (DST), **Metrics:** Joint goal accuracy (JGA) and slot micro F1 score (Slot F1) #### Results | Task | TC | STS | | NLI | NER | | RE | | DP | | MRC | | DST | | | :---: |:---: | :---: | :---: |:---:| :---: | :---: |:---:| :---:| :---: |:---: | :---: | :---:| :---: | :---: | | Metric | F1 | Pearsons' r| F1 | ACC | Entity F1 | Char F1 | F1 | AUPRC| UAS | LAS | EM | ROUGE| JGA |Slot F1 | | | 85.73| 90.85 | 82.84 |81.63| 83.97 | 91.39 |66.44| 66.17| 89.96 |88.05 | 62.32 | 68.51| 46.64 | 91.61 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type based on the [associated paper](https://arxiv.org/pdf/2105.09680.pdf). - **Hardware Type:** TPU v3-8 - **Hours used:** Unknown - **Cloud Provider:** Unknown - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Technical Specifications See the [associated paper](https://arxiv.org/pdf/2105.09680.pdf) for details on the modeling architecture (BERT), objective, compute infrastructure, and training details. ## Citation Information ```bibtex @misc{park2021klue, title={KLUE: Korean Language Understanding Evaluation}, author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho}, year={2021}, eprint={2105.09680}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
10,415
[ [ -0.03167724609375, -0.043121337890625, 0.020538330078125, 0.0186767578125, -0.031494140625, -0.0092926025390625, -0.033477783203125, -0.0313720703125, 0.010223388671875, 0.0312347412109375, -0.0306243896484375, -0.050201416015625, -0.047088623046875, 0.003368377685546875, 0.0022029876708984375, 0.07818603515625, -0.0131683349609375, 0.0113372802734375, -0.0030803680419921875, -0.008575439453125, -0.02740478515625, -0.0467529296875, -0.03143310546875, -0.019927978515625, 0.0287322998046875, 0.01430511474609375, 0.04486083984375, 0.051239013671875, 0.022552490234375, 0.0198974609375, -0.0072784423828125, -0.01016998291015625, -0.0382080078125, -0.009552001953125, -0.0008831024169921875, -0.03204345703125, -0.03460693359375, 0.019134521484375, 0.03582763671875, 0.06243896484375, 0.0138092041015625, 0.022979736328125, -0.00580596923828125, 0.05902099609375, -0.0489501953125, 0.017822265625, -0.0404052734375, 0.0009264945983886719, -0.019195556640625, 0.018035888671875, -0.032470703125, -0.02239990234375, 0.01483154296875, -0.049560546875, 0.007904052734375, -0.00823974609375, 0.10205078125, 0.012481689453125, -0.029205322265625, -0.036468505859375, -0.0232696533203125, 0.0587158203125, -0.06536865234375, 0.057769775390625, 0.03955078125, 0.0019273757934570312, -0.008209228515625, -0.06732177734375, -0.041534423828125, -0.031280517578125, -0.019287109375, 0.0261077880859375, 0.007171630859375, 0.00830841064453125, 0.021148681640625, 0.0285491943359375, -0.04644775390625, 0.00026726722717285156, -0.036651611328125, -0.025970458984375, 0.0548095703125, -0.0110626220703125, 0.03106689453125, -0.052520751953125, -0.02392578125, -0.02032470703125, -0.033477783203125, 0.0157928466796875, 0.04132080078125, 0.0214385986328125, -0.009552001953125, 0.04638671875, -0.007808685302734375, 0.029205322265625, 0.0174560546875, -0.029754638671875, 0.047454833984375, -0.05120849609375, -0.01418304443359375, 0.00159454345703125, 0.06866455078125, 0.0257415771484375, 0.0140838623046875, -0.0161590576171875, -0.001361846923828125, 0.00949859619140625, 0.0193023681640625, -0.06304931640625, -0.03289794921875, 0.0257568359375, -0.06640625, -0.01242828369140625, 0.0003643035888671875, -0.062042236328125, 0.0014629364013671875, -0.03204345703125, 0.0255279541015625, -0.0439453125, -0.02947998046875, 0.005031585693359375, -0.0026149749755859375, 0.006481170654296875, -0.00286865234375, -0.040740966796875, 0.01494598388671875, 0.0298004150390625, 0.052215576171875, -0.0192718505859375, -0.02911376953125, -0.011016845703125, -0.0174560546875, -0.016998291015625, 0.0293731689453125, -0.0122833251953125, -0.0280609130859375, -0.0135498046875, 0.0274200439453125, -0.0284576416015625, -0.034820556640625, 0.055877685546875, -0.03485107421875, 0.02972412109375, -0.011932373046875, -0.06561279296875, -0.01324462890625, 0.00281524658203125, -0.044525146484375, 0.09149169921875, 0.02203369140625, -0.058074951171875, 0.027740478515625, -0.06024169921875, -0.023162841796875, 0.0001304149627685547, -0.0003712177276611328, -0.027679443359375, -0.0157928466796875, 0.01113128662109375, 0.03472900390625, 0.004741668701171875, 0.032806396484375, -0.01085662841796875, -0.01092529296875, -0.00641632080078125, -0.007228851318359375, 0.0830078125, 0.0278167724609375, -0.0275726318359375, -0.0009393692016601562, -0.06365966796875, 0.01715087890625, 0.01348876953125, -0.037109375, -0.02777099609375, -0.0246429443359375, 0.0094146728515625, 0.033111572265625, 0.032562255859375, -0.043243408203125, 0.0002493858337402344, -0.03973388671875, 0.012298583984375, 0.03717041015625, -0.015350341796875, 0.049102783203125, -0.01065826416015625, 0.0404052734375, 0.0036907196044921875, 0.01081085205078125, -0.0220184326171875, -0.03271484375, -0.047821044921875, -0.0268096923828125, 0.06298828125, 0.0499267578125, -0.042938232421875, 0.052947998046875, -0.02911376953125, -0.06182861328125, -0.0692138671875, -0.0003376007080078125, 0.04534912109375, 0.03662109375, 0.03192138671875, -0.00666046142578125, -0.0572509765625, -0.058868408203125, -0.01248931884765625, -0.01236724853515625, -0.0006556510925292969, 0.033050537109375, 0.052459716796875, -0.017547607421875, 0.062744140625, -0.03271484375, -0.0015201568603515625, -0.0171966552734375, 0.009765625, 0.01788330078125, 0.039093017578125, 0.044403076171875, -0.07330322265625, -0.058502197265625, -0.00664520263671875, -0.05059814453125, -0.0244140625, 0.0053253173828125, -0.01080322265625, 0.040374755859375, 0.061859130859375, -0.059783935546875, 0.027008056640625, 0.04632568359375, -0.034820556640625, 0.0692138671875, 0.0019121170043945312, 0.0061492919921875, -0.09613037109375, 0.02459716796875, -0.0136260986328125, 0.0005116462707519531, -0.05950927734375, 0.0085296630859375, 0.01255035400390625, -0.00806427001953125, -0.035308837890625, 0.0604248046875, -0.03619384765625, 0.00923919677734375, -0.0233154296875, 0.0196533203125, -0.00385284423828125, 0.051483154296875, 0.0046844482421875, 0.0467529296875, 0.020263671875, -0.055694580078125, 0.00030803680419921875, 0.017974853515625, -0.0313720703125, 0.018707275390625, -0.04443359375, 0.004062652587890625, -0.01153564453125, 0.021240234375, -0.06134033203125, -0.00885772705078125, 0.033966064453125, -0.04443359375, 0.0283660888671875, -0.0179595947265625, -0.0312347412109375, -0.031524658203125, -0.031646728515625, 0.02117919921875, 0.037078857421875, -0.0294189453125, 0.0445556640625, 0.0355224609375, -0.015777587890625, -0.0582275390625, -0.035980224609375, -0.00496673583984375, -0.00965118408203125, -0.049560546875, 0.051666259765625, -0.00925445556640625, -0.00948333740234375, 0.027099609375, -0.00475311279296875, -0.0022029876708984375, 0.0007505416870117188, 0.0169830322265625, 0.02813720703125, -0.02197265625, 0.01050567626953125, -0.0022525787353515625, -0.002323150634765625, -0.005504608154296875, -0.01261138916015625, 0.062744140625, -0.00763702392578125, -0.006320953369140625, -0.024017333984375, 0.0135498046875, 0.039337158203125, -0.004619598388671875, 0.054229736328125, 0.06280517578125, -0.0283203125, 0.025787353515625, -0.04351806640625, -0.00962066650390625, -0.032379150390625, 0.05316162109375, -0.0305023193359375, -0.063720703125, 0.035369873046875, -0.0023097991943359375, 0.0036163330078125, 0.04681396484375, 0.042022705078125, -0.00022661685943603516, 0.060302734375, 0.04010009765625, -0.0233154296875, 0.0291900634765625, -0.017303466796875, 0.031768798828125, -0.06414794921875, -0.016845703125, -0.04034423828125, -0.0126190185546875, -0.0692138671875, -0.0231475830078125, -0.0013704299926757812, 0.0333251953125, -0.01445770263671875, 0.04443359375, -0.0233154296875, 0.016082763671875, 0.040130615234375, -0.0095672607421875, -0.001201629638671875, -0.0125732421875, -0.022430419921875, -0.019073486328125, -0.05657958984375, -0.05230712890625, 0.0904541015625, 0.035888671875, 0.022125244140625, -0.032135009765625, 0.056732177734375, 0.002582550048828125, 0.0107269287109375, -0.054168701171875, 0.046417236328125, -0.0216522216796875, -0.046600341796875, -0.0300750732421875, -0.037567138671875, -0.0888671875, 0.032928466796875, -0.00794219970703125, -0.05426025390625, 0.0157470703125, -0.00363922119140625, -0.01284027099609375, 0.017425537109375, -0.06561279296875, 0.088134765625, -0.01287078857421875, 0.0086517333984375, -0.0026531219482421875, -0.0552978515625, 0.0223388671875, -0.005916595458984375, 0.01549530029296875, -0.010009765625, -0.00128173828125, 0.06500244140625, -0.032196044921875, 0.050750732421875, -0.02642822265625, 0.0044708251953125, 0.0194091796875, -0.0235137939453125, 0.0294342041015625, -0.006473541259765625, 0.00017189979553222656, 0.03216552734375, 0.006244659423828125, -0.0297393798828125, -0.032073974609375, 0.039093017578125, -0.0628662109375, -0.0210418701171875, -0.022125244140625, -0.04681396484375, -0.0018854141235351562, 0.03021240234375, 0.040679931640625, 0.004673004150390625, -0.00691986083984375, 0.0182952880859375, 0.037567138671875, -0.036712646484375, 0.02105712890625, 0.036956787109375, -0.0240631103515625, -0.0391845703125, 0.06561279296875, 0.028961181640625, 0.016693115234375, 0.01274871826171875, 0.00775146484375, -0.02947998046875, -0.0304412841796875, -0.023773193359375, 0.031982421875, -0.054718017578125, 0.0006113052368164062, -0.07208251953125, -0.037628173828125, -0.048309326171875, 0.000640869140625, -0.034881591796875, -0.0295257568359375, -0.0160369873046875, -0.006244659423828125, 0.00199127197265625, 0.0202789306640625, -0.00025582313537597656, 0.020050048828125, -0.050048828125, 0.0292510986328125, -0.0020122528076171875, 0.0178680419921875, -0.002269744873046875, -0.048370361328125, -0.023223876953125, 0.01007080078125, -0.016998291015625, -0.052001953125, 0.0252838134765625, -0.0030574798583984375, 0.053558349609375, 0.00846099853515625, 0.0056915283203125, 0.045501708984375, -0.03955078125, 0.07623291015625, 0.0024662017822265625, -0.060760498046875, 0.03826904296875, -0.01076507568359375, 0.0379638671875, 0.04974365234375, 0.051666259765625, -0.050933837890625, -0.034454345703125, -0.06390380859375, -0.08587646484375, 0.0615234375, 0.02203369140625, 0.0103912353515625, -0.004512786865234375, 0.03173828125, 0.008331298828125, 0.019775390625, -0.07513427734375, -0.040679931640625, -0.02960205078125, -0.023406982421875, 0.003887176513671875, -0.0270843505859375, 0.0179443359375, -0.0291290283203125, 0.0770263671875, 0.006435394287109375, 0.0273284912109375, 0.01959228515625, -0.03216552734375, 0.00806427001953125, 0.0161285400390625, 0.043701171875, 0.04107666015625, -0.01358795166015625, -0.0037555694580078125, 0.01554107666015625, -0.06689453125, -0.000885009765625, 0.0256805419921875, -0.0256195068359375, 0.0225677490234375, 0.0198974609375, 0.069580078125, 0.0215301513671875, -0.053436279296875, 0.033294677734375, 0.0039825439453125, -0.0350341796875, -0.01439666748046875, -0.016845703125, 0.0175628662109375, 0.007568359375, 0.015777587890625, -0.011322021484375, -0.0089874267578125, -0.0280609130859375, 0.0158233642578125, 0.0081024169921875, -0.015228271484375, -0.0275726318359375, 0.037811279296875, 0.0002815723419189453, -0.01023101806640625, 0.040771484375, -0.04107666015625, -0.054595947265625, 0.04876708984375, 0.035919189453125, 0.059478759765625, -0.020965576171875, 0.025421142578125, 0.053558349609375, 0.026031494140625, -0.00412750244140625, 0.0248870849609375, 0.01403045654296875, -0.057769775390625, -0.039764404296875, -0.05615234375, 0.0033130645751953125, 0.045135498046875, -0.037506103515625, 0.0202178955078125, -0.0154571533203125, -0.0120391845703125, 0.00228118896484375, 0.0262603759765625, -0.042938232421875, 0.01885986328125, 0.007232666015625, 0.06878662109375, -0.057037353515625, 0.057861328125, 0.05841064453125, -0.03631591796875, -0.06353759765625, 0.007389068603515625, -0.023193359375, -0.0455322265625, 0.057708740234375, 0.0169525146484375, 0.02734375, -0.01235198974609375, -0.0286407470703125, -0.0745849609375, 0.0823974609375, 0.02655029296875, -0.03729248046875, -0.000021517276763916016, 0.0141448974609375, 0.046905517578125, -0.0258636474609375, 0.0015659332275390625, 0.040740966796875, 0.0421142578125, -0.0254364013671875, -0.08148193359375, 0.01181793212890625, -0.034149169921875, -0.0013427734375, 0.00992584228515625, -0.044525146484375, 0.07135009765625, 0.006618499755859375, -0.0205078125, 0.00008940696716308594, 0.045440673828125, 0.024627685546875, 0.03326416015625, 0.05206298828125, 0.052947998046875, 0.07366943359375, 0.006397247314453125, 0.074462890625, -0.04595947265625, 0.018646240234375, 0.08978271484375, -0.003665924072265625, 0.05078125, 0.0219268798828125, -0.019073486328125, 0.033966064453125, 0.0516357421875, -0.01345062255859375, 0.046295166015625, 0.005611419677734375, 0.0021877288818359375, 0.0043792724609375, -0.01317596435546875, -0.030792236328125, 0.0345458984375, 0.0140228271484375, -0.03741455078125, -0.012298583984375, 0.009857177734375, 0.031829833984375, 0.0012369155883789062, -0.0262908935546875, 0.03857421875, 0.00930023193359375, -0.057586669921875, 0.0400390625, 0.007251739501953125, 0.059661865234375, -0.05133056640625, 0.0206451416015625, -0.0003247261047363281, 0.0027408599853515625, -0.01291656494140625, -0.042266845703125, 0.0170135498046875, -0.005245208740234375, -0.028594970703125, -0.00948333740234375, 0.076171875, -0.045501708984375, -0.03131103515625, 0.032501220703125, 0.0421142578125, 0.0259246826171875, -0.005451202392578125, -0.07073974609375, -0.00543975830078125, 0.0018987655639648438, -0.036590576171875, 0.0404052734375, 0.0272216796875, -0.01561737060546875, 0.035125732421875, 0.052947998046875, 0.00188446044921875, 0.0185089111328125, 0.012237548828125, 0.055084228515625, -0.0374755859375, -0.032928466796875, -0.0596923828125, 0.031494140625, -0.015655517578125, -0.0321044921875, 0.0665283203125, 0.05426025390625, 0.09326171875, -0.0184173583984375, 0.06524658203125, -0.0208282470703125, 0.041961669921875, -0.03466796875, 0.06292724609375, -0.03607177734375, -0.01143646240234375, -0.033355712890625, -0.06304931640625, -0.0010938644409179688, 0.042510986328125, -0.02362060546875, 0.0142059326171875, 0.058197021484375, 0.057098388671875, 0.0014629364013671875, -0.0036373138427734375, 0.01593017578125, 0.0221710205078125, -0.000030875205993652344, 0.02313232421875, 0.03302001953125, -0.05584716796875, 0.055999755859375, -0.0400390625, 0.00022029876708984375, -0.008148193359375, -0.061126708984375, -0.0693359375, -0.05523681640625, -0.0295867919921875, -0.0248870849609375, 0.0032634735107421875, 0.07855224609375, 0.043609619140625, -0.06787109375, -0.0209808349609375, -0.00099945068359375, -0.0005769729614257812, -0.0131072998046875, -0.0208282470703125, 0.055511474609375, -0.042999267578125, -0.0560302734375, 0.017974853515625, 0.00258636474609375, 0.001129150390625, -0.003047943115234375, -0.0223388671875, -0.041351318359375, -0.009368896484375, 0.057220458984375, 0.0171051025390625, -0.05230712890625, -0.00312042236328125, 0.0216217041015625, -0.0197906494140625, 0.0006265640258789062, 0.04449462890625, -0.044036865234375, 0.02130126953125, 0.032501220703125, 0.05108642578125, 0.03857421875, -0.00910186767578125, 0.0279388427734375, -0.0478515625, 0.008575439453125, 0.01244354248046875, 0.02752685546875, 0.01534271240234375, -0.02899169921875, 0.056640625, 0.031829833984375, -0.04498291015625, -0.07037353515625, 0.00012445449829101562, -0.06610107421875, -0.0343017578125, 0.09033203125, -0.0287628173828125, -0.0242462158203125, -0.0265655517578125, -0.0312347412109375, 0.03765869140625, -0.01617431640625, 0.058746337890625, 0.07916259765625, 0.01064300537109375, -0.0003523826599121094, -0.057769775390625, 0.04632568359375, 0.02984619140625, -0.05389404296875, 0.0011014938354492188, 0.01255035400390625, 0.0287322998046875, 0.026947021484375, 0.06329345703125, -0.0218963623046875, 0.0249176025390625, -0.003597259521484375, 0.0203399658203125, 0.00513458251953125, -0.0057373046875, -0.0092315673828125, 0.002166748046875, -0.026580810546875, -0.0137481689453125 ] ]
SG161222/Realistic_Vision_V3.0_VAE
2023-06-27T06:16:14.000Z
[ "diffusers", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
null
SG161222
null
null
SG161222/Realistic_Vision_V3.0_VAE
71
117,781
diffusers
2023-06-13T12:46:41
--- license: creativeml-openrail-m --- <b>Please read this!</b><br> The necessary VAE is already baked into the model.<br> <hr/> <b>The recommended negative prompt:</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br> <b>OR</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation <b>Recommended parameters for generation:</b><br> Euler A or DPM++ SDE Karras<br> CFG Scale 3,5 - 7<br> Hires. fix with 4x-UltraSharp upscaler<br> 0 Hires steps and Denoising strength 0.25-0.45<br> Upscale by 1.1-2.0
1,268
[ [ -0.045501708984375, -0.058074951171875, 0.047607421875, 0.0167388916015625, -0.03155517578125, -0.01812744140625, 0.03662109375, -0.0162200927734375, 0.026824951171875, 0.063232421875, -0.0657958984375, -0.032012939453125, -0.03289794921875, -0.0007181167602539062, -0.033782958984375, 0.05364990234375, 0.01334381103515625, 0.01311492919921875, 0.0005173683166503906, 0.0421142578125, -0.055694580078125, -0.0122528076171875, -0.03460693359375, -0.01074981689453125, 0.0180816650390625, 0.052032470703125, 0.037567138671875, 0.0499267578125, -0.002948760986328125, 0.02008056640625, -0.0014200210571289062, 0.0022487640380859375, -0.04290771484375, -0.000041604042053222656, -0.0003509521484375, -0.02764892578125, -0.04827880859375, -0.00698089599609375, 0.032501220703125, 0.0255584716796875, 0.0018310546875, 0.022552490234375, -0.00377655029296875, 0.040863037109375, -0.050201416015625, -0.0153961181640625, -0.0111236572265625, 0.00868988037109375, -0.018890380859375, -0.0197906494140625, -0.031005859375, -0.0189971923828125, -0.03668212890625, -0.060577392578125, 0.0419921875, -0.0009207725524902344, 0.08544921875, 0.002094268798828125, -0.02496337890625, 0.022857666015625, -0.0650634765625, 0.039215087890625, -0.049407958984375, 0.006870269775390625, 0.01983642578125, 0.02685546875, -0.01526641845703125, -0.0604248046875, -0.05816650390625, -0.0079345703125, 0.017425537109375, 0.01468658447265625, -0.025115966796875, 0.0096588134765625, 0.055877685546875, 0.0276031494140625, -0.045257568359375, -0.00745391845703125, -0.035369873046875, -0.0172882080078125, 0.054107666015625, 0.036102294921875, 0.046051025390625, -0.0150299072265625, -0.049835205078125, -0.03521728515625, -0.053985595703125, -0.0095672607421875, 0.04644775390625, 0.005962371826171875, -0.020111083984375, 0.0523681640625, 0.00611114501953125, 0.0491943359375, 0.04010009765625, 0.0005655288696289062, 0.020111083984375, -0.02716064453125, -0.016845703125, -0.0178680419921875, 0.048553466796875, 0.0599365234375, 0.01409912109375, 0.0257110595703125, 0.005298614501953125, 0.00855255126953125, 0.044342041015625, -0.082275390625, -0.01898193359375, 0.01416015625, -0.0296173095703125, -0.0211029052734375, -0.0230865478515625, -0.1002197265625, -0.018646240234375, -0.0203704833984375, 0.044525146484375, -0.0274810791015625, -0.0157318115234375, 0.00978851318359375, -0.0224609375, 0.03179931640625, 0.032073974609375, -0.076416015625, -0.00019824504852294922, 0.006473541259765625, 0.03192138671875, 0.040740966796875, -0.002681732177734375, -0.006916046142578125, 0.01727294921875, -0.03863525390625, 0.060302734375, -0.033111572265625, -0.053375244140625, -0.0164642333984375, 0.02349853515625, 0.007572174072265625, -0.0203857421875, 0.059478759765625, -0.0171051025390625, 0.019317626953125, -0.0264129638671875, -0.032470703125, -0.0279083251953125, -0.01537322998046875, -0.050689697265625, 0.0401611328125, 0.03485107421875, -0.05023193359375, 0.034912109375, -0.024627685546875, -0.01324462890625, 0.0208892822265625, -0.01320648193359375, -0.041473388671875, 0.016326904296875, 0.0260772705078125, 0.03515625, -0.0177154541015625, -0.0016231536865234375, -0.0306549072265625, -0.028167724609375, -0.0187835693359375, -0.044403076171875, 0.06488037109375, 0.0062103271484375, -0.03985595703125, 0.005229949951171875, -0.08184814453125, 0.0011892318725585938, 0.0318603515625, 0.01081085205078125, 0.0064544677734375, -0.01398468017578125, 0.022735595703125, 0.015960693359375, 0.020660400390625, -0.050445556640625, 0.01549530029296875, -0.031494140625, 0.001049041748046875, 0.0650634765625, 0.03387451171875, 0.01398468017578125, -0.0310821533203125, 0.06732177734375, 0.00411224365234375, 0.02392578125, -0.01290130615234375, -0.0401611328125, -0.0787353515625, -0.015625, -0.01186370849609375, 0.0281982421875, -0.06329345703125, 0.0274200439453125, 0.032745361328125, -0.0491943359375, -0.0212249755859375, -0.006717681884765625, 0.034912109375, 0.0440673828125, 0.02899169921875, -0.0268096923828125, -0.0455322265625, -0.0775146484375, 0.0165557861328125, -0.00652313232421875, -0.0168914794921875, 0.01110076904296875, 0.030364990234375, -0.004364013671875, 0.0302734375, -0.043121337890625, -0.010986328125, -0.0010089874267578125, -0.00164794921875, 0.01837158203125, 0.041534423828125, 0.053192138671875, -0.047088623046875, -0.0204620361328125, -0.013092041015625, -0.04315185546875, -0.003620147705078125, -0.021881103515625, -0.022857666015625, -0.016357421875, 0.018829345703125, -0.046234130859375, 0.0452880859375, 0.02996826171875, -0.06048583984375, 0.06634521484375, -0.0227508544921875, 0.0281982421875, -0.0887451171875, 0.0110015869140625, 0.007274627685546875, -0.0130462646484375, -0.04498291015625, 0.026092529296875, 0.0029659271240234375, -0.0279693603515625, -0.0511474609375, 0.049652099609375, -0.043426513671875, 0.0180816650390625, -0.01161956787109375, -0.0066375732421875, 0.0193328857421875, 0.0257720947265625, -0.01352691650390625, 0.0533447265625, 0.050567626953125, -0.057403564453125, 0.04864501953125, 0.02227783203125, -0.004787445068359375, 0.04901123046875, -0.06072998046875, 0.00431060791015625, -0.017547607421875, -0.00420379638671875, -0.060577392578125, -0.0389404296875, 0.0369873046875, -0.04364013671875, 0.036834716796875, 0.01947021484375, -0.0158538818359375, -0.049560546875, -0.02691650390625, 0.0298309326171875, 0.056854248046875, -0.037750244140625, 0.0484619140625, 0.00972747802734375, 0.004161834716796875, -0.01297760009765625, -0.0474853515625, 0.001926422119140625, -0.0212249755859375, -0.04461669921875, 0.033355712890625, -0.0187530517578125, 0.0115966796875, -0.0034332275390625, 0.01025390625, -0.01190948486328125, -0.01462554931640625, 0.0179443359375, 0.01788330078125, -0.0295562744140625, -0.04022216796875, 0.019989013671875, -0.01522064208984375, 0.0120086669921875, 0.024627685546875, 0.0439453125, 0.0109710693359375, -0.037384033203125, -0.0469970703125, 0.0377197265625, 0.06304931640625, 0.009033203125, 0.0164337158203125, 0.061614990234375, -0.04736328125, 0.0005288124084472656, -0.033782958984375, -0.0171051025390625, -0.035064697265625, 0.0161895751953125, -0.0274200439453125, -0.0309600830078125, 0.046783447265625, 0.0110015869140625, -0.0159149169921875, 0.06427001953125, 0.05267333984375, -0.0107421875, 0.11279296875, 0.05474853515625, 0.02728271484375, 0.03369140625, -0.036529541015625, 0.003910064697265625, -0.0654296875, -0.0288543701171875, -0.0218505859375, -0.0188140869140625, -0.04644775390625, -0.03076171875, 0.03125, 0.0223388671875, -0.035797119140625, 0.04132080078125, -0.040496826171875, 0.0379638671875, 0.04302978515625, 0.0248260498046875, -0.00946807861328125, 0.0067291259765625, -0.005184173583984375, -0.01146697998046875, -0.037139892578125, -0.046234130859375, 0.054595947265625, 0.015716552734375, 0.04791259765625, 0.0006093978881835938, 0.052734375, -0.00733184814453125, -0.0025539398193359375, -0.027679443359375, 0.053375244140625, -0.027374267578125, -0.0716552734375, -0.010650634765625, -0.00624847412109375, -0.07220458984375, 0.010162353515625, -0.03375244140625, -0.07684326171875, 0.048370361328125, 0.0233917236328125, -0.03985595703125, 0.026275634765625, -0.058868408203125, 0.061859130859375, -0.01425933837890625, -0.042205810546875, -0.00670623779296875, -0.028961181640625, 0.046142578125, 0.0009303092956542969, -0.01023101806640625, 0.005886077880859375, 0.005615234375, 0.0419921875, -0.03900146484375, 0.06396484375, -0.0259552001953125, 0.0135650634765625, 0.03936767578125, 0.00885772705078125, 0.0145111083984375, 0.0195770263671875, 0.00412750244140625, -0.00765228271484375, 0.007965087890625, -0.041534423828125, -0.033477783203125, 0.043548583984375, -0.052093505859375, -0.05755615234375, -0.04022216796875, -0.0131683349609375, 0.002696990966796875, 0.01837158203125, 0.061004638671875, 0.035552978515625, -0.0122528076171875, -0.002750396728515625, 0.042266845703125, -0.029327392578125, 0.0343017578125, 0.025909423828125, -0.0251617431640625, -0.032470703125, 0.069091796875, 0.004970550537109375, 0.025054931640625, -0.0025272369384765625, 0.0108642578125, -0.0028705596923828125, -0.0130462646484375, -0.070068359375, 0.026275634765625, -0.034576416015625, -0.0222015380859375, -0.029083251953125, -0.02349853515625, -0.0251312255859375, -0.02923583984375, -0.0309295654296875, 0.0025005340576171875, -0.0723876953125, -0.0026569366455078125, 0.038055419921875, 0.048095703125, -0.01445770263671875, 0.0166778564453125, -0.0625, 0.04998779296875, 0.0182342529296875, 0.00849151611328125, 0.01082611083984375, -0.03662109375, 0.0015773773193359375, 0.00902557373046875, -0.0546875, -0.07965087890625, 0.042755126953125, -0.0198516845703125, 0.0421142578125, 0.039794921875, -0.005741119384765625, 0.07794189453125, -0.037353515625, 0.0894775390625, 0.043304443359375, -0.052276611328125, 0.04541015625, -0.044830322265625, 0.02215576171875, 0.0270538330078125, 0.0206756591796875, -0.01885986328125, -0.0216522216796875, -0.0828857421875, -0.0780029296875, 0.03948974609375, 0.00902557373046875, 0.04913330078125, -0.011199951171875, 0.0286407470703125, 0.007610321044921875, 0.002704620361328125, -0.053192138671875, -0.02593994140625, -0.0197601318359375, 0.0190277099609375, 0.00995635986328125, -0.04351806640625, 0.0095062255859375, -0.045989990234375, 0.06683349609375, 0.01264190673828125, 0.041534423828125, 0.006374359130859375, 0.040374755859375, -0.037200927734375, -0.0013675689697265625, 0.051116943359375, 0.0411376953125, -0.0245361328125, -0.017578125, -0.0008544921875, -0.0386962890625, 0.0250244140625, -0.01226806640625, -0.029327392578125, 0.01435089111328125, 0.02923583984375, 0.07501220703125, -0.005107879638671875, -0.02020263671875, 0.0406494140625, -0.0131683349609375, -0.0213165283203125, -0.034210205078125, 0.027191162109375, -0.002162933349609375, 0.01056671142578125, 0.0107879638671875, 0.0178070068359375, 0.01435089111328125, -0.0308685302734375, 0.01318359375, 0.00514984130859375, -0.0173187255859375, -0.0222320556640625, 0.061767578125, 0.0147552490234375, -0.0253753662109375, 0.036102294921875, -0.0258026123046875, -0.0217742919921875, 0.06402587890625, 0.055938720703125, 0.06927490234375, -0.03924560546875, 0.034271240234375, 0.0687255859375, 0.0162506103515625, -0.00200653076171875, 0.049835205078125, 0.0198974609375, -0.0274658203125, -0.01922607421875, -0.046051025390625, -0.02117919921875, 0.043914794921875, -0.056427001953125, 0.057830810546875, -0.049072265625, -0.002010345458984375, -0.004055023193359375, -0.021942138671875, -0.052825927734375, 0.055877685546875, 0.0146636962890625, 0.044189453125, -0.08319091796875, 0.0276641845703125, 0.03411865234375, -0.0675048828125, -0.06915283203125, -0.0011415481567382812, 0.005489349365234375, -0.055511474609375, 0.00809478759765625, 0.00902557373046875, 0.002719879150390625, 0.01398468017578125, -0.043701171875, -0.06292724609375, 0.062225341796875, 0.046539306640625, -0.06134033203125, -0.01457977294921875, -0.0275421142578125, 0.048797607421875, -0.0194244384765625, 0.0282745361328125, 0.0196075439453125, 0.03570556640625, 0.01091766357421875, -0.0318603515625, 0.01018524169921875, -0.0325927734375, 0.027374267578125, 0.0147552490234375, -0.045867919921875, 0.07220458984375, -0.02911376953125, -0.03607177734375, 0.0241851806640625, 0.044036865234375, 0.0140533447265625, 0.0250701904296875, 0.036407470703125, 0.04559326171875, 0.025299072265625, 0.01983642578125, 0.0870361328125, -0.0220794677734375, 0.0133209228515625, 0.05804443359375, 0.0102996826171875, 0.0401611328125, 0.02667236328125, -0.01468658447265625, 0.04119873046875, 0.072265625, -0.03558349609375, 0.0312042236328125, 0.0219879150390625, -0.015625, -0.0265045166015625, 0.0023708343505859375, -0.05377197265625, 0.037139892578125, 0.0271453857421875, -0.03497314453125, -0.021148681640625, 0.0245361328125, -0.01116943359375, 0.022796630859375, -0.0251007080078125, 0.039459228515625, -0.0017957687377929688, -0.0257720947265625, 0.05023193359375, -0.0023040771484375, 0.0211639404296875, -0.0269927978515625, -0.0362548828125, -0.014434814453125, -0.002841949462890625, -0.0222625732421875, -0.0518798828125, 0.01557159423828125, 0.0003788471221923828, -0.035552978515625, -0.010284423828125, 0.045989990234375, -0.00849151611328125, -0.07598876953125, 0.0246429443359375, 0.0142974853515625, 0.025909423828125, 0.0133514404296875, -0.051483154296875, -0.006591796875, 0.0152435302734375, -0.03411865234375, 0.006557464599609375, 0.0138702392578125, 0.01003265380859375, 0.0230560302734375, 0.032257080078125, 0.013458251953125, -0.012725830078125, 0.0224609375, 0.0699462890625, -0.044921875, -0.0260467529296875, -0.055328369140625, 0.06646728515625, 0.004283905029296875, -0.037506103515625, 0.049560546875, 0.024322509765625, 0.0634765625, -0.035125732421875, 0.03277587890625, -0.01137542724609375, 0.0170135498046875, -0.052978515625, 0.045501708984375, -0.058319091796875, -0.01027679443359375, -0.04266357421875, -0.090087890625, -0.007503509521484375, 0.06396484375, -0.0002130270004272461, 0.03778076171875, 0.054901123046875, 0.06982421875, -0.005031585693359375, -0.006450653076171875, 0.04315185546875, 0.01531219482421875, 0.00902557373046875, 0.045257568359375, 0.03863525390625, -0.038665771484375, 0.006649017333984375, -0.038116455078125, -0.0239105224609375, -0.01555633544921875, -0.058380126953125, -0.0421142578125, -0.05242919921875, -0.034637451171875, -0.044525146484375, -0.005344390869140625, 0.037689208984375, 0.057464599609375, -0.047821044921875, 0.0054931640625, -0.023040771484375, -0.0147857666015625, -0.007389068603515625, -0.022308349609375, 0.00002777576446533203, 0.027008056640625, -0.059539794921875, 0.01629638671875, 0.00727081298828125, 0.0266876220703125, -0.033782958984375, 0.01380157470703125, -0.0157318115234375, 0.005519866943359375, 0.0257568359375, 0.0243988037109375, -0.0489501953125, -0.041778564453125, -0.025299072265625, 0.00030517578125, -0.004486083984375, 0.028778076171875, -0.03277587890625, 0.06451416015625, 0.0345458984375, 0.0068206787109375, 0.055938720703125, 0.00171661376953125, 0.0465087890625, -0.046905517578125, 0.0042877197265625, 0.0298309326171875, 0.024139404296875, 0.0183258056640625, -0.06414794921875, 0.01113128662109375, 0.0232391357421875, -0.03643798828125, -0.0584716796875, 0.035247802734375, -0.07501220703125, -0.0196533203125, 0.08074951171875, 0.0184326171875, -0.033050537109375, 0.0196075439453125, -0.04510498046875, 0.03729248046875, -0.016937255859375, 0.038055419921875, 0.04351806640625, -0.0195159912109375, -0.0196533203125, -0.04498291015625, 0.0190582275390625, 0.014129638671875, -0.049560546875, -0.00611114501953125, 0.0543212890625, 0.014495849609375, 0.025238037109375, 0.04718017578125, -0.0186920166015625, 0.04388427734375, 0.045196533203125, 0.042724609375, -0.0122528076171875, -0.0199737548828125, -0.027374267578125, -0.00008553266525268555, -0.0045928955078125, -0.0283355712890625 ] ]
suno/bark-small
2023-10-04T14:17:59.000Z
[ "transformers", "pytorch", "bark", "text-to-audio", "audio", "text-to-speech", "en", "de", "es", "fr", "hi", "it", "ja", "ko", "pl", "pt", "ru", "tr", "zh", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-to-speech
suno
null
null
suno/bark-small
51
117,367
transformers
2023-07-18T13:50:46
--- language: - en - de - es - fr - hi - it - ja - ko - pl - pt - ru - tr - zh thumbnail: >- https://user-images.githubusercontent.com/5068315/230698495-cbb1ced9-c911-4c9a-941d-a1a4a1286ac6.png library: bark license: mit tags: - bark - audio - text-to-speech duplicated_from: ylacombe/bark-small pipeline_tag: text-to-speech --- # Bark Bark is a transformer-based text-to-audio model created by [Suno](https://www.suno.ai). Bark can generate highly realistic, multilingual speech as well as other audio - including music, background noise and simple sound effects. The model can also produce nonverbal communications like laughing, sighing and crying. To support the research community, we are providing access to pretrained model checkpoints ready for inference. The original github repo and model card can be found [here](https://github.com/suno-ai/bark). This model is meant for research purposes only. The model output is not censored and the authors do not endorse the opinions in the generated content. Use at your own risk. Two checkpoints are released: - [**small** (this checkpoint)](https://huggingface.co/suno/bark-small) - [large](https://huggingface.co/suno/bark) ## Example Try out Bark yourself! * Bark Colab: <a target="_blank" href="https://colab.research.google.com/drive/1eJfA2XUa-mXwdMy7DoYKVYHI1iTd9Vkt?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> * Hugging Face Colab: <a target="_blank" href="https://colab.research.google.com/drive/1dWWkZzvu7L9Bunq9zvD-W02RFUXoW-Pd?usp=sharing"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> * Hugging Face Demo: <a target="_blank" href="https://huggingface.co/spaces/suno/bark"> <img src="https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg" alt="Open in HuggingFace"/> </a> You can run Bark locally with the 🤗 Transformers library from version 4.31.0 onwards. 1. First install the 🤗 [Transformers library](https://github.com/huggingface/transformers) and scipy: ``` pip install --upgrade pip pip install --upgrade transformers scipy ``` 2. Run inference via the `Text-to-Speech` (TTS) pipeline. You can infer the bark model via the TTS pipeline in just a few lines of code! ```python from transformers import pipeline import scipy synthesiser = pipeline("text-to-speech", "suno/bark-small") speech = synthesiser("Hello, my dog is cooler than you!", forward_params={"do_sample": True}) scipy.io.wavfile.write("bark_out.wav", rate=speech["sampling_rate"], data=speech["audio"]) ``` 3. Run inference via the Transformers modelling code. You can use the processor + generate code to convert text into a mono 24 kHz speech waveform for more fine-grained control. ```python from transformers import AutoProcessor, AutoModel processor = AutoProcessor.from_pretrained("suno/bark-small") model = AutoModel.from_pretrained("suno/bark-small") inputs = processor( text=["Hello, my name is Suno. And, uh — and I like pizza. [laughs] But I also have other interests such as playing tic tac toe."], return_tensors="pt", ) speech_values = model.generate(**inputs, do_sample=True) ``` 4. Listen to the speech samples either in an ipynb notebook: ```python from IPython.display import Audio sampling_rate = model.generation_config.sample_rate Audio(speech_values.cpu().numpy().squeeze(), rate=sampling_rate) ``` Or save them as a `.wav` file using a third-party library, e.g. `scipy`: ```python import scipy sampling_rate = model.config.sample_rate scipy.io.wavfile.write("bark_out.wav", rate=sampling_rate, data=speech_values.cpu().numpy().squeeze()) ``` For more details on using the Bark model for inference using the 🤗 Transformers library, refer to the [Bark docs](https://huggingface.co/docs/transformers/model_doc/bark). ## Suno Usage You can also run Bark locally through the original [Bark library]((https://github.com/suno-ai/bark): 1. First install the [`bark` library](https://github.com/suno-ai/bark) 3. Run the following Python code: ```python from bark import SAMPLE_RATE, generate_audio, preload_models from IPython.display import Audio # download and load all models preload_models() # generate audio from text text_prompt = """ Hello, my name is Suno. And, uh — and I like pizza. [laughs] But I also have other interests such as playing tic tac toe. """ speech_array = generate_audio(text_prompt) # play text in notebook Audio(speech_array, rate=SAMPLE_RATE) ``` [pizza.webm](https://user-images.githubusercontent.com/5068315/230490503-417e688d-5115-4eee-9550-b46a2b465ee3.webm) To save `audio_array` as a WAV file: ```python from scipy.io.wavfile import write as write_wav write_wav("/path/to/audio.wav", SAMPLE_RATE, audio_array) ``` ## Model Details The following is additional information about the models released here. Bark is a series of three transformer models that turn text into audio. ### Text to semantic tokens - Input: text, tokenized with [BERT tokenizer from Hugging Face](https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer) - Output: semantic tokens that encode the audio to be generated ### Semantic to coarse tokens - Input: semantic tokens - Output: tokens from the first two codebooks of the [EnCodec Codec](https://github.com/facebookresearch/encodec) from facebook ### Coarse to fine tokens - Input: the first two codebooks from EnCodec - Output: 8 codebooks from EnCodec ### Architecture | Model | Parameters | Attention | Output Vocab size | |:-------------------------:|:----------:|------------|:-----------------:| | Text to semantic tokens | 80/300 M | Causal | 10,000 | | Semantic to coarse tokens | 80/300 M | Causal | 2x 1,024 | | Coarse to fine tokens | 80/300 M | Non-causal | 6x 1,024 | ### Release date April 2023 ## Broader Implications We anticipate that this model's text to audio capabilities can be used to improve accessbility tools in a variety of languages. While we hope that this release will enable users to express their creativity and build applications that are a force for good, we acknowledge that any text to audio model has the potential for dual use. While it is not straightforward to voice clone known people with Bark, it can still be used for nefarious purposes. To further reduce the chances of unintended use of Bark, we also release a simple classifier to detect Bark-generated audio with high accuracy (see notebooks section of the main repository). ## License Bark is licensed under the [MIT License](https://github.com/suno-ai/bark/blob/main/LICENSE), meaning it's available for commercial use.
6,789
[ [ -0.0206451416015625, -0.051239013671875, 0.014892578125, 0.0364990234375, -0.01125335693359375, -0.00440216064453125, -0.0235595703125, -0.064208984375, 0.0142974853515625, 0.0143890380859375, -0.046905517578125, -0.0526123046875, -0.027099609375, -0.0029735565185546875, -0.0177001953125, 0.09063720703125, 0.041473388671875, 0.0009012222290039062, -0.0023345947265625, -0.0016155242919921875, -0.0203094482421875, -0.032745361328125, -0.0604248046875, -0.0509033203125, 0.02264404296875, 0.00313568115234375, 0.0321044921875, 0.02642822265625, -0.0010557174682617188, 0.019683837890625, -0.026031494140625, -0.0220184326171875, -0.001415252685546875, -0.0181732177734375, 0.01416015625, -0.05072021484375, -0.049346923828125, 0.020050048828125, 0.0279388427734375, 0.0155029296875, -0.0293426513671875, 0.0174102783203125, -0.0097503662109375, 0.0169830322265625, -0.00647735595703125, 0.0273590087890625, -0.046539306640625, -0.0105743408203125, 0.0037384033203125, -0.0006923675537109375, -0.03802490234375, -0.033172607421875, 0.006378173828125, -0.0555419921875, 0.01493072509765625, -0.0002803802490234375, 0.084716796875, 0.021575927734375, -0.0138702392578125, -0.034759521484375, -0.043914794921875, 0.0616455078125, -0.08258056640625, 0.0102081298828125, 0.060211181640625, 0.0157318115234375, -0.01219940185546875, -0.06463623046875, -0.03741455078125, -0.02685546875, -0.0053558349609375, 0.0237274169921875, -0.0379638671875, -0.0012063980102539062, 0.010498046875, 0.026214599609375, -0.018585205078125, -0.003131866455078125, -0.0189056396484375, -0.01373291015625, 0.0460205078125, 0.0002593994140625, 0.04248046875, -0.0283203125, -0.004917144775390625, -0.043121337890625, -0.01082611083984375, 0.039215087890625, 0.0193939208984375, 0.023162841796875, -0.03643798828125, 0.043060302734375, 0.02001953125, 0.0338134765625, 0.0205078125, -0.03802490234375, 0.035430908203125, -0.0207366943359375, -0.01334381103515625, 0.03411865234375, 0.08258056640625, 0.01204681396484375, -0.0102081298828125, -0.0009565353393554688, 0.0024814605712890625, 0.00858306884765625, 0.00092315673828125, -0.051544189453125, -0.0198211669921875, 0.04052734375, -0.027252197265625, -0.031585693359375, -0.0210113525390625, -0.042877197265625, 0.0004489421844482422, -0.0018835067749023438, 0.033172607421875, -0.07220458984375, -0.02923583984375, 0.015655517578125, -0.0394287109375, 0.02301025390625, 0.005741119384765625, -0.0728759765625, 0.02392578125, 0.034088134765625, 0.05181884765625, 0.0208740234375, -0.0286102294921875, -0.020538330078125, 0.01580810546875, -0.0245208740234375, 0.041290283203125, -0.029205322265625, -0.0282135009765625, -0.036102294921875, 0.006999969482421875, 0.00046324729919433594, -0.04949951171875, 0.0570068359375, -0.003910064697265625, 0.0272369384765625, 0.01103973388671875, -0.0203399658203125, -0.0288848876953125, -0.004428863525390625, -0.03363037109375, 0.11083984375, 0.00662994384765625, -0.0693359375, 0.00847625732421875, -0.053436279296875, -0.044525146484375, -0.0242462158203125, 0.0103759765625, -0.043121337890625, 0.0035953521728515625, 0.0271453857421875, 0.0169830322265625, -0.032379150390625, 0.03076171875, -0.0137176513671875, -0.0153045654296875, 0.037933349609375, -0.01212310791015625, 0.08709716796875, 0.0145111083984375, -0.04180908203125, 0.0193634033203125, -0.06512451171875, 0.01056671142578125, 0.0287628173828125, -0.033477783203125, -0.0269012451171875, 0.007328033447265625, 0.0204620361328125, 0.0072174072265625, 0.01558685302734375, -0.04815673828125, 0.001239776611328125, -0.043121337890625, 0.05938720703125, 0.0352783203125, -0.01490020751953125, 0.0099639892578125, -0.054290771484375, 0.023284912109375, -0.0156097412109375, -0.00458526611328125, -0.016448974609375, -0.042510986328125, -0.032012939453125, -0.047943115234375, 0.01629638671875, 0.03955078125, -0.00678253173828125, 0.062744140625, 0.007678985595703125, -0.07025146484375, -0.07647705078125, -0.038421630859375, 0.01605224609375, 0.023956298828125, 0.032867431640625, -0.004459381103515625, -0.0455322265625, -0.048187255859375, -0.00569915771484375, -0.033294677734375, -0.013397216796875, 0.047332763671875, 0.0213470458984375, -0.0222320556640625, 0.08404541015625, -0.027679443359375, -0.0264434814453125, -0.0168304443359375, 0.0308685302734375, 0.030059814453125, 0.04901123046875, 0.041107177734375, -0.0364990234375, -0.0167236328125, 0.00011736154556274414, -0.05120849609375, -0.0190277099609375, -0.00827789306640625, 0.00945281982421875, 0.0072021484375, 0.0239715576171875, -0.047332763671875, 0.01512908935546875, 0.03948974609375, -0.010711669921875, 0.05718994140625, -0.0098876953125, 0.005237579345703125, -0.0877685546875, 0.012420654296875, -0.00524139404296875, -0.01430511474609375, -0.04010009765625, -0.023895263671875, -0.0252227783203125, -0.0204925537109375, -0.02923583984375, 0.031402587890625, -0.021942138671875, -0.0115203857421875, -0.011444091796875, 0.006183624267578125, -0.0030364990234375, 0.04150390625, 0.00872802734375, 0.04254150390625, 0.06689453125, -0.044677734375, 0.021453857421875, 0.0301971435546875, -0.01419830322265625, 0.0271759033203125, -0.06378173828125, 0.0205535888671875, 0.01348876953125, 0.0321044921875, -0.06781005859375, -0.01259613037109375, 0.0229644775390625, -0.064697265625, 0.01218414306640625, -0.0035991668701171875, -0.042022705078125, -0.0258636474609375, -0.0174102783203125, 0.0309600830078125, 0.0576171875, -0.048492431640625, 0.051849365234375, 0.04388427734375, -0.005649566650390625, -0.029937744140625, -0.05743408203125, -0.0002760887145996094, -0.035736083984375, -0.046112060546875, 0.034088134765625, -0.00426483154296875, 0.0017948150634765625, 0.0153350830078125, -0.007171630859375, -0.0023193359375, -0.00963592529296875, 0.0341796875, 0.004795074462890625, -0.01617431640625, 0.002437591552734375, 0.0012950897216796875, -0.00548553466796875, 0.0160675048828125, -0.015960693359375, 0.051025390625, -0.0379638671875, -0.004085540771484375, -0.057586669921875, 0.00650787353515625, 0.049468994140625, -0.0260467529296875, 0.0218963623046875, 0.0615234375, -0.025299072265625, -0.0128326416015625, -0.032318115234375, -0.0211181640625, -0.03399658203125, 0.02606201171875, -0.0294036865234375, -0.0389404296875, 0.038055419921875, -0.01456451416015625, -0.004291534423828125, 0.0347900390625, 0.032684326171875, -0.01410675048828125, 0.0787353515625, 0.060028076171875, -0.009552001953125, 0.0384521484375, -0.021759033203125, 0.005054473876953125, -0.07080078125, -0.0335693359375, -0.049835205078125, 0.0004673004150390625, -0.03680419921875, -0.0244903564453125, 0.023193359375, 0.0119476318359375, -0.0028362274169921875, 0.04290771484375, -0.0623779296875, 0.008087158203125, 0.057769775390625, 0.004207611083984375, 0.01715087890625, 0.004093170166015625, -0.01220703125, -0.0103607177734375, -0.05181884765625, -0.037445068359375, 0.053985595703125, 0.043121337890625, 0.0648193359375, -0.0093536376953125, 0.054168701171875, 0.0007982254028320312, 0.0040435791015625, -0.07586669921875, 0.044647216796875, -0.00452423095703125, -0.057861328125, -0.0269622802734375, -0.01271820068359375, -0.07891845703125, 0.01070404052734375, -0.01580810546875, -0.0748291015625, -0.00579833984375, -0.0064849853515625, -0.0008959770202636719, 0.020965576171875, -0.042816162109375, 0.060699462890625, -0.0162811279296875, -0.004085540771484375, -0.02764892578125, -0.039581298828125, 0.02801513671875, 0.0007805824279785156, 0.0216827392578125, -0.02923583984375, 0.021209716796875, 0.07574462890625, -0.023773193359375, 0.08306884765625, -0.0015382766723632812, 0.00313568115234375, 0.04583740234375, -0.0132598876953125, 0.01324462890625, -0.00876617431640625, -0.01273345947265625, 0.0173187255859375, 0.039154052734375, -0.009765625, -0.02789306640625, 0.0235748291015625, -0.0665283203125, -0.022552490234375, -0.040252685546875, -0.044464111328125, -0.0035457611083984375, 0.0154266357421875, 0.0338134765625, 0.024505615234375, -0.01983642578125, 0.003208160400390625, 0.004695892333984375, -0.0489501953125, 0.03985595703125, 0.0433349609375, -0.0253448486328125, -0.046905517578125, 0.0640869140625, -0.0194244384765625, 0.00853729248046875, 0.0017995834350585938, 0.049102783203125, -0.0296173095703125, -0.01526641845703125, -0.0192718505859375, 0.0433349609375, -0.032440185546875, 0.0008373260498046875, -0.03936767578125, -0.021453857421875, -0.03857421875, 0.00009810924530029297, -0.046539306640625, -0.02020263671875, -0.0243988037109375, 0.01004791259765625, 0.05029296875, 0.0491943359375, -0.0287628173828125, 0.0260772705078125, -0.0491943359375, 0.032318115234375, 0.01125335693359375, 0.0102691650390625, 0.0030269622802734375, -0.046539306640625, -0.00830078125, 0.01053619384765625, -0.01438140869140625, -0.06549072265625, 0.0404052734375, 0.010589599609375, 0.043365478515625, 0.0015878677368164062, 0.019683837890625, 0.050079345703125, -0.033782958984375, 0.05682373046875, 0.037811279296875, -0.081787109375, 0.06793212890625, -0.0236968994140625, 0.01493072509765625, 0.0191650390625, 0.02044677734375, -0.04327392578125, -0.05694580078125, -0.061737060546875, -0.0638427734375, 0.09381103515625, 0.021240234375, 0.0033550262451171875, 0.002685546875, -0.004039764404296875, 0.0018901824951171875, 0.0030841827392578125, -0.07696533203125, -0.0316162109375, -0.036712646484375, -0.01337432861328125, -0.002025604248046875, 0.0023212432861328125, -0.016876220703125, -0.043792724609375, 0.07421875, 0.0012950897216796875, 0.03668212890625, 0.02532958984375, 0.01413726806640625, -0.0024547576904296875, 0.0279541015625, 0.024444580078125, 0.0014829635620117188, -0.04473876953125, 0.006134033203125, 0.01227569580078125, -0.04388427734375, 0.023345947265625, 0.006984710693359375, -0.005584716796875, 0.0154266357421875, 0.01479339599609375, 0.075927734375, 0.017425537109375, -0.05169677734375, 0.029022216796875, -0.01212310791015625, -0.015106201171875, -0.0345458984375, -0.00034618377685546875, 0.0325927734375, 0.019805908203125, 0.02288818359375, -0.0034999847412109375, -0.005641937255859375, -0.053466796875, 0.023406982421875, 0.0290679931640625, -0.0261383056640625, -0.02899169921875, 0.07086181640625, -0.012054443359375, -0.049102783203125, 0.033050537109375, -0.004482269287109375, -0.0261077880859375, 0.061431884765625, 0.08612060546875, 0.075927734375, -0.01032257080078125, 0.018341064453125, 0.050872802734375, 0.019683837890625, 0.0027484893798828125, -0.002216339111328125, -0.018402099609375, -0.038116455078125, -0.01702880859375, -0.055267333984375, -0.030029296875, 0.03021240234375, -0.057037353515625, 0.0248260498046875, -0.0295562744140625, -0.03363037109375, 0.0174713134765625, -0.01349639892578125, -0.005428314208984375, 0.0188140869140625, 0.005260467529296875, 0.052703857421875, -0.05865478515625, 0.08917236328125, 0.0447998046875, -0.0413818359375, -0.08447265625, 0.0107421875, -0.01495361328125, -0.04901123046875, 0.03424072265625, 0.023101806640625, -0.03228759765625, 0.0072479248046875, -0.05206298828125, -0.046539306640625, 0.072998046875, 0.027984619140625, -0.0106353759765625, -0.0013704299926757812, 0.0005812644958496094, 0.0482177734375, -0.0193023681640625, 0.0283966064453125, 0.052001953125, 0.03466796875, 0.020050048828125, -0.08612060546875, 0.0063323974609375, -0.0264739990234375, -0.0264129638671875, -0.020751953125, -0.042236328125, 0.0526123046875, -0.0241851806640625, -0.0229949951171875, 0.004047393798828125, 0.040985107421875, 0.040313720703125, 0.040985107421875, 0.0433349609375, 0.043731689453125, 0.059814453125, -0.00824737548828125, 0.0594482421875, -0.0215606689453125, 0.02459716796875, 0.08447265625, 0.01117706298828125, 0.06707763671875, 0.020965576171875, -0.034759521484375, 0.036102294921875, 0.05145263671875, -0.013641357421875, 0.037841796875, 0.01004791259765625, -0.019195556640625, 0.003925323486328125, -0.0220184326171875, -0.03619384765625, 0.043853759765625, 0.0189666748046875, -0.0024204254150390625, -0.0029621124267578125, 0.011077880859375, -0.0039520263671875, -0.006046295166015625, 0.0033740997314453125, 0.060333251953125, 0.0166778564453125, -0.041656494140625, 0.076416015625, 0.011962890625, 0.06793212890625, -0.05438232421875, 0.0089111328125, 0.013397216796875, 0.005435943603515625, -0.0245361328125, -0.0484619140625, 0.0269012451171875, 0.00005078315734863281, -0.0052642822265625, -0.002689361572265625, 0.028839111328125, -0.036834716796875, -0.022674560546875, 0.05804443359375, 0.0034942626953125, 0.042022705078125, -0.00223541259765625, -0.0672607421875, 0.007526397705078125, 0.004852294921875, -0.00708770751953125, 0.01076507568359375, 0.02593994140625, 0.01439666748046875, 0.044158935546875, 0.050628662109375, 0.0088653564453125, 0.0168914794921875, 0.006610870361328125, 0.04766845703125, -0.05950927734375, -0.04547119140625, -0.047637939453125, 0.037445068359375, 0.0158538818359375, -0.01331329345703125, 0.04736328125, 0.05517578125, 0.0411376953125, -0.00965118408203125, 0.057373046875, -0.0270843505859375, 0.0303955078125, -0.0307464599609375, 0.054656982421875, -0.0555419921875, 0.013641357421875, -0.036102294921875, -0.043670654296875, -0.002605438232421875, 0.05517578125, -0.01148223876953125, -0.00740814208984375, 0.05023193359375, 0.07000732421875, -0.006526947021484375, 0.009613037109375, 0.0106353759765625, 0.0201416015625, 0.02825927734375, 0.0452880859375, 0.06341552734375, -0.045501708984375, 0.06280517578125, -0.039276123046875, -0.0172882080078125, 0.0006151199340820312, -0.05084228515625, -0.0697021484375, -0.061737060546875, -0.0246734619140625, -0.0426025390625, -0.024078369140625, 0.058258056640625, 0.068359375, -0.048980712890625, -0.044189453125, -0.004154205322265625, -0.00012767314910888672, -0.035369873046875, -0.0203094482421875, 0.03680419921875, -0.0164794921875, -0.06591796875, 0.04486083984375, 0.0032138824462890625, 0.023895263671875, 0.02178955078125, -0.0173187255859375, -0.0205535888671875, 0.0173187255859375, 0.02728271484375, 0.033966064453125, -0.07470703125, -0.00421905517578125, -0.00628662109375, -0.0203094482421875, 0.039154052734375, 0.0322265625, -0.0516357421875, 0.02728271484375, 0.0200347900390625, 0.025115966796875, 0.088134765625, -0.0005207061767578125, 0.0203094482421875, -0.041656494140625, 0.0308074951171875, 0.020751953125, 0.00730133056640625, 0.024993896484375, -0.01306915283203125, 0.02581787109375, 0.0157928466796875, -0.032745361328125, -0.066650390625, -0.0048675537109375, -0.10443115234375, -0.038848876953125, 0.0762939453125, 0.006103515625, -0.039642333984375, 0.0106353759765625, -0.048980712890625, 0.0501708984375, -0.041473388671875, 0.0443115234375, 0.04144287109375, -0.0256195068359375, -0.00403594970703125, -0.035369873046875, 0.0426025390625, 0.038665771484375, -0.05767822265625, 0.00852203369140625, 0.01971435546875, 0.040435791015625, 0.0235748291015625, 0.06756591796875, -0.017425537109375, 0.015594482421875, 0.028228759765625, 0.041717529296875, -0.0052032470703125, -0.006561279296875, -0.036407470703125, -0.0026264190673828125, 0.00778961181640625, -0.0333251953125 ] ]
facebook/dinov2-small
2023-09-06T11:24:10.000Z
[ "transformers", "pytorch", "safetensors", "dinov2", "feature-extraction", "dino", "vision", "arxiv:2304.07193", "license:apache-2.0", "endpoints_compatible", "region:us" ]
feature-extraction
facebook
null
null
facebook/dinov2-small
2
117,118
transformers
2023-07-31T16:53:09
--- license: apache-2.0 tags: - dino - vision --- # Vision Transformer (small-sized model) trained using DINOv2 Vision Transformer (ViT) model trained using the DINOv2 method. It was introduced in the paper [DINOv2: Learning Robust Visual Features without Supervision](https://arxiv.org/abs/2304.07193) by Oquab et al. and first released in [this repository](https://github.com/facebookresearch/dinov2). Disclaimer: The team releasing DINOv2 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a self-supervised fashion. Images are presented to the model as a sequence of fixed-size patches, which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not include any fine-tuned heads. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. ## Intended uses & limitations You can use the raw model for feature extraction. See the [model hub](https://huggingface.co/models?search=facebook/dinov2) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model: ```python from transformers import AutoImageProcessor, AutoModel from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained('facebook/dinov2-small') model = AutoModel.from_pretrained('facebook/dinov2-small') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state ``` ### BibTeX entry and citation info ```bibtex misc{oquab2023dinov2, title={DINOv2: Learning Robust Visual Features without Supervision}, author={Maxime Oquab and Timothée Darcet and Théo Moutakanni and Huy Vo and Marc Szafraniec and Vasil Khalidov and Pierre Fernandez and Daniel Haziza and Francisco Massa and Alaaeldin El-Nouby and Mahmoud Assran and Nicolas Ballas and Wojciech Galuba and Russell Howes and Po-Yao Huang and Shang-Wen Li and Ishan Misra and Michael Rabbat and Vasu Sharma and Gabriel Synnaeve and Hu Xu and Hervé Jegou and Julien Mairal and Patrick Labatut and Armand Joulin and Piotr Bojanowski}, year={2023}, eprint={2304.07193}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
3,030
[ [ -0.037628173828125, -0.0308837890625, 0.007656097412109375, -0.01047515869140625, -0.036376953125, -0.00469970703125, 0.00795745849609375, -0.030853271484375, 0.0207977294921875, 0.0367431640625, -0.035797119140625, -0.0162353515625, -0.050689697265625, -0.01287078857421875, -0.03411865234375, 0.06622314453125, -0.0009002685546875, -0.00603485107421875, -0.020538330078125, -0.0017852783203125, -0.0177459716796875, -0.033416748046875, -0.0391845703125, -0.02899169921875, 0.0259246826171875, 0.0077972412109375, 0.05377197265625, 0.0758056640625, 0.03411865234375, 0.0305328369140625, -0.0091400146484375, 0.0011777877807617188, -0.041046142578125, -0.0174560546875, -0.0190887451171875, -0.03948974609375, -0.0229034423828125, 0.0089569091796875, 0.0406494140625, 0.028564453125, 0.0192108154296875, 0.026092529296875, 0.0102386474609375, 0.00989532470703125, -0.0421142578125, 0.035369873046875, -0.03436279296875, 0.0266571044921875, -0.005817413330078125, -0.0002562999725341797, -0.0214080810546875, -0.029144287109375, 0.01934814453125, -0.035430908203125, 0.01293182373046875, -0.0037670135498046875, 0.099609375, 0.0239105224609375, -0.035888671875, -0.00374603271484375, -0.044677734375, 0.059722900390625, -0.018951416015625, 0.0257568359375, 0.01313018798828125, 0.0276031494140625, 0.0054473876953125, -0.0859375, -0.04949951171875, 0.0033969879150390625, -0.01522064208984375, -0.0015058517456054688, -0.0190887451171875, -0.002071380615234375, 0.024658203125, 0.0273590087890625, -0.0113067626953125, 0.01189422607421875, -0.03863525390625, -0.03643798828125, 0.0279693603515625, -0.00731658935546875, 0.01256561279296875, -0.03167724609375, -0.04949951171875, -0.0330810546875, -0.028350830078125, 0.034088134765625, 0.0124053955078125, 0.0086212158203125, -0.0139617919921875, 0.04730224609375, 0.006443023681640625, 0.040069580078125, 0.023651123046875, -0.011871337890625, 0.040802001953125, -0.02166748046875, -0.020782470703125, -0.0160675048828125, 0.0615234375, 0.0191497802734375, 0.023223876953125, 0.00345611572265625, -0.0257720947265625, 0.00618743896484375, 0.0209503173828125, -0.07049560546875, -0.0237884521484375, -0.01074981689453125, -0.047576904296875, -0.04156494140625, 0.020477294921875, -0.054168701171875, -0.0119171142578125, -0.0208740234375, 0.053497314453125, -0.020355224609375, -0.0249481201171875, -0.034759521484375, -0.0028247833251953125, 0.05413818359375, 0.00848388671875, -0.07073974609375, 0.0264739990234375, 0.037506103515625, 0.06805419921875, -0.005374908447265625, -0.0130462646484375, -0.0240478515625, -0.01192474365234375, -0.037139892578125, 0.052947998046875, -0.025054931640625, -0.0161285400390625, 0.01332855224609375, 0.03973388671875, 0.002017974853515625, -0.035552978515625, 0.0297393798828125, -0.0255889892578125, 0.01554107666015625, -0.027008056640625, -0.019775390625, -0.0232391357421875, 0.00945281982421875, -0.0498046875, 0.08697509765625, 0.0272674560546875, -0.05792236328125, 0.043212890625, -0.037353515625, -0.0154266357421875, 0.001983642578125, -0.012237548828125, -0.052398681640625, -0.005901336669921875, 0.034027099609375, 0.037445068359375, 0.00937652587890625, -0.01544189453125, -0.0285491943359375, -0.036468505859375, 0.0208740234375, -0.007015228271484375, 0.064208984375, 0.0136566162109375, -0.024139404296875, 0.01279449462890625, -0.048583984375, -0.00106048583984375, 0.0186767578125, -0.024139404296875, -0.00724029541015625, -0.01502227783203125, 0.01284027099609375, 0.0260162353515625, 0.026275634765625, -0.048248291015625, 0.014892578125, -0.0268402099609375, 0.04644775390625, 0.061798095703125, -0.0034084320068359375, 0.044097900390625, -0.0107421875, 0.0279083251953125, 0.00966644287109375, 0.038330078125, -0.0290069580078125, -0.043670654296875, -0.05828857421875, -0.023040771484375, 0.0254364013671875, 0.0360107421875, -0.069580078125, 0.039520263671875, -0.01332855224609375, -0.022369384765625, -0.03643798828125, 0.0171356201171875, 0.03375244140625, 0.044097900390625, 0.0246429443359375, -0.04156494140625, -0.039703369140625, -0.06756591796875, 0.016021728515625, -0.0011930465698242188, 0.0009899139404296875, 0.0218353271484375, 0.052093505859375, -0.0220184326171875, 0.07659912109375, -0.01258087158203125, -0.018707275390625, -0.00605010986328125, 0.0009703636169433594, 0.014190673828125, 0.05242919921875, 0.057098388671875, -0.0675048828125, -0.0218353271484375, -0.00557708740234375, -0.06439208984375, 0.01486968994140625, 0.0053558349609375, -0.01340484619140625, 0.0008077621459960938, 0.0236358642578125, -0.056884765625, 0.05694580078125, 0.011962890625, -0.01326751708984375, 0.0123748779296875, -0.005863189697265625, -0.00238037109375, -0.0845947265625, -0.0007214546203613281, -0.003795623779296875, -0.034027099609375, -0.0386962890625, 0.011871337890625, 0.012969970703125, -0.01361083984375, -0.038482666015625, 0.029693603515625, -0.03717041015625, -0.0323486328125, -0.019500732421875, -0.01371002197265625, -0.00025653839111328125, 0.03955078125, -0.0026264190673828125, 0.03131103515625, 0.06231689453125, -0.0301971435546875, 0.053131103515625, 0.033599853515625, -0.031524658203125, 0.03228759765625, -0.049713134765625, 0.028350830078125, -0.01245880126953125, 0.0076141357421875, -0.072021484375, -0.0328369140625, 0.0309600830078125, -0.03558349609375, 0.042633056640625, -0.0272674560546875, -0.035614013671875, -0.06365966796875, -0.0228424072265625, 0.0237884521484375, 0.060699462890625, -0.059661865234375, 0.04168701171875, 0.026824951171875, 0.0175933837890625, -0.060760498046875, -0.07403564453125, -0.01134490966796875, -0.00940704345703125, -0.03265380859375, 0.02545166015625, 0.02239990234375, 0.0205078125, 0.0276641845703125, -0.00848388671875, -0.01904296875, -0.016937255859375, 0.04339599609375, 0.0207672119140625, -0.0265350341796875, -0.000545501708984375, -0.008514404296875, -0.012054443359375, 0.0026149749755859375, -0.034454345703125, 0.041412353515625, -0.0206146240234375, -0.0245513916015625, -0.05792236328125, 0.00533294677734375, 0.0435791015625, -0.022430419921875, 0.040313720703125, 0.07305908203125, -0.0523681640625, -0.01078033447265625, -0.02349853515625, -0.0145263671875, -0.0401611328125, 0.0297393798828125, -0.02923583984375, -0.04754638671875, 0.060516357421875, -0.00450897216796875, -0.021148681640625, 0.03485107421875, 0.03973388671875, -0.014251708984375, 0.0654296875, 0.068359375, 0.001163482666015625, 0.055755615234375, -0.0567626953125, 0.0068359375, -0.05255126953125, -0.050201416015625, -0.003631591796875, -0.02935791015625, -0.031494140625, -0.03509521484375, 0.00702667236328125, 0.0281982421875, -0.01532745361328125, 0.047454833984375, -0.0501708984375, 0.0302886962890625, 0.0599365234375, 0.039337158203125, -0.02490234375, 0.01016998291015625, -0.01904296875, 0.00004506111145019531, -0.046356201171875, -0.0110626220703125, 0.07562255859375, 0.042449951171875, 0.061614990234375, -0.014129638671875, 0.047882080078125, 0.0111236572265625, 0.0011425018310546875, -0.0712890625, 0.03887939453125, -0.007843017578125, -0.03912353515625, -0.01372528076171875, -0.011871337890625, -0.06689453125, -0.0035552978515625, -0.033660888671875, -0.05975341796875, 0.050262451171875, 0.0216064453125, -0.03448486328125, 0.02410888671875, -0.045989990234375, 0.07379150390625, -0.0146942138671875, -0.0210723876953125, 0.009246826171875, -0.045562744140625, 0.0144805908203125, -0.00963592529296875, -0.01318359375, 0.0212249755859375, 0.01540374755859375, 0.0499267578125, -0.045928955078125, 0.07769775390625, -0.03173828125, 0.0260467529296875, 0.04302978515625, -0.01169586181640625, 0.0310821533203125, -0.006237030029296875, 0.032135009765625, 0.0153656005859375, -0.00047850608825683594, -0.03717041015625, -0.041748046875, 0.0345458984375, -0.07635498046875, -0.0283050537109375, -0.02679443359375, -0.020538330078125, 0.022491455078125, 0.03009033203125, 0.04888916015625, 0.045928955078125, 0.01186370849609375, 0.0316162109375, 0.04571533203125, -0.024658203125, 0.045654296875, -0.0164031982421875, -0.0255889892578125, -0.02734375, 0.06158447265625, 0.0230865478515625, 0.01038360595703125, 0.0203094482421875, 0.01161956787109375, -0.027313232421875, -0.028350830078125, -0.0261383056640625, 0.005077362060546875, -0.074462890625, -0.0218505859375, -0.033935546875, -0.047210693359375, -0.03973388671875, -0.01195526123046875, -0.042022705078125, -0.0291900634765625, -0.038604736328125, -0.019073486328125, 0.0214996337890625, 0.06292724609375, -0.0256805419921875, 0.04266357421875, -0.028961181640625, 0.0228271484375, 0.06158447265625, 0.0130462646484375, -0.00968170166015625, -0.046783447265625, -0.0196990966796875, -0.001495361328125, -0.01287078857421875, -0.047119140625, 0.03497314453125, 0.0252227783203125, 0.0621337890625, 0.060455322265625, -0.027008056640625, 0.0587158203125, -0.02203369140625, 0.055206298828125, 0.0254364013671875, -0.06427001953125, 0.0489501953125, -0.01026153564453125, 0.010223388671875, 0.0133056640625, 0.034912109375, 0.002285003662109375, 0.0159149169921875, -0.0372314453125, -0.0460205078125, 0.0567626953125, 0.0112762451171875, 0.022003173828125, 0.00820159912109375, 0.048919677734375, -0.005466461181640625, 0.006046295166015625, -0.0694580078125, -0.0135650634765625, -0.07159423828125, -0.0099639892578125, 0.016448974609375, -0.0274505615234375, -0.006534576416015625, -0.040618896484375, 0.01300048828125, -0.01001739501953125, 0.057098388671875, 0.0145721435546875, -0.0198516845703125, -0.0157470703125, -0.032623291015625, 0.01383209228515625, 0.0399169921875, -0.029510498046875, 0.0144195556640625, 0.007274627685546875, -0.041717529296875, -0.006805419921875, 0.0089569091796875, -0.0166168212890625, -0.005771636962890625, 0.036895751953125, 0.0693359375, 0.0143280029296875, -0.001239776611328125, 0.07183837890625, 0.01265716552734375, -0.015380859375, -0.036376953125, 0.0099639892578125, -0.01263427734375, 0.04132080078125, 0.0278778076171875, 0.0285491943359375, -0.004703521728515625, -0.051666259765625, 0.039794921875, 0.0231781005859375, -0.04974365234375, -0.04022216796875, 0.06329345703125, -0.006114959716796875, -0.01561737060546875, 0.046356201171875, -0.014495849609375, -0.048583984375, 0.0615234375, 0.04620361328125, 0.050445556640625, -0.0263671875, 0.01837158203125, 0.037445068359375, 0.02471923828125, -0.004024505615234375, 0.0186767578125, -0.01398468017578125, -0.06768798828125, -0.030181884765625, -0.04913330078125, -0.004810333251953125, 0.01256561279296875, -0.06195068359375, 0.028961181640625, -0.052703857421875, -0.028289794921875, 0.016632080078125, -0.0146942138671875, -0.08209228515625, 0.019256591796875, 0.037750244140625, 0.050079345703125, -0.06256103515625, 0.08489990234375, 0.051483154296875, -0.044189453125, -0.05511474609375, -0.0205078125, 0.00128173828125, -0.0794677734375, 0.0634765625, 0.0266571044921875, 0.005130767822265625, 0.00370025634765625, -0.06719970703125, -0.0794677734375, 0.0902099609375, 0.022918701171875, -0.01526641845703125, -0.007701873779296875, 0.00708770751953125, 0.030853271484375, -0.04486083984375, 0.02142333984375, 0.003566741943359375, 0.00681304931640625, 0.0352783203125, -0.0543212890625, -0.001071929931640625, -0.0239410400390625, 0.024810791015625, -0.0136566162109375, -0.054443359375, 0.086669921875, -0.01605224609375, -0.015899658203125, 0.00957489013671875, 0.0487060546875, -0.0216827392578125, -0.0016431808471679688, 0.046844482421875, 0.044281005859375, 0.043182373046875, -0.017364501953125, 0.07061767578125, -0.0029144287109375, 0.04766845703125, 0.05780029296875, 0.01232147216796875, 0.051483154296875, 0.02056884765625, -0.0050201416015625, 0.047119140625, 0.06640625, -0.04119873046875, 0.07049560546875, -0.0038700103759765625, 0.01322174072265625, -0.019073486328125, 0.00528717041015625, -0.02545166015625, 0.051239013671875, 0.0318603515625, -0.04840087890625, -0.005107879638671875, 0.0224761962890625, -0.0122833251953125, -0.023681640625, -0.033203125, 0.047088623046875, 0.01016998291015625, -0.025360107421875, 0.04888916015625, -0.01983642578125, 0.036895751953125, -0.030792236328125, -0.01202392578125, -0.01258087158203125, 0.0218353271484375, -0.02508544921875, -0.061004638671875, 0.01282501220703125, -0.00963592529296875, -0.0048370361328125, -0.00534820556640625, 0.06793212890625, -0.017242431640625, -0.04559326171875, 0.0270843505859375, 0.012542724609375, 0.0174560546875, 0.0162353515625, -0.060394287109375, -0.01763916015625, -0.005626678466796875, -0.03375244140625, 0.01287078857421875, 0.0301055908203125, -0.0008301734924316406, 0.046356201171875, 0.051849365234375, -0.0096282958984375, 0.03076171875, 0.0013141632080078125, 0.088134765625, -0.03948974609375, -0.034912109375, -0.051055908203125, 0.04522705078125, -0.0210418701171875, -0.0204010009765625, 0.045013427734375, 0.0241241455078125, 0.0723876953125, -0.00698089599609375, 0.035980224609375, -0.0125885009765625, 0.015655517578125, -0.0253143310546875, 0.0504150390625, -0.0276947021484375, -0.01261138916015625, -0.01297760009765625, -0.0806884765625, -0.018768310546875, 0.06854248046875, -0.0020732879638671875, 0.0006122589111328125, 0.0367431640625, 0.059234619140625, -0.02392578125, -0.0229949951171875, 0.0203857421875, 0.0300140380859375, 0.005157470703125, 0.02703857421875, 0.061798095703125, -0.040802001953125, 0.045379638671875, -0.048004150390625, -0.0270843505859375, -0.00893402099609375, -0.04962158203125, -0.09564208984375, -0.043853759765625, -0.0219268798828125, -0.03533935546875, -0.004634857177734375, 0.05462646484375, 0.0853271484375, -0.073974609375, 0.01323699951171875, 0.0005583763122558594, -0.00415802001953125, -0.0175323486328125, -0.01226806640625, 0.04302978515625, -0.0047454833984375, -0.051055908203125, 0.005847930908203125, 0.007526397705078125, 0.0180511474609375, -0.0252227783203125, 0.0046234130859375, -0.00140380859375, -0.009490966796875, 0.039398193359375, 0.0276641845703125, -0.056610107421875, -0.052093505859375, -0.005039215087890625, 0.0008940696716308594, 0.02545166015625, 0.035491943359375, -0.07122802734375, 0.04949951171875, 0.03253173828125, 0.03472900390625, 0.06829833984375, 0.0020961761474609375, 0.0186767578125, -0.0589599609375, 0.032379150390625, -0.0020732879638671875, 0.0438232421875, 0.0268402099609375, -0.0280914306640625, 0.029693603515625, 0.0379638671875, -0.0330810546875, -0.058319091796875, 0.014312744140625, -0.0888671875, -0.00870513916015625, 0.06524658203125, -0.037506103515625, -0.042449951171875, 0.00806427001953125, -0.0026493072509765625, 0.041717529296875, -0.00308990478515625, 0.04150390625, 0.0225067138671875, 0.0031185150146484375, -0.04754638671875, -0.0301971435546875, 0.0369873046875, -0.0167236328125, -0.0295867919921875, -0.046722412109375, -0.0014104843139648438, 0.0300140380859375, 0.029144287109375, 0.01367950439453125, -0.02880859375, 0.01247406005859375, 0.0266571044921875, 0.017669677734375, -0.0184326171875, -0.0253143310546875, -0.0251312255859375, 0.007568359375, -0.0213470458984375, -0.05462646484375 ] ]
huggyllama/llama-7b
2023-04-07T15:50:47.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "license:other", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
huggyllama
null
null
huggyllama/llama-7b
215
116,961
transformers
2023-04-03T23:16:48
--- license: other --- This contains the weights for the LLaMA-7b model. This model is under a non-commercial license (see the LICENSE file). You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
472
[ [ -0.0021648406982421875, -0.00476837158203125, 0.032470703125, 0.04791259765625, -0.047149658203125, -0.0055694580078125, 0.030364990234375, -0.0243377685546875, 0.030181884765625, 0.061676025390625, -0.042877197265625, -0.02777099609375, -0.06317138671875, 0.0167236328125, -0.04052734375, 0.07171630859375, -0.00836181640625, 0.0200042724609375, -0.0202484130859375, -0.01239013671875, -0.017486572265625, -0.0223388671875, -0.01409912109375, -0.04705810546875, 0.0533447265625, 0.027069091796875, 0.042694091796875, 0.038055419921875, 0.05450439453125, 0.0178680419921875, -0.024322509765625, -0.04339599609375, -0.041107177734375, -0.0220794677734375, -0.00026345252990722656, -0.061279296875, -0.06915283203125, 0.01308441162109375, 0.048309326171875, 0.03790283203125, -0.048309326171875, 0.040252685546875, -0.0258331298828125, 0.03717041015625, -0.032501220703125, 0.036376953125, -0.035491943359375, -0.00421142578125, -0.0247802734375, -0.0018768310546875, -0.025970458984375, -0.0232391357421875, -0.023651123046875, -0.06298828125, 0.00447845458984375, 0.0148162841796875, 0.073974609375, 0.03265380859375, -0.042022705078125, -0.0208282470703125, -0.01605224609375, 0.05267333984375, -0.061676025390625, 0.01103973388671875, 0.0299224853515625, 0.050506591796875, -0.0206298828125, -0.054962158203125, -0.0276336669921875, -0.0298614501953125, 0.00902557373046875, 0.0045318603515625, -0.02081298828125, 0.006465911865234375, 0.0267486572265625, 0.045928955078125, -0.0274505615234375, -0.00439453125, -0.087158203125, -0.010467529296875, 0.0743408203125, 0.002544403076171875, 0.035064697265625, -0.0269927978515625, -0.06329345703125, -0.0263519287109375, -0.058319091796875, 0.00044918060302734375, 0.05609130859375, 0.0107269287109375, -0.057708740234375, 0.09320068359375, -0.006587982177734375, 0.0308990478515625, 0.006305694580078125, -0.024017333984375, 0.04949951171875, 0.026153564453125, -0.0288848876953125, 0.015106201171875, 0.0244598388671875, 0.04132080078125, 0.0125274658203125, -0.01361083984375, -0.017425537109375, 0.002960205078125, 0.031707763671875, -0.048095703125, -0.00487518310546875, -0.006336212158203125, -0.04730224609375, -0.0205230712890625, 0.0006933212280273438, -0.0177154541015625, -0.03948974609375, -0.02105712890625, 0.03582763671875, 0.00982666015625, -0.0290374755859375, 0.032684326171875, -0.02301025390625, 0.031890869140625, 0.01580810546875, -0.04443359375, 0.0290679931640625, -0.003170013427734375, 0.04046630859375, 0.0153656005859375, -0.011199951171875, -0.00324249267578125, 0.026702880859375, -0.010833740234375, 0.05096435546875, -0.0123291015625, -0.06646728515625, -0.01190948486328125, 0.042236328125, 0.0023479461669921875, -0.047698974609375, 0.04052734375, -0.064453125, -0.0100555419921875, -0.023406982421875, -0.023651123046875, -0.04827880859375, -0.00475311279296875, -0.07916259765625, 0.0697021484375, 0.03509521484375, -0.046356201171875, 0.025146484375, -0.035614013671875, -0.007038116455078125, 0.016571044921875, 0.00655364990234375, -0.036865234375, 0.0251617431640625, -0.01026153564453125, 0.0197296142578125, -0.025360107421875, 0.02294921875, -0.043609619140625, -0.028106689453125, 0.00374603271484375, 0.0017194747924804688, 0.08184814453125, 0.01538848876953125, -0.002468109130859375, 0.005950927734375, -0.08416748046875, -0.04058837890625, 0.03851318359375, -0.051116943359375, -0.0008244514465332031, -0.0245819091796875, 0.006198883056640625, 0.005550384521484375, 0.044677734375, -0.044677734375, 0.043365478515625, 0.0020046234130859375, 0.004932403564453125, 0.061248779296875, -0.00579833984375, 0.0411376953125, -0.034027099609375, 0.0599365234375, 0.0006089210510253906, 0.031280517578125, 0.041168212890625, -0.047149658203125, -0.0452880859375, -0.00775146484375, -0.005527496337890625, 0.017547607421875, -0.040924072265625, 0.0170745849609375, -0.01226043701171875, -0.0677490234375, -0.0301055908203125, 0.008819580078125, 0.014404296875, 0.035675048828125, 0.0218353271484375, -0.018585205078125, -0.030487060546875, -0.09228515625, 0.020477294921875, -0.00913238525390625, -0.0209808349609375, 0.029052734375, 0.06884765625, -0.012542724609375, 0.037750244140625, -0.034332275390625, -0.02392578125, -0.0286865234375, -0.009063720703125, 0.02252197265625, 0.02093505859375, 0.074462890625, -0.046875, -0.002536773681640625, -0.02349853515625, -0.0648193359375, -0.0401611328125, 0.0047607421875, -0.01459503173828125, -0.00592041015625, -0.00917816162109375, -0.04541015625, 0.039398193359375, 0.0711669921875, -0.0200347900390625, 0.031982421875, -0.0156097412109375, -0.01519012451171875, -0.0787353515625, 0.0262298583984375, 0.0006690025329589844, -0.015106201171875, 0.005290985107421875, 0.0210113525390625, 0.00628662109375, 0.00446319580078125, -0.0555419921875, 0.037506103515625, -0.03277587890625, -0.0302581787109375, -0.01319122314453125, 0.0007290840148925781, 0.002643585205078125, 0.020599365234375, -0.0263824462890625, 0.06146240234375, 0.027191162109375, -0.035247802734375, 0.036651611328125, 0.034515380859375, -0.026092529296875, 0.0228271484375, -0.06304931640625, 0.0005402565002441406, -0.0247955322265625, 0.031524658203125, -0.0274810791015625, -0.0274505615234375, 0.039154052734375, -0.0223388671875, 0.0088043212890625, -0.01947021484375, -0.01605224609375, -0.015625, -0.0187225341796875, 0.027618408203125, 0.0295257568359375, -0.0364990234375, 0.0665283203125, 0.0400390625, 0.0091552734375, -0.050933837890625, -0.07489013671875, -0.0258331298828125, -0.028228759765625, -0.030487060546875, 0.024658203125, -0.004547119140625, -0.014556884765625, 0.0143890380859375, -0.0162353515625, -0.00641632080078125, 0.004913330078125, 0.035858154296875, 0.0268096923828125, -0.0012578964233398438, -0.01806640625, 0.0169525146484375, -0.00868988037109375, 0.01116943359375, 0.01187896728515625, 0.039093017578125, 0.005725860595703125, -0.01708984375, -0.035614013671875, -0.002193450927734375, 0.036651611328125, 0.0087738037109375, 0.0572509765625, 0.0149078369140625, -0.035125732421875, -0.0010194778442382812, -0.036956787109375, 0.00789642333984375, -0.033477783203125, 0.0149078369140625, -0.0166015625, -0.0311431884765625, 0.05572509765625, 0.01056671142578125, -0.0015163421630859375, 0.0703125, 0.061187744140625, 0.01390838623046875, 0.044830322265625, 0.064453125, -0.008270263671875, 0.027099609375, -0.040313720703125, -0.010040283203125, -0.08880615234375, -0.063720703125, -0.039825439453125, -0.04730224609375, -0.013275146484375, -0.01194000244140625, 0.01036834716796875, 0.0224456787109375, -0.054290771484375, 0.05511474609375, -0.029388427734375, 0.0132293701171875, 0.0428466796875, 0.020782470703125, 0.039154052734375, 0.003971099853515625, -0.00702667236328125, 0.0118865966796875, -0.0231170654296875, -0.044403076171875, 0.08770751953125, 0.01059722900390625, 0.08587646484375, 0.0274810791015625, 0.045867919921875, 0.0254058837890625, 0.044830322265625, -0.05023193359375, 0.037261962890625, 0.0157012939453125, -0.06707763671875, 0.01139068603515625, -0.033111572265625, -0.07366943359375, 0.018646240234375, -0.0030975341796875, -0.061920166015625, 0.015167236328125, -0.0006899833679199219, 0.008575439453125, 0.031402587890625, -0.054962158203125, 0.038177490234375, -0.0093536376953125, 0.020538330078125, -0.0237884521484375, -0.0301513671875, 0.051116943359375, 0.00777435302734375, 0.01447296142578125, -0.02362060546875, -0.004150390625, 0.0648193359375, -0.0345458984375, 0.0718994140625, -0.0299530029296875, -0.0111846923828125, 0.040924072265625, -0.006927490234375, 0.029815673828125, 0.018341064453125, -0.006961822509765625, 0.0235137939453125, -0.00015103816986083984, -0.0234527587890625, -0.0153656005859375, 0.03948974609375, -0.09075927734375, -0.033966064453125, -0.0372314453125, -0.041900634765625, 0.0372314453125, 0.017608642578125, 0.0086212158203125, -0.00510406494140625, 0.03387451171875, 0.0411376953125, 0.0203399658203125, -0.0228729248046875, 0.0167694091796875, 0.038909912109375, -0.0203399658203125, -0.0279693603515625, 0.040130615234375, 0.007266998291015625, 0.01873779296875, 0.01397705078125, 0.0272369384765625, -0.041351318359375, -0.024017333984375, -0.040679931640625, 0.03253173828125, -0.06671142578125, -0.035400390625, -0.0270538330078125, -0.00817108154296875, -0.0238800048828125, -0.0124664306640625, -0.01415252685546875, -0.04852294921875, -0.0345458984375, -0.018524169921875, 0.042083740234375, 0.07757568359375, -0.005458831787109375, 0.07562255859375, -0.0606689453125, 0.023162841796875, 0.00930023193359375, 0.0194091796875, 0.01338958740234375, -0.05035400390625, -0.005222320556640625, -0.018463134765625, -0.0477294921875, -0.06878662109375, 0.0279693603515625, -0.01727294921875, 0.048095703125, 0.0107421875, -0.009796142578125, 0.0295867919921875, -0.01357269287109375, 0.074951171875, 0.029296875, -0.053741455078125, 0.00501251220703125, -0.017303466796875, 0.0015726089477539062, 0.0194549560546875, 0.0298614501953125, -0.01136016845703125, 0.01250457763671875, -0.05340576171875, -0.0606689453125, 0.050506591796875, 0.01219940185546875, 0.0016050338745117188, 0.03521728515625, 0.02484130859375, -0.00029587745666503906, 0.039031982421875, -0.08953857421875, -0.012054443359375, -0.0273284912109375, -0.023345947265625, 0.029693603515625, -0.017120361328125, -0.0257415771484375, -0.0036487579345703125, 0.05712890625, 0.01224517822265625, -0.0015401840209960938, 0.00016033649444580078, -0.027679443359375, -0.022857666015625, 0.0008759498596191406, 0.04083251953125, 0.03521728515625, -0.0301055908203125, -0.0206756591796875, 0.023162841796875, -0.06494140625, 0.01070404052734375, 0.0179290771484375, -0.00267791748046875, -0.0311431884765625, 0.0203094482421875, 0.036346435546875, 0.030731201171875, -0.03411865234375, 0.0413818359375, 0.002391815185546875, -0.031463623046875, -0.0309906005859375, -0.0092315673828125, 0.0173492431640625, 0.04425048828125, 0.0244598388671875, -0.00933074951171875, 0.0231170654296875, -0.018646240234375, -0.0144500732421875, 0.0103912353515625, 0.00759124755859375, -0.035430908203125, 0.07501220703125, 0.000835418701171875, -0.0005650520324707031, 0.039642333984375, -0.0052032470703125, -0.004253387451171875, 0.059539794921875, 0.044769287109375, 0.053558349609375, -0.00688934326171875, -0.003032684326171875, 0.015655517578125, 0.02813720703125, -0.01605224609375, 0.0535888671875, 0.01065826416015625, -0.0251922607421875, -0.0292816162109375, -0.07733154296875, -0.03875732421875, 0.00403594970703125, -0.05059814453125, 0.0391845703125, -0.04180908203125, -0.0198516845703125, -0.026123046875, -0.0005321502685546875, -0.0394287109375, 0.0269012451171875, 0.0251922607421875, 0.06964111328125, -0.048126220703125, 0.0462646484375, 0.054443359375, -0.07843017578125, -0.09307861328125, -0.057830810546875, 0.01052093505859375, -0.0965576171875, 0.0504150390625, -0.0005788803100585938, -0.0150909423828125, -0.009368896484375, -0.0660400390625, -0.0928955078125, 0.10406494140625, 0.047515869140625, -0.037933349609375, -0.03753662109375, 0.007415771484375, -0.0034427642822265625, -0.00148773193359375, 0.0157623291015625, 0.0003745555877685547, 0.041351318359375, 0.02850341796875, -0.0643310546875, 0.003810882568359375, -0.004344940185546875, 0.0018157958984375, -0.00466156005859375, -0.06585693359375, 0.08050537109375, -0.01053619384765625, -0.006618499755859375, 0.0183258056640625, 0.036376953125, 0.042144775390625, 0.0010843276977539062, 0.0308074951171875, 0.059112548828125, 0.046478271484375, 0.00833892822265625, 0.08941650390625, -0.0163726806640625, 0.05322265625, 0.04364013671875, -0.0253143310546875, 0.0435791015625, 0.035064697265625, -0.029296875, 0.049591064453125, 0.038665771484375, -0.02862548828125, 0.0196075439453125, 0.04412841796875, -0.00272369384765625, -0.002147674560546875, -0.01538848876953125, -0.04974365234375, 0.021759033203125, 0.01093292236328125, -0.03265380859375, -0.0274200439453125, -0.02996826171875, -0.0007653236389160156, -0.0252532958984375, -0.031829833984375, 0.02459716796875, 0.0400390625, 0.01334381103515625, 0.02923583984375, 0.0021495819091796875, 0.038543701171875, -0.06097412109375, -0.0088653564453125, 0.00519561767578125, 0.01194000244140625, -0.0369873046875, -0.041290283203125, 0.0263671875, 0.0032100677490234375, -0.01265716552734375, -0.0017271041870117188, 0.038818359375, 0.01557159423828125, -0.0584716796875, 0.033599853515625, 0.00550079345703125, 0.0166015625, 0.011077880859375, -0.04840087890625, 0.024139404296875, -0.0247802734375, -0.03387451171875, 0.0215606689453125, -0.01104736328125, -0.0129241943359375, 0.07171630859375, 0.047332763671875, 0.008636474609375, 0.0265960693359375, 0.0253143310546875, 0.0765380859375, -0.03961181640625, -0.02215576171875, -0.02825927734375, 0.053436279296875, 0.005954742431640625, -0.039306640625, 0.04058837890625, 0.049835205078125, 0.07537841796875, -0.035369873046875, 0.03826904296875, -0.0194244384765625, 0.002140045166015625, -0.04052734375, 0.077392578125, -0.069091796875, -0.006435394287109375, -0.01016998291015625, -0.07452392578125, -0.0323486328125, 0.05438232421875, -0.0017652511596679688, -0.0054779052734375, 0.0313720703125, 0.038787841796875, 0.0017919540405273438, -0.00496673583984375, 0.01317596435546875, 0.00907135009765625, 0.031585693359375, 0.031402587890625, 0.04400634765625, -0.0462646484375, 0.04815673828125, -0.010955810546875, -0.0196075439453125, -0.005901336669921875, -0.06756591796875, -0.06097412109375, -0.014984130859375, 0.004314422607421875, -0.01493072509765625, -0.0306396484375, 0.06939697265625, 0.0401611328125, -0.02783203125, -0.034698486328125, 0.0117034912109375, 0.0010137557983398438, 0.00839996337890625, -0.01059722900390625, 0.0233917236328125, 0.0153656005859375, -0.0570068359375, 0.0194854736328125, 0.0006961822509765625, 0.039215087890625, -0.02716064453125, 0.0034503936767578125, -0.00154876708984375, -0.0114898681640625, 0.0263824462890625, 0.00644683837890625, -0.058380126953125, -0.0048370361328125, -0.0019044876098632812, 0.0001977682113647461, 0.0008234977722167969, 0.0302581787109375, -0.041595458984375, 0.01015472412109375, 0.033477783203125, 0.0308990478515625, 0.03741455078125, 0.00099945068359375, 0.0304412841796875, -0.034820556640625, 0.03839111328125, 0.0112457275390625, 0.0640869140625, 0.023681640625, -0.020263671875, 0.034393310546875, 0.0249176025390625, -0.04815673828125, -0.045867919921875, -0.0003638267517089844, -0.1131591796875, 0.01317596435546875, 0.06695556640625, -0.001445770263671875, -0.047760009765625, 0.036529541015625, -0.03485107421875, 0.02545166015625, -0.0413818359375, 0.05450439453125, 0.043060302734375, 0.0137939453125, -0.022491455078125, -0.04840087890625, 0.00811767578125, -0.009002685546875, -0.043304443359375, -0.03192138671875, 0.029388427734375, 0.03216552734375, 0.01458740234375, 0.0242767333984375, -0.044525146484375, -0.006328582763671875, 0.004756927490234375, 0.03558349609375, -0.0042266845703125, -0.01523590087890625, -0.01538848876953125, -0.010833740234375, 0.0066680908203125, -0.01953125 ] ]
sonoisa/sentence-bert-base-ja-mean-tokens-v2
2022-12-04T07:04:25.000Z
[ "sentence-transformers", "pytorch", "bert", "sentence-bert", "feature-extraction", "sentence-similarity", "ja", "license:cc-by-sa-4.0", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
sonoisa
null
null
sonoisa/sentence-bert-base-ja-mean-tokens-v2
19
116,438
sentence-transformers
2022-03-02T23:29:05
--- language: ja license: cc-by-sa-4.0 tags: - sentence-transformers - sentence-bert - feature-extraction - sentence-similarity --- This is a Japanese sentence-BERT model. 日本語用Sentence-BERTモデル(バージョン2)です。 [バージョン1](https://huggingface.co/sonoisa/sentence-bert-base-ja-mean-tokens)よりも良いロス関数である[MultipleNegativesRankingLoss](https://www.sbert.net/docs/package_reference/losses.html#multiplenegativesrankingloss)を用いて学習した改良版です。 手元の非公開データセットでは、バージョン1よりも1.5〜2ポイントほど精度が高い結果が得られました。 事前学習済みモデルとして[cl-tohoku/bert-base-japanese-whole-word-masking](https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking)を利用しました。 従って、推論の実行にはfugashiとipadicが必要です(pip install fugashi ipadic)。 # 旧バージョンの解説 https://qiita.com/sonoisa/items/1df94d0a98cd4f209051 モデル名を"sonoisa/sentence-bert-base-ja-mean-tokens-v2"に書き換えれば、本モデルを利用した挙動になります。 # 使い方 ```python from transformers import BertJapaneseTokenizer, BertModel import torch class SentenceBertJapanese: def __init__(self, model_name_or_path, device=None): self.tokenizer = BertJapaneseTokenizer.from_pretrained(model_name_or_path) self.model = BertModel.from_pretrained(model_name_or_path) self.model.eval() if device is None: device = "cuda" if torch.cuda.is_available() else "cpu" self.device = torch.device(device) self.model.to(device) def _mean_pooling(self, model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) @torch.no_grad() def encode(self, sentences, batch_size=8): all_embeddings = [] iterator = range(0, len(sentences), batch_size) for batch_idx in iterator: batch = sentences[batch_idx:batch_idx + batch_size] encoded_input = self.tokenizer.batch_encode_plus(batch, padding="longest", truncation=True, return_tensors="pt").to(self.device) model_output = self.model(**encoded_input) sentence_embeddings = self._mean_pooling(model_output, encoded_input["attention_mask"]).to('cpu') all_embeddings.extend(sentence_embeddings) # return torch.stack(all_embeddings).numpy() return torch.stack(all_embeddings) MODEL_NAME = "sonoisa/sentence-bert-base-ja-mean-tokens-v2" # <- v2です。 model = SentenceBertJapanese(MODEL_NAME) sentences = ["暴走したAI", "暴走した人工知能"] sentence_embeddings = model.encode(sentences, batch_size=8) print("Sentence embeddings:", sentence_embeddings) ```
2,758
[ [ -0.017852783203125, -0.069580078125, 0.02001953125, 0.030364990234375, -0.03521728515625, -0.02825927734375, -0.0284423828125, -0.011138916015625, 0.0225677490234375, 0.03167724609375, -0.050048828125, -0.031280517578125, -0.05889892578125, -0.002643585205078125, -0.01137542724609375, 0.0701904296875, -0.018035888671875, 0.01195526123046875, 0.01070404052734375, -0.002593994140625, -0.04150390625, -0.0323486328125, -0.047576904296875, -0.032135009765625, 0.0172119140625, 0.01102447509765625, 0.026397705078125, 0.018890380859375, 0.025665283203125, 0.0266265869140625, 0.01067352294921875, 0.003570556640625, -0.022857666015625, -0.006061553955078125, 0.0164337158203125, -0.040771484375, -0.00037479400634765625, 0.00946044921875, 0.042694091796875, 0.02874755859375, 0.00791168212890625, 0.007518768310546875, -0.0005865097045898438, 0.032012939453125, -0.02978515625, 0.0213623046875, -0.03759765625, 0.001590728759765625, -0.005290985107421875, -0.005596160888671875, -0.04248046875, -0.02142333984375, -0.004459381103515625, -0.038543701171875, 0.0202484130859375, -0.00128936767578125, 0.09539794921875, 0.01389312744140625, -0.01335906982421875, -0.030303955078125, -0.00989532470703125, 0.07110595703125, -0.07403564453125, 0.0240020751953125, 0.02410888671875, -0.011566162109375, -0.0105133056640625, -0.0552978515625, -0.056304931640625, 0.0010404586791992188, -0.0175933837890625, 0.020904541015625, -0.009735107421875, 0.0025844573974609375, 0.01464080810546875, 0.00469207763671875, -0.04693603515625, -0.005123138427734375, -0.038055419921875, -0.019500732421875, 0.04150390625, -0.008392333984375, 0.035614013671875, -0.0467529296875, -0.03021240234375, -0.0193939208984375, -0.0240325927734375, 0.0087738037109375, 0.0244293212890625, 0.024566650390625, -0.0240936279296875, 0.053558349609375, -0.0002994537353515625, 0.03485107421875, 0.0014638900756835938, -0.01898193359375, 0.0364990234375, -0.0155029296875, -0.02459716796875, 0.01128387451171875, 0.06597900390625, 0.03521728515625, 0.025054931640625, -0.004627227783203125, -0.0028476715087890625, 0.00962066650390625, -0.010589599609375, -0.0634765625, -0.0194854736328125, 0.0157012939453125, -0.045074462890625, -0.0203857421875, 0.01316070556640625, -0.050323486328125, -0.005580902099609375, 0.01367950439453125, 0.061676025390625, -0.0499267578125, -0.0038928985595703125, 0.02783203125, -0.025390625, 0.0191192626953125, -0.01212310791015625, -0.064453125, 0.0060272216796875, 0.02874755859375, 0.06256103515625, 0.0197906494140625, -0.02734375, -0.0103607177734375, -0.0004730224609375, -0.0107574462890625, 0.0179290771484375, -0.0259246826171875, -0.02532958984375, -0.014129638671875, 0.01122283935546875, -0.040130615234375, -0.0155181884765625, 0.043609619140625, -0.028594970703125, 0.05126953125, -0.00829315185546875, -0.05242919921875, -0.0300140380859375, 0.01357269287109375, -0.033355712890625, 0.07379150390625, 0.001102447509765625, -0.0770263671875, 0.00965118408203125, -0.04632568359375, -0.034515380859375, -0.0164794921875, -0.002471923828125, -0.051849365234375, 0.0025081634521484375, 0.0369873046875, 0.050079345703125, -0.00342559814453125, 0.0165252685546875, -0.01351165771484375, -0.0275421142578125, 0.0296478271484375, -0.0261688232421875, 0.1009521484375, 0.01049041748046875, -0.040802001953125, 0.005657196044921875, -0.05169677734375, 0.0003361701965332031, 0.023529052734375, -0.0200653076171875, -0.0195159912109375, -0.01001739501953125, 0.01580810546875, 0.02154541015625, 0.0225067138671875, -0.062744140625, 0.00901031494140625, -0.033172607421875, 0.060516357421875, 0.06964111328125, 0.0179443359375, 0.0084381103515625, -0.0304718017578125, 0.01357269287109375, 0.018218994140625, -0.001148223876953125, -0.01015472412109375, -0.0423583984375, -0.08416748046875, -0.0309600830078125, 0.0160980224609375, 0.03668212890625, -0.064697265625, 0.047943115234375, -0.017974853515625, -0.03399658203125, -0.054229736328125, -0.006290435791015625, 0.017852783203125, 0.0333251953125, 0.024749755859375, -0.0123748779296875, -0.0562744140625, -0.07135009765625, -0.005931854248046875, -0.0128173828125, 0.006679534912109375, 0.0223846435546875, 0.052734375, -0.0208282470703125, 0.0438232421875, -0.041839599609375, -0.025543212890625, -0.0290985107421875, 0.02313232421875, 0.04718017578125, 0.05682373046875, 0.0283203125, -0.043304443359375, -0.0250244140625, -0.0269622802734375, -0.054534912109375, 0.0042724609375, -0.037017822265625, -0.02532958984375, 0.020782470703125, 0.019775390625, -0.06365966796875, 0.0263214111328125, 0.0231170654296875, -0.0380859375, 0.028778076171875, -0.013916015625, 0.0138092041015625, -0.09979248046875, 0.0062713623046875, -0.004711151123046875, -0.0033130645751953125, -0.036041259765625, 0.0155792236328125, 0.002620697021484375, 0.0030460357666015625, -0.033050537109375, 0.037750244140625, -0.037689208984375, 0.011566162109375, 0.004016876220703125, 0.0185546875, 0.00836944580078125, 0.0648193359375, -0.004108428955078125, 0.059173583984375, 0.0443115234375, -0.0438232421875, 0.036102294921875, 0.025421142578125, -0.0300140380859375, 0.00759124755859375, -0.062408447265625, -0.0034351348876953125, -0.01068115234375, 0.0117645263671875, -0.08154296875, -0.00995635986328125, 0.030670166015625, -0.05914306640625, 0.01224517822265625, 0.0261688232421875, -0.050445556640625, -0.043609619140625, -0.04119873046875, 0.010986328125, 0.036590576171875, -0.03802490234375, 0.035400390625, 0.016021728515625, 0.0007243156433105469, -0.0501708984375, -0.08270263671875, -0.004852294921875, -0.014495849609375, -0.050628662109375, 0.0283203125, -0.0052337646484375, 0.0164337158203125, -0.00209808349609375, 0.0086669921875, -0.006256103515625, 0.00826263427734375, 0.002605438232421875, 0.03179931640625, -0.018798828125, 0.007350921630859375, -0.004192352294921875, 0.00710296630859375, 0.013519287109375, -0.00273895263671875, 0.060577392578125, -0.005275726318359375, -0.01374053955078125, -0.04278564453125, 0.023712158203125, 0.024078369140625, -0.0042572021484375, 0.0645751953125, 0.07666015625, -0.039886474609375, 0.0010509490966796875, -0.043060302734375, -0.0213623046875, -0.0350341796875, 0.04486083984375, -0.040771484375, -0.050384521484375, 0.04962158203125, 0.034027099609375, 0.004367828369140625, 0.053070068359375, 0.0521240234375, -0.005855560302734375, 0.07281494140625, 0.038543701171875, -0.007747650146484375, 0.038482666015625, -0.04608154296875, 0.035003662109375, -0.07159423828125, -0.01788330078125, -0.017974853515625, -0.0255279541015625, -0.054962158203125, -0.0211639404296875, 0.0251007080078125, 0.0011959075927734375, -0.0219268798828125, 0.037109375, -0.04888916015625, 0.018463134765625, 0.06005859375, 0.038238525390625, -0.007045745849609375, -0.00936126708984375, -0.01374053955078125, -0.014251708984375, -0.047882080078125, -0.0303192138671875, 0.06805419921875, 0.0338134765625, 0.048431396484375, 0.00762939453125, 0.0633544921875, -0.004001617431640625, -0.00298309326171875, -0.052398681640625, 0.054046630859375, -0.01255035400390625, -0.0552978515625, -0.019775390625, -0.0252685546875, -0.0704345703125, 0.025115966796875, -0.006305694580078125, -0.07183837890625, 0.00595855712890625, -0.031768798828125, -0.0210723876953125, 0.028778076171875, -0.0577392578125, 0.07012939453125, -0.0059661865234375, -0.017852783203125, -0.01300811767578125, -0.05230712890625, 0.035308837890625, 0.0182037353515625, 0.00417327880859375, -0.0116119384765625, 0.00447845458984375, 0.0960693359375, -0.02001953125, 0.067626953125, -0.01183319091796875, 0.017059326171875, 0.02325439453125, -0.0164031982421875, 0.0243988037109375, -0.00249481201171875, -0.0012493133544921875, -0.00634765625, -0.0046539306640625, -0.0352783203125, -0.02740478515625, 0.053558349609375, -0.08428955078125, -0.03558349609375, -0.03997802734375, -0.040985107421875, 0.0018444061279296875, 0.034912109375, 0.045989990234375, 0.031341552734375, 0.007442474365234375, 0.0252685546875, 0.049163818359375, -0.0292816162109375, 0.058929443359375, 0.01082611083984375, -0.0009984970092773438, -0.034637451171875, 0.06195068359375, 0.0144195556640625, 0.0015172958374023438, 0.029083251953125, 0.00951385498046875, -0.021148681640625, -0.01084136962890625, -0.0267486572265625, 0.0443115234375, -0.0533447265625, -0.009857177734375, -0.066162109375, -0.0301513671875, -0.053680419921875, -0.017364501953125, -0.00971221923828125, -0.0311737060546875, -0.040283203125, -0.012908935546875, 0.0239105224609375, 0.0250396728515625, -0.0009431838989257812, 0.0276336669921875, -0.0445556640625, 0.031280517578125, 0.01873779296875, 0.01251983642578125, 0.002941131591796875, -0.04840087890625, -0.0226593017578125, 0.00347900390625, -0.0340576171875, -0.0751953125, 0.04449462890625, 0.0065460205078125, 0.047943115234375, 0.017669677734375, 0.01146697998046875, 0.05322265625, -0.0195159912109375, 0.07415771484375, 0.02984619140625, -0.08489990234375, 0.042449951171875, -0.006237030029296875, 0.0275421142578125, 0.0210418701171875, 0.0169677734375, -0.0509033203125, -0.035797119140625, -0.061492919921875, -0.07354736328125, 0.0655517578125, 0.0175323486328125, 0.035308837890625, -0.01390838623046875, 0.00308990478515625, -0.0020771026611328125, 0.0113067626953125, -0.0850830078125, -0.03192138671875, -0.0290374755859375, -0.036865234375, -0.0034656524658203125, -0.0205535888671875, 0.001667022705078125, -0.0235443115234375, 0.0899658203125, 0.0201416015625, 0.058624267578125, 0.0302581787109375, -0.023590087890625, 0.00984954833984375, 0.01358795166015625, 0.051116943359375, 0.018035888671875, -0.0229949951171875, 0.016387939453125, 0.01103973388671875, -0.030364990234375, 0.0001493692398071289, 0.013671875, -0.0161285400390625, 0.030120849609375, 0.04217529296875, 0.08221435546875, 0.01776123046875, -0.03485107421875, 0.04901123046875, 0.0011758804321289062, -0.017303466796875, -0.0158843994140625, -0.005901336669921875, 0.00876617431640625, 0.008575439453125, 0.0243682861328125, -0.01088714599609375, 0.00870513916015625, -0.039031982421875, 0.0197296142578125, 0.02301025390625, -0.0067901611328125, -0.00011795759201049805, 0.055084228515625, 0.00399017333984375, -0.0008230209350585938, 0.052490234375, -0.0003714561462402344, -0.0655517578125, 0.04986572265625, 0.050384521484375, 0.07647705078125, 0.004863739013671875, 0.01096343994140625, 0.053466796875, 0.032379150390625, 0.00908660888671875, 0.0239410400390625, 0.004596710205078125, -0.06292724609375, -0.0086517333984375, -0.03515625, 0.0002046823501586914, 0.006816864013671875, -0.046600341796875, 0.020263671875, -0.03326416015625, -0.016204833984375, 0.005939483642578125, 0.0153961181640625, -0.03851318359375, 0.020538330078125, 0.00960540771484375, 0.0516357421875, -0.07305908203125, 0.06695556640625, 0.038482666015625, -0.0487060546875, -0.06781005859375, -0.01001739501953125, -0.0311279296875, -0.08270263671875, 0.0477294921875, 0.03472900390625, 0.00684356689453125, 0.0034618377685546875, -0.039398193359375, -0.064453125, 0.0699462890625, 0.01166534423828125, -0.018218994140625, -0.00868988037109375, -0.00846099853515625, 0.0267181396484375, -0.016815185546875, 0.0212860107421875, 0.033203125, 0.032196044921875, -0.004962921142578125, -0.05255126953125, 0.019134521484375, -0.0306243896484375, 0.016387939453125, -0.00246429443359375, -0.057403564453125, 0.087158203125, -0.0188446044921875, -0.0161285400390625, 0.0034275054931640625, 0.05767822265625, 0.0265655517578125, 0.00244903564453125, 0.0190887451171875, 0.047149658203125, 0.04217529296875, -0.0120697021484375, 0.063720703125, -0.0232086181640625, 0.05279541015625, 0.060516357421875, 0.022552490234375, 0.060577392578125, 0.045562744140625, -0.0189056396484375, 0.04266357421875, 0.051910400390625, -0.0146942138671875, 0.05859375, 0.008636474609375, -0.006488800048828125, -0.0012187957763671875, 0.004627227783203125, -0.02978515625, 0.0318603515625, 0.01023101806640625, -0.043609619140625, 0.00023162364959716797, 0.0210418701171875, 0.0294647216796875, -0.0153961181640625, -0.01068115234375, 0.04766845703125, 0.003101348876953125, -0.051300048828125, 0.05474853515625, 0.0265045166015625, 0.08026123046875, -0.037078857421875, 0.0199432373046875, -0.0035419464111328125, 0.02117919921875, -0.00534820556640625, -0.060302734375, 0.0160675048828125, -0.003833770751953125, -0.0152587890625, -0.008392333984375, 0.03564453125, -0.04620361328125, -0.06060791015625, 0.0162353515625, 0.0104522705078125, 0.0186614990234375, 0.0265960693359375, -0.0701904296875, 0.01397705078125, 0.01971435546875, -0.039520263671875, -0.004638671875, 0.0191650390625, 0.0184173583984375, 0.0372314453125, 0.032623291015625, -0.0016155242919921875, 0.0289306640625, 0.0053253173828125, 0.057891845703125, -0.044525146484375, -0.04510498046875, -0.07696533203125, 0.035186767578125, -0.00948333740234375, -0.033172607421875, 0.058074951171875, 0.038421630859375, 0.0648193359375, -0.0184478759765625, 0.05682373046875, -0.0215911865234375, 0.022247314453125, -0.0516357421875, 0.045074462890625, -0.037261962890625, -0.005397796630859375, -0.0176544189453125, -0.053436279296875, -0.0236053466796875, 0.07940673828125, -0.02020263671875, 0.00716400146484375, 0.05987548828125, 0.0594482421875, 0.006694793701171875, -0.00745391845703125, 0.01215362548828125, 0.0287322998046875, 0.0184173583984375, 0.0662841796875, 0.018096923828125, -0.0712890625, 0.0295867919921875, -0.045562744140625, -0.0008993148803710938, -0.0187835693359375, -0.04388427734375, -0.06854248046875, -0.055908203125, -0.031097412109375, -0.033447265625, -0.00481414794921875, 0.0775146484375, 0.0513916015625, -0.07183837890625, -0.0124053955078125, -0.009490966796875, 0.01183319091796875, -0.01500701904296875, -0.024505615234375, 0.04486083984375, -0.0285797119140625, -0.062164306640625, -0.00341796875, -0.0137481689453125, 0.0097503662109375, -0.008392333984375, 0.0009522438049316406, -0.0430908203125, 0.0217132568359375, 0.03424072265625, -0.01143646240234375, -0.06805419921875, -0.0255889892578125, 0.0025577545166015625, -0.032073974609375, -0.0001169443130493164, 0.027130126953125, -0.035430908203125, 0.039886474609375, 0.04583740234375, 0.033050537109375, 0.0523681640625, -0.02252197265625, 0.023040771484375, -0.070556640625, 0.02313232421875, 0.00954437255859375, 0.048736572265625, 0.029144287109375, -0.033447265625, 0.037841796875, 0.028656005859375, -0.02325439453125, -0.061370849609375, -0.0124664306640625, -0.08111572265625, -0.033966064453125, 0.07904052734375, -0.017242431640625, -0.027008056640625, 0.0261993408203125, -0.029815673828125, 0.05279541015625, -0.021942138671875, 0.058074951171875, 0.07745361328125, -0.006256103515625, -0.0305023193359375, -0.026123046875, -0.0011129379272460938, 0.038482666015625, -0.03179931640625, -0.010009765625, 0.0114288330078125, 0.0272216796875, 0.0216522216796875, 0.048492431640625, 0.003818511962890625, 0.0194091796875, 0.0126495361328125, 0.0242156982421875, -0.00571441650390625, 0.004215240478515625, -0.013214111328125, 0.00005453824996948242, -0.0202789306640625, -0.05963134765625 ] ]
bennyguo/zero123-xl-diffusers
2023-08-25T04:43:32.000Z
[ "diffusers", "arxiv:2303.11328", "license:mit", "has_space", "diffusers:Zero123Pipeline", "region:us" ]
null
bennyguo
null
null
bennyguo/zero123-xl-diffusers
5
116,375
diffusers
2023-08-23T13:37:15
--- license: mit --- # Uses _Note: This section is originally taken from the [Stable Diffusion v2 model card](https://huggingface.co/stabilityai/stable-diffusion-2), but applies in the same way to Zero-1-to-3._ ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include: - Safe deployment of large-scale models. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism. - The model cannot render legible text. - Faces and people in general may not be parsed or generated properly. - The autoencoding part of the model is lossy. - Stable Diffusion was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, Stability AI has filtered the dataset using LAION's NSFW detector. - Zero-1-to-3 was subsequently finetuned on a subset of the large-scale dataset [Objaverse](https://objaverse.allenai.org/), which might also potentially contain inappropriate content. To partially mitigate this, our demo applies a safety check to every uploaded image. ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Images and concepts from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as Western cultures are often overrepresented. Stable Diffusion mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ### Safety Module The intended use of this model is with the [Safety Checker](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/safety_checker.py) in Diffusers. This checker works by checking model inputs against known hard-coded NSFW concepts. Specifically, the checker compares the class probability of harmful concepts in the embedding space of the uploaded input images. The concepts are passed into the model with the image and compared to a hand-engineered weight for each NSFW concept. ## Citation ``` @misc{liu2023zero1to3, title={Zero-1-to-3: Zero-shot One Image to 3D Object}, author={Ruoshi Liu and Rundi Wu and Basile Van Hoorick and Pavel Tokmakov and Sergey Zakharov and Carl Vondrick}, year={2023}, eprint={2303.11328}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
4,272
[ [ -0.026092529296875, -0.07232666015625, 0.026824951171875, 0.0210723876953125, -0.02294921875, -0.047607421875, 0.0246734619140625, -0.040008544921875, -0.01380157470703125, 0.04388427734375, -0.0222625732421875, -0.036346435546875, -0.050323486328125, -0.00677490234375, -0.0516357421875, 0.07159423828125, -0.007476806640625, -0.006389617919921875, -0.017181396484375, -0.00820159912109375, -0.020416259765625, -0.01253509521484375, -0.0609130859375, -0.01641845703125, 0.03228759765625, 0.01256561279296875, 0.0626220703125, 0.042449951171875, 0.03155517578125, 0.018157958984375, -0.017333984375, -0.0224456787109375, -0.06707763671875, 0.00007456541061401367, -0.0121612548828125, -0.01525115966796875, -0.039520263671875, 0.014739990234375, 0.044158935546875, 0.026611328125, -0.0130462646484375, 0.00720977783203125, -0.007656097412109375, 0.04229736328125, -0.052398681640625, -0.0230560302734375, -0.0198211669921875, 0.0263214111328125, -0.02923583984375, 0.0234527587890625, -0.0262603759765625, -0.0211181640625, 0.0015354156494140625, -0.050323486328125, 0.0215911865234375, -0.0269927978515625, 0.07989501953125, 0.020477294921875, -0.033294677734375, 0.00649261474609375, -0.0609130859375, 0.044891357421875, -0.04632568359375, 0.0218353271484375, 0.047607421875, -0.001354217529296875, -0.00010311603546142578, -0.053009033203125, -0.045623779296875, -0.00080108642578125, 0.0060882568359375, 0.02294921875, -0.0276947021484375, -0.01030731201171875, 0.038543701171875, 0.006984710693359375, -0.045654296875, -0.002590179443359375, -0.050262451171875, -0.0029296875, 0.051544189453125, -0.000545501708984375, 0.031707763671875, -0.0167999267578125, -0.04144287109375, -0.00251007080078125, -0.047576904296875, -0.00970458984375, 0.031494140625, -0.03021240234375, -0.024810791015625, 0.043731689453125, -0.0023746490478515625, 0.0391845703125, 0.00701141357421875, -0.01450347900390625, 0.0265350341796875, -0.017181396484375, -0.01513671875, -0.032501220703125, 0.06292724609375, 0.06268310546875, 0.00907135009765625, 0.004383087158203125, -0.001201629638671875, 0.0078582763671875, 0.0279388427734375, -0.08636474609375, -0.01332855224609375, 0.00991058349609375, -0.048095703125, -0.047943115234375, -0.00437164306640625, -0.0684814453125, -0.024566650390625, 0.02178955078125, 0.03082275390625, -0.0237579345703125, -0.05255126953125, 0.00862884521484375, -0.033447265625, 0.0113677978515625, 0.0244293212890625, -0.03839111328125, 0.01483154296875, 0.01061248779296875, 0.0797119140625, -0.0193634033203125, -0.0054473876953125, 0.0057830810546875, 0.0162811279296875, -0.03265380859375, 0.042449951171875, -0.0154266357421875, -0.0443115234375, -0.004680633544921875, 0.022430419921875, 0.0291290283203125, -0.03570556640625, 0.046875, -0.033935546875, 0.01551055908203125, -0.0038299560546875, -0.023193359375, -0.025787353515625, 0.0047760009765625, -0.053863525390625, 0.070068359375, 0.0186920166015625, -0.0679931640625, 0.0169830322265625, -0.048187255859375, -0.0274658203125, -0.003902435302734375, 0.00792694091796875, -0.0496826171875, -0.01416778564453125, -0.018646240234375, 0.037689208984375, 0.004917144775390625, 0.0193023681640625, -0.041351318359375, -0.0229949951171875, -0.0005965232849121094, -0.042205810546875, 0.08203125, 0.032501220703125, -0.032684326171875, -0.0002865791320800781, -0.040985107421875, -0.030029296875, 0.041168212890625, -0.0054473876953125, -0.0307464599609375, -0.0004458427429199219, 0.0179290771484375, 0.0195770263671875, -0.003604888916015625, -0.039642333984375, -0.01555633544921875, -0.00807952880859375, 0.023040771484375, 0.052398681640625, 0.0211181640625, 0.04766845703125, -0.0391845703125, 0.042877197265625, 0.02337646484375, 0.03302001953125, -0.0019931793212890625, -0.06640625, -0.0440673828125, -0.00640869140625, -0.00240325927734375, 0.036285400390625, -0.059295654296875, 0.010650634765625, 0.015350341796875, -0.042388916015625, -0.00753021240234375, 0.0031948089599609375, 0.0240478515625, 0.04718017578125, 0.0204620361328125, -0.0265655517578125, -0.019195556640625, -0.05572509765625, 0.0281219482421875, -0.00896453857421875, 0.002880096435546875, 0.015106201171875, 0.044830322265625, -0.0244293212890625, 0.041748046875, -0.03570556640625, -0.02032470703125, 0.0169830322265625, 0.01143646240234375, 0.0006260871887207031, 0.05352783203125, 0.066650390625, -0.08349609375, -0.036651611328125, -0.0173187255859375, -0.0870361328125, 0.0036220550537109375, 0.0017719268798828125, -0.034912109375, 0.0193328857421875, 0.026458740234375, -0.0487060546875, 0.05291748046875, 0.039093017578125, -0.032562255859375, 0.029876708984375, -0.0140228271484375, -0.00347137451171875, -0.06988525390625, 0.007144927978515625, 0.03887939453125, -0.02642822265625, -0.06219482421875, 0.034088134765625, -0.0006971359252929688, -0.0104827880859375, -0.06451416015625, 0.052581787109375, -0.021636962890625, 0.039154052734375, -0.02313232421875, 0.01297760009765625, 0.007556915283203125, 0.00701141357421875, 0.0227508544921875, 0.0401611328125, 0.06005859375, -0.0487060546875, 0.00048542022705078125, 0.01303863525390625, -0.01201629638671875, 0.04803466796875, -0.059906005859375, 0.01297760009765625, -0.0374755859375, 0.016265869140625, -0.05877685546875, -0.0182037353515625, 0.04962158203125, -0.021270751953125, 0.02508544921875, -0.01213836669921875, -0.024169921875, -0.03045654296875, -0.035369873046875, 0.03851318359375, 0.061004638671875, -0.022979736328125, 0.0232086181640625, 0.0428466796875, 0.0179901123046875, -0.049346923828125, -0.053314208984375, -0.01776123046875, -0.036102294921875, -0.062042236328125, 0.018890380859375, -0.01308441162109375, -0.0223846435546875, 0.010528564453125, -0.00513458251953125, -0.005336761474609375, 0.015716552734375, 0.034820556640625, 0.0216827392578125, 0.006755828857421875, -0.018768310546875, 0.0113525390625, -0.0018205642700195312, 0.010406494140625, -0.003444671630859375, 0.01861572265625, 0.010589599609375, -0.01456451416015625, -0.03662109375, 0.03955078125, 0.04156494140625, 0.01708984375, 0.057373046875, 0.07574462890625, -0.05194091796875, -0.003353118896484375, -0.0310211181640625, -0.0087890625, -0.0360107421875, 0.028656005859375, -0.00760650634765625, -0.050262451171875, 0.043975830078125, 0.007625579833984375, -0.00624847412109375, 0.0438232421875, 0.044891357421875, -0.01258087158203125, 0.08099365234375, 0.050872802734375, 0.01519775390625, 0.04046630859375, -0.050567626953125, 0.00030922889709472656, -0.072021484375, -0.0299072265625, -0.024017333984375, -0.0137481689453125, -0.039581298828125, -0.044525146484375, 0.023681640625, 0.031494140625, -0.0274658203125, 0.0164794921875, -0.036590576171875, 0.037200927734375, 0.01293182373046875, 0.02581787109375, 0.0021800994873046875, 0.00395965576171875, 0.00881195068359375, -0.000046372413635253906, -0.049041748046875, -0.03875732421875, 0.0745849609375, 0.047210693359375, 0.07666015625, 0.0261688232421875, 0.03363037109375, 0.0355224609375, 0.05126953125, -0.044952392578125, 0.043121337890625, -0.03271484375, -0.07489013671875, -0.015380859375, -0.0268096923828125, -0.059600830078125, 0.01776123046875, -0.0128936767578125, -0.050872802734375, 0.043121337890625, 0.00824737548828125, -0.0090484619140625, 0.0283660888671875, -0.045501708984375, 0.06329345703125, -0.004299163818359375, -0.062347412109375, -0.00843048095703125, -0.046966552734375, 0.0478515625, -0.01128387451171875, 0.023681640625, -0.01410675048828125, 0.00634765625, 0.05950927734375, -0.0177764892578125, 0.0806884765625, -0.0277099609375, -0.0076446533203125, 0.0284576416015625, 0.0161285400390625, 0.0138702392578125, 0.0240478515625, -0.0223236083984375, 0.052337646484375, 0.00531768798828125, -0.0248260498046875, -0.006809234619140625, 0.0543212890625, -0.07916259765625, -0.045501708984375, -0.037384033203125, -0.0198211669921875, 0.038848876953125, 0.0292510986328125, 0.049041748046875, 0.018157958984375, -0.0222015380859375, 0.0169830322265625, 0.060455322265625, -0.0195159912109375, 0.021820068359375, 0.030029296875, -0.037841796875, -0.023834228515625, 0.05218505859375, 0.00785064697265625, 0.040313720703125, -0.01285552978515625, 0.02093505859375, -0.0161285400390625, -0.0306396484375, -0.04150390625, 0.01328277587890625, -0.0672607421875, -0.020538330078125, -0.051361083984375, -0.0297088623046875, -0.030181884765625, -0.014190673828125, -0.00982666015625, -0.016448974609375, -0.054443359375, -0.01352691650390625, 0.01727294921875, 0.04937744140625, -0.035491943359375, 0.0156402587890625, -0.0277252197265625, 0.041656494140625, 0.01568603515625, 0.032073974609375, -0.000423431396484375, -0.040313720703125, -0.0007381439208984375, 0.01067352294921875, -0.04229736328125, -0.07708740234375, 0.013427734375, 0.00341033935546875, 0.042694091796875, 0.065185546875, 0.004474639892578125, 0.032440185546875, -0.0291290283203125, 0.0810546875, 0.02215576171875, -0.0506591796875, 0.04705810546875, -0.047119140625, 0.0092315673828125, 0.0230865478515625, 0.060577392578125, -0.0260467529296875, -0.0275726318359375, -0.051727294921875, -0.067138671875, 0.059173583984375, 0.034027099609375, 0.035369873046875, -0.0194549560546875, 0.050628662109375, -0.0033168792724609375, -0.011077880859375, -0.08807373046875, -0.042694091796875, -0.040313720703125, 0.01934814453125, 0.0146942138671875, -0.038238525390625, -0.0276031494140625, -0.027069091796875, 0.07073974609375, 0.006542205810546875, 0.0274200439453125, 0.01678466796875, 0.0070953369140625, -0.0482177734375, -0.0227508544921875, 0.04327392578125, 0.03387451171875, -0.0295562744140625, 0.001728057861328125, -0.0026493072509765625, -0.04681396484375, 0.0186614990234375, -0.0017385482788085938, -0.0584716796875, -0.005764007568359375, -0.007503509521484375, 0.059814453125, -0.01605224609375, -0.024688720703125, 0.047943115234375, -0.015960693359375, -0.031707763671875, -0.034088134765625, 0.0205535888671875, -0.0056304931640625, 0.00928497314453125, 0.00470733642578125, 0.0396728515625, 0.024444580078125, -0.03582763671875, 0.01285552978515625, 0.040130615234375, -0.027130126953125, -0.0202178955078125, 0.08489990234375, 0.007114410400390625, -0.0275421142578125, 0.040985107421875, -0.034942626953125, -0.01104736328125, 0.05035400390625, 0.06475830078125, 0.06005859375, -0.0201416015625, 0.053863525390625, 0.05291748046875, 0.026824951171875, -0.033660888671875, 0.01015472412109375, 0.006198883056640625, -0.07550048828125, -0.006107330322265625, -0.036407470703125, -0.007808685302734375, 0.0269012451171875, -0.05206298828125, 0.02734375, -0.04498291015625, -0.037933349609375, -0.0031261444091796875, -0.040069580078125, -0.038909912109375, 0.00971221923828125, 0.0307464599609375, 0.0552978515625, -0.08294677734375, 0.0552978515625, 0.05364990234375, -0.04840087890625, -0.03717041015625, 0.007335662841796875, 0.0120086669921875, -0.026214599609375, 0.0296478271484375, 0.007587432861328125, -0.01145172119140625, -0.009002685546875, -0.0694580078125, -0.072265625, 0.080078125, 0.0175628662109375, -0.01404571533203125, 0.007625579833984375, -0.0084381103515625, 0.044036865234375, -0.0113677978515625, 0.01641845703125, 0.0160064697265625, 0.0303497314453125, 0.0287322998046875, -0.028533935546875, 0.018463134765625, -0.035064697265625, 0.031890869140625, -0.00872039794921875, -0.06182861328125, 0.06768798828125, -0.017547607421875, -0.0276641845703125, 0.0179443359375, 0.034027099609375, 0.019439697265625, 0.024200439453125, 0.039154052734375, 0.058319091796875, 0.032501220703125, -0.005870819091796875, 0.08624267578125, 0.001377105712890625, 0.033355712890625, 0.059600830078125, -0.0211029052734375, 0.048980712890625, 0.0223388671875, -0.00665283203125, 0.03875732421875, 0.0379638671875, -0.0203094482421875, 0.057830810546875, -0.000002682209014892578, -0.015533447265625, -0.006984710693359375, -0.01336669921875, -0.0396728515625, 0.01136016845703125, 0.031768798828125, -0.017974853515625, -0.0185699462890625, 0.0164794921875, 0.0164947509765625, -0.01861572265625, 0.0045166015625, 0.04498291015625, 0.01190185546875, -0.01788330078125, 0.04083251953125, 0.01284027099609375, 0.058685302734375, -0.03594970703125, -0.0126190185546875, -0.01678466796875, 0.00720977783203125, -0.025634765625, -0.0648193359375, 0.0377197265625, 0.00762939453125, -0.00627899169921875, -0.02520751953125, 0.075439453125, -0.01120758056640625, -0.046478271484375, 0.036468505859375, 0.0207366943359375, 0.025054931640625, 0.01154327392578125, -0.06781005859375, 0.0067901611328125, -0.00395965576171875, -0.0251617431640625, 0.02117919921875, 0.016937255859375, -0.001979827880859375, 0.045135498046875, 0.047943115234375, -0.0037975311279296875, -0.006801605224609375, 0.01111602783203125, 0.05596923828125, -0.0192718505859375, -0.0187835693359375, -0.054168701171875, 0.061004638671875, -0.00738525390625, -0.018585205078125, 0.06317138671875, 0.03143310546875, 0.07318115234375, -0.0035114288330078125, 0.054290771484375, -0.0260467529296875, -0.003887176513671875, -0.026397705078125, 0.07861328125, -0.0594482421875, 0.003635406494140625, -0.03717041015625, -0.0653076171875, -0.03167724609375, 0.08514404296875, -0.0032939910888671875, 0.012786865234375, 0.0312347412109375, 0.061370849609375, -0.004070281982421875, -0.00910186767578125, 0.034332275390625, 0.0285186767578125, 0.02227783203125, -0.0045318603515625, 0.06982421875, -0.044403076171875, 0.0118865966796875, -0.0543212890625, -0.02508544921875, -0.001148223876953125, -0.07025146484375, -0.06988525390625, -0.040924072265625, -0.052154541015625, -0.05938720703125, -0.0040435791015625, 0.019622802734375, 0.08013916015625, -0.046722412109375, -0.00583648681640625, -0.00794219970703125, 0.0185089111328125, -0.00933074951171875, -0.0228271484375, 0.002513885498046875, 0.028045654296875, -0.04705810546875, 0.0003540515899658203, 0.0122222900390625, 0.032562255859375, -0.05322265625, -0.0074462890625, -0.02191162109375, -0.00742340087890625, 0.050933837890625, 0.012054443359375, -0.043426513671875, -0.0166168212890625, -0.005390167236328125, 0.007537841796875, -0.00760650634765625, 0.01122283935546875, -0.034332275390625, 0.0352783203125, 0.03985595703125, 0.00012058019638061523, 0.0648193359375, 0.00669097900390625, 0.00878143310546875, -0.040924072265625, 0.0195465087890625, 0.0036640167236328125, 0.0307159423828125, 0.020111083984375, -0.0543212890625, 0.048187255859375, 0.034332275390625, -0.044830322265625, -0.052642822265625, 0.015289306640625, -0.07403564453125, -0.0226593017578125, 0.1002197265625, -0.0174560546875, -0.0199127197265625, -0.00623321533203125, -0.005458831787109375, 0.014617919921875, -0.026763916015625, 0.05731201171875, 0.04052734375, -0.002979278564453125, -0.0278472900390625, -0.058990478515625, 0.029632568359375, -0.0013141632080078125, -0.04705810546875, -0.0240478515625, 0.046478271484375, 0.05877685546875, 0.026458740234375, 0.06439208984375, -0.0312347412109375, 0.018218994140625, -0.0018167495727539062, 0.004428863525390625, 0.00733184814453125, -0.021759033203125, -0.0362548828125, 0.01264190673828125, -0.0174560546875, 0.005859375 ] ]
compressed-llm/vicuna-13b-v1.3_gptq
2023-09-03T14:39:47.000Z
[ "transformers", "llama", "text-generation", "license:mit", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
compressed-llm
null
null
compressed-llm/vicuna-13b-v1.3_gptq
0
114,899
transformers
2023-09-02T05:21:23
--- license: mit --- # Compressed LLM Model Zone The models are prepared by [Visual Informatics Group @ University of Texas at Austin (VITA-group)](https://vita-group.github.io/). Credits to Ajay Jaiswal, Zhenyu Zhang, Zhangheng Li, Lu Yin, Shiwei Liu and Junyuan Hong. License: [MIT License](https://opensource.org/license/mit/) Setup environment ```shell pip install torch==2.0.0+cu117 torchvision==0.15.1+cu117 torchaudio==2.0.1 --index-url https://download.pytorch.org/whl/cu117 pip install transformers==4.31.0 pip install accelerate pip install auto-gptq # for gptq pip install sentencepiece ``` How to use pruned models ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer base_model = 'llama-2-7b' comp_method = 'magnitude_unstructured' comp_degree = 0.2 model_path = f'vita-group/{base_model}_{comp_method}' model = AutoModelForCausalLM.from_pretrained( model_path, revision=f's{comp_degree}', torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto" ) tokenizer = AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-hf') input_ids = tokenizer('Hello! I am a VITA-compressed-LLM chatbot!', return_tensors='pt').input_ids.cuda() outputs = model.generate(input_ids, max_new_tokens=128) print(tokenizer.decode(outputs[0])) ``` How to use wanda+gptq models ```python from transformers import AutoTokenizer from auto_gptq import AutoGPTQForCausalLM model_path = 'vita-group/llama-2-7b_wanda_2_4_gptq_4bit_128g' tokenizer_path = 'meta-llama/Llama-2-7b-hf' model = AutoGPTQForCausalLM.from_quantized( model_path, # inject_fused_attention=False, # or disable_exllama=True, device_map='auto', ) tokenizer = AutoTokenizer.from_pretrained(tokenizer_path, trust_remote_code=True) input_ids = tokenizer('Hello! I am a VITA-compressed-LLM chatbot!', return_tensors='pt').input_ids.to('cuda') outputs = model.generate(input_ids=input_ids, max_length=128) tokenizer.decode(outputs[0]) ``` How to use gptq models ```python from transformers import AutoTokenizer from auto_gptq import AutoGPTQForCausalLM # model_path = 'vita-group/llama-2-7b_wanda_2_4_gptq_4bit_128g' # tokenizer_path = 'meta-llama/Llama-2-7b-hf' model_path = 'vita-group/vicuna-7b-v1.3_gptq' tokenizer_path = 'lmsys/vicuna-7b-v1.3' model = AutoGPTQForCausalLM.from_quantized( model_path, # inject_fused_attention=False, # or disable_exllama=True, device_map='auto', revision='2bit_128g', ) from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained(tokenizer_path, trust_remote_code=True) input_ids = tokenizer('Hello! I am a VITA-compressed-LLM chatbot!', return_tensors='pt').input_ids.to('cuda') outputs = model.generate(input_ids=input_ids, max_length=128) tokenizer.decode(outputs[0]) ``` | | Base Model | Model Size | Compression Method | Compression Degree | |---:|:-------------|:-------------|:----------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | 0 | Llama-2 | 7b | [magnitude_unstructured](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured) | [s0.1](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured/tree/s0.1) | | 1 | Llama-2 | 7b | [magnitude_unstructured](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured) | [s0.2](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured/tree/s0.2) | | 2 | Llama-2 | 7b | [magnitude_unstructured](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured) | [s0.3](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured/tree/s0.3) | | 3 | Llama-2 | 7b | [magnitude_unstructured](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured) | [s0.5](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured/tree/s0.5) | | 4 | Llama-2 | 7b | [magnitude_unstructured](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured) | [s0.6](https://huggingface.co/vita-group/llama-2-7b_magnitude_unstructured/tree/s0.6) | | 5 | Llama-2 | 7b | [sparsegpt_unstructured](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured) | [s0.1](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured/tree/s0.1) | | 6 | Llama-2 | 7b | [sparsegpt_unstructured](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured) | [s0.2](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured/tree/s0.2) | | 7 | Llama-2 | 7b | [sparsegpt_unstructured](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured) | [s0.3](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured/tree/s0.3) | | 8 | Llama-2 | 7b | [sparsegpt_unstructured](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured) | [s0.5](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured/tree/s0.5) | | 9 | Llama-2 | 7b | [sparsegpt_unstructured](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured) | [s0.6](https://huggingface.co/vita-group/llama-2-7b_sparsegpt_unstructured/tree/s0.6) | | 10 | Llama-2 | 7b | [wanda_gptq](https://huggingface.co/vita-group/llama-2-7b_wanda_2_4_gptq_4bit_128g) | 4bit_128g | | 11 | Llama-2 | 7b | [wanda_unstructured](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured) | [s0.1](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured/tree/s0.1) | | 12 | Llama-2 | 7b | [wanda_unstructured](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured) | [s0.2](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured/tree/s0.2) | | 13 | Llama-2 | 7b | [wanda_unstructured](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured) | [s0.3](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured/tree/s0.3) | | 14 | Llama-2 | 7b | [wanda_unstructured](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured) | [s0.5](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured/tree/s0.5) | | 15 | Llama-2 | 7b | [wanda_unstructured](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured) | [s0.6](https://huggingface.co/vita-group/llama-2-7b_wanda_unstructured/tree/s0.6) | | 16 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [10bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/10bit_128g) | | 17 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [12bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/12bit_128g) | | 18 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [14bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/14bit_128g) | | 19 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [2bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/2bit_128g) | | 20 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [3bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/3bit_128g) | | 21 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [4bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/4bit_128g) | | 22 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [6bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/6bit_128g) | | 23 | vicuna-v1.3 | 13b | [gptq](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq) | [8bit_128g](https://huggingface.co/vita-group/vicuna-13b-v1.3_gptq/tree/8bit_128g) | | 24 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [10bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/10bit_128g) | | 25 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [12bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/12bit_128g) | | 26 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [14bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/14bit_128g) | | 27 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [2bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/2bit_128g) | | 28 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [3bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/3bit_128g) | | 29 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [4bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/4bit_128g) | | 30 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [6bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/6bit_128g) | | 31 | vicuna-v1.3 | 7b | [gptq](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq) | [8bit_128g](https://huggingface.co/vita-group/vicuna-7b-v1.3_gptq/tree/8bit_128g) |
10,376
[ [ -0.0250701904296875, -0.035369873046875, 0.0101776123046875, 0.039093017578125, -0.04156494140625, 0.0099639892578125, 0.018585205078125, -0.0264129638671875, 0.050048828125, 0.0024471282958984375, -0.046783447265625, -0.042205810546875, -0.05548095703125, 0.031646728515625, -0.01143646240234375, 0.061431884765625, 0.01160430908203125, -0.01251220703125, 0.020111083984375, -0.0154571533203125, -0.02008056640625, -0.035186767578125, -0.03045654296875, -0.003711700439453125, 0.0214691162109375, 0.0108642578125, 0.052032470703125, 0.046783447265625, 0.01515960693359375, 0.0286712646484375, -0.0293121337890625, 0.018310546875, -0.0286712646484375, -0.016021728515625, 0.0235137939453125, -0.0364990234375, -0.060333251953125, -0.01580810546875, 0.03765869140625, 0.028045654296875, -0.0156097412109375, 0.030487060546875, 0.01525115966796875, 0.0303497314453125, -0.00495147705078125, 0.022064208984375, -0.04974365234375, 0.01251220703125, -0.017974853515625, -0.02593994140625, -0.0007753372192382812, -0.03497314453125, -0.0137481689453125, -0.054962158203125, 0.01110076904296875, -0.0033111572265625, 0.10845947265625, 0.0272674560546875, -0.0235748291015625, 0.0001327991485595703, 0.0012340545654296875, 0.03369140625, -0.0626220703125, -0.0101318359375, 0.032257080078125, -0.004978179931640625, -0.0343017578125, -0.037261962890625, -0.04132080078125, 0.0146484375, -0.0254364013671875, 0.0228118896484375, -0.02850341796875, -0.0124053955078125, 0.0258331298828125, 0.04071044921875, -0.0260467529296875, -0.00153350830078125, -0.032745361328125, -0.0061798095703125, 0.049224853515625, 0.01081085205078125, 0.033599853515625, -0.046966552734375, -0.03192138671875, 0.00013816356658935547, -0.046112060546875, 0.0193023681640625, 0.0173187255859375, 0.00931549072265625, -0.026885986328125, 0.055267333984375, -0.0157928466796875, 0.03387451171875, 0.01739501953125, -0.038482666015625, 0.03302001953125, -0.045806884765625, -0.03704833984375, -0.0183868408203125, 0.0833740234375, 0.04541015625, -0.0004477500915527344, 0.0166473388671875, -0.005405426025390625, 0.0027217864990234375, -0.02117919921875, -0.0592041015625, -0.0206298828125, 0.020751953125, -0.0482177734375, -0.0308074951171875, -0.01155853271484375, -0.062225341796875, -0.007110595703125, 0.0105133056640625, 0.0020236968994140625, -0.027496337890625, -0.032958984375, 0.0084686279296875, -0.0004532337188720703, 0.02679443359375, 0.02587890625, -0.043212890625, 0.0194091796875, 0.019439697265625, 0.0693359375, 0.00966644287109375, -0.0057373046875, -0.00782012939453125, 0.014312744140625, -0.0219879150390625, 0.033203125, -0.01435089111328125, -0.041778564453125, -0.0330810546875, 0.0352783203125, -0.0226593017578125, -0.018646240234375, 0.060638427734375, 0.0018205642700195312, 0.0168914794921875, -0.0256500244140625, -0.0207061767578125, -0.0338134765625, 0.020660400390625, -0.04400634765625, 0.09124755859375, 0.037506103515625, -0.0836181640625, 0.0234832763671875, -0.0232391357421875, 0.006107330322265625, -0.01300811767578125, -0.00435638427734375, -0.04931640625, -0.01041412353515625, 0.03131103515625, 0.038055419921875, -0.03582763671875, 0.0144500732421875, -0.01268768310546875, -0.01395416259765625, 0.003711700439453125, -0.018096923828125, 0.10247802734375, 0.012420654296875, -0.026885986328125, 0.0110931396484375, -0.048828125, 0.007091522216796875, 0.04949951171875, -0.0214080810546875, 0.0093841552734375, -0.0202789306640625, -0.00710296630859375, 0.0173492431640625, 0.0270233154296875, -0.0313720703125, 0.05548095703125, -0.0160675048828125, 0.044647216796875, 0.051422119140625, 0.01317596435546875, -0.0017042160034179688, -0.0361328125, 0.04345703125, 0.01511383056640625, 0.0357666015625, 0.0092315673828125, -0.06170654296875, -0.06243896484375, -0.038299560546875, 0.016632080078125, 0.04107666015625, -0.034942626953125, 0.0709228515625, -0.0253753662109375, -0.049163818359375, -0.05548095703125, 0.0157318115234375, 0.0177154541015625, 0.0296783447265625, 0.01654052734375, -0.045135498046875, -0.03997802734375, -0.05487060546875, 0.026611328125, -0.013916015625, 0.0083160400390625, 0.041595458984375, 0.051666259765625, -0.0225677490234375, 0.0667724609375, -0.0408935546875, -0.03271484375, -0.006744384765625, -0.03045654296875, 0.059661865234375, 0.032989501953125, 0.06695556640625, -0.034454345703125, -0.03790283203125, -0.0023136138916015625, -0.0574951171875, -0.00689697265625, -0.001651763916015625, -0.0076751708984375, 0.024658203125, 0.0018768310546875, -0.070068359375, 0.057647705078125, 0.038543701171875, -0.043853759765625, 0.032440185546875, -0.0183563232421875, 0.018524169921875, -0.0880126953125, 0.004337310791015625, -0.028350830078125, -0.0192108154296875, -0.01491546630859375, -0.0061187744140625, -0.01210784912109375, 0.024627685546875, -0.03997802734375, 0.026458740234375, -0.04840087890625, -0.0159454345703125, -0.00376129150390625, 0.002361297607421875, 0.0083465576171875, 0.0621337890625, -0.01378631591796875, 0.051361083984375, 0.0570068359375, -0.0162506103515625, 0.0279541015625, 0.0282745361328125, -0.0204620361328125, 0.01934814453125, -0.059906005859375, 0.006771087646484375, -0.00910186767578125, 0.050872802734375, -0.06787109375, -0.0187835693359375, 0.051849365234375, -0.045166015625, 0.00567626953125, -0.0295257568359375, -0.043609619140625, -0.03790283203125, -0.04156494140625, 0.036468505859375, 0.0321044921875, -0.049346923828125, 0.022705078125, 0.032745361328125, 0.00963592529296875, -0.038604736328125, -0.032196044921875, -0.004772186279296875, -0.042510986328125, -0.043792724609375, 0.0279693603515625, -0.006195068359375, 0.0049591064453125, 0.0014638900756835938, -0.0024318695068359375, 0.0119781494140625, 0.00577545166015625, 0.04718017578125, 0.0259552001953125, -0.01143646240234375, -0.027496337890625, -0.00989532470703125, 0.00896453857421875, -0.0109405517578125, -0.003284454345703125, 0.06427001953125, -0.032135009765625, -0.01739501953125, -0.04541015625, 0.008270263671875, 0.0367431640625, -0.00038552284240722656, 0.0633544921875, 0.04571533203125, -0.02655029296875, 0.02459716796875, -0.02972412109375, 0.0013942718505859375, -0.029296875, -0.009002685546875, -0.051544189453125, -0.06201171875, 0.045867919921875, 0.012115478515625, 0.0021457672119140625, 0.051666259765625, 0.04840087890625, 0.0174560546875, 0.07342529296875, 0.033599853515625, -0.03302001953125, 0.0286712646484375, -0.0616455078125, 0.00372314453125, -0.06597900390625, -0.046234130859375, -0.02410888671875, -0.01190948486328125, -0.0445556640625, -0.056549072265625, 0.02374267578125, 0.04522705078125, -0.02398681640625, 0.049041748046875, -0.07525634765625, 0.0203857421875, 0.0301971435546875, 0.031524658203125, 0.01959228515625, -0.00341033935546875, -0.0146636962890625, 0.01081085205078125, -0.052032470703125, -0.03936767578125, 0.06756591796875, 0.03936767578125, 0.01739501953125, 0.016876220703125, 0.04119873046875, -0.00331878662109375, 0.02972412109375, -0.04296875, 0.054656982421875, 0.0232086181640625, -0.033203125, -0.0160675048828125, -0.01438140869140625, -0.08782958984375, 0.032440185546875, -0.0048980712890625, -0.080810546875, 0.01555633544921875, -0.0037784576416015625, -0.0223388671875, 0.03155517578125, -0.041259765625, 0.04205322265625, -0.020477294921875, -0.021636962890625, -0.0020771026611328125, -0.055908203125, 0.0465087890625, 0.007965087890625, 0.004177093505859375, -0.036346435546875, -0.01415252685546875, 0.056854248046875, -0.060089111328125, 0.0638427734375, -0.01180267333984375, -0.001781463623046875, 0.05291748046875, 0.005100250244140625, 0.040557861328125, 0.0238189697265625, -0.0072479248046875, 0.01328277587890625, 0.005329132080078125, -0.036834716796875, -0.01226043701171875, 0.042572021484375, -0.07464599609375, -0.0662841796875, -0.056121826171875, -0.01280975341796875, 0.007442474365234375, 0.005859375, 0.033935546875, -0.00803375244140625, 0.0026645660400390625, 0.0020236968994140625, 0.0175018310546875, -0.0207672119140625, 0.038360595703125, 0.006244659423828125, -0.0200042724609375, -0.0518798828125, 0.05487060546875, 0.0006971359252929688, 0.00005537271499633789, 0.0086669921875, 0.005893707275390625, -0.01690673828125, -0.035247802734375, -0.02410888671875, 0.0380859375, -0.031829833984375, -0.046051025390625, -0.05438232421875, -0.0296173095703125, -0.0296478271484375, 0.01206207275390625, -0.0110626220703125, -0.04351806640625, -0.053131103515625, -0.0263519287109375, 0.06475830078125, 0.0447998046875, 0.0019550323486328125, 0.0322265625, -0.058624267578125, 0.0172882080078125, 0.00959014892578125, 0.006587982177734375, -0.0037994384765625, -0.036712646484375, 0.0213470458984375, 0.0017824172973632812, -0.034454345703125, -0.099365234375, 0.08184814453125, 0.008575439453125, 0.02166748046875, 0.0215301513671875, -0.0168914794921875, 0.07525634765625, -0.00620269775390625, 0.05987548828125, 0.02850341796875, -0.06195068359375, 0.03765869140625, -0.0305023193359375, 0.0014410018920898438, 0.0139923095703125, 0.0271759033203125, -0.01898193359375, -0.0036945343017578125, -0.0635986328125, -0.06927490234375, 0.044952392578125, 0.04571533203125, -0.012115478515625, 0.005367279052734375, 0.0176849365234375, -0.0028209686279296875, 0.00881195068359375, -0.07940673828125, -0.03076171875, -0.028533935546875, -0.007091522216796875, -0.0026950836181640625, -0.01947021484375, -0.0167694091796875, -0.051849365234375, 0.045318603515625, -0.007720947265625, 0.026763916015625, 0.0285186767578125, 0.00820159912109375, -0.018890380859375, -0.0157928466796875, 0.03143310546875, 0.016326904296875, -0.0167388916015625, -0.0199737548828125, 0.024810791015625, -0.0244598388671875, 0.0157318115234375, -0.0034999847412109375, -0.015869140625, -0.0005249977111816406, 0.01238250732421875, 0.0484619140625, 0.0026454925537109375, -0.0240020751953125, 0.02972412109375, -0.0149688720703125, -0.0164642333984375, -0.0244293212890625, -0.001262664794921875, 0.0213165283203125, 0.01462554931640625, 0.0193023681640625, -0.006771087646484375, -0.002532958984375, -0.0474853515625, 0.0066375732421875, 0.040679931640625, 0.0013971328735351562, -0.0203857421875, 0.05828857421875, -0.001983642578125, -0.01192474365234375, 0.0360107421875, -0.01389312744140625, -0.057525634765625, 0.0645751953125, 0.032196044921875, 0.04400634765625, -0.0214691162109375, 0.0167388916015625, 0.055816650390625, 0.0238189697265625, 0.01020050048828125, 0.0173187255859375, -0.005950927734375, -0.0223541259765625, -0.0012025833129882812, -0.06317138671875, 0.0021991729736328125, 0.018829345703125, -0.0197296142578125, 0.022064208984375, -0.03607177734375, -0.0318603515625, -0.0192108154296875, 0.0386962890625, -0.045562744140625, 0.01317596435546875, -0.01200103759765625, 0.070068359375, -0.05999755859375, 0.0804443359375, 0.021484375, -0.0292510986328125, -0.0748291015625, -0.022186279296875, 0.006679534912109375, -0.069091796875, 0.021240234375, 0.004657745361328125, 0.0168914794921875, -0.02001953125, -0.046783447265625, -0.08990478515625, 0.12548828125, 0.024810791015625, -0.03582763671875, 0.006664276123046875, -0.006999969482421875, 0.0360107421875, -0.0162506103515625, 0.054718017578125, 0.04827880859375, 0.025390625, 0.031829833984375, -0.07550048828125, 0.0284423828125, -0.042236328125, 0.00017452239990234375, 0.00412750244140625, -0.098876953125, 0.0703125, -0.0030651092529296875, -0.025634765625, 0.020843505859375, 0.056365966796875, 0.045928955078125, -0.0168914794921875, 0.038818359375, 0.05877685546875, 0.053009033203125, -0.01508331298828125, 0.064697265625, -0.0190582275390625, 0.035064697265625, 0.041473388671875, 0.0096435546875, 0.034210205078125, 0.0258331298828125, -0.038055419921875, 0.017242431640625, 0.09539794921875, -0.013214111328125, 0.018951416015625, 0.0248565673828125, -0.0303955078125, -0.00225830078125, -0.0245208740234375, -0.05230712890625, 0.00902557373046875, 0.04034423828125, -0.0321044921875, 0.002239227294921875, -0.03338623046875, 0.02667236328125, -0.020233154296875, -0.0181884765625, 0.0308074951171875, 0.0141754150390625, -0.047271728515625, 0.06036376953125, -0.00933837890625, 0.069091796875, -0.0154571533203125, -0.0031986236572265625, -0.024627685546875, -0.0009112358093261719, -0.042877197265625, -0.061309814453125, 0.01497650146484375, -0.0038394927978515625, 0.009552001953125, -0.0012178421020507812, 0.041015625, -0.0152130126953125, -0.041168212890625, 0.02423095703125, 0.02862548828125, 0.040130615234375, 0.0176239013671875, -0.0673828125, 0.01849365234375, 0.0262298583984375, -0.0440673828125, 0.02972412109375, 0.0347900390625, -0.00574493408203125, 0.05877685546875, 0.047210693359375, 0.00890350341796875, 0.021759033203125, -0.0109710693359375, 0.0787353515625, -0.043731689453125, 0.00726318359375, -0.06646728515625, 0.035186767578125, -0.01097869873046875, -0.0311279296875, 0.06646728515625, 0.05255126953125, 0.028533935546875, 0.003009796142578125, 0.030853271484375, -0.0213165283203125, 0.0208740234375, -0.022674560546875, 0.040740966796875, -0.041259765625, 0.00939178466796875, -0.0243682861328125, -0.050872802734375, -0.0223236083984375, 0.055694580078125, 0.0004668235778808594, -0.00634765625, 0.0285797119140625, 0.058258056640625, 0.005115509033203125, -0.011474609375, -0.00574493408203125, 0.022552490234375, 0.0257568359375, 0.079833984375, 0.0389404296875, -0.04608154296875, 0.03619384765625, -0.0316162109375, -0.0144195556640625, -0.020721435546875, -0.06640625, -0.06573486328125, -0.043060302734375, -0.027374267578125, -0.02984619140625, -0.0152740478515625, 0.0732421875, 0.05029296875, -0.041473388671875, -0.0293121337890625, -0.01226043701171875, 0.022705078125, -0.002288818359375, -0.0152740478515625, 0.0531005859375, 0.0259552001953125, -0.058868408203125, 0.020172119140625, 0.01375579833984375, 0.0408935546875, 0.01091766357421875, -0.0167388916015625, -0.0239105224609375, -0.008636474609375, 0.0390625, 0.037261962890625, -0.06109619140625, -0.00894927978515625, 0.0018320083618164062, -0.0095977783203125, 0.029876708984375, 0.01145172119140625, -0.055511474609375, -0.0018968582153320312, 0.039825439453125, 0.00010591745376586914, 0.057159423828125, 0.021026611328125, 0.00965118408203125, -0.032196044921875, 0.027191162109375, 0.01139068603515625, 0.050323486328125, 0.01788330078125, -0.02764892578125, 0.0701904296875, 0.03033447265625, -0.0369873046875, -0.0704345703125, -0.004032135009765625, -0.10931396484375, -0.0126495361328125, 0.07940673828125, -0.021026611328125, -0.0513916015625, 0.00884246826171875, -0.05010986328125, 0.03271484375, -0.03253173828125, 0.0384521484375, 0.0258331298828125, -0.021942138671875, -0.0244598388671875, -0.019439697265625, 0.04071044921875, 0.032867431640625, -0.06707763671875, -0.01253509521484375, 0.0157623291015625, 0.030853271484375, 0.024139404296875, 0.068603515625, -0.0160675048828125, -0.0037288665771484375, -0.008758544921875, 0.002086639404296875, 0.00734710693359375, 0.0094146728515625, -0.01291656494140625, 0.0025501251220703125, -0.00563812255859375, -0.032562255859375 ] ]
cointegrated/rubert-tiny-toxicity
2023-03-17T10:23:09.000Z
[ "transformers", "pytorch", "safetensors", "bert", "text-classification", "russian", "classification", "toxicity", "multilabel", "ru", "arxiv:2103.05345", "endpoints_compatible", "has_space", "region:us" ]
text-classification
cointegrated
null
null
cointegrated/rubert-tiny-toxicity
23
114,752
transformers
2022-03-02T23:29:05
--- language: ["ru"] tags: - russian - classification - toxicity - multilabel widget: - text: "Иди ты нафиг!" --- This is the [cointegrated/rubert-tiny](https://huggingface.co/cointegrated/rubert-tiny) model fine-tuned for classification of toxicity and inappropriateness for short informal Russian texts, such as comments in social networks. The problem is formulated as multilabel classification with the following classes: - `non-toxic`: the text does NOT contain insults, obscenities, and threats, in the sense of the [OK ML Cup](https://cups.mail.ru/ru/tasks/1048) competition. - `insult` - `obscenity` - `threat` - `dangerous`: the text is inappropriate, in the sense of [Babakov et.al.](https://arxiv.org/abs/2103.05345), i.e. it can harm the reputation of the speaker. A text can be considered safe if it is BOTH `non-toxic` and NOT `dangerous`. ## Usage The function below estimates the probability that the text is either toxic OR dangerous: ```python # !pip install transformers sentencepiece --quiet import torch from transformers import AutoTokenizer, AutoModelForSequenceClassification model_checkpoint = 'cointegrated/rubert-tiny-toxicity' tokenizer = AutoTokenizer.from_pretrained(model_checkpoint) model = AutoModelForSequenceClassification.from_pretrained(model_checkpoint) if torch.cuda.is_available(): model.cuda() def text2toxicity(text, aggregate=True): """ Calculate toxicity of a text (if aggregate=True) or a vector of toxicity aspects (if aggregate=False)""" with torch.no_grad(): inputs = tokenizer(text, return_tensors='pt', truncation=True, padding=True).to(model.device) proba = torch.sigmoid(model(**inputs).logits).cpu().numpy() if isinstance(text, str): proba = proba[0] if aggregate: return 1 - proba.T[0] * (1 - proba.T[-1]) return proba print(text2toxicity('я люблю нигеров', True)) # 0.9350118728093193 print(text2toxicity('я люблю нигеров', False)) # [0.9715758 0.0180863 0.0045551 0.00189755 0.9331106 ] print(text2toxicity(['я люблю нигеров', 'я люблю африканцев'], True)) # [0.93501186 0.04156357] print(text2toxicity(['я люблю нигеров', 'я люблю африканцев'], False)) # [[9.7157580e-01 1.8086294e-02 4.5550885e-03 1.8975559e-03 9.3311059e-01] # [9.9979788e-01 1.9048342e-04 1.5297388e-04 1.7452303e-04 4.1369814e-02]] ``` ## Training The model has been trained on the joint dataset of [OK ML Cup](https://cups.mail.ru/ru/tasks/1048) and [Babakov et.al.](https://arxiv.org/abs/2103.05345) with `Adam` optimizer, the learning rate of `1e-5`, and batch size of `64` for `15` epochs. A text was considered inappropriate if its inappropriateness score was higher than 0.8, and appropriate - if it was lower than 0.2. The per-label ROC AUC on the dev set is: ``` non-toxic : 0.9937 insult : 0.9912 obscenity : 0.9881 threat : 0.9910 dangerous : 0.8295 ```
2,880
[ [ 0.01190185546875, -0.06378173828125, 0.029632568359375, -0.0035839080810546875, -0.0108642578125, -0.027557373046875, -0.0149993896484375, -0.03863525390625, -0.0016164779663085938, 0.0206756591796875, -0.0096435546875, -0.048828125, -0.059600830078125, 0.005157470703125, -0.01739501953125, 0.10302734375, 0.0091552734375, 0.005954742431640625, -0.006587982177734375, -0.0180206298828125, -0.0185394287109375, -0.032470703125, -0.043243408203125, -0.0273284912109375, 0.05377197265625, 0.043609619140625, 0.03863525390625, 0.042083740234375, 0.036163330078125, 0.0283966064453125, -0.02484130859375, -0.00702667236328125, -0.0263824462890625, -0.002185821533203125, -0.01027679443359375, -0.0335693359375, -0.020843505859375, 0.01525115966796875, 0.016326904296875, 0.017242431640625, 0.0030384063720703125, 0.01904296875, 0.016326904296875, 0.03070068359375, -0.043670654296875, 0.003688812255859375, -0.03790283203125, 0.0014543533325195312, -0.0096282958984375, 0.0005621910095214844, -0.051513671875, -0.00449371337890625, 0.0032196044921875, -0.040252685546875, -0.0183258056640625, 0.0085296630859375, 0.07672119140625, 0.00969696044921875, -0.02947998046875, -0.036285400390625, -0.03265380859375, 0.07159423828125, -0.05340576171875, -0.007595062255859375, 0.032989501953125, -0.005893707275390625, -0.0028171539306640625, -0.0574951171875, -0.03802490234375, -0.01358795166015625, -0.0216217041015625, 0.005382537841796875, -0.01081085205078125, -0.010101318359375, 0.01541900634765625, 0.004558563232421875, -0.05352783203125, 0.0049896240234375, -0.030731201171875, -0.0269317626953125, 0.05157470703125, 0.01416015625, 0.0286407470703125, -0.033538818359375, -0.0389404296875, -0.0042266845703125, -0.0067596435546875, 0.01031494140625, 0.022125244140625, 0.031829833984375, -0.0287628173828125, 0.050048828125, -0.00482177734375, 0.027130126953125, -0.005840301513671875, -0.0218048095703125, 0.04071044921875, -0.032257080078125, -0.00958251953125, 0.00551605224609375, 0.09588623046875, 0.031646728515625, 0.0195465087890625, -0.00021719932556152344, -0.00266265869140625, 0.01488494873046875, -0.0011539459228515625, -0.0867919921875, -0.0452880859375, 0.016204833984375, -0.0369873046875, -0.05078125, -0.0157623291015625, -0.07086181640625, -0.0172119140625, 0.01023101806640625, 0.061248779296875, -0.034027099609375, -0.00916290283203125, -0.010955810546875, -0.0003523826599121094, -0.0146636962890625, -0.00147247314453125, -0.060333251953125, 0.0124664306640625, 0.02447509765625, 0.0784912109375, 0.0005960464477539062, -0.0096588134765625, -0.025299072265625, -0.0037670135498046875, -0.016357421875, 0.06298828125, -0.035064697265625, -0.02191162109375, -0.0241851806640625, 0.00949859619140625, -0.00795745849609375, -0.0304718017578125, 0.0404052734375, -0.014007568359375, 0.03924560546875, 0.0077362060546875, -0.0206756591796875, -0.0130615234375, 0.019622802734375, -0.03759765625, 0.09259033203125, 0.01043701171875, -0.09661865234375, 0.02386474609375, -0.03924560546875, -0.032562255859375, -0.01024627685546875, 0.01227569580078125, -0.0733642578125, -0.04638671875, -0.01129150390625, 0.0328369140625, -0.005222320556640625, 0.0124053955078125, -0.0478515625, -0.0202484130859375, 0.035308837890625, -0.0218048095703125, 0.085205078125, 0.033355712890625, -0.0235137939453125, 0.0223236083984375, -0.0596923828125, 0.01087188720703125, 0.00434112548828125, -0.041229248046875, -0.036346435546875, -0.0218048095703125, 0.0234222412109375, 0.0384521484375, 0.004589080810546875, -0.038421630859375, -0.005207061767578125, -0.031158447265625, 0.050384521484375, 0.044036865234375, 0.0223388671875, 0.037994384765625, -0.03253173828125, 0.007358551025390625, 0.01263427734375, 0.0142669677734375, 0.01178741455078125, -0.052581787109375, -0.057281494140625, -0.0162353515625, 0.0226287841796875, 0.06304931640625, -0.058013916015625, 0.04547119140625, -0.0103607177734375, -0.0545654296875, -0.01617431640625, 0.0012378692626953125, 0.043121337890625, 0.04156494140625, 0.02947998046875, -0.007232666015625, -0.02667236328125, -0.05218505859375, -0.0263824462890625, -0.0249176025390625, 0.0129547119140625, 0.032867431640625, 0.061004638671875, -0.0237884521484375, 0.0499267578125, -0.031585693359375, -0.04168701171875, 0.0030727386474609375, 0.01910400390625, 0.0204315185546875, 0.05181884765625, 0.0455322265625, -0.044921875, -0.06488037109375, -0.00333404541015625, -0.052642822265625, -0.007183074951171875, -0.00028252601623535156, -0.020904541015625, 0.017364501953125, 0.02630615234375, -0.0257568359375, 0.0280609130859375, 0.028350830078125, -0.0350341796875, 0.036834716796875, -0.0297088623046875, 0.0106353759765625, -0.09161376953125, 0.0283203125, 0.0244598388671875, -0.006420135498046875, -0.071044921875, -0.0128936767578125, 0.0119171142578125, 0.0086822509765625, -0.0526123046875, 0.050567626953125, -0.0211639404296875, 0.033660888671875, -0.006366729736328125, 0.007480621337890625, 0.01027679443359375, 0.0264892578125, -0.005947113037109375, 0.05572509765625, 0.034271240234375, -0.049957275390625, 0.02032470703125, 0.019775390625, -0.01517486572265625, 0.03265380859375, -0.032745361328125, 0.00463104248046875, -0.005855560302734375, 0.006526947021484375, -0.0814208984375, -0.01739501953125, 0.047210693359375, -0.06298828125, 0.027587890625, -0.006183624267578125, -0.04107666015625, -0.0277252197265625, -0.0273895263671875, 0.036865234375, 0.032257080078125, -0.0208892822265625, 0.021728515625, 0.03985595703125, -0.0196533203125, -0.051116943359375, -0.08026123046875, -0.0029735565185546875, -0.032867431640625, -0.04595947265625, 0.004974365234375, -0.01593017578125, -0.01116943359375, 0.0014209747314453125, 0.01361083984375, -0.026885986328125, 0.00237274169921875, 0.00714111328125, 0.02264404296875, 0.0148468017578125, 0.0163726806640625, 0.0004820823669433594, -0.01448822021484375, 0.017333984375, 0.0169677734375, 0.0318603515625, -0.004291534423828125, 0.0313720703125, -0.042236328125, 0.0109100341796875, 0.01393890380859375, 0.005138397216796875, 0.064697265625, 0.0618896484375, -0.033538818359375, -0.0017080307006835938, -0.0003986358642578125, -0.01357269287109375, -0.03582763671875, 0.048309326171875, -0.0157470703125, -0.044677734375, 0.04083251953125, 0.0201568603515625, -0.01477813720703125, 0.0643310546875, 0.051605224609375, 0.0007276535034179688, 0.0772705078125, 0.0262908935546875, -0.003192901611328125, 0.03582763671875, -0.03131103515625, 0.01036834716796875, -0.054718017578125, -0.0472412109375, -0.048370361328125, -0.004150390625, -0.044281005859375, -0.032684326171875, 0.036468505859375, -0.008331298828125, -0.039398193359375, 0.0189971923828125, -0.06036376953125, 0.0303802490234375, 0.0244140625, 0.0267486572265625, -0.004100799560546875, -0.004852294921875, -0.01349639892578125, -0.007083892822265625, -0.05523681640625, -0.040924072265625, 0.0767822265625, 0.039459228515625, 0.0599365234375, 0.0103607177734375, 0.041534423828125, 0.0243072509765625, 0.04974365234375, -0.0601806640625, 0.0364990234375, -0.0219573974609375, -0.0882568359375, -0.0166015625, -0.057586669921875, -0.0684814453125, 0.01175689697265625, -0.04266357421875, -0.08197021484375, 0.00887298583984375, 0.0100250244140625, -0.03179931640625, 0.0303192138671875, -0.06884765625, 0.07305908203125, -0.011566162109375, -0.047607421875, 0.0006442070007324219, -0.06427001953125, 0.037872314453125, -0.00737762451171875, 0.046600341796875, -0.01483154296875, 0.0022602081298828125, 0.08135986328125, -0.017974853515625, 0.0640869140625, -0.0025386810302734375, 0.004566192626953125, 0.0338134765625, -0.0014295578002929688, 0.03143310546875, 0.00618743896484375, -0.016265869140625, 0.01515960693359375, 0.00414276123046875, -0.0228729248046875, 0.0038623809814453125, 0.059173583984375, -0.066650390625, -0.0038547515869140625, -0.07012939453125, -0.030029296875, 0.01788330078125, 0.035308837890625, 0.05108642578125, 0.028717041015625, 0.00226593017578125, 0.00377655029296875, 0.054718017578125, -0.04180908203125, 0.032440185546875, 0.01337432861328125, 0.001705169677734375, -0.05133056640625, 0.07806396484375, 0.00385284423828125, 0.031005859375, 0.022369384765625, 0.04107666015625, -0.026031494140625, -0.021820068359375, -0.0228729248046875, 0.021575927734375, -0.06268310546875, -0.0347900390625, -0.06683349609375, -0.032318115234375, -0.046173095703125, 0.0112152099609375, -0.0128936767578125, -0.0199127197265625, -0.059539794921875, 0.006439208984375, 0.03057861328125, 0.05078125, -0.023529052734375, 0.02294921875, -0.05792236328125, 0.0127716064453125, 0.0140838623046875, 0.02557373046875, -0.0098419189453125, -0.06695556640625, -0.021453857421875, 0.0066375732421875, -0.033111572265625, -0.08184814453125, 0.03741455078125, 0.0114288330078125, 0.0284423828125, 0.03717041015625, 0.0230865478515625, 0.023162841796875, -0.024078369140625, 0.05438232421875, 0.0213775634765625, -0.054962158203125, 0.0330810546875, -0.02001953125, -0.0016927719116210938, 0.03717041015625, 0.0308685302734375, -0.05267333984375, -0.0249786376953125, -0.04925537109375, -0.06842041015625, 0.083740234375, 0.036773681640625, 0.0010309219360351562, -0.00850677490234375, 0.018341064453125, -0.0042266845703125, -0.007755279541015625, -0.08172607421875, -0.049957275390625, -0.00289154052734375, -0.03448486328125, -0.0068359375, -0.039703369140625, -0.006328582763671875, -0.034423828125, 0.0799560546875, 0.01409912109375, 0.039398193359375, 0.01108551025390625, -0.016143798828125, -0.01047515869140625, 0.0307464599609375, 0.043548583984375, 0.007434844970703125, -0.03118896484375, 0.0299530029296875, 0.0257568359375, -0.05108642578125, 0.023162841796875, 0.016021728515625, -0.018646240234375, -0.00542449951171875, 0.0219268798828125, 0.045440673828125, -0.004970550537109375, -0.020782470703125, 0.046600341796875, -0.011016845703125, -0.0290679931640625, -0.0138092041015625, 0.01442718505859375, -0.007282257080078125, -0.005092620849609375, 0.00310516357421875, 0.0237884521484375, 0.0286102294921875, -0.04547119140625, 0.024871826171875, 0.0218353271484375, -0.037078857421875, -0.012054443359375, 0.06671142578125, 0.02069091796875, -0.02960205078125, 0.0214080810546875, -0.03857421875, -0.049102783203125, 0.04266357421875, 0.0338134765625, 0.05328369140625, -0.0227203369140625, 0.0233917236328125, 0.066650390625, 0.0291290283203125, 0.01264190673828125, 0.0169219970703125, 0.02520751953125, -0.041748046875, -0.004375457763671875, -0.041961669921875, -0.0000641942024230957, 0.0196533203125, -0.03839111328125, 0.0138397216796875, -0.03363037109375, -0.028594970703125, 0.0271453857421875, 0.01241302490234375, -0.0205535888671875, 0.02947998046875, 0.0020275115966796875, 0.047149658203125, -0.0968017578125, 0.06597900390625, 0.05743408203125, -0.023681640625, -0.04925537109375, -0.0104217529296875, 0.004421234130859375, -0.0462646484375, 0.031280517578125, 0.031890869140625, 0.01290130615234375, -0.00858306884765625, -0.03948974609375, -0.06201171875, 0.064697265625, -0.00864410400390625, -0.026397705078125, 0.007450103759765625, 0.01021575927734375, 0.049041748046875, 0.004825592041015625, 0.035888671875, 0.03857421875, 0.033538818359375, -0.0016031265258789062, -0.05572509765625, 0.01430511474609375, -0.032440185546875, -0.006275177001953125, 0.015869140625, -0.0648193359375, 0.0657958984375, -0.03173828125, -0.01052093505859375, 0.02142333984375, 0.0286712646484375, 0.002826690673828125, 0.043609619140625, 0.03448486328125, 0.044952392578125, 0.04705810546875, -0.01074981689453125, 0.05657958984375, -0.00881195068359375, 0.056427001953125, 0.0648193359375, 0.0206756591796875, 0.055389404296875, 0.02301025390625, -0.01529693603515625, 0.036407470703125, 0.0604248046875, 0.00482940673828125, 0.056182861328125, 0.002635955810546875, -0.0211944580078125, -0.004962921142578125, 0.002513885498046875, -0.0182342529296875, 0.0163421630859375, 0.037811279296875, -0.0308990478515625, -0.004673004150390625, 0.007965087890625, 0.034759521484375, 0.0004687309265136719, -0.00992584228515625, 0.0640869140625, 0.01091766357421875, -0.031524658203125, 0.03167724609375, 0.0185546875, 0.062347412109375, -0.016754150390625, 0.01378631591796875, 0.01128387451171875, 0.03466796875, -0.0237274169921875, -0.07470703125, 0.0173797607421875, 0.0005359649658203125, -0.004589080810546875, 0.01239013671875, 0.06280517578125, -0.053863525390625, -0.038787841796875, 0.0189971923828125, -0.0035572052001953125, 0.0223541259765625, 0.01506805419921875, -0.075927734375, -0.007152557373046875, 0.002593994140625, -0.0184478759765625, 0.01837158203125, 0.035064697265625, 0.010009765625, 0.035003662109375, 0.03509521484375, -0.0050506591796875, -0.00754547119140625, -0.02386474609375, 0.06201171875, -0.040069580078125, -0.040252685546875, -0.06610107421875, 0.044708251953125, -0.00908660888671875, -0.039398193359375, 0.04266357421875, 0.05926513671875, 0.040069580078125, 0.0121002197265625, 0.046478271484375, -0.0170440673828125, 0.03070068359375, -0.032470703125, 0.0806884765625, -0.037078857421875, -0.002216339111328125, -0.0271453857421875, -0.04534912109375, -0.0301055908203125, 0.06781005859375, -0.0251922607421875, 0.0179901123046875, 0.060546875, 0.0621337890625, 0.0016574859619140625, -0.02593994140625, 0.0044097900390625, 0.034637451171875, 0.031341552734375, 0.0526123046875, 0.031524658203125, -0.05047607421875, 0.0643310546875, -0.053497314453125, -0.033905029296875, -0.0229034423828125, -0.058319091796875, -0.0816650390625, -0.04266357421875, -0.03363037109375, -0.0714111328125, -0.00045943260192871094, 0.059478759765625, 0.041473388671875, -0.05712890625, 0.00586700439453125, 0.010498046875, 0.005039215087890625, 0.00588226318359375, -0.022064208984375, 0.0199127197265625, -0.01058197021484375, -0.04144287109375, -0.0267181396484375, 0.0037212371826171875, 0.0204620361328125, 0.006793975830078125, -0.0135498046875, -0.0243377685546875, 0.0024547576904296875, 0.03857421875, 0.004825592041015625, -0.029266357421875, -0.0330810546875, -0.0151214599609375, -0.046478271484375, 0.0089111328125, 0.008941650390625, -0.02392578125, 0.045684814453125, 0.036163330078125, 0.00927734375, 0.04229736328125, -0.01267242431640625, -0.01055908203125, -0.0562744140625, 0.0196990966796875, 0.022674560546875, 0.0122222900390625, 0.0261993408203125, -0.045928955078125, 0.02471923828125, 0.02587890625, -0.06365966796875, -0.05902099609375, -0.00786590576171875, -0.0802001953125, -0.02685546875, 0.11663818359375, -0.01142120361328125, -0.03887939453125, -0.0126190185546875, -0.027618408203125, 0.045867919921875, -0.040252685546875, 0.06890869140625, 0.043670654296875, 0.0034351348876953125, -0.0026721954345703125, -0.0303955078125, 0.040863037109375, 0.0115203857421875, -0.0689697265625, 0.0015401840209960938, 0.0185394287109375, 0.0478515625, 0.0194091796875, 0.045135498046875, 0.00077056884765625, 0.01678466796875, 0.01531219482421875, 0.0048828125, 0.00789642333984375, -0.020965576171875, -0.0142822265625, -0.01629638671875, -0.0014829635620117188, -0.0199432373046875 ] ]
intfloat/multilingual-e5-large
2023-09-23T13:04:50.000Z
[ "sentence-transformers", "pytorch", "onnx", "safetensors", "xlm-roberta", "mteb", "Sentence Transformers", "sentence-similarity", "feature-extraction", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:2212.03533", "arxiv:2108.08787", "arxiv:2104.08663", "arxiv:2210.07316", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
intfloat
null
null
intfloat/multilingual-e5-large
252
114,463
sentence-transformers
2023-06-30T07:38:19
--- tags: - mteb - Sentence Transformers - sentence-similarity - feature-extraction - sentence-transformers model-index: - name: multilingual-e5-large results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 79.05970149253731 - type: ap value: 43.486574390835635 - type: f1 value: 73.32700092140148 - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (de) config: de split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 71.22055674518201 - type: ap value: 81.55756710830498 - type: f1 value: 69.28271787752661 - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en-ext) config: en-ext split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 80.41979010494754 - type: ap value: 29.34879922376344 - type: f1 value: 67.62475449011278 - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (ja) config: ja split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 77.8372591006424 - type: ap value: 26.557560591210738 - type: f1 value: 64.96619417368707 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.489875 - type: ap value: 90.98758636917603 - type: f1 value: 93.48554819717332 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 47.564 - type: f1 value: 46.75122173518047 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (de) config: de split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 45.400000000000006 - type: f1 value: 44.17195682400632 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (es) config: es split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 43.068 - type: f1 value: 42.38155696855596 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (fr) config: fr split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 41.89 - type: f1 value: 40.84407321682663 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (ja) config: ja split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 40.120000000000005 - type: f1 value: 39.522976223819114 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (zh) config: zh split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 38.832 - type: f1 value: 38.0392533394713 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 30.725 - type: map_at_10 value: 46.055 - type: map_at_100 value: 46.900999999999996 - type: map_at_1000 value: 46.911 - type: map_at_3 value: 41.548 - type: map_at_5 value: 44.297 - type: mrr_at_1 value: 31.152 - type: mrr_at_10 value: 46.231 - type: mrr_at_100 value: 47.07 - type: mrr_at_1000 value: 47.08 - type: mrr_at_3 value: 41.738 - type: mrr_at_5 value: 44.468999999999994 - type: ndcg_at_1 value: 30.725 - type: ndcg_at_10 value: 54.379999999999995 - type: ndcg_at_100 value: 58.138 - type: ndcg_at_1000 value: 58.389 - type: ndcg_at_3 value: 45.156 - type: ndcg_at_5 value: 50.123 - type: precision_at_1 value: 30.725 - type: precision_at_10 value: 8.087 - type: precision_at_100 value: 0.9769999999999999 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 18.54 - type: precision_at_5 value: 13.542000000000002 - type: recall_at_1 value: 30.725 - type: recall_at_10 value: 80.868 - type: recall_at_100 value: 97.653 - type: recall_at_1000 value: 99.57300000000001 - type: recall_at_3 value: 55.619 - type: recall_at_5 value: 67.71000000000001 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 44.30960650674069 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 38.427074197498996 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 60.28270056031872 - type: mrr value: 74.38332673789738 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 84.05942144105269 - type: cos_sim_spearman value: 82.51212105850809 - type: euclidean_pearson value: 81.95639829909122 - type: euclidean_spearman value: 82.3717564144213 - type: manhattan_pearson value: 81.79273425468256 - type: manhattan_spearman value: 82.20066817871039 - task: type: BitextMining dataset: type: mteb/bucc-bitext-mining name: MTEB BUCC (de-en) config: de-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.46764091858039 - type: f1 value: 99.37717466945023 - type: precision value: 99.33194154488518 - type: recall value: 99.46764091858039 - task: type: BitextMining dataset: type: mteb/bucc-bitext-mining name: MTEB BUCC (fr-en) config: fr-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 98.29407880255337 - type: f1 value: 98.11248073959938 - type: precision value: 98.02443319392472 - type: recall value: 98.29407880255337 - task: type: BitextMining dataset: type: mteb/bucc-bitext-mining name: MTEB BUCC (ru-en) config: ru-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 97.79009352268791 - type: f1 value: 97.5176076665512 - type: precision value: 97.38136473848286 - type: recall value: 97.79009352268791 - task: type: BitextMining dataset: type: mteb/bucc-bitext-mining name: MTEB BUCC (zh-en) config: zh-en split: test revision: d51519689f32196a32af33b075a01d0e7c51e252 metrics: - type: accuracy value: 99.26276987888363 - type: f1 value: 99.20133403545726 - type: precision value: 99.17500438827453 - type: recall value: 99.26276987888363 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 84.72727272727273 - type: f1 value: 84.67672206031433 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 35.34220182511161 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 33.4987096128766 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 25.558249999999997 - type: map_at_10 value: 34.44425000000001 - type: map_at_100 value: 35.59833333333333 - type: map_at_1000 value: 35.706916666666665 - type: map_at_3 value: 31.691749999999995 - type: map_at_5 value: 33.252916666666664 - type: mrr_at_1 value: 30.252666666666666 - type: mrr_at_10 value: 38.60675 - type: mrr_at_100 value: 39.42666666666666 - type: mrr_at_1000 value: 39.48408333333334 - type: mrr_at_3 value: 36.17441666666665 - type: mrr_at_5 value: 37.56275 - type: ndcg_at_1 value: 30.252666666666666 - type: ndcg_at_10 value: 39.683 - type: ndcg_at_100 value: 44.68541666666667 - type: ndcg_at_1000 value: 46.94316666666668 - type: ndcg_at_3 value: 34.961749999999995 - type: ndcg_at_5 value: 37.215666666666664 - type: precision_at_1 value: 30.252666666666666 - type: precision_at_10 value: 6.904166666666667 - type: precision_at_100 value: 1.0989999999999995 - type: precision_at_1000 value: 0.14733333333333334 - type: precision_at_3 value: 16.037666666666667 - type: precision_at_5 value: 11.413583333333333 - type: recall_at_1 value: 25.558249999999997 - type: recall_at_10 value: 51.13341666666666 - type: recall_at_100 value: 73.08366666666667 - type: recall_at_1000 value: 88.79483333333334 - type: recall_at_3 value: 37.989083333333326 - type: recall_at_5 value: 43.787833333333325 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 10.338 - type: map_at_10 value: 18.360000000000003 - type: map_at_100 value: 19.942 - type: map_at_1000 value: 20.134 - type: map_at_3 value: 15.174000000000001 - type: map_at_5 value: 16.830000000000002 - type: mrr_at_1 value: 23.257 - type: mrr_at_10 value: 33.768 - type: mrr_at_100 value: 34.707 - type: mrr_at_1000 value: 34.766000000000005 - type: mrr_at_3 value: 30.977 - type: mrr_at_5 value: 32.528 - type: ndcg_at_1 value: 23.257 - type: ndcg_at_10 value: 25.733 - type: ndcg_at_100 value: 32.288 - type: ndcg_at_1000 value: 35.992000000000004 - type: ndcg_at_3 value: 20.866 - type: ndcg_at_5 value: 22.612 - type: precision_at_1 value: 23.257 - type: precision_at_10 value: 8.124 - type: precision_at_100 value: 1.518 - type: precision_at_1000 value: 0.219 - type: precision_at_3 value: 15.679000000000002 - type: precision_at_5 value: 12.117 - type: recall_at_1 value: 10.338 - type: recall_at_10 value: 31.154 - type: recall_at_100 value: 54.161 - type: recall_at_1000 value: 75.21900000000001 - type: recall_at_3 value: 19.427 - type: recall_at_5 value: 24.214 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 8.498 - type: map_at_10 value: 19.103 - type: map_at_100 value: 27.375 - type: map_at_1000 value: 28.981 - type: map_at_3 value: 13.764999999999999 - type: map_at_5 value: 15.950000000000001 - type: mrr_at_1 value: 65.5 - type: mrr_at_10 value: 74.53800000000001 - type: mrr_at_100 value: 74.71799999999999 - type: mrr_at_1000 value: 74.725 - type: mrr_at_3 value: 72.792 - type: mrr_at_5 value: 73.554 - type: ndcg_at_1 value: 53.37499999999999 - type: ndcg_at_10 value: 41.286 - type: ndcg_at_100 value: 45.972 - type: ndcg_at_1000 value: 53.123 - type: ndcg_at_3 value: 46.172999999999995 - type: ndcg_at_5 value: 43.033 - type: precision_at_1 value: 65.5 - type: precision_at_10 value: 32.725 - type: precision_at_100 value: 10.683 - type: precision_at_1000 value: 1.978 - type: precision_at_3 value: 50 - type: precision_at_5 value: 41.349999999999994 - type: recall_at_1 value: 8.498 - type: recall_at_10 value: 25.070999999999998 - type: recall_at_100 value: 52.383 - type: recall_at_1000 value: 74.91499999999999 - type: recall_at_3 value: 15.207999999999998 - type: recall_at_5 value: 18.563 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 46.5 - type: f1 value: 41.93833713984145 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 67.914 - type: map_at_10 value: 78.10000000000001 - type: map_at_100 value: 78.333 - type: map_at_1000 value: 78.346 - type: map_at_3 value: 76.626 - type: map_at_5 value: 77.627 - type: mrr_at_1 value: 72.74199999999999 - type: mrr_at_10 value: 82.414 - type: mrr_at_100 value: 82.511 - type: mrr_at_1000 value: 82.513 - type: mrr_at_3 value: 81.231 - type: mrr_at_5 value: 82.065 - type: ndcg_at_1 value: 72.74199999999999 - type: ndcg_at_10 value: 82.806 - type: ndcg_at_100 value: 83.677 - type: ndcg_at_1000 value: 83.917 - type: ndcg_at_3 value: 80.305 - type: ndcg_at_5 value: 81.843 - type: precision_at_1 value: 72.74199999999999 - type: precision_at_10 value: 10.24 - type: precision_at_100 value: 1.089 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 31.268 - type: precision_at_5 value: 19.706000000000003 - type: recall_at_1 value: 67.914 - type: recall_at_10 value: 92.889 - type: recall_at_100 value: 96.42699999999999 - type: recall_at_1000 value: 97.92 - type: recall_at_3 value: 86.21 - type: recall_at_5 value: 90.036 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 22.166 - type: map_at_10 value: 35.57 - type: map_at_100 value: 37.405 - type: map_at_1000 value: 37.564 - type: map_at_3 value: 30.379 - type: map_at_5 value: 33.324 - type: mrr_at_1 value: 43.519000000000005 - type: mrr_at_10 value: 51.556000000000004 - type: mrr_at_100 value: 52.344 - type: mrr_at_1000 value: 52.373999999999995 - type: mrr_at_3 value: 48.868 - type: mrr_at_5 value: 50.319 - type: ndcg_at_1 value: 43.519000000000005 - type: ndcg_at_10 value: 43.803 - type: ndcg_at_100 value: 50.468999999999994 - type: ndcg_at_1000 value: 53.111 - type: ndcg_at_3 value: 38.893 - type: ndcg_at_5 value: 40.653 - type: precision_at_1 value: 43.519000000000005 - type: precision_at_10 value: 12.253 - type: precision_at_100 value: 1.931 - type: precision_at_1000 value: 0.242 - type: precision_at_3 value: 25.617 - type: precision_at_5 value: 19.383 - type: recall_at_1 value: 22.166 - type: recall_at_10 value: 51.6 - type: recall_at_100 value: 76.574 - type: recall_at_1000 value: 92.192 - type: recall_at_3 value: 34.477999999999994 - type: recall_at_5 value: 41.835 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 39.041 - type: map_at_10 value: 62.961999999999996 - type: map_at_100 value: 63.79899999999999 - type: map_at_1000 value: 63.854 - type: map_at_3 value: 59.399 - type: map_at_5 value: 61.669 - type: mrr_at_1 value: 78.082 - type: mrr_at_10 value: 84.321 - type: mrr_at_100 value: 84.49600000000001 - type: mrr_at_1000 value: 84.502 - type: mrr_at_3 value: 83.421 - type: mrr_at_5 value: 83.977 - type: ndcg_at_1 value: 78.082 - type: ndcg_at_10 value: 71.229 - type: ndcg_at_100 value: 74.10900000000001 - type: ndcg_at_1000 value: 75.169 - type: ndcg_at_3 value: 66.28699999999999 - type: ndcg_at_5 value: 69.084 - type: precision_at_1 value: 78.082 - type: precision_at_10 value: 14.993 - type: precision_at_100 value: 1.7239999999999998 - type: precision_at_1000 value: 0.186 - type: precision_at_3 value: 42.737 - type: precision_at_5 value: 27.843 - type: recall_at_1 value: 39.041 - type: recall_at_10 value: 74.96300000000001 - type: recall_at_100 value: 86.199 - type: recall_at_1000 value: 93.228 - type: recall_at_3 value: 64.105 - type: recall_at_5 value: 69.608 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.23160000000001 - type: ap value: 85.5674856808308 - type: f1 value: 90.18033354786317 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 24.091 - type: map_at_10 value: 36.753 - type: map_at_100 value: 37.913000000000004 - type: map_at_1000 value: 37.958999999999996 - type: map_at_3 value: 32.818999999999996 - type: map_at_5 value: 35.171 - type: mrr_at_1 value: 24.742 - type: mrr_at_10 value: 37.285000000000004 - type: mrr_at_100 value: 38.391999999999996 - type: mrr_at_1000 value: 38.431 - type: mrr_at_3 value: 33.440999999999995 - type: mrr_at_5 value: 35.75 - type: ndcg_at_1 value: 24.742 - type: ndcg_at_10 value: 43.698 - type: ndcg_at_100 value: 49.145 - type: ndcg_at_1000 value: 50.23800000000001 - type: ndcg_at_3 value: 35.769 - type: ndcg_at_5 value: 39.961999999999996 - type: precision_at_1 value: 24.742 - type: precision_at_10 value: 6.7989999999999995 - type: precision_at_100 value: 0.95 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 15.096000000000002 - type: precision_at_5 value: 11.183 - type: recall_at_1 value: 24.091 - type: recall_at_10 value: 65.068 - type: recall_at_100 value: 89.899 - type: recall_at_1000 value: 98.16 - type: recall_at_3 value: 43.68 - type: recall_at_5 value: 53.754999999999995 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 93.66621067031465 - type: f1 value: 93.49622853272142 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (de) config: de split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 91.94702733164272 - type: f1 value: 91.17043441745282 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (es) config: es split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 92.20146764509674 - type: f1 value: 91.98359080555608 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (fr) config: fr split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.99780770435328 - type: f1 value: 89.19746342724068 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (hi) config: hi split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 89.78486912871998 - type: f1 value: 89.24578823628642 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (th) config: th split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 88.74502712477394 - type: f1 value: 89.00297573881542 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 77.9046967624259 - type: f1 value: 59.36787125785957 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (de) config: de split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.5280360664976 - type: f1 value: 57.17723440888718 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (es) config: es split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 75.44029352901934 - type: f1 value: 54.052855531072964 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (fr) config: fr split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 70.5606013153774 - type: f1 value: 52.62215934386531 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (hi) config: hi split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 73.11581211903908 - type: f1 value: 52.341291845645465 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (th) config: th split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 74.28933092224233 - type: f1 value: 57.07918745504911 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (af) config: af split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.38063214525892 - type: f1 value: 59.46463723443009 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (am) config: am split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 56.06926698049766 - type: f1 value: 52.49084283283562 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ar) config: ar split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.74983187626093 - type: f1 value: 56.960640620165904 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (az) config: az split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.86550100874243 - type: f1 value: 62.47370548140688 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (bn) config: bn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.971082716879636 - type: f1 value: 61.03812421957381 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (cy) config: cy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 54.98318762609282 - type: f1 value: 51.51207916008392 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (da) config: da split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.45527908540686 - type: f1 value: 66.16631905400318 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (de) config: de split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.32750504371216 - type: f1 value: 66.16755288646591 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (el) config: el split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.09213180901143 - type: f1 value: 66.95654394661507 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 73.75588433086752 - type: f1 value: 71.79973779656923 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (es) config: es split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.49428379287154 - type: f1 value: 68.37494379215734 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (fa) config: fa split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.90921318090115 - type: f1 value: 66.79517376481645 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (fi) config: fi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.12104909213181 - type: f1 value: 67.29448842879584 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (fr) config: fr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.34095494283793 - type: f1 value: 67.01134288992947 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (he) config: he split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.61264290517822 - type: f1 value: 64.68730512660757 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (hi) config: hi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.79757901815738 - type: f1 value: 65.24938539425598 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (hu) config: hu split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.68728984532616 - type: f1 value: 67.0487169762553 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (hy) config: hy split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.07464694014795 - type: f1 value: 59.183532276789286 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (id) config: id split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.04707464694015 - type: f1 value: 67.66829629003848 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (is) config: is split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.42434431741762 - type: f1 value: 59.01617226544757 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (it) config: it split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.53127101546738 - type: f1 value: 68.10033760906255 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ja) config: ja split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 72.50504371217215 - type: f1 value: 69.74931103158923 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (jv) config: jv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 57.91190316072628 - type: f1 value: 54.05551136648796 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ka) config: ka split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 51.78211163416275 - type: f1 value: 49.874888544058535 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (km) config: km split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 47.017484868863484 - type: f1 value: 44.53364263352014 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (kn) config: kn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.16207128446537 - type: f1 value: 59.01185692320829 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ko) config: ko split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.42501681237391 - type: f1 value: 67.13169450166086 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (lv) config: lv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0780094149294 - type: f1 value: 64.41720167850707 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ml) config: ml split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.57162071284466 - type: f1 value: 62.414138683804424 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (mn) config: mn split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 61.71149966375252 - type: f1 value: 58.594805125087234 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ms) config: ms split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.03900470746471 - type: f1 value: 63.87937257883887 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (my) config: my split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 60.8776059179556 - type: f1 value: 57.48587618059131 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (nb) config: nb split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87895090786819 - type: f1 value: 66.8141299430347 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (nl) config: nl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.45057162071285 - type: f1 value: 67.46444039673516 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (pl) config: pl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.546738399462 - type: f1 value: 68.63640876702655 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (pt) config: pt split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 70.72965702757229 - type: f1 value: 68.54119560379115 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ro) config: ro split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.35574983187625 - type: f1 value: 65.88844917691927 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ru) config: ru split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.70477471418964 - type: f1 value: 69.19665697061978 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (sl) config: sl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 67.0880968392737 - type: f1 value: 64.76962317666086 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (sq) config: sq split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 65.18493611297916 - type: f1 value: 62.49984559035371 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (sv) config: sv split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.75857431069265 - type: f1 value: 69.20053687623418 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (sw) config: sw split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 58.500336247478145 - type: f1 value: 55.2972398687929 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ta) config: ta split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 62.68997982515132 - type: f1 value: 59.36848202755348 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (te) config: te split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 63.01950235373235 - type: f1 value: 60.09351954625423 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (th) config: th split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 68.29186281102892 - type: f1 value: 67.57860496703447 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (tl) config: tl split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.77471418964357 - type: f1 value: 61.913983147713836 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (tr) config: tr split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.87222595830532 - type: f1 value: 66.03679033708141 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (ur) config: ur split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 64.04505716207127 - type: f1 value: 61.28569169817908 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (vi) config: vi split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 69.38466711499663 - type: f1 value: 67.20532357036844 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (zh-CN) config: zh-CN split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 71.12306657700067 - type: f1 value: 68.91251226588182 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (zh-TW) config: zh-TW split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 66.20040349697378 - type: f1 value: 66.02657347714175 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (af) config: af split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.73907195696032 - type: f1 value: 66.98484521791418 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (am) config: am split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 60.58843308675185 - type: f1 value: 58.95591723092005 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ar) config: ar split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.22730329522528 - type: f1 value: 66.0894499712115 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (az) config: az split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.48285137861465 - type: f1 value: 65.21963176785157 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (bn) config: bn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.74714189643578 - type: f1 value: 66.8212192745412 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (cy) config: cy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 59.09213180901143 - type: f1 value: 56.70735546356339 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (da) config: da split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.05716207128448 - type: f1 value: 74.8413712365364 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (de) config: de split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.69737726967047 - type: f1 value: 74.7664341963 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (el) config: el split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.90383322125084 - type: f1 value: 73.59201554448323 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.51176866173503 - type: f1 value: 77.46104434577758 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (es) config: es split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.31069266980496 - type: f1 value: 74.61048660675635 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (fa) config: fa split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.95225285810356 - type: f1 value: 72.33160006574627 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (fi) config: fi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.12373907195696 - type: f1 value: 73.20921012557481 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (fr) config: fr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.86684599865501 - type: f1 value: 73.82348774610831 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (he) config: he split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.40215198386012 - type: f1 value: 71.11945183971858 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (hi) config: hi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 72.12844653665098 - type: f1 value: 71.34450495911766 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (hu) config: hu split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.52252858103566 - type: f1 value: 73.98878711342999 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (hy) config: hy split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.93611297915265 - type: f1 value: 63.723200467653385 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (id) config: id split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.11903160726295 - type: f1 value: 73.82138439467096 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (is) config: is split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.15198386012105 - type: f1 value: 66.02172193802167 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (it) config: it split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.32414256893072 - type: f1 value: 74.30943421170574 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ja) config: ja split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 77.46805648957633 - type: f1 value: 77.62808409298209 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (jv) config: jv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.318762609280434 - type: f1 value: 62.094284066075076 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ka) config: ka split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 58.34902488231338 - type: f1 value: 57.12893860987984 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (km) config: km split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 50.88433086751849 - type: f1 value: 48.2272350802058 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (kn) config: kn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.4425016812374 - type: f1 value: 64.61463095996173 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ko) config: ko split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.04707464694015 - type: f1 value: 75.05099199098998 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (lv) config: lv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.50437121721586 - type: f1 value: 69.83397721096314 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ml) config: ml split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.94283792871553 - type: f1 value: 68.8704663703913 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (mn) config: mn split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 64.79488903833222 - type: f1 value: 63.615424063345436 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ms) config: ms split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 69.88231338264963 - type: f1 value: 68.57892302593237 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (my) config: my split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.248150638870214 - type: f1 value: 61.06680605338809 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (nb) config: nb split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.84196368527236 - type: f1 value: 74.52566464968763 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (nl) config: nl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.8285137861466 - type: f1 value: 74.8853197608802 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (pl) config: pl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 74.13248150638869 - type: f1 value: 74.3982040999179 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (pt) config: pt split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.49024882313383 - type: f1 value: 73.82153848368573 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ro) config: ro split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.72158708809684 - type: f1 value: 71.85049433180541 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ru) config: ru split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 75.137861466039 - type: f1 value: 75.37628348188467 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (sl) config: sl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.86953597848016 - type: f1 value: 71.87537624521661 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (sq) config: sq split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 70.27572293207801 - type: f1 value: 68.80017302344231 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (sv) config: sv split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.09952925353059 - type: f1 value: 76.07992707688408 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (sw) config: sw split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 63.140551445864155 - type: f1 value: 61.73855010331415 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ta) config: ta split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.27774041694687 - type: f1 value: 64.83664868894539 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (te) config: te split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 66.69468728984533 - type: f1 value: 64.76239666920868 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (th) config: th split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.44653665097512 - type: f1 value: 73.14646052013873 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (tl) config: tl split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 67.71351714862139 - type: f1 value: 66.67212180163382 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (tr) config: tr split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.9946200403497 - type: f1 value: 73.87348793725525 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (ur) config: ur split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 68.15400134498992 - type: f1 value: 67.09433241421094 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (vi) config: vi split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 73.11365164761264 - type: f1 value: 73.59502539433753 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (zh-CN) config: zh-CN split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 76.82582380632145 - type: f1 value: 76.89992945316313 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (zh-TW) config: zh-TW split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 71.81237390719569 - type: f1 value: 72.36499770986265 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 31.480506569594695 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 29.71252128004552 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.421396787056548 - type: mrr value: 32.48155274872267 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 5.595 - type: map_at_10 value: 12.642000000000001 - type: map_at_100 value: 15.726 - type: map_at_1000 value: 17.061999999999998 - type: map_at_3 value: 9.125 - type: map_at_5 value: 10.866000000000001 - type: mrr_at_1 value: 43.344 - type: mrr_at_10 value: 52.227999999999994 - type: mrr_at_100 value: 52.898999999999994 - type: mrr_at_1000 value: 52.944 - type: mrr_at_3 value: 49.845 - type: mrr_at_5 value: 51.115 - type: ndcg_at_1 value: 41.949999999999996 - type: ndcg_at_10 value: 33.995 - type: ndcg_at_100 value: 30.869999999999997 - type: ndcg_at_1000 value: 39.487 - type: ndcg_at_3 value: 38.903999999999996 - type: ndcg_at_5 value: 37.236999999999995 - type: precision_at_1 value: 43.344 - type: precision_at_10 value: 25.480000000000004 - type: precision_at_100 value: 7.672 - type: precision_at_1000 value: 2.028 - type: precision_at_3 value: 36.636 - type: precision_at_5 value: 32.632 - type: recall_at_1 value: 5.595 - type: recall_at_10 value: 16.466 - type: recall_at_100 value: 31.226 - type: recall_at_1000 value: 62.778999999999996 - type: recall_at_3 value: 9.931 - type: recall_at_5 value: 12.884 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 40.414 - type: map_at_10 value: 56.754000000000005 - type: map_at_100 value: 57.457 - type: map_at_1000 value: 57.477999999999994 - type: map_at_3 value: 52.873999999999995 - type: map_at_5 value: 55.175 - type: mrr_at_1 value: 45.278 - type: mrr_at_10 value: 59.192 - type: mrr_at_100 value: 59.650000000000006 - type: mrr_at_1000 value: 59.665 - type: mrr_at_3 value: 56.141 - type: mrr_at_5 value: 57.998000000000005 - type: ndcg_at_1 value: 45.278 - type: ndcg_at_10 value: 64.056 - type: ndcg_at_100 value: 66.89 - type: ndcg_at_1000 value: 67.364 - type: ndcg_at_3 value: 56.97 - type: ndcg_at_5 value: 60.719 - type: precision_at_1 value: 45.278 - type: precision_at_10 value: 9.994 - type: precision_at_100 value: 1.165 - type: precision_at_1000 value: 0.121 - type: precision_at_3 value: 25.512 - type: precision_at_5 value: 17.509 - type: recall_at_1 value: 40.414 - type: recall_at_10 value: 83.596 - type: recall_at_100 value: 95.72 - type: recall_at_1000 value: 99.24 - type: recall_at_3 value: 65.472 - type: recall_at_5 value: 74.039 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 70.352 - type: map_at_10 value: 84.369 - type: map_at_100 value: 85.02499999999999 - type: map_at_1000 value: 85.04 - type: map_at_3 value: 81.42399999999999 - type: map_at_5 value: 83.279 - type: mrr_at_1 value: 81.05 - type: mrr_at_10 value: 87.401 - type: mrr_at_100 value: 87.504 - type: mrr_at_1000 value: 87.505 - type: mrr_at_3 value: 86.443 - type: mrr_at_5 value: 87.10799999999999 - type: ndcg_at_1 value: 81.04 - type: ndcg_at_10 value: 88.181 - type: ndcg_at_100 value: 89.411 - type: ndcg_at_1000 value: 89.507 - type: ndcg_at_3 value: 85.28099999999999 - type: ndcg_at_5 value: 86.888 - type: precision_at_1 value: 81.04 - type: precision_at_10 value: 13.406 - type: precision_at_100 value: 1.5350000000000001 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.31 - type: precision_at_5 value: 24.54 - type: recall_at_1 value: 70.352 - type: recall_at_10 value: 95.358 - type: recall_at_100 value: 99.541 - type: recall_at_1000 value: 99.984 - type: recall_at_3 value: 87.111 - type: recall_at_5 value: 91.643 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 46.54068723291946 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 63.216287629895994 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.023000000000001 - type: map_at_10 value: 10.071 - type: map_at_100 value: 11.892 - type: map_at_1000 value: 12.196 - type: map_at_3 value: 7.234 - type: map_at_5 value: 8.613999999999999 - type: mrr_at_1 value: 19.900000000000002 - type: mrr_at_10 value: 30.516 - type: mrr_at_100 value: 31.656000000000002 - type: mrr_at_1000 value: 31.723000000000003 - type: mrr_at_3 value: 27.400000000000002 - type: mrr_at_5 value: 29.270000000000003 - type: ndcg_at_1 value: 19.900000000000002 - type: ndcg_at_10 value: 17.474 - type: ndcg_at_100 value: 25.020999999999997 - type: ndcg_at_1000 value: 30.728 - type: ndcg_at_3 value: 16.588 - type: ndcg_at_5 value: 14.498 - type: precision_at_1 value: 19.900000000000002 - type: precision_at_10 value: 9.139999999999999 - type: precision_at_100 value: 2.011 - type: precision_at_1000 value: 0.33899999999999997 - type: precision_at_3 value: 15.667 - type: precision_at_5 value: 12.839999999999998 - type: recall_at_1 value: 4.023000000000001 - type: recall_at_10 value: 18.497 - type: recall_at_100 value: 40.8 - type: recall_at_1000 value: 68.812 - type: recall_at_3 value: 9.508 - type: recall_at_5 value: 12.983 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 83.967008785134 - type: cos_sim_spearman value: 80.23142141101837 - type: euclidean_pearson value: 81.20166064704539 - type: euclidean_spearman value: 80.18961335654585 - type: manhattan_pearson value: 81.13925443187625 - type: manhattan_spearman value: 80.07948723044424 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 86.94262461316023 - type: cos_sim_spearman value: 80.01596278563865 - type: euclidean_pearson value: 83.80799622922581 - type: euclidean_spearman value: 79.94984954947103 - type: manhattan_pearson value: 83.68473841756281 - type: manhattan_spearman value: 79.84990707951822 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 80.57346443146068 - type: cos_sim_spearman value: 81.54689837570866 - type: euclidean_pearson value: 81.10909881516007 - type: euclidean_spearman value: 81.56746243261762 - type: manhattan_pearson value: 80.87076036186582 - type: manhattan_spearman value: 81.33074987964402 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 79.54733787179849 - type: cos_sim_spearman value: 77.72202105610411 - type: euclidean_pearson value: 78.9043595478849 - type: euclidean_spearman value: 77.93422804309435 - type: manhattan_pearson value: 78.58115121621368 - type: manhattan_spearman value: 77.62508135122033 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 88.59880017237558 - type: cos_sim_spearman value: 89.31088630824758 - type: euclidean_pearson value: 88.47069261564656 - type: euclidean_spearman value: 89.33581971465233 - type: manhattan_pearson value: 88.40774264100956 - type: manhattan_spearman value: 89.28657485627835 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 84.08055117917084 - type: cos_sim_spearman value: 85.78491813080304 - type: euclidean_pearson value: 84.99329155500392 - type: euclidean_spearman value: 85.76728064677287 - type: manhattan_pearson value: 84.87947428989587 - type: manhattan_spearman value: 85.62429454917464 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (ko-ko) config: ko-ko split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 82.14190939287384 - type: cos_sim_spearman value: 82.27331573306041 - type: euclidean_pearson value: 81.891896953716 - type: euclidean_spearman value: 82.37695542955998 - type: manhattan_pearson value: 81.73123869460504 - type: manhattan_spearman value: 82.19989168441421 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (ar-ar) config: ar-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 76.84695301843362 - type: cos_sim_spearman value: 77.87790986014461 - type: euclidean_pearson value: 76.91981583106315 - type: euclidean_spearman value: 77.88154772749589 - type: manhattan_pearson value: 76.94953277451093 - type: manhattan_spearman value: 77.80499230728604 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-ar) config: en-ar split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 75.44657840482016 - type: cos_sim_spearman value: 75.05531095119674 - type: euclidean_pearson value: 75.88161755829299 - type: euclidean_spearman value: 74.73176238219332 - type: manhattan_pearson value: 75.63984765635362 - type: manhattan_spearman value: 74.86476440770737 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-de) config: en-de split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.64700140524133 - type: cos_sim_spearman value: 86.16014210425672 - type: euclidean_pearson value: 86.49086860843221 - type: euclidean_spearman value: 86.09729326815614 - type: manhattan_pearson value: 86.43406265125513 - type: manhattan_spearman value: 86.17740150939994 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.91170098764921 - type: cos_sim_spearman value: 88.12437004058931 - type: euclidean_pearson value: 88.81828254494437 - type: euclidean_spearman value: 88.14831794572122 - type: manhattan_pearson value: 88.93442183448961 - type: manhattan_spearman value: 88.15254630778304 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-tr) config: en-tr split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 72.91390577997292 - type: cos_sim_spearman value: 71.22979457536074 - type: euclidean_pearson value: 74.40314008106749 - type: euclidean_spearman value: 72.54972136083246 - type: manhattan_pearson value: 73.85687539530218 - type: manhattan_spearman value: 72.09500771742637 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (es-en) config: es-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 80.9301067983089 - type: cos_sim_spearman value: 80.74989828346473 - type: euclidean_pearson value: 81.36781301814257 - type: euclidean_spearman value: 80.9448819964426 - type: manhattan_pearson value: 81.0351322685609 - type: manhattan_spearman value: 80.70192121844177 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (es-es) config: es-es split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 87.13820465980005 - type: cos_sim_spearman value: 86.73532498758757 - type: euclidean_pearson value: 87.21329451846637 - type: euclidean_spearman value: 86.57863198601002 - type: manhattan_pearson value: 87.06973713818554 - type: manhattan_spearman value: 86.47534918791499 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (fr-en) config: fr-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 85.48720108904415 - type: cos_sim_spearman value: 85.62221757068387 - type: euclidean_pearson value: 86.1010129512749 - type: euclidean_spearman value: 85.86580966509942 - type: manhattan_pearson value: 86.26800938808971 - type: manhattan_spearman value: 85.88902721678429 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (it-en) config: it-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 83.98021347333516 - type: cos_sim_spearman value: 84.53806553803501 - type: euclidean_pearson value: 84.61483347248364 - type: euclidean_spearman value: 85.14191408011702 - type: manhattan_pearson value: 84.75297588825967 - type: manhattan_spearman value: 85.33176753669242 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (nl-en) config: nl-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 84.51856644893233 - type: cos_sim_spearman value: 85.27510748506413 - type: euclidean_pearson value: 85.09886861540977 - type: euclidean_spearman value: 85.62579245860887 - type: manhattan_pearson value: 84.93017860464607 - type: manhattan_spearman value: 85.5063988898453 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.581573200584195 - type: cos_sim_spearman value: 63.05503590247928 - type: euclidean_pearson value: 63.652564812602094 - type: euclidean_spearman value: 62.64811520876156 - type: manhattan_pearson value: 63.506842893061076 - type: manhattan_spearman value: 62.51289573046917 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (de) config: de split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 48.2248801729127 - type: cos_sim_spearman value: 56.5936604678561 - type: euclidean_pearson value: 43.98149464089 - type: euclidean_spearman value: 56.108561882423615 - type: manhattan_pearson value: 43.86880305903564 - type: manhattan_spearman value: 56.04671150510166 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (es) config: es split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.17564527009831 - type: cos_sim_spearman value: 64.57978560979488 - type: euclidean_pearson value: 58.8818330154583 - type: euclidean_spearman value: 64.99214839071281 - type: manhattan_pearson value: 58.72671436121381 - type: manhattan_spearman value: 65.10713416616109 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (pl) config: pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 26.772131864023297 - type: cos_sim_spearman value: 34.68200792408681 - type: euclidean_pearson value: 16.68082419005441 - type: euclidean_spearman value: 34.83099932652166 - type: manhattan_pearson value: 16.52605949659529 - type: manhattan_spearman value: 34.82075801399475 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (tr) config: tr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 54.42415189043831 - type: cos_sim_spearman value: 63.54594264576758 - type: euclidean_pearson value: 57.36577498297745 - type: euclidean_spearman value: 63.111466379158074 - type: manhattan_pearson value: 57.584543715873885 - type: manhattan_spearman value: 63.22361054139183 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (ar) config: ar split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 47.55216762405518 - type: cos_sim_spearman value: 56.98670142896412 - type: euclidean_pearson value: 50.15318757562699 - type: euclidean_spearman value: 56.524941926541906 - type: manhattan_pearson value: 49.955618528674904 - type: manhattan_spearman value: 56.37102209240117 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (ru) config: ru split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 49.20540980338571 - type: cos_sim_spearman value: 59.9009453504406 - type: euclidean_pearson value: 49.557749853620535 - type: euclidean_spearman value: 59.76631621172456 - type: manhattan_pearson value: 49.62340591181147 - type: manhattan_spearman value: 59.94224880322436 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (zh) config: zh split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 51.508169956576985 - type: cos_sim_spearman value: 66.82461565306046 - type: euclidean_pearson value: 56.2274426480083 - type: euclidean_spearman value: 66.6775323848333 - type: manhattan_pearson value: 55.98277796300661 - type: manhattan_spearman value: 66.63669848497175 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (fr) config: fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 72.86478788045507 - type: cos_sim_spearman value: 76.7946552053193 - type: euclidean_pearson value: 75.01598530490269 - type: euclidean_spearman value: 76.83618917858281 - type: manhattan_pearson value: 74.68337628304332 - type: manhattan_spearman value: 76.57480204017773 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (de-en) config: de-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 55.922619099401984 - type: cos_sim_spearman value: 56.599362477240774 - type: euclidean_pearson value: 56.68307052369783 - type: euclidean_spearman value: 54.28760436777401 - type: manhattan_pearson value: 56.67763566500681 - type: manhattan_spearman value: 53.94619541711359 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (es-en) config: es-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 66.74357206710913 - type: cos_sim_spearman value: 72.5208244925311 - type: euclidean_pearson value: 67.49254562186032 - type: euclidean_spearman value: 72.02469076238683 - type: manhattan_pearson value: 67.45251772238085 - type: manhattan_spearman value: 72.05538819984538 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (it) config: it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 71.25734330033191 - type: cos_sim_spearman value: 76.98349083946823 - type: euclidean_pearson value: 73.71642838667736 - type: euclidean_spearman value: 77.01715504651384 - type: manhattan_pearson value: 73.61712711868105 - type: manhattan_spearman value: 77.01392571153896 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (pl-en) config: pl-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 63.18215462781212 - type: cos_sim_spearman value: 65.54373266117607 - type: euclidean_pearson value: 64.54126095439005 - type: euclidean_spearman value: 65.30410369102711 - type: manhattan_pearson value: 63.50332221148234 - type: manhattan_spearman value: 64.3455878104313 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (zh-en) config: zh-en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 62.30509221440029 - type: cos_sim_spearman value: 65.99582704642478 - type: euclidean_pearson value: 63.43818859884195 - type: euclidean_spearman value: 66.83172582815764 - type: manhattan_pearson value: 63.055779168508764 - type: manhattan_spearman value: 65.49585020501449 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (es-it) config: es-it split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 59.587830825340404 - type: cos_sim_spearman value: 68.93467614588089 - type: euclidean_pearson value: 62.3073527367404 - type: euclidean_spearman value: 69.69758171553175 - type: manhattan_pearson value: 61.9074580815789 - type: manhattan_spearman value: 69.57696375597865 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (de-fr) config: de-fr split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.143220125577066 - type: cos_sim_spearman value: 67.78857859159226 - type: euclidean_pearson value: 55.58225107923733 - type: euclidean_spearman value: 67.80662907184563 - type: manhattan_pearson value: 56.24953502726514 - type: manhattan_spearman value: 67.98262125431616 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (de-pl) config: de-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 21.826928900322066 - type: cos_sim_spearman value: 49.578506634400405 - type: euclidean_pearson value: 27.939890138843214 - type: euclidean_spearman value: 52.71950519136242 - type: manhattan_pearson value: 26.39878683847546 - type: manhattan_spearman value: 47.54609580342499 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (fr-pl) config: fr-pl split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 57.27603854632001 - type: cos_sim_spearman value: 50.709255283710995 - type: euclidean_pearson value: 59.5419024445929 - type: euclidean_spearman value: 50.709255283710995 - type: manhattan_pearson value: 59.03256832438492 - type: manhattan_spearman value: 61.97797868009122 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 85.00757054859712 - type: cos_sim_spearman value: 87.29283629622222 - type: euclidean_pearson value: 86.54824171775536 - type: euclidean_spearman value: 87.24364730491402 - type: manhattan_pearson value: 86.5062156915074 - type: manhattan_spearman value: 87.15052170378574 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 82.03549357197389 - type: mrr value: 95.05437645143527 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 57.260999999999996 - type: map_at_10 value: 66.259 - type: map_at_100 value: 66.884 - type: map_at_1000 value: 66.912 - type: map_at_3 value: 63.685 - type: map_at_5 value: 65.35499999999999 - type: mrr_at_1 value: 60.333000000000006 - type: mrr_at_10 value: 67.5 - type: mrr_at_100 value: 68.013 - type: mrr_at_1000 value: 68.038 - type: mrr_at_3 value: 65.61099999999999 - type: mrr_at_5 value: 66.861 - type: ndcg_at_1 value: 60.333000000000006 - type: ndcg_at_10 value: 70.41 - type: ndcg_at_100 value: 73.10600000000001 - type: ndcg_at_1000 value: 73.846 - type: ndcg_at_3 value: 66.133 - type: ndcg_at_5 value: 68.499 - type: precision_at_1 value: 60.333000000000006 - type: precision_at_10 value: 9.232999999999999 - type: precision_at_100 value: 1.0630000000000002 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 25.667 - type: precision_at_5 value: 17.067 - type: recall_at_1 value: 57.260999999999996 - type: recall_at_10 value: 81.94399999999999 - type: recall_at_100 value: 93.867 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 70.339 - type: recall_at_5 value: 76.25 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.74356435643564 - type: cos_sim_ap value: 93.13411948212683 - type: cos_sim_f1 value: 86.80521991300147 - type: cos_sim_precision value: 84.00374181478017 - type: cos_sim_recall value: 89.8 - type: dot_accuracy value: 99.67920792079208 - type: dot_ap value: 89.27277565444479 - type: dot_f1 value: 83.9276990718124 - type: dot_precision value: 82.04393505253104 - type: dot_recall value: 85.9 - type: euclidean_accuracy value: 99.74257425742574 - type: euclidean_ap value: 93.17993008259062 - type: euclidean_f1 value: 86.69396110542476 - type: euclidean_precision value: 88.78406708595388 - type: euclidean_recall value: 84.7 - type: manhattan_accuracy value: 99.74257425742574 - type: manhattan_ap value: 93.14413755550099 - type: manhattan_f1 value: 86.82483594144371 - type: manhattan_precision value: 87.66564729867483 - type: manhattan_recall value: 86 - type: max_accuracy value: 99.74356435643564 - type: max_ap value: 93.17993008259062 - type: max_f1 value: 86.82483594144371 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 57.525863806168566 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 32.68850574423839 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 49.71580650644033 - type: mrr value: 50.50971903913081 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 29.152190498799484 - type: cos_sim_spearman value: 29.686180371952727 - type: dot_pearson value: 27.248664793816342 - type: dot_spearman value: 28.37748983721745 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.20400000000000001 - type: map_at_10 value: 1.6209999999999998 - type: map_at_100 value: 9.690999999999999 - type: map_at_1000 value: 23.733 - type: map_at_3 value: 0.575 - type: map_at_5 value: 0.885 - type: mrr_at_1 value: 78 - type: mrr_at_10 value: 86.56700000000001 - type: mrr_at_100 value: 86.56700000000001 - type: mrr_at_1000 value: 86.56700000000001 - type: mrr_at_3 value: 85.667 - type: mrr_at_5 value: 86.56700000000001 - type: ndcg_at_1 value: 76 - type: ndcg_at_10 value: 71.326 - type: ndcg_at_100 value: 54.208999999999996 - type: ndcg_at_1000 value: 49.252 - type: ndcg_at_3 value: 74.235 - type: ndcg_at_5 value: 73.833 - type: precision_at_1 value: 78 - type: precision_at_10 value: 74.8 - type: precision_at_100 value: 55.50000000000001 - type: precision_at_1000 value: 21.836 - type: precision_at_3 value: 78 - type: precision_at_5 value: 78 - type: recall_at_1 value: 0.20400000000000001 - type: recall_at_10 value: 1.894 - type: recall_at_100 value: 13.245999999999999 - type: recall_at_1000 value: 46.373 - type: recall_at_3 value: 0.613 - type: recall_at_5 value: 0.991 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (sqi-eng) config: sqi-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.89999999999999 - type: f1 value: 94.69999999999999 - type: precision value: 94.11666666666667 - type: recall value: 95.89999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (fry-eng) config: fry-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 68.20809248554913 - type: f1 value: 63.431048720066066 - type: precision value: 61.69143958161298 - type: recall value: 68.20809248554913 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kur-eng) config: kur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 71.21951219512195 - type: f1 value: 66.82926829268293 - type: precision value: 65.1260162601626 - type: recall value: 71.21951219512195 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tur-eng) config: tur-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.2 - type: f1 value: 96.26666666666667 - type: precision value: 95.8 - type: recall value: 97.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (deu-eng) config: deu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 99.3 - type: f1 value: 99.06666666666666 - type: precision value: 98.95 - type: recall value: 99.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (nld-eng) config: nld-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.63333333333333 - type: precision value: 96.26666666666668 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ron-eng) config: ron-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.86666666666666 - type: precision value: 94.31666666666668 - type: recall value: 96 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ang-eng) config: ang-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 47.01492537313433 - type: f1 value: 40.178867566927266 - type: precision value: 38.179295828549556 - type: recall value: 47.01492537313433 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ido-eng) config: ido-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.5 - type: f1 value: 83.62537480063796 - type: precision value: 82.44555555555554 - type: recall value: 86.5 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (jav-eng) config: jav-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.48780487804879 - type: f1 value: 75.45644599303138 - type: precision value: 73.37398373983739 - type: recall value: 80.48780487804879 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (isl-eng) config: isl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.95666666666666 - type: precision value: 91.125 - type: recall value: 93.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (slv-eng) config: slv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.73754556500607 - type: f1 value: 89.65168084244632 - type: precision value: 88.73025516403402 - type: recall value: 91.73754556500607 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cym-eng) config: cym-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81.04347826086956 - type: f1 value: 76.2128364389234 - type: precision value: 74.2 - type: recall value: 81.04347826086956 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kaz-eng) config: kaz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.65217391304348 - type: f1 value: 79.4376811594203 - type: precision value: 77.65797101449274 - type: recall value: 83.65217391304348 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (est-eng) config: est-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.5 - type: f1 value: 85.02690476190476 - type: precision value: 83.96261904761904 - type: recall value: 87.5 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (heb-eng) config: heb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89.3 - type: f1 value: 86.52333333333333 - type: precision value: 85.22833333333332 - type: recall value: 89.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (gla-eng) config: gla-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.01809408926418 - type: f1 value: 59.00594446432805 - type: precision value: 56.827215807915444 - type: recall value: 65.01809408926418 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (mar-eng) config: mar-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.2 - type: f1 value: 88.58 - type: precision value: 87.33333333333334 - type: recall value: 91.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (lat-eng) config: lat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.199999999999996 - type: f1 value: 53.299166276284915 - type: precision value: 51.3383908045977 - type: recall value: 59.199999999999996 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (bel-eng) config: bel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.2 - type: precision value: 90.25 - type: recall value: 93.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (pms-eng) config: pms-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 64.76190476190476 - type: f1 value: 59.867110667110666 - type: precision value: 58.07390192653351 - type: recall value: 64.76190476190476 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (gle-eng) config: gle-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.2 - type: f1 value: 71.48147546897547 - type: precision value: 69.65409090909091 - type: recall value: 76.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (pes-eng) config: pes-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.8 - type: f1 value: 92.14 - type: precision value: 91.35833333333333 - type: recall value: 93.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (nob-eng) config: nob-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.89999999999999 - type: f1 value: 97.2 - type: precision value: 96.85000000000001 - type: recall value: 97.89999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (bul-eng) config: bul-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 92.93333333333334 - type: precision value: 92.13333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cbk-eng) config: cbk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.1 - type: f1 value: 69.14817460317461 - type: precision value: 67.2515873015873 - type: recall value: 74.1 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (hun-eng) config: hun-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.19999999999999 - type: f1 value: 94.01333333333335 - type: precision value: 93.46666666666667 - type: recall value: 95.19999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (uig-eng) config: uig-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.9 - type: f1 value: 72.07523809523809 - type: precision value: 70.19777777777779 - type: recall value: 76.9 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (rus-eng) config: rus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.1 - type: f1 value: 92.31666666666666 - type: precision value: 91.43333333333332 - type: recall value: 94.1 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (spa-eng) config: spa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.8 - type: f1 value: 97.1 - type: precision value: 96.76666666666668 - type: recall value: 97.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (hye-eng) config: hye-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.85714285714286 - type: f1 value: 90.92093441150045 - type: precision value: 90.00449236298293 - type: recall value: 92.85714285714286 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tel-eng) config: tel-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.16239316239316 - type: f1 value: 91.33903133903132 - type: precision value: 90.56267806267806 - type: recall value: 93.16239316239316 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (afr-eng) config: afr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.4 - type: f1 value: 90.25666666666666 - type: precision value: 89.25833333333334 - type: recall value: 92.4 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (mon-eng) config: mon-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.22727272727272 - type: f1 value: 87.53030303030303 - type: precision value: 86.37121212121211 - type: recall value: 90.22727272727272 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (arz-eng) config: arz-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 79.03563941299791 - type: f1 value: 74.7349505840072 - type: precision value: 72.9035639412998 - type: recall value: 79.03563941299791 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (hrv-eng) config: hrv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97 - type: f1 value: 96.15 - type: precision value: 95.76666666666668 - type: recall value: 97 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (nov-eng) config: nov-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.26459143968872 - type: f1 value: 71.55642023346303 - type: precision value: 69.7544932369835 - type: recall value: 76.26459143968872 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (gsw-eng) config: gsw-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 58.119658119658126 - type: f1 value: 51.65242165242165 - type: precision value: 49.41768108434775 - type: recall value: 58.119658119658126 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (nds-eng) config: nds-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 74.3 - type: f1 value: 69.52055555555555 - type: precision value: 67.7574938949939 - type: recall value: 74.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ukr-eng) config: ukr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.8 - type: f1 value: 93.31666666666666 - type: precision value: 92.60000000000001 - type: recall value: 94.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (uzb-eng) config: uzb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.63551401869158 - type: f1 value: 72.35202492211837 - type: precision value: 70.60358255451713 - type: recall value: 76.63551401869158 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (lit-eng) config: lit-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.4 - type: f1 value: 88.4811111111111 - type: precision value: 87.7452380952381 - type: recall value: 90.4 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ina-eng) config: ina-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95 - type: f1 value: 93.60666666666667 - type: precision value: 92.975 - type: recall value: 95 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (lfn-eng) config: lfn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 63.01595782872099 - type: precision value: 61.596587301587306 - type: recall value: 67.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (zsm-eng) config: zsm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.7 - type: f1 value: 94.52999999999999 - type: precision value: 94 - type: recall value: 95.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ita-eng) config: ita-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.28999999999999 - type: precision value: 92.675 - type: recall value: 94.6 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cmn-eng) config: cmn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.75 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (lvs-eng) config: lvs-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.9 - type: f1 value: 89.83 - type: precision value: 88.92 - type: recall value: 91.9 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (glg-eng) config: glg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.34222222222223 - type: precision value: 92.75416666666668 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ceb-eng) config: ceb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 60.333333333333336 - type: f1 value: 55.31203703703703 - type: precision value: 53.39971108326371 - type: recall value: 60.333333333333336 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (bre-eng) config: bre-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 12.9 - type: f1 value: 11.099861903031458 - type: precision value: 10.589187932631877 - type: recall value: 12.9 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ben-eng) config: ben-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 86.7 - type: f1 value: 83.0152380952381 - type: precision value: 81.37833333333333 - type: recall value: 86.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (swg-eng) config: swg-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.39285714285714 - type: f1 value: 56.832482993197274 - type: precision value: 54.56845238095237 - type: recall value: 63.39285714285714 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (arq-eng) config: arq-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 48.73765093304062 - type: f1 value: 41.555736920720456 - type: precision value: 39.06874531737319 - type: recall value: 48.73765093304062 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kab-eng) config: kab-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 41.099999999999994 - type: f1 value: 36.540165945165946 - type: precision value: 35.05175685425686 - type: recall value: 41.099999999999994 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (fra-eng) config: fra-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.42333333333333 - type: precision value: 92.75833333333333 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (por-eng) config: por-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.89999999999999 - type: f1 value: 93.63333333333334 - type: precision value: 93.01666666666665 - type: recall value: 94.89999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tat-eng) config: tat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.9 - type: f1 value: 73.64833333333334 - type: precision value: 71.90282106782105 - type: recall value: 77.9 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (oci-eng) config: oci-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 59.4 - type: f1 value: 54.90521367521367 - type: precision value: 53.432840025471606 - type: recall value: 59.4 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (pol-eng) config: pol-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.39999999999999 - type: f1 value: 96.6 - type: precision value: 96.2 - type: recall value: 97.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (war-eng) config: war-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 67.2 - type: f1 value: 62.25926129426129 - type: precision value: 60.408376623376626 - type: recall value: 67.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (aze-eng) config: aze-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.2 - type: f1 value: 87.60666666666667 - type: precision value: 86.45277777777778 - type: recall value: 90.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (vie-eng) config: vie-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 97.7 - type: f1 value: 97 - type: precision value: 96.65 - type: recall value: 97.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (nno-eng) config: nno-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.2 - type: f1 value: 91.39746031746031 - type: precision value: 90.6125 - type: recall value: 93.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cha-eng) config: cha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 32.11678832116788 - type: f1 value: 27.210415386260234 - type: precision value: 26.20408990846947 - type: recall value: 32.11678832116788 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (mhr-eng) config: mhr-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.787319277832475 - type: precision value: 6.3452094433344435 - type: recall value: 8.5 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (dan-eng) config: dan-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.1 - type: f1 value: 95.08 - type: precision value: 94.61666666666667 - type: recall value: 96.1 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ell-eng) config: ell-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.3 - type: f1 value: 93.88333333333333 - type: precision value: 93.18333333333332 - type: recall value: 95.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (amh-eng) config: amh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.11904761904762 - type: f1 value: 80.69444444444444 - type: precision value: 78.72023809523809 - type: recall value: 85.11904761904762 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (pam-eng) config: pam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 11.1 - type: f1 value: 9.276381801735853 - type: precision value: 8.798174603174601 - type: recall value: 11.1 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (hsb-eng) config: hsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 63.56107660455487 - type: f1 value: 58.70433569191332 - type: precision value: 56.896926581464015 - type: recall value: 63.56107660455487 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (srp-eng) config: srp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.69999999999999 - type: f1 value: 93.10000000000001 - type: precision value: 92.35 - type: recall value: 94.69999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (epo-eng) config: epo-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.8 - type: f1 value: 96.01222222222222 - type: precision value: 95.67083333333332 - type: recall value: 96.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kzj-eng) config: kzj-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 9.2 - type: f1 value: 7.911555250305249 - type: precision value: 7.631246556216846 - type: recall value: 9.2 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (awa-eng) config: awa-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.48917748917748 - type: f1 value: 72.27375798804371 - type: precision value: 70.14430014430013 - type: recall value: 77.48917748917748 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (fao-eng) config: fao-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 77.09923664122137 - type: f1 value: 72.61541257724463 - type: precision value: 70.8998380754106 - type: recall value: 77.09923664122137 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (mal-eng) config: mal-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 98.2532751091703 - type: f1 value: 97.69529354682193 - type: precision value: 97.42843279961184 - type: recall value: 98.2532751091703 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ile-eng) config: ile-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 82.8 - type: f1 value: 79.14672619047619 - type: precision value: 77.59489247311828 - type: recall value: 82.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (bos-eng) config: bos-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.35028248587571 - type: f1 value: 92.86252354048965 - type: precision value: 92.2080979284369 - type: recall value: 94.35028248587571 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cor-eng) config: cor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.5 - type: f1 value: 6.282429263935621 - type: precision value: 5.783274240739785 - type: recall value: 8.5 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (cat-eng) config: cat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 91.025 - type: precision value: 90.30428571428571 - type: recall value: 92.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (eus-eng) config: eus-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 81 - type: f1 value: 77.8232380952381 - type: precision value: 76.60194444444444 - type: recall value: 81 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (yue-eng) config: yue-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91 - type: f1 value: 88.70857142857142 - type: precision value: 87.7 - type: recall value: 91 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (swe-eng) config: swe-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.3 - type: precision value: 94.76666666666667 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (dtp-eng) config: dtp-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 8.1 - type: f1 value: 7.001008218834307 - type: precision value: 6.708329562594269 - type: recall value: 8.1 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kat-eng) config: kat-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 87.1313672922252 - type: f1 value: 84.09070598748882 - type: precision value: 82.79171454104429 - type: recall value: 87.1313672922252 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (jpn-eng) config: jpn-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.28333333333333 - type: precision value: 94.73333333333332 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (csb-eng) config: csb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 42.29249011857708 - type: f1 value: 36.981018542283365 - type: precision value: 35.415877813576024 - type: recall value: 42.29249011857708 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (xho-eng) config: xho-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 83.80281690140845 - type: f1 value: 80.86854460093896 - type: precision value: 79.60093896713614 - type: recall value: 83.80281690140845 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (orv-eng) config: orv-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 45.26946107784431 - type: f1 value: 39.80235464678088 - type: precision value: 38.14342660001342 - type: recall value: 45.26946107784431 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ind-eng) config: ind-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.3 - type: f1 value: 92.9 - type: precision value: 92.26666666666668 - type: recall value: 94.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tuk-eng) config: tuk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 37.93103448275862 - type: f1 value: 33.15192743764172 - type: precision value: 31.57456528146183 - type: recall value: 37.93103448275862 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (max-eng) config: max-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 69.01408450704226 - type: f1 value: 63.41549295774648 - type: precision value: 61.342778895595806 - type: recall value: 69.01408450704226 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (swh-eng) config: swh-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 76.66666666666667 - type: f1 value: 71.60705960705961 - type: precision value: 69.60683760683762 - type: recall value: 76.66666666666667 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (hin-eng) config: hin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 95.8 - type: f1 value: 94.48333333333333 - type: precision value: 93.83333333333333 - type: recall value: 95.8 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (dsb-eng) config: dsb-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 52.81837160751566 - type: f1 value: 48.435977731384824 - type: precision value: 47.11291973845539 - type: recall value: 52.81837160751566 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ber-eng) config: ber-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 44.9 - type: f1 value: 38.88962621607783 - type: precision value: 36.95936507936508 - type: recall value: 44.9 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tam-eng) config: tam-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 90.55374592833876 - type: f1 value: 88.22553125484721 - type: precision value: 87.26927252985884 - type: recall value: 90.55374592833876 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (slk-eng) config: slk-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 94.6 - type: f1 value: 93.13333333333333 - type: precision value: 92.45333333333333 - type: recall value: 94.6 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tgl-eng) config: tgl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 93.7 - type: f1 value: 91.99666666666667 - type: precision value: 91.26666666666668 - type: recall value: 93.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ast-eng) config: ast-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 85.03937007874016 - type: f1 value: 81.75853018372703 - type: precision value: 80.34120734908137 - type: recall value: 85.03937007874016 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (mkd-eng) config: mkd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88.3 - type: f1 value: 85.5 - type: precision value: 84.25833333333334 - type: recall value: 88.3 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (khm-eng) config: khm-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 65.51246537396122 - type: f1 value: 60.02297410192148 - type: precision value: 58.133467727289236 - type: recall value: 65.51246537396122 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ces-eng) config: ces-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96 - type: f1 value: 94.89 - type: precision value: 94.39166666666667 - type: recall value: 96 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tzl-eng) config: tzl-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 57.692307692307686 - type: f1 value: 53.162393162393165 - type: precision value: 51.70673076923077 - type: recall value: 57.692307692307686 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (urd-eng) config: urd-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 91.60000000000001 - type: f1 value: 89.21190476190475 - type: precision value: 88.08666666666667 - type: recall value: 91.60000000000001 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (ara-eng) config: ara-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 88 - type: f1 value: 85.47 - type: precision value: 84.43266233766234 - type: recall value: 88 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (kor-eng) config: kor-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 92.7 - type: f1 value: 90.64999999999999 - type: precision value: 89.68333333333332 - type: recall value: 92.7 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (yid-eng) config: yid-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 80.30660377358491 - type: f1 value: 76.33044137466307 - type: precision value: 74.78970125786164 - type: recall value: 80.30660377358491 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (fin-eng) config: fin-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.39999999999999 - type: f1 value: 95.44 - type: precision value: 94.99166666666666 - type: recall value: 96.39999999999999 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (tha-eng) config: tha-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 96.53284671532847 - type: f1 value: 95.37712895377129 - type: precision value: 94.7992700729927 - type: recall value: 96.53284671532847 - task: type: BitextMining dataset: type: mteb/tatoeba-bitext-mining name: MTEB Tatoeba (wuu-eng) config: wuu-eng split: test revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553 metrics: - type: accuracy value: 89 - type: f1 value: 86.23190476190476 - type: precision value: 85.035 - type: recall value: 89 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.585 - type: map_at_10 value: 9.012 - type: map_at_100 value: 14.027000000000001 - type: map_at_1000 value: 15.565000000000001 - type: map_at_3 value: 5.032 - type: map_at_5 value: 6.657 - type: mrr_at_1 value: 28.571 - type: mrr_at_10 value: 45.377 - type: mrr_at_100 value: 46.119 - type: mrr_at_1000 value: 46.127 - type: mrr_at_3 value: 41.156 - type: mrr_at_5 value: 42.585 - type: ndcg_at_1 value: 27.551 - type: ndcg_at_10 value: 23.395 - type: ndcg_at_100 value: 33.342 - type: ndcg_at_1000 value: 45.523 - type: ndcg_at_3 value: 25.158 - type: ndcg_at_5 value: 23.427 - type: precision_at_1 value: 28.571 - type: precision_at_10 value: 21.429000000000002 - type: precision_at_100 value: 6.714 - type: precision_at_1000 value: 1.473 - type: precision_at_3 value: 27.211000000000002 - type: precision_at_5 value: 24.490000000000002 - type: recall_at_1 value: 2.585 - type: recall_at_10 value: 15.418999999999999 - type: recall_at_100 value: 42.485 - type: recall_at_1000 value: 79.536 - type: recall_at_3 value: 6.239999999999999 - type: recall_at_5 value: 8.996 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.3234 - type: ap value: 14.361688653847423 - type: f1 value: 54.819068624319044 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 61.97792869269949 - type: f1 value: 62.28965628513728 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 38.90540145385218 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.53513739047506 - type: cos_sim_ap value: 75.27741586677557 - type: cos_sim_f1 value: 69.18792902473774 - type: cos_sim_precision value: 67.94708725515136 - type: cos_sim_recall value: 70.47493403693932 - type: dot_accuracy value: 84.7052512368123 - type: dot_ap value: 69.36075482849378 - type: dot_f1 value: 64.44688376631296 - type: dot_precision value: 59.92288500793831 - type: dot_recall value: 69.70976253298153 - type: euclidean_accuracy value: 86.60666388508076 - type: euclidean_ap value: 75.47512772621097 - type: euclidean_f1 value: 69.413872536473 - type: euclidean_precision value: 67.39562624254472 - type: euclidean_recall value: 71.55672823218997 - type: manhattan_accuracy value: 86.52917684925792 - type: manhattan_ap value: 75.34000110496703 - type: manhattan_f1 value: 69.28489190226429 - type: manhattan_precision value: 67.24608889992551 - type: manhattan_recall value: 71.45118733509234 - type: max_accuracy value: 86.60666388508076 - type: max_ap value: 75.47512772621097 - type: max_f1 value: 69.413872536473 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 89.01695967710637 - type: cos_sim_ap value: 85.8298270742901 - type: cos_sim_f1 value: 78.46988128389272 - type: cos_sim_precision value: 74.86017897091722 - type: cos_sim_recall value: 82.44533415460425 - type: dot_accuracy value: 88.19420188613343 - type: dot_ap value: 83.82679165901324 - type: dot_f1 value: 76.55833777304208 - type: dot_precision value: 75.6884875846501 - type: dot_recall value: 77.44841392054204 - type: euclidean_accuracy value: 89.03054294252338 - type: euclidean_ap value: 85.89089555185325 - type: euclidean_f1 value: 78.62997658079624 - type: euclidean_precision value: 74.92329149232914 - type: euclidean_recall value: 82.72251308900523 - type: manhattan_accuracy value: 89.0266620095471 - type: manhattan_ap value: 85.86458997929147 - type: manhattan_f1 value: 78.50685331000291 - type: manhattan_precision value: 74.5499861534201 - type: manhattan_recall value: 82.90729904527257 - type: max_accuracy value: 89.03054294252338 - type: max_ap value: 85.89089555185325 - type: max_f1 value: 78.62997658079624 language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - 'no' - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit --- ## Multilingual-E5-large [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf). Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022 This model has 24 layers and the embedding size is 1024. ## Usage Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset. ```python import torch.nn.functional as F from torch import Tensor from transformers import AutoTokenizer, AutoModel def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor: last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0) return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None] # Each input text should start with "query: " or "passage: ", even for non-English texts. # For tasks other than retrieval, you can simply use the "query: " prefix. input_texts = ['query: how much protein should a female eat', 'query: 南瓜的家常做法', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右,放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅"] tokenizer = AutoTokenizer.from_pretrained('intfloat/multilingual-e5-large') model = AutoModel.from_pretrained('intfloat/multilingual-e5-large') # Tokenize the input texts batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt') outputs = model(**batch_dict) embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask']) # normalize embeddings embeddings = F.normalize(embeddings, p=2, dim=1) scores = (embeddings[:2] @ embeddings[2:].T) * 100 print(scores.tolist()) ``` ## Supported Languages This model is initialized from [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation. ## Training Details **Initialization**: [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) **First stage**: contrastive pre-training with weak supervision | Dataset | Weak supervision | # of text pairs | |--------------------------------------------------------------------------------------------------------|---------------------------------------|-----------------| | Filtered [mC4](https://huggingface.co/datasets/mc4) | (title, page content) | 1B | | [CC News](https://huggingface.co/datasets/intfloat/multilingual_cc_news) | (title, news content) | 400M | | [NLLB](https://huggingface.co/datasets/allenai/nllb) | translation pairs | 2.4B | | [Wikipedia](https://huggingface.co/datasets/intfloat/wikipedia) | (hierarchical section title, passage) | 150M | | Filtered [Reddit](https://www.reddit.com/) | (comment, response) | 800M | | [S2ORC](https://github.com/allenai/s2orc) | (title, abstract) and citation pairs | 100M | | [Stackexchange](https://stackexchange.com/) | (question, answer) | 50M | | [xP3](https://huggingface.co/datasets/bigscience/xP3) | (input prompt, response) | 80M | | [Miscellaneous unsupervised SBERT data](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | - | 10M | **Second stage**: supervised fine-tuning | Dataset | Language | # of text pairs | |----------------------------------------------------------------------------------------|--------------|-----------------| | [MS MARCO](https://microsoft.github.io/msmarco/) | English | 500k | | [NQ](https://github.com/facebookresearch/DPR) | English | 70k | | [Trivia QA](https://github.com/facebookresearch/DPR) | English | 60k | | [NLI from SimCSE](https://github.com/princeton-nlp/SimCSE) | English | <300k | | [ELI5](https://huggingface.co/datasets/eli5) | English | 500k | | [DuReader Retrieval](https://github.com/baidu/DuReader/tree/master/DuReader-Retrieval) | Chinese | 86k | | [KILT Fever](https://huggingface.co/datasets/kilt_tasks) | English | 70k | | [KILT HotpotQA](https://huggingface.co/datasets/kilt_tasks) | English | 70k | | [SQuAD](https://huggingface.co/datasets/squad) | English | 87k | | [Quora](https://huggingface.co/datasets/quora) | English | 150k | | [Mr. TyDi](https://huggingface.co/datasets/castorini/mr-tydi) | 11 languages | 50k | | [MIRACL](https://huggingface.co/datasets/miracl/miracl) | 16 languages | 40k | For all labeled datasets, we only use its training set for fine-tuning. For other training details, please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf). ## Benchmark Results on [Mr. TyDi](https://arxiv.org/abs/2108.08787) | Model | Avg MRR@10 | | ar | bn | en | fi | id | ja | ko | ru | sw | te | th | |-----------------------|------------|-------|------| --- | --- | --- | --- | --- | --- | --- |------| --- | --- | | BM25 | 33.3 | | 36.7 | 41.3 | 15.1 | 28.8 | 38.2 | 21.7 | 28.1 | 32.9 | 39.6 | 42.4 | 41.7 | | mDPR | 16.7 | | 26.0 | 25.8 | 16.2 | 11.3 | 14.6 | 18.1 | 21.9 | 18.5 | 7.3 | 10.6 | 13.5 | | BM25 + mDPR | 41.7 | | 49.1 | 53.5 | 28.4 | 36.5 | 45.5 | 35.5 | 36.2 | 42.7 | 40.5 | 42.0 | 49.2 | | | | | multilingual-e5-small | 64.4 | | 71.5 | 66.3 | 54.5 | 57.7 | 63.2 | 55.4 | 54.3 | 60.8 | 65.4 | 89.1 | 70.1 | | multilingual-e5-base | 65.9 | | 72.3 | 65.0 | 58.5 | 60.8 | 64.9 | 56.6 | 55.8 | 62.7 | 69.0 | 86.6 | 72.7 | | multilingual-e5-large | **70.5** | | 77.5 | 73.2 | 60.8 | 66.8 | 68.5 | 62.5 | 61.6 | 65.8 | 72.7 | 90.2 | 76.2 | ## MTEB Benchmark Evaluation Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316). ## Support for Sentence Transformers Below is an example for usage with sentence_transformers. ```python from sentence_transformers import SentenceTransformer model = SentenceTransformer('intfloat/multilingual-e5-large') input_texts = [ 'query: how much protein should a female eat', 'query: 南瓜的家常做法', "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 i s 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or traini ng for a marathon. Check out the chart below to see how much protein you should be eating each day.", "passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮 ,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右, 放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油 锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅" ] embeddings = model.encode(input_texts, normalize_embeddings=True) ``` Package requirements `pip install sentence_transformers~=2.2.2` Contributors: [michaelfeil](https://huggingface.co/michaelfeil) ## FAQ **1. Do I need to add the prefix "query: " and "passage: " to input texts?** Yes, this is how the model is trained, otherwise you will see a performance degradation. Here are some rules of thumb: - Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval. - Use "query: " prefix for symmetric tasks such as semantic similarity, bitext mining, paraphrase retrieval. - Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering. **2. Why are my reproduced results slightly different from reported in the model card?** Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences. **3. Why does the cosine similarity scores distribute around 0.7 to 1.0?** This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss. For text embedding tasks like text retrieval or semantic similarity, what matters is the relative order of the scores instead of the absolute values, so this should not be an issue. ## Citation If you find our paper or models helpful, please consider cite as follows: ``` @article{wang2022text, title={Text Embeddings by Weakly-Supervised Contrastive Pre-training}, author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu}, journal={arXiv preprint arXiv:2212.03533}, year={2022} } ``` ## Limitations Long texts will be truncated to at most 512 tokens.
159,206
[ [ -0.036163330078125, -0.037994384765625, 0.0091705322265625, 0.0243377685546875, -0.0165252685546875, -0.007457733154296875, -0.024322509765625, -0.032958984375, 0.02783203125, 0.01702880859375, -0.036651611328125, -0.055206298828125, -0.054046630859375, 0.0241241455078125, 0.005096435546875, 0.0777587890625, -0.01104736328125, 0.002288818359375, -0.0011091232299804688, -0.0265960693359375, -0.0185546875, -0.033203125, -0.04852294921875, -0.015838623046875, 0.027191162109375, 0.0282745361328125, 0.03485107421875, 0.04632568359375, 0.032623291015625, 0.0254974365234375, -0.012115478515625, 0.0239105224609375, -0.0298614501953125, -0.0146026611328125, 0.0101470947265625, -0.031463623046875, -0.033447265625, -0.00150299072265625, 0.045928955078125, 0.0428466796875, 0.0008935928344726562, 0.0181121826171875, 0.015899658203125, 0.0638427734375, -0.027435302734375, 0.020660400390625, -0.016021728515625, 0.00916290283203125, -0.0192718505859375, 0.0006394386291503906, -0.01142120361328125, -0.007305145263671875, 0.0012445449829101562, -0.040130615234375, -0.003265380859375, 0.003841400146484375, 0.0924072265625, 0.0127105712890625, -0.0311126708984375, -0.0157318115234375, -0.00933074951171875, 0.06591796875, -0.054901123046875, 0.037841796875, 0.048675537109375, -0.0019102096557617188, -0.0008945465087890625, -0.04962158203125, -0.038909912109375, 0.0004012584686279297, -0.031890869140625, 0.0272369384765625, -0.01026153564453125, -0.01122283935546875, 0.0238037109375, 0.03289794921875, -0.06109619140625, 0.00862884521484375, -0.024566650390625, -0.01171112060546875, 0.0572509765625, 0.00621795654296875, 0.0262603759765625, -0.035675048828125, -0.0245208740234375, -0.01325225830078125, -0.031280517578125, 0.02166748046875, 0.026611328125, 0.018218994140625, -0.04388427734375, 0.035552978515625, -0.0213623046875, 0.044586181640625, 0.00922393798828125, -0.013916015625, 0.053009033203125, -0.051788330078125, -0.01605224609375, -0.01291656494140625, 0.08648681640625, 0.027801513671875, 0.003398895263671875, 0.00750732421875, -0.00978851318359375, -0.005733489990234375, -0.01806640625, -0.06451416015625, -0.02130126953125, 0.034332275390625, -0.038787841796875, -0.0169525146484375, 0.0012969970703125, -0.05633544921875, 0.0033111572265625, -0.01611328125, 0.0212860107421875, -0.046783447265625, -0.022979736328125, 0.00679779052734375, 0.0034046173095703125, 0.016357421875, 0.010528564453125, -0.056304931640625, 0.0151519775390625, 0.01436614990234375, 0.069091796875, -0.01031494140625, -0.023834228515625, -0.020843505859375, 0.0018548965454101562, -0.01125335693359375, 0.043701171875, -0.0174560546875, -0.00945281982421875, -0.005130767822265625, 0.029571533203125, -0.0291595458984375, -0.0226593017578125, 0.0465087890625, -0.015838623046875, 0.0369873046875, -0.022705078125, -0.030670166015625, -0.0063629150390625, 0.028839111328125, -0.04791259765625, 0.0997314453125, 0.0164642333984375, -0.07763671875, 0.0178070068359375, -0.044189453125, -0.028045654296875, -0.0125579833984375, -0.0157012939453125, -0.0467529296875, -0.0262908935546875, 0.03668212890625, 0.041290283203125, -0.01342010498046875, 0.00984954833984375, -0.004413604736328125, -0.0173187255859375, 0.0003364086151123047, -0.011474609375, 0.08428955078125, 0.0224151611328125, -0.045806884765625, -0.001705169677734375, -0.0654296875, 0.0023555755615234375, 0.0213470458984375, -0.03204345703125, -0.00940704345703125, -0.0159759521484375, 0.00380706787109375, 0.0380859375, 0.03045654296875, -0.037841796875, 0.0162200927734375, -0.045013427734375, 0.039276123046875, 0.04046630859375, -0.00876617431640625, 0.0256805419921875, -0.03009033203125, 0.0299224853515625, 0.029510498046875, 0.0008130073547363281, -0.0185394287109375, -0.04107666015625, -0.06597900390625, -0.02935791015625, 0.0295562744140625, 0.04803466796875, -0.059417724609375, 0.04534912109375, -0.0243682861328125, -0.031585693359375, -0.055816650390625, 0.01953125, 0.038787841796875, 0.02972412109375, 0.045684814453125, -0.0002751350402832031, -0.0460205078125, -0.071533203125, -0.00809478759765625, 0.004405975341796875, 0.004337310791015625, 0.023956298828125, 0.057891845703125, -0.0232696533203125, 0.042724609375, -0.038909912109375, -0.040863037109375, -0.026763916015625, -0.00571441650390625, 0.0369873046875, 0.045166015625, 0.048065185546875, -0.06280517578125, -0.059234619140625, 0.00826263427734375, -0.0655517578125, 0.00775146484375, 0.00286865234375, -0.01279449462890625, 0.04278564453125, 0.040191650390625, -0.04388427734375, 0.020477294921875, 0.05303955078125, -0.0294647216796875, 0.03680419921875, -0.0234375, 0.00954437255859375, -0.1053466796875, 0.0173797607421875, 0.0077056884765625, -0.007781982421875, -0.030731201171875, 0.0010423660278320312, 0.006999969482421875, 0.002567291259765625, -0.0288543701171875, 0.047607421875, -0.05560302734375, 0.01538848876953125, 0.01012420654296875, 0.0233154296875, 0.0004239082336425781, 0.0484619140625, 0.005725860595703125, 0.053619384765625, 0.039031982421875, -0.0439453125, 0.00848388671875, 0.0303497314453125, -0.03289794921875, 0.0296173095703125, -0.046783447265625, -0.00843048095703125, -0.001007080078125, 0.022125244140625, -0.078369140625, -0.02093505859375, 0.0167694091796875, -0.0421142578125, 0.03240966796875, -0.007389068603515625, -0.041107177734375, -0.04791259765625, -0.046173095703125, 0.01027679443359375, 0.017974853515625, -0.031463623046875, 0.03314208984375, 0.019134521484375, -0.006130218505859375, -0.06201171875, -0.06976318359375, -0.004863739013671875, -0.00730133056640625, -0.06134033203125, 0.033538818359375, -0.01145172119140625, 0.0101776123046875, 0.0015048980712890625, 0.001621246337890625, 0.00934600830078125, -0.00554656982421875, 0.004344940185546875, 0.013427734375, -0.01139068603515625, -0.005641937255859375, -0.0011615753173828125, -0.00374603271484375, -0.016265869140625, -0.00860595703125, 0.047027587890625, -0.0274810791015625, 0.0036067962646484375, -0.0413818359375, 0.021148681640625, 0.03326416015625, -0.0214996337890625, 0.078369140625, 0.072265625, -0.0196075439453125, 0.007221221923828125, -0.038330078125, -0.0027923583984375, -0.034393310546875, 0.040313720703125, -0.04364013671875, -0.06298828125, 0.05255126953125, 0.009796142578125, 0.00919342041015625, 0.05682373046875, 0.03607177734375, -0.01080322265625, 0.0819091796875, 0.0296783447265625, -0.016143798828125, 0.031219482421875, -0.052459716796875, -0.0006928443908691406, -0.065185546875, -0.0272216796875, -0.0367431640625, -0.0213623046875, -0.07098388671875, -0.031463623046875, 0.01629638671875, 0.0025634765625, -0.028564453125, 0.0300445556640625, -0.04180908203125, 0.01181793212890625, 0.038116455078125, 0.0194549560546875, 0.00469970703125, 0.006725311279296875, -0.0263824462890625, -0.0142974853515625, -0.05987548828125, -0.0286712646484375, 0.08062744140625, 0.0313720703125, 0.0352783203125, 0.010467529296875, 0.051239013671875, -0.0014009475708007812, 0.0011043548583984375, -0.037811279296875, 0.029205322265625, -0.021759033203125, -0.039276123046875, -0.0164794921875, -0.05126953125, -0.0771484375, 0.0247650146484375, -0.022705078125, -0.0556640625, 0.010650634765625, 0.0006465911865234375, -0.019287109375, 0.0224456787109375, -0.065673828125, 0.07257080078125, -0.02313232421875, -0.037841796875, 0.0113983154296875, -0.060791015625, 0.0206756591796875, 0.0056915283203125, 0.02685546875, 0.00021708011627197266, -0.009033203125, 0.06793212890625, -0.0302581787109375, 0.052764892578125, -0.01032257080078125, 0.0020885467529296875, 0.01168060302734375, -0.014190673828125, 0.051055908203125, -0.0021533966064453125, -0.0093994140625, 0.0269927978515625, 0.01346588134765625, -0.043304443359375, -0.0236358642578125, 0.0643310546875, -0.078369140625, -0.0340576171875, -0.048248291015625, -0.0299835205078125, -0.004863739013671875, 0.0276031494140625, 0.038177490234375, 0.0137481689453125, 0.00818634033203125, 0.02978515625, 0.047698974609375, -0.03619384765625, 0.039459228515625, 0.03082275390625, -0.0031681060791015625, -0.053070068359375, 0.06640625, 0.022308349609375, 0.01302337646484375, 0.03399658203125, 0.01221466064453125, -0.028900146484375, -0.032989501953125, -0.041259765625, 0.03594970703125, -0.03082275390625, -0.0231475830078125, -0.07086181640625, -0.0190582275390625, -0.05084228515625, -0.001922607421875, -0.024688720703125, -0.0277862548828125, -0.02447509765625, -0.00875091552734375, 0.0225067138671875, 0.03497314453125, -0.000995635986328125, 0.012603759765625, -0.046905517578125, 0.01715087890625, -0.006626129150390625, 0.0217132568359375, -0.010589599609375, -0.052520751953125, -0.04107666015625, 0.0132293701171875, -0.0265350341796875, -0.051055908203125, 0.0494384765625, 0.0211639404296875, 0.0418701171875, 0.028839111328125, -0.0018415451049804688, 0.05029296875, -0.025787353515625, 0.07025146484375, 0.0212554931640625, -0.054901123046875, 0.046844482421875, -0.026123046875, 0.035125732421875, 0.04620361328125, 0.055450439453125, -0.03759765625, -0.02301025390625, -0.048675537109375, -0.0743408203125, 0.06768798828125, 0.0269012451171875, -0.0031585693359375, 0.0033435821533203125, 0.015838623046875, -0.0128173828125, 0.0126800537109375, -0.07257080078125, -0.051361083984375, -0.0183868408203125, -0.034637451171875, -0.01308441162109375, -0.016265869140625, -0.00341796875, -0.03875732421875, 0.05889892578125, -0.0078277587890625, 0.033416748046875, 0.0254974365234375, -0.0141143798828125, 0.004909515380859375, 0.00516510009765625, 0.052490234375, 0.03643798828125, -0.0212860107421875, 0.01012420654296875, 0.02325439453125, -0.04913330078125, -0.007778167724609375, 0.0166168212890625, -0.01418304443359375, 0.0005035400390625, 0.0307769775390625, 0.056549072265625, 0.0178985595703125, -0.03460693359375, 0.03948974609375, -0.00490570068359375, -0.028472900390625, -0.0237579345703125, -0.01074981689453125, 0.0168914794921875, 0.01457977294921875, 0.0255279541015625, -0.016815185546875, -0.0023365020751953125, -0.04229736328125, 0.01361083984375, 0.015716552734375, -0.0272369384765625, -0.0211944580078125, 0.04522705078125, 0.0082244873046875, 0.008270263671875, 0.042816162109375, -0.013641357421875, -0.048858642578125, 0.04541015625, 0.03753662109375, 0.040924072265625, -0.02557373046875, 0.0124664306640625, 0.06982421875, 0.037750244140625, 0.0004146099090576172, 0.0242462158203125, 0.0126800537109375, -0.04736328125, -0.02117919921875, -0.06134033203125, 0.00353240966796875, 0.0206298828125, -0.0345458984375, 0.0234832763671875, -0.0160369873046875, -0.0103759765625, 0.00013887882232666016, 0.044281005859375, -0.055206298828125, 0.018280029296875, 0.00022614002227783203, 0.07354736328125, -0.061248779296875, 0.06463623046875, 0.060516357421875, -0.061614990234375, -0.0655517578125, -0.0164794921875, -0.007312774658203125, -0.05218505859375, 0.055694580078125, 0.024505615234375, 0.01425933837890625, -0.0027751922607421875, -0.0300445556640625, -0.07086181640625, 0.0941162109375, 0.0164947509765625, -0.038909912109375, -0.001163482666015625, 0.0188751220703125, 0.043670654296875, -0.0220947265625, 0.034393310546875, 0.043792724609375, 0.044036865234375, -0.0100250244140625, -0.06927490234375, 0.01439666748046875, -0.04449462890625, -0.003448486328125, 0.004169464111328125, -0.079345703125, 0.0745849609375, -0.0118255615234375, -0.00464630126953125, -0.0089263916015625, 0.051727294921875, 0.02520751953125, 0.022186279296875, 0.023956298828125, 0.061859130859375, 0.054229736328125, -0.0130767822265625, 0.086181640625, -0.0364990234375, 0.036773681640625, 0.0693359375, 0.015899658203125, 0.05987548828125, 0.0313720703125, -0.0270843505859375, 0.047149658203125, 0.06622314453125, -0.0020008087158203125, 0.034332275390625, -0.00868988037109375, -0.0068359375, -0.006687164306640625, 0.0013017654418945312, -0.03961181640625, 0.024627685546875, 0.0216522216796875, -0.037445068359375, 0.0021038055419921875, 0.0100555419921875, 0.0296630859375, -0.00623321533203125, -0.014892578125, 0.0557861328125, 0.012786865234375, -0.040771484375, 0.065673828125, 0.0016489028930664062, 0.0665283203125, -0.0516357421875, 0.01204681396484375, -0.02801513671875, 0.018035888671875, -0.033203125, -0.06488037109375, 0.00909423828125, -0.0077056884765625, -0.0159454345703125, -0.013427734375, 0.033782958984375, -0.0469970703125, -0.040008544921875, 0.03497314453125, 0.03387451171875, 0.0159759521484375, 0.00531005859375, -0.08099365234375, 0.004261016845703125, 0.0192718505859375, -0.042510986328125, 0.0321044921875, 0.039154052734375, 0.004337310791015625, 0.043212890625, 0.049468994140625, 0.01080322265625, 0.0014009475708007812, -0.0016021728515625, 0.0718994140625, -0.055328369140625, -0.03875732421875, -0.05517578125, 0.035003662109375, -0.02294921875, -0.031280517578125, 0.07659912109375, 0.057708740234375, 0.070068359375, 0.004421234130859375, 0.052734375, -0.01395416259765625, 0.03326416015625, -0.036834716796875, 0.0567626953125, -0.0628662109375, -0.00392913818359375, -0.033782958984375, -0.05780029296875, -0.023529052734375, 0.058074951171875, -0.02935791015625, 0.0131683349609375, 0.059783935546875, 0.05877685546875, 0.0008058547973632812, -0.01611328125, 0.01494598388671875, 0.03460693359375, 0.014892578125, 0.0587158203125, 0.026275634765625, -0.06939697265625, 0.052459716796875, -0.03369140625, -0.01044464111328125, -0.0205230712890625, -0.04522705078125, -0.06549072265625, -0.05950927734375, -0.03167724609375, -0.036865234375, -0.0030765533447265625, 0.0743408203125, 0.045928955078125, -0.06488037109375, -0.0210113525390625, 0.0016422271728515625, 0.003292083740234375, -0.0200958251953125, -0.017303466796875, 0.0599365234375, -0.025238037109375, -0.06787109375, 0.00955963134765625, -0.000179290771484375, 0.004791259765625, -0.005489349365234375, -0.020660400390625, -0.048553466796875, -0.00859832763671875, 0.05694580078125, 0.0191192626953125, -0.0386962890625, -0.0030956268310546875, 0.006038665771484375, -0.0255279541015625, 0.010498046875, 0.00626373291015625, -0.027435302734375, 0.0270233154296875, 0.0361328125, 0.0335693359375, 0.060302734375, -0.00556182861328125, 0.015411376953125, -0.047607421875, 0.0220947265625, 0.0007390975952148438, 0.032989501953125, 0.0274505615234375, -0.02642822265625, 0.05767822265625, 0.023681640625, -0.037811279296875, -0.055389404296875, -0.0177764892578125, -0.084228515625, -0.0163116455078125, 0.09332275390625, -0.0222625732421875, -0.035125732421875, -0.004627227783203125, -0.0191802978515625, 0.0236663818359375, -0.033721923828125, 0.04296875, 0.0582275390625, -0.009033203125, -0.0203399658203125, -0.052764892578125, 0.043243408203125, 0.03143310546875, -0.06195068359375, -0.00823974609375, 0.01641845703125, 0.02838134765625, 0.016571044921875, 0.051788330078125, -0.01351165771484375, 0.0005903244018554688, -0.004123687744140625, 0.008758544921875, -0.0058441162109375, 0.00012022256851196289, -0.0098876953125, 0.0004432201385498047, -0.0247344970703125, -0.0124969482421875 ] ]
apanc/russian-sensitive-topics
2021-05-18T22:41:20.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "text-classification", "toxic comments classification", "ru", "arxiv:2103.05345", "endpoints_compatible", "region:us" ]
text-classification
apanc
null
null
apanc/russian-sensitive-topics
9
114,337
transformers
2022-03-02T23:29:05
--- language: - ru tags: - toxic comments classification licenses: - cc-by-nc-sa --- ## General concept of the model This model is trained on the dataset of sensitive topics of the Russian language. The concept of sensitive topics is described [in this article ](https://www.aclweb.org/anthology/2021.bsnlp-1.4/) presented at the workshop for Balto-Slavic NLP at the EACL-2021 conference. Please note that this article describes the first version of the dataset, while the model is trained on the extended version of the dataset open-sourced on our [GitHub](https://github.com/skoltech-nlp/inappropriate-sensitive-topics/blob/main/Version2/sensitive_topics/sensitive_topics.csv) or on [kaggle](https://www.kaggle.com/nigula/russian-sensitive-topics). The properties of the dataset is the same as the one described in the article, the only difference is the size. ## Instructions The model predicts combinations of 18 sensitive topics described in the [article](https://arxiv.org/abs/2103.05345). You can find step-by-step instructions for using the model [here](https://github.com/skoltech-nlp/inappropriate-sensitive-topics/blob/main/Version2/sensitive_topics/Inference.ipynb) ## Metrics The dataset partially manually labeled samples and partially semi-automatically labeled samples. Learn more in our article. We tested the performance of the classifier only on the part of manually labeled data that is why some topics are not well represented in the test set. | | precision | recall | f1-score | support | |-------------------|-----------|--------|----------|---------| | offline_crime | 0.65 | 0.55 | 0.6 | 132 | | online_crime | 0.5 | 0.46 | 0.48 | 37 | | drugs | 0.87 | 0.9 | 0.88 | 87 | | gambling | 0.5 | 0.67 | 0.57 | 6 | | pornography | 0.73 | 0.59 | 0.65 | 204 | | prostitution | 0.75 | 0.69 | 0.72 | 91 | | slavery | 0.72 | 0.72 | 0.73 | 40 | | suicide | 0.33 | 0.29 | 0.31 | 7 | | terrorism | 0.68 | 0.57 | 0.62 | 47 | | weapons | 0.89 | 0.83 | 0.86 | 138 | | body_shaming | 0.9 | 0.67 | 0.77 | 109 | | health_shaming | 0.84 | 0.55 | 0.66 | 108 | | politics | 0.68 | 0.54 | 0.6 | 241 | | racism | 0.81 | 0.59 | 0.68 | 204 | | religion | 0.94 | 0.72 | 0.81 | 102 | | sexual_minorities | 0.69 | 0.46 | 0.55 | 102 | | sexism | 0.66 | 0.64 | 0.65 | 132 | | social_injustice | 0.56 | 0.37 | 0.45 | 181 | | none | 0.62 | 0.67 | 0.64 | 250 | | micro avg | 0.72 | 0.61 | 0.66 | 2218 | | macro avg | 0.7 | 0.6 | 0.64 | 2218 | | weighted avg | 0.73 | 0.61 | 0.66 | 2218 | | samples avg | 0.75 | 0.66 | 0.68 | 2218 | ## Licensing Information [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa]. [![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa] [cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/ [cc-by-nc-sa-image]: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png ## Citation If you find this repository helpful, feel free to cite our publication: ``` @inproceedings{babakov-etal-2021-detecting, title = "Detecting Inappropriate Messages on Sensitive Topics that Could Harm a Company{'}s Reputation", author = "Babakov, Nikolay and Logacheva, Varvara and Kozlova, Olga and Semenov, Nikita and Panchenko, Alexander", booktitle = "Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing", month = apr, year = "2021", address = "Kiyv, Ukraine", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2021.bsnlp-1.4", pages = "26--36", abstract = "Not all topics are equally {``}flammable{''} in terms of toxicity: a calm discussion of turtles or fishing less often fuels inappropriate toxic dialogues than a discussion of politics or sexual minorities. We define a set of sensitive topics that can yield inappropriate and toxic messages and describe the methodology of collecting and labelling a dataset for appropriateness. While toxicity in user-generated data is well-studied, we aim at defining a more fine-grained notion of inappropriateness. The core of inappropriateness is that it can harm the reputation of a speaker. This is different from toxicity in two respects: (i) inappropriateness is topic-related, and (ii) inappropriate message is not toxic but still unacceptable. We collect and release two datasets for Russian: a topic-labelled dataset and an appropriateness-labelled dataset. We also release pre-trained classification models trained on this data.", } ```
5,040
[ [ -0.02777099609375, -0.05047607421875, 0.00457763671875, 0.0012903213500976562, -0.0216522216796875, 0.0022430419921875, -0.01119232177734375, -0.04296875, 0.007640838623046875, 0.0260467529296875, -0.032867431640625, -0.061737060546875, -0.04522705078125, -0.0028076171875, -0.0225830078125, 0.092529296875, 0.028167724609375, 0.008148193359375, -0.0070037841796875, -0.000804901123046875, -0.03692626953125, -0.037322998046875, -0.0280609130859375, -0.007808685302734375, 0.01287841796875, 0.0278778076171875, 0.0304412841796875, 0.0360107421875, 0.038909912109375, 0.023895263671875, -0.029083251953125, -0.01139068603515625, -0.0221099853515625, 0.005657196044921875, -0.0157623291015625, -0.03411865234375, -0.035186767578125, 0.005977630615234375, 0.03863525390625, 0.038330078125, -0.0016489028930664062, 0.0263214111328125, 0.0048675537109375, 0.04193115234375, -0.034393310546875, -0.008148193359375, -0.049468994140625, 0.0057373046875, -0.02081298828125, -0.004642486572265625, -0.02227783203125, -0.0131683349609375, 0.01611328125, -0.03143310546875, 0.033721923828125, 0.0099334716796875, 0.08819580078125, -0.0015964508056640625, -0.034393310546875, -0.0223236083984375, -0.03253173828125, 0.07427978515625, -0.059600830078125, 0.014556884765625, 0.060028076171875, -0.01293182373046875, -0.0012340545654296875, -0.035430908203125, -0.029388427734375, -0.0039825439453125, -0.0228424072265625, 0.01074981689453125, -0.03326416015625, -0.01322174072265625, 0.0321044921875, 0.028045654296875, -0.059600830078125, 0.00621795654296875, -0.041351318359375, -0.03717041015625, 0.06549072265625, 0.01238250732421875, 0.00675201416015625, -0.026092529296875, -0.0161590576171875, 0.00846099853515625, -0.0180816650390625, 0.0114288330078125, 0.047698974609375, 0.037017822265625, -0.021240234375, 0.041107177734375, -0.026885986328125, 0.061126708984375, 0.01514434814453125, -0.003078460693359375, 0.042572021484375, -0.043670654296875, -0.009124755859375, -0.00124359130859375, 0.0634765625, 0.0249786376953125, -0.0019350051879882812, 0.00493621826171875, -0.0073394775390625, 0.019927978515625, 0.0194244384765625, -0.0555419921875, -0.019561767578125, 0.0297698974609375, -0.040069580078125, -0.031646728515625, -0.0091705322265625, -0.08245849609375, -0.003047943115234375, -0.022003173828125, 0.0273284912109375, -0.0261993408203125, -0.020355224609375, 0.0051727294921875, 0.0070037841796875, 0.01425933837890625, 0.009124755859375, -0.06378173828125, 0.025482177734375, 0.041015625, 0.06011962890625, -0.01349639892578125, 0.008514404296875, -0.00881195068359375, -0.01335906982421875, -0.021636962890625, 0.06414794921875, -0.033843994140625, -0.017333984375, -0.00914764404296875, 0.0206146240234375, 0.01081085205078125, -0.033538818359375, 0.054412841796875, -0.0201568603515625, 0.048919677734375, -0.0137786865234375, -0.04010009765625, -0.0014810562133789062, 0.02313232421875, -0.031585693359375, 0.08575439453125, 0.003086090087890625, -0.0850830078125, 0.03753662109375, -0.052215576171875, -0.020050048828125, -0.0014705657958984375, 0.01358795166015625, -0.05462646484375, -0.02899169921875, -0.0157928466796875, 0.041656494140625, -0.02325439453125, 0.035125732421875, -0.0222930908203125, -0.01525115966796875, 0.0255889892578125, -0.013946533203125, 0.0831298828125, 0.032867431640625, -0.036895751953125, 0.004886627197265625, -0.065185546875, 0.006870269775390625, 0.004985809326171875, -0.0302276611328125, -0.02642822265625, -0.00977325439453125, 0.01074981689453125, 0.028228759765625, 0.007785797119140625, -0.045806884765625, -0.0079803466796875, -0.0311737060546875, 0.0233612060546875, 0.06658935546875, 0.00457763671875, 0.038055419921875, -0.04693603515625, 0.03753662109375, 0.00902557373046875, 0.0114898681640625, 0.01303863525390625, -0.048370361328125, -0.050201416015625, -0.0274505615234375, 0.02685546875, 0.04779052734375, -0.023529052734375, 0.06622314453125, -0.00634765625, -0.05548095703125, -0.03179931640625, -0.004756927490234375, 0.04119873046875, 0.04510498046875, 0.0223846435546875, -0.016998291015625, -0.035247802734375, -0.07867431640625, -0.020721435546875, -0.0185089111328125, 0.0134735107421875, 0.0291900634765625, 0.052459716796875, 0.001434326171875, 0.0633544921875, -0.0455322265625, -0.02166748046875, -0.0023555755615234375, 0.0131378173828125, 0.0240020751953125, 0.060791015625, 0.037322998046875, -0.07403564453125, -0.05743408203125, 0.004547119140625, -0.03472900390625, -0.01519775390625, 0.0102691650390625, -0.018798828125, 0.03564453125, 0.00811767578125, -0.033905029296875, 0.03594970703125, 0.018157958984375, -0.06158447265625, 0.05517578125, 0.00390625, 0.002208709716796875, -0.08575439453125, 0.034881591796875, 0.0173797607421875, -0.004047393798828125, -0.07635498046875, -0.01406097412109375, -0.0021762847900390625, -0.0037746429443359375, -0.036285400390625, 0.043243408203125, -0.0203094482421875, 0.01371002197265625, -0.00030612945556640625, -0.0078125, -0.0037097930908203125, 0.039398193359375, 0.01654052734375, 0.05230712890625, 0.052337646484375, -0.03961181640625, 0.0169525146484375, 0.022430419921875, -0.039459228515625, 0.050872802734375, -0.039337158203125, 0.002773284912109375, -0.01020050048828125, -0.006000518798828125, -0.06732177734375, -0.021270751953125, 0.041259765625, -0.0545654296875, 0.03253173828125, -0.01470184326171875, -0.02691650390625, -0.0238037109375, -0.0190887451171875, 0.00821685791015625, 0.035736083984375, 0.003875732421875, 0.017242431640625, 0.04693603515625, -0.0185089111328125, -0.07257080078125, -0.0633544921875, -0.01544952392578125, -0.042755126953125, -0.0380859375, 0.022125244140625, -0.020050048828125, -0.02215576171875, 0.01087188720703125, -0.01125335693359375, -0.0304412841796875, 0.012847900390625, -0.00450897216796875, 0.0238189697265625, -0.006805419921875, 0.03076171875, -0.017120361328125, -0.00927734375, 0.00225067138671875, 0.00555419921875, 0.04681396484375, -0.02923583984375, -0.005016326904296875, -0.045684814453125, 0.034515380859375, 0.0217132568359375, -0.01490020751953125, 0.05218505859375, 0.039215087890625, -0.035247802734375, -0.0017948150634765625, -0.02789306640625, -0.0169219970703125, -0.0318603515625, 0.0379638671875, -0.0070037841796875, -0.05389404296875, 0.058349609375, 0.01346588134765625, -0.007556915283203125, 0.050384521484375, 0.040802001953125, -0.0069732666015625, 0.068115234375, 0.034759521484375, -0.0021343231201171875, 0.030181884765625, -0.037689208984375, 0.01546478271484375, -0.04840087890625, -0.0416259765625, -0.064208984375, -0.0248565673828125, -0.0635986328125, -0.00997161865234375, -0.00582122802734375, -0.0160064697265625, -0.03533935546875, 0.0248870849609375, -0.06048583984375, 0.027191162109375, 0.038177490234375, 0.0111083984375, 0.00824737548828125, 0.0005850791931152344, -0.005176544189453125, -0.020904541015625, -0.05413818359375, -0.042205810546875, 0.0791015625, 0.022491455078125, 0.045684814453125, 0.01363372802734375, 0.033660888671875, 0.032379150390625, 0.0158233642578125, -0.058349609375, 0.043914794921875, -0.0210113525390625, -0.10491943359375, -0.015838623046875, -0.039398193359375, -0.083984375, 0.01751708984375, -0.03253173828125, -0.079345703125, 0.043731689453125, 0.00693511962890625, -0.01328277587890625, 0.0258026123046875, -0.050445556640625, 0.07025146484375, -0.01145172119140625, -0.0513916015625, 0.0008411407470703125, -0.0631103515625, 0.024658203125, -0.0202178955078125, 0.029083251953125, -0.03839111328125, 0.00014543533325195312, 0.07080078125, -0.0037441253662109375, 0.0731201171875, -0.0167388916015625, 0.01067352294921875, 0.0188140869140625, -0.0113983154296875, 0.02215576171875, -0.0006232261657714844, -0.01293182373046875, 0.00717926025390625, 0.01459503173828125, -0.035919189453125, -0.0112762451171875, 0.046630859375, -0.062744140625, -0.0350341796875, -0.06561279296875, -0.03607177734375, -0.004459381103515625, 0.030792236328125, 0.0311431884765625, 0.0254058837890625, -0.0052642822265625, 0.02984619140625, 0.0692138671875, -0.03009033203125, 0.005344390869140625, 0.042999267578125, -0.0028972625732421875, -0.0251922607421875, 0.06207275390625, 0.017303466796875, 0.03070068359375, 0.017333984375, 0.0288238525390625, -0.0178985595703125, -0.03668212890625, 0.00420379638671875, 0.015960693359375, -0.05609130859375, -0.034759521484375, -0.06195068359375, -0.0160980224609375, -0.0382080078125, -0.007556915283203125, -0.0231170654296875, -0.0256195068359375, -0.039398193359375, -0.006038665771484375, 0.034393310546875, 0.024871826171875, -0.0160369873046875, 0.018890380859375, -0.030120849609375, 0.01322174072265625, 0.01763916015625, 0.0224609375, -0.022705078125, -0.057342529296875, 0.00017309188842773438, 0.00998687744140625, -0.0286712646484375, -0.0634765625, 0.046905517578125, 0.0094146728515625, 0.054229736328125, 0.0269012451171875, 0.023681640625, 0.03179931640625, 0.0026988983154296875, 0.06036376953125, 0.006183624267578125, -0.0718994140625, 0.05755615234375, -0.03277587890625, 0.01922607421875, 0.050872802734375, 0.051544189453125, -0.05230712890625, -0.0440673828125, -0.07318115234375, -0.06939697265625, 0.08160400390625, 0.0243988037109375, 0.01605224609375, 0.005321502685546875, 0.014556884765625, 0.00667572021484375, 0.0158233642578125, -0.08056640625, -0.0430908203125, -0.0102691650390625, -0.0218505859375, 0.0036258697509765625, -0.0234527587890625, -0.020172119140625, -0.035736083984375, 0.0753173828125, 0.005558013916015625, 0.041351318359375, 0.0196990966796875, 0.0074920654296875, -0.0025424957275390625, 0.028717041015625, 0.050079345703125, 0.045013427734375, -0.0272216796875, 0.011993408203125, 0.01493072509765625, -0.055694580078125, 0.0160369873046875, 0.0013742446899414062, -0.024566650390625, -0.00693511962890625, 0.02435302734375, 0.05517578125, -0.0065155029296875, -0.03399658203125, 0.04180908203125, -0.0081787109375, -0.0274200439453125, -0.016632080078125, -0.006969451904296875, -0.0084381103515625, 0.0018892288208007812, 0.0304412841796875, 0.015899658203125, 0.017730712890625, -0.0489501953125, 0.02239990234375, 0.029052734375, -0.03118896484375, -0.022125244140625, 0.034576416015625, 0.00630950927734375, -0.03076171875, 0.035888671875, -0.0199737548828125, -0.050384521484375, 0.06036376953125, 0.044830322265625, 0.048797607421875, -0.0306549072265625, 0.030517578125, 0.060455322265625, 0.02850341796875, 0.0174407958984375, 0.04205322265625, 0.010711669921875, -0.05255126953125, -0.016265869140625, -0.049072265625, -0.009185791015625, 0.03839111328125, -0.057220458984375, 0.023345947265625, -0.0302886962890625, -0.027496337890625, 0.01326751708984375, 0.034332275390625, -0.040313720703125, 0.03216552734375, -0.00487518310546875, 0.059356689453125, -0.08624267578125, 0.049652099609375, 0.05548095703125, -0.0122833251953125, -0.043060302734375, -0.0102996826171875, 0.0004982948303222656, -0.03155517578125, 0.04058837890625, 0.01100921630859375, 0.0301361083984375, -0.01557159423828125, -0.048309326171875, -0.08306884765625, 0.06744384765625, 0.00356292724609375, -0.031951904296875, 0.024932861328125, 0.0099639892578125, 0.05328369140625, -0.00821685791015625, 0.002292633056640625, 0.030120849609375, 0.03680419921875, -0.002780914306640625, -0.0684814453125, 0.01043701171875, -0.048370361328125, -0.0184173583984375, 0.01776123046875, -0.05462646484375, 0.05419921875, 0.01119232177734375, -0.00704193115234375, -0.005977630615234375, 0.022857666015625, 0.024017333984375, 0.04034423828125, 0.036834716796875, 0.041351318359375, 0.0626220703125, -0.0204925537109375, 0.050140380859375, -0.01546478271484375, 0.043914794921875, 0.07672119140625, -0.0024433135986328125, 0.046234130859375, 0.02374267578125, -0.036163330078125, 0.01166534423828125, 0.0718994140625, -0.015838623046875, 0.058685302734375, 0.0167083740234375, -0.0238800048828125, -0.0140838623046875, -0.01200103759765625, -0.036285400390625, 0.0438232421875, 0.039031982421875, -0.031280517578125, 0.0006494522094726562, -0.0012874603271484375, 0.01145172119140625, 0.00021445751190185547, -0.015655517578125, 0.061492919921875, -0.010589599609375, -0.030181884765625, 0.039703369140625, -0.005191802978515625, 0.055908203125, -0.055694580078125, 0.016265869140625, -0.0016489028930664062, 0.0080413818359375, -0.0252227783203125, -0.0709228515625, 0.043243408203125, 0.010009765625, -0.0134429931640625, -0.00928497314453125, 0.07904052734375, -0.032928466796875, -0.038543701171875, 0.010467529296875, 0.0193328857421875, 0.02783203125, 0.0228729248046875, -0.0706787109375, -0.010162353515625, 0.0104827880859375, -0.0251922607421875, 0.0161285400390625, 0.0335693359375, -0.004230499267578125, 0.046478271484375, 0.055755615234375, 0.011962890625, 0.0019779205322265625, -0.03076171875, 0.06610107421875, -0.04144287109375, -0.044677734375, -0.044769287109375, 0.0556640625, -0.02667236328125, -0.040313720703125, 0.053314208984375, 0.054962158203125, 0.06549072265625, 0.004482269287109375, 0.054229736328125, -0.0261077880859375, 0.049285888671875, -0.02801513671875, 0.06207275390625, -0.043121337890625, 0.00481414794921875, -0.0203704833984375, -0.058380126953125, -0.04205322265625, 0.06640625, -0.0377197265625, 0.009307861328125, 0.02984619140625, 0.06866455078125, -0.003002166748046875, -0.00384521484375, 0.019195556640625, 0.03118896484375, 0.009429931640625, 0.022613525390625, 0.05548095703125, -0.0258026123046875, 0.035247802734375, -0.056884765625, -0.0091552734375, -0.00732421875, -0.06414794921875, -0.055877685546875, -0.0635986328125, -0.04693603515625, -0.024932861328125, 0.00203704833984375, 0.059814453125, 0.05133056640625, -0.06878662109375, -0.0117034912109375, -0.0017671585083007812, 0.00484466552734375, 0.002529144287109375, -0.0215911865234375, 0.0278472900390625, -0.0012712478637695312, -0.050262451171875, -0.01340484619140625, -0.015533447265625, 0.01087188720703125, -0.0011568069458007812, 0.00481414794921875, -0.048309326171875, -0.00174713134765625, 0.0250244140625, 0.007366180419921875, -0.046875, -0.0207672119140625, 0.0047760009765625, -0.0119171142578125, 0.01134490966796875, 0.0196380615234375, -0.025665283203125, 0.0372314453125, 0.046905517578125, 0.0133819580078125, 0.04052734375, -0.009033203125, 0.0118560791015625, -0.053497314453125, 0.0170745849609375, 0.040985107421875, 0.0183258056640625, 0.02020263671875, -0.048065185546875, 0.04278564453125, 0.025390625, -0.0601806640625, -0.08245849609375, 0.0007004737854003906, -0.0765380859375, -0.0261688232421875, 0.09246826171875, -0.015899658203125, -0.0138397216796875, -0.026153564453125, -0.0200653076171875, 0.02252197265625, -0.03900146484375, 0.06317138671875, 0.06781005859375, 0.0045013427734375, 0.007785797119140625, -0.056793212890625, 0.046661376953125, 0.0233612060546875, -0.06500244140625, 0.0169677734375, 0.0302886962890625, 0.0313720703125, 0.0304412841796875, 0.040069580078125, -0.0138702392578125, 0.0185699462890625, -0.0024509429931640625, 0.01226806640625, 0.020233154296875, -0.007198333740234375, -0.040802001953125, -0.00719451904296875, -0.0160064697265625, -0.005084991455078125 ] ]
EleutherAI/gpt-j-6b
2023-06-21T14:33:36.000Z
[ "transformers", "pytorch", "tf", "jax", "gptj", "text-generation", "causal-lm", "en", "dataset:EleutherAI/pile", "arxiv:2104.09864", "arxiv:2101.00027", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-generation
EleutherAI
null
null
EleutherAI/gpt-j-6b
1,300
114,048
transformers
2022-03-02T23:29:04
--- language: - en tags: - pytorch - causal-lm license: apache-2.0 datasets: - EleutherAI/pile --- # GPT-J 6B ## Model Description GPT-J 6B is a transformer model trained using Ben Wang's [Mesh Transformer JAX](https://github.com/kingoflolz/mesh-transformer-jax/). "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters. <figure> | Hyperparameter | Value | |----------------------|------------| | \\(n_{parameters}\\) | 6053381344 | | \\(n_{layers}\\) | 28&ast; | | \\(d_{model}\\) | 4096 | | \\(d_{ff}\\) | 16384 | | \\(n_{heads}\\) | 16 | | \\(d_{head}\\) | 256 | | \\(n_{ctx}\\) | 2048 | | \\(n_{vocab}\\) | 50257/50400&dagger; (same tokenizer as GPT-2/3) | | Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) | | RoPE Dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) | <figcaption><p><strong>&ast;</strong> Each layer consists of one feedforward block and one self attention block.</p> <p><strong>&dagger;</strong> Although the embedding matrix has a size of 50400, only 50257 entries are used by the GPT-2 tokenizer.</p></figcaption></figure> The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary Position Embedding (RoPE) is applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. ## Intended Use and Limitations GPT-J learns an inner representation of the English language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt. ### Out-of-scope use GPT-J-6B is **not** intended for deployment without fine-tuning, supervision, and/or moderation. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. GPT-J-6B was trained on an English-language only dataset, and is thus **not** suitable for translation or generating text in other languages. GPT-J-6B has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means GPT-J-6B will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and Biases The core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon GPT-J to produce factually accurate output. GPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See [Sections 5 and 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a more detailed analysis of the biases in the Pile. As with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results. ### How to use This model can be easily loaded using the `AutoModelForCausalLM` functionality: ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-j-6B") model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B") ``` ## Training data GPT-J 6B was trained on [the Pile](https://pile.eleuther.ai), a large-scale curated dataset created by [EleutherAI](https://www.eleuther.ai). ## Training procedure This model was trained for 402 billion tokens over 383,500 steps on TPU v3-256 pod. It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token correctly. ## Evaluation results <figure> | Model | Public | Training FLOPs | LAMBADA PPL ↓ | LAMBADA Acc ↑ | Winogrande ↑ | Hellaswag ↑ | PIQA ↑ | Dataset Size (GB) | |--------------------------|-------------|----------------|--- |--- |--- |--- |--- |-------------------| | Random Chance | &check; | 0 | ~a lot | ~0% | 50% | 25% | 25% | 0 | | GPT-3 Ada&ddagger; | &cross; | ----- | 9.95 | 51.6% | 52.9% | 43.4% | 70.5% | ----- | | GPT-2 1.5B | &check; | ----- | 10.63 | 51.21% | 59.4% | 50.9% | 70.8% | 40 | | GPT-Neo 1.3B&ddagger; | &check; | 3.0e21 | 7.50 | 57.2% | 55.0% | 48.9% | 71.1% | 825 | | Megatron-2.5B&ast; | &cross; | 2.4e21 | ----- | 61.7% | ----- | ----- | ----- | 174 | | GPT-Neo 2.7B&ddagger; | &check; | 6.8e21 | 5.63 | 62.2% | 56.5% | 55.8% | 73.0% | 825 | | GPT-3 1.3B&ast;&ddagger; | &cross; | 2.4e21 | 5.44 | 63.6% | 58.7% | 54.7% | 75.1% | ~800 | | GPT-3 Babbage&ddagger; | &cross; | ----- | 5.58 | 62.4% | 59.0% | 54.5% | 75.5% | ----- | | Megatron-8.3B&ast; | &cross; | 7.8e21 | ----- | 66.5% | ----- | ----- | ----- | 174 | | GPT-3 2.7B&ast;&ddagger; | &cross; | 4.8e21 | 4.60 | 67.1% | 62.3% | 62.8% | 75.6% | ~800 | | Megatron-11B&dagger; | &check; | 1.0e22 | ----- | ----- | ----- | ----- | ----- | 161 | | **GPT-J 6B&ddagger;** | **&check;** | **1.5e22** | **3.99** | **69.7%** | **65.3%** | **66.1%** | **76.5%** | **825** | | GPT-3 6.7B&ast;&ddagger; | &cross; | 1.2e22 | 4.00 | 70.3% | 64.5% | 67.4% | 78.0% | ~800 | | GPT-3 Curie&ddagger; | &cross; | ----- | 4.00 | 69.3% | 65.6% | 68.5% | 77.9% | ----- | | GPT-3 13B&ast;&ddagger; | &cross; | 2.3e22 | 3.56 | 72.5% | 67.9% | 70.9% | 78.5% | ~800 | | GPT-3 175B&ast;&ddagger; | &cross; | 3.1e23 | 3.00 | 76.2% | 70.2% | 78.9% | 81.0% | ~800 | | GPT-3 Davinci&ddagger; | &cross; | ----- | 3.0 | 75% | 72% | 78% | 80% | ----- | <figcaption><p>Models roughly sorted by performance, or by FLOPs if not available.</p> <p><strong>&ast;</strong> Evaluation numbers reported by their respective authors. All other numbers are provided by running <a href="https://github.com/EleutherAI/lm-evaluation-harness/"><code>lm-evaluation-harness</code></a> either with released weights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these might not be directly comparable. See <a href="https://blog.eleuther.ai/gpt3-model-sizes/">this blog post</a> for more details.</p> <p><strong>†</strong> Megatron-11B provides no comparable metrics, and several implementations using the released weights do not reproduce the generation quality and evaluations. (see <a href="https://github.com/huggingface/transformers/pull/10301">1</a> <a href="https://github.com/pytorch/fairseq/issues/2358">2</a> <a href="https://github.com/pytorch/fairseq/issues/2719">3</a>) Thus, evaluation was not attempted.</p> <p><strong>‡</strong> These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models failed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is trained on the Pile, which has not been deduplicated against any test sets.</p></figcaption></figure> ## Citation and Related Information ### BibTeX entry To cite this model: ```bibtex @misc{gpt-j, author = {Wang, Ben and Komatsuzaki, Aran}, title = {{GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model}}, howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}}, year = 2021, month = May } ``` To cite the codebase that trained this model: ```bibtex @misc{mesh-transformer-jax, author = {Wang, Ben}, title = {{Mesh-Transformer-JAX: Model-Parallel Implementation of Transformer Language Model with JAX}}, howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}}, year = 2021, month = May } ``` If you use this model, we would love to hear about it! Reach out on [GitHub](https://github.com/kingoflolz/mesh-transformer-jax), Discord, or shoot Ben an email. ## Acknowledgements This project would not have been possible without compute generously provided by Google through the [TPU Research Cloud](https://sites.research.google/trc/), as well as the Cloud TPU team for providing early access to the [Cloud TPU VM](https://cloud.google.com/blog/products/compute/introducing-cloud-tpu-vms) Alpha. Thanks to everyone who have helped out one way or another (listed alphabetically): - [James Bradbury](https://twitter.com/jekbradbury) for valuable assistance with debugging JAX issues. - [Stella Biderman](https://www.stellabiderman.com), [Eric Hallahan](https://twitter.com/erichallahan), [Kurumuz](https://github.com/kurumuz/), and [Finetune](https://github.com/finetuneanon/) for converting the model to be compatible with the `transformers` package. - [Leo Gao](https://twitter.com/nabla_theta) for running zero shot evaluations for the baseline models for the table. - [Laurence Golding](https://github.com/researcher2/) for adding some features to the web demo. - [Aran Komatsuzaki](https://twitter.com/arankomatsuzaki) for advice with experiment design and writing the blog posts. - [Janko Prester](https://github.com/jprester/) for creating the web demo frontend.
10,978
[ [ -0.038421630859375, -0.052490234375, 0.0172576904296875, 0.00278472900390625, -0.016021728515625, 0.0008997917175292969, 0.005558013916015625, -0.030670166015625, 0.0135040283203125, 0.01287841796875, -0.034423828125, -0.029388427734375, -0.06280517578125, -0.0013866424560546875, -0.0153045654296875, 0.08123779296875, 0.00682830810546875, -0.01204681396484375, 0.0191497802734375, 0.006809234619140625, -0.030487060546875, -0.023101806640625, -0.043609619140625, -0.035003662109375, 0.034820556640625, -0.004756927490234375, 0.070068359375, 0.056243896484375, 0.0303802490234375, 0.0234832763671875, -0.01849365234375, -0.01027679443359375, -0.0308380126953125, -0.0298004150390625, -0.004913330078125, -0.020477294921875, -0.0390625, -0.001529693603515625, 0.03485107421875, 0.0323486328125, 0.0026531219482421875, 0.018341064453125, 0.003971099853515625, 0.049163818359375, -0.032958984375, 0.0249481201171875, -0.0304718017578125, 0.00925445556640625, -0.017059326171875, 0.005863189697265625, -0.02252197265625, -0.0139007568359375, 0.0072021484375, -0.04071044921875, 0.0174713134765625, 0.018463134765625, 0.091796875, 0.01265716552734375, -0.024200439453125, -0.007198333740234375, -0.03887939453125, 0.0592041015625, -0.057159423828125, 0.031982421875, 0.01971435546875, 0.01214599609375, -0.005390167236328125, -0.059478759765625, -0.05035400390625, -0.00983428955078125, -0.01708984375, 0.033111572265625, -0.0146331787109375, -0.0032558441162109375, 0.03594970703125, 0.03424072265625, -0.0694580078125, 0.0004973411560058594, -0.038238525390625, -0.01380157470703125, 0.045562744140625, 0.01181793212890625, 0.0311737060546875, -0.041168212890625, -0.050445556640625, -0.02471923828125, -0.035888671875, 0.0162811279296875, 0.034027099609375, 0.0105438232421875, -0.02069091796875, 0.040740966796875, 0.00182342529296875, 0.04296875, 0.020965576171875, -0.00836944580078125, 0.037445068359375, -0.0308990478515625, -0.04052734375, -0.01320648193359375, 0.07940673828125, 0.0271453857421875, 0.0113525390625, -0.0018472671508789062, -0.0189666748046875, -0.0022945404052734375, 0.0254974365234375, -0.0712890625, -0.0308380126953125, 0.0153045654296875, -0.0364990234375, -0.02301025390625, 0.0160675048828125, -0.047637939453125, 0.0019969940185546875, -0.0116119384765625, 0.046051025390625, -0.035614013671875, -0.03411865234375, 0.00984954833984375, -0.01558685302734375, 0.029052734375, 0.01514434814453125, -0.06915283203125, 0.00862884521484375, 0.03216552734375, 0.060302734375, -0.006900787353515625, -0.0182647705078125, 0.00485992431640625, 0.0153045654296875, -0.027008056640625, 0.045501708984375, -0.0204315185546875, -0.0343017578125, -0.0201873779296875, 0.0166015625, -0.021484375, -0.016632080078125, 0.034088134765625, -0.0274810791015625, 0.03802490234375, -0.0238037109375, -0.036895751953125, -0.0201568603515625, 0.0244598388671875, -0.052490234375, 0.07464599609375, 0.023681640625, -0.0726318359375, 0.03692626953125, -0.053863525390625, -0.01043701171875, 0.002044677734375, -0.0103759765625, -0.057281494140625, -0.01149749755859375, 0.0262298583984375, 0.0272064208984375, -0.0277862548828125, 0.022369384765625, -0.0130462646484375, -0.0352783203125, -0.001209259033203125, -0.041961669921875, 0.0797119140625, 0.0212860107421875, -0.054656982421875, 0.00278472900390625, -0.05035400390625, 0.0054931640625, 0.038909912109375, -0.015899658203125, 0.006282806396484375, -0.02813720703125, -0.0019130706787109375, 0.027496337890625, 0.0174102783203125, -0.0281524658203125, 0.020965576171875, -0.03802490234375, 0.0263214111328125, 0.0626220703125, 0.0035991668701171875, 0.022491455078125, -0.040374755859375, 0.051025390625, 0.00911712646484375, 0.0237884521484375, -0.0005884170532226562, -0.05615234375, -0.050323486328125, -0.0162506103515625, 0.02392578125, 0.03973388671875, -0.049713134765625, 0.04693603515625, -0.008575439453125, -0.048614501953125, -0.025634765625, -0.0175933837890625, 0.044036865234375, 0.0364990234375, 0.032501220703125, -0.0171661376953125, -0.03094482421875, -0.07293701171875, -0.0085296630859375, -0.022064208984375, -0.002635955810546875, 0.0175933837890625, 0.053131103515625, -0.007965087890625, 0.061767578125, -0.033935546875, -0.01020050048828125, -0.0293731689453125, 0.01009368896484375, 0.051849365234375, 0.0399169921875, 0.057159423828125, -0.04730224609375, -0.054046630859375, 0.006317138671875, -0.04974365234375, 0.008087158203125, -0.00873565673828125, -0.004505157470703125, 0.018463134765625, 0.004558563232421875, -0.07147216796875, 0.044464111328125, 0.03509521484375, -0.041473388671875, 0.059600830078125, -0.01117706298828125, 0.01416778564453125, -0.08740234375, 0.0175323486328125, -0.0016641616821289062, -0.005504608154296875, -0.04266357421875, -0.002933502197265625, 0.010040283203125, 0.00136566162109375, -0.0406494140625, 0.0494384765625, -0.04827880859375, 0.0019483566284179688, -0.005352020263671875, -0.00007510185241699219, 0.0008387565612792969, 0.06011962890625, -0.01031494140625, 0.0726318359375, 0.038665771484375, -0.035552978515625, 0.0193939208984375, 0.016357421875, -0.028045654296875, 0.0211944580078125, -0.059295654296875, 0.00513458251953125, -0.01023101806640625, 0.017974853515625, -0.07830810546875, -0.00765228271484375, 0.038421630859375, -0.049224853515625, 0.0333251953125, -0.0189056396484375, -0.032623291015625, -0.061431884765625, -0.034149169921875, 0.019012451171875, 0.05059814453125, -0.0252838134765625, 0.0345458984375, 0.0207366943359375, -0.003551483154296875, -0.055633544921875, -0.056640625, -0.015350341796875, -0.0210418701171875, -0.05133056640625, 0.03143310546875, -0.0124664306640625, 0.0012798309326171875, 0.0026264190673828125, -0.01103973388671875, 0.00750732421875, -0.00841522216796875, 0.0142059326171875, 0.0278778076171875, -0.007282257080078125, -0.016265869140625, -0.004291534423828125, -0.0223541259765625, 0.00688934326171875, -0.0244140625, 0.04852294921875, -0.01422882080078125, -0.0277099609375, -0.03192138671875, 0.0020084381103515625, 0.048797607421875, -0.002666473388671875, 0.052947998046875, 0.06951904296875, -0.0243377685546875, 0.00959014892578125, -0.034759521484375, -0.0018768310546875, -0.039520263671875, 0.039581298828125, -0.0430908203125, -0.057708740234375, 0.0462646484375, 0.0216217041015625, 0.00901031494140625, 0.06549072265625, 0.053619384765625, 0.004489898681640625, 0.09271240234375, 0.030609130859375, -0.023162841796875, 0.0303802490234375, -0.04974365234375, 0.0204315185546875, -0.0618896484375, -0.0229644775390625, -0.0302581787109375, -0.029022216796875, -0.05291748046875, -0.0263671875, 0.02850341796875, 0.00421142578125, -0.04449462890625, 0.0462646484375, -0.055511474609375, 0.024658203125, 0.05023193359375, -0.0038623809814453125, 0.0023975372314453125, -0.005767822265625, -0.02569580078125, -0.0020885467529296875, -0.056915283203125, -0.02044677734375, 0.080322265625, 0.0298614501953125, 0.044342041015625, 0.0170440673828125, 0.04644775390625, 0.0121612548828125, 0.0087432861328125, -0.039215087890625, 0.0281524658203125, 0.002651214599609375, -0.06842041015625, -0.02886962890625, -0.034912109375, -0.0794677734375, 0.030487060546875, -0.00400543212890625, -0.07208251953125, 0.0144500732421875, 0.0038776397705078125, -0.0249481201171875, 0.044342041015625, -0.06005859375, 0.06695556640625, -0.0150146484375, -0.029693603515625, 0.0029621124267578125, -0.0562744140625, 0.032562255859375, 0.0102691650390625, 0.01432037353515625, -0.0005359649658203125, 0.010345458984375, 0.05511474609375, -0.043670654296875, 0.051910400390625, -0.0175933837890625, -0.0017995834350585938, 0.03521728515625, -0.0009350776672363281, 0.062255859375, 0.01345062255859375, 0.00798797607421875, 0.007732391357421875, -0.0072479248046875, -0.03106689453125, -0.033233642578125, 0.046905517578125, -0.07818603515625, -0.05255126953125, -0.049591064453125, -0.042205810546875, 0.005413055419921875, 0.024627685546875, 0.04522705078125, 0.031402587890625, 0.01020050048828125, 0.015838623046875, 0.039398193359375, -0.02239990234375, 0.051177978515625, 0.019866943359375, -0.0177154541015625, -0.0452880859375, 0.07073974609375, 0.01389312744140625, 0.0272064208984375, 0.0277099609375, 0.0233917236328125, -0.045623779296875, -0.0374755859375, -0.043609619140625, 0.0259552001953125, -0.0357666015625, -0.0180511474609375, -0.051055908203125, -0.03277587890625, -0.04498291015625, -0.01001739501953125, -0.0260162353515625, -0.034423828125, -0.01617431640625, -0.0183868408203125, 0.0294189453125, 0.061431884765625, 0.00431060791015625, 0.0099639892578125, -0.04296875, 0.01299285888671875, 0.026702880859375, 0.0293121337890625, 0.0012359619140625, -0.0565185546875, -0.01081085205078125, -0.0009293556213378906, -0.0338134765625, -0.05902099609375, 0.035736083984375, -0.01458740234375, 0.037567138671875, 0.030609130859375, -0.005645751953125, 0.058349609375, -0.028778076171875, 0.07904052734375, 0.037109375, -0.0560302734375, 0.0279388427734375, -0.044342041015625, 0.03631591796875, 0.0173187255859375, 0.03240966796875, -0.0305633544921875, -0.032806396484375, -0.08251953125, -0.06805419921875, 0.066650390625, 0.034912109375, -0.0023937225341796875, 0.0007576942443847656, 0.0045318603515625, -0.0000832676887512207, 0.0153045654296875, -0.07562255859375, -0.04534912109375, -0.029998779296875, -0.0029239654541015625, 0.00519561767578125, -0.01377105712890625, 0.0068206787109375, -0.02880859375, 0.0601806640625, 0.004779815673828125, 0.055419921875, 0.004238128662109375, -0.0076446533203125, -0.0003304481506347656, 0.00505828857421875, 0.04339599609375, 0.057342529296875, -0.0251312255859375, -0.006465911865234375, 0.0153045654296875, -0.04931640625, 0.0016965866088867188, 0.0114288330078125, -0.028472900390625, 0.0030117034912109375, 0.0192413330078125, 0.07000732421875, 0.003063201904296875, -0.01519012451171875, 0.045257568359375, 0.0033283233642578125, -0.03521728515625, -0.02789306640625, 0.0096435546875, 0.0036792755126953125, 0.0177459716796875, 0.0223541259765625, -0.0015926361083984375, 0.00787353515625, -0.031494140625, 0.019927978515625, 0.033721923828125, -0.028411865234375, -0.0266876220703125, 0.07000732421875, -0.0013093948364257812, -0.01439666748046875, 0.03411865234375, -0.0226287841796875, -0.045257568359375, 0.060516357421875, 0.049713134765625, 0.0682373046875, -0.0201568603515625, 0.014862060546875, 0.05810546875, 0.034393310546875, -0.00524139404296875, 0.01389312744140625, 0.018585205078125, -0.0400390625, -0.028411865234375, -0.045501708984375, -0.0212554931640625, 0.0285797119140625, -0.039276123046875, 0.023712158203125, -0.0418701171875, -0.02154541015625, -0.020233154296875, 0.019378662109375, -0.06158447265625, 0.031951904296875, -0.0006494522094726562, 0.056671142578125, -0.058197021484375, 0.0643310546875, 0.040740966796875, -0.056915283203125, -0.07568359375, -0.00978851318359375, 0.004711151123046875, -0.07403564453125, 0.035552978515625, 0.015533447265625, 0.0173187255859375, 0.01015472412109375, -0.03204345703125, -0.088623046875, 0.10528564453125, 0.00978851318359375, -0.0372314453125, -0.0036945343017578125, 0.0142822265625, 0.031982421875, -0.003131866455078125, 0.056671142578125, 0.0372314453125, 0.044921875, 0.00524139404296875, -0.0760498046875, 0.01444244384765625, -0.040802001953125, 0.0101318359375, 0.0263671875, -0.07354736328125, 0.08013916015625, -0.007793426513671875, -0.005550384521484375, -0.01149749755859375, 0.042022705078125, 0.0276031494140625, 0.0006451606750488281, 0.03997802734375, 0.051422119140625, 0.044189453125, -0.0241546630859375, 0.092529296875, -0.0311737060546875, 0.05023193359375, 0.05865478515625, 0.0107269287109375, 0.032470703125, 0.0164337158203125, -0.042144775390625, 0.02862548828125, 0.047271728515625, -0.015472412109375, 0.03289794921875, 0.00820159912109375, -0.001285552978515625, 0.00011599063873291016, 0.01824951171875, -0.05108642578125, 0.00711822509765625, 0.02093505859375, -0.0294952392578125, -0.0017652511596679688, -0.013671875, 0.01227569580078125, -0.02679443359375, -0.02032470703125, 0.04046630859375, 0.0035190582275390625, -0.035125732421875, 0.0499267578125, -0.0017452239990234375, 0.04925537109375, -0.04571533203125, 0.01445770263671875, -0.0139923095703125, 0.0055389404296875, -0.01474761962890625, -0.062255859375, 0.0093536376953125, -0.005535125732421875, -0.0123748779296875, -0.01546478271484375, 0.050872802734375, -0.01288604736328125, -0.043914794921875, 0.018341064453125, 0.0240936279296875, 0.0128326416015625, 0.004222869873046875, -0.08251953125, -0.0008249282836914062, 0.01357269287109375, -0.052581787109375, 0.033233642578125, 0.036285400390625, 0.00836181640625, 0.05047607421875, 0.045074462890625, -0.0034732818603515625, 0.0108489990234375, 0.00833892822265625, 0.068603515625, -0.061126708984375, -0.034759521484375, -0.059295654296875, 0.061492919921875, -0.005077362060546875, -0.0345458984375, 0.056365966796875, 0.044708251953125, 0.06622314453125, -0.0141448974609375, 0.0648193359375, -0.0250091552734375, 0.0251617431640625, -0.036529541015625, 0.05572509765625, -0.04180908203125, 0.0028438568115234375, -0.036468505859375, -0.0792236328125, -0.0179595947265625, 0.056365966796875, -0.0206451416015625, 0.0374755859375, 0.060394287109375, 0.05511474609375, 0.0011701583862304688, -0.00010317564010620117, 0.027618408203125, 0.028076171875, 0.03326416015625, 0.059844970703125, 0.044342041015625, -0.054412841796875, 0.050445556640625, -0.0303192138671875, -0.00957489013671875, -0.015533447265625, -0.051177978515625, -0.07470703125, -0.0264129638671875, -0.02532958984375, -0.0247802734375, 0.0005578994750976562, 0.06268310546875, 0.0447998046875, -0.05584716796875, -0.008880615234375, -0.0177764892578125, 0.00435638427734375, -0.0235595703125, -0.0229339599609375, 0.04669189453125, -0.0142822265625, -0.06842041015625, 0.0136260986328125, 0.0136260986328125, 0.021453857421875, -0.017547607421875, -0.01160430908203125, -0.014801025390625, -0.0069122314453125, 0.035552978515625, 0.0120086669921875, -0.05706787109375, -0.037689208984375, -0.0020751953125, -0.00861358642578125, 0.0071258544921875, 0.0258331298828125, -0.046356201171875, 0.018402099609375, 0.04180908203125, 0.037811279296875, 0.056793212890625, -0.00035643577575683594, 0.0304107666015625, -0.04180908203125, 0.00795745849609375, 0.00823974609375, 0.03125, 0.0032196044921875, -0.0295562744140625, 0.04925537109375, 0.031982421875, -0.05419921875, -0.048004150390625, 0.0014286041259765625, -0.08062744140625, -0.0166015625, 0.08734130859375, -0.0041961669921875, -0.0268096923828125, -0.00405120849609375, -0.013824462890625, 0.01776123046875, -0.031646728515625, 0.051727294921875, 0.049896240234375, -0.006053924560546875, -0.0179595947265625, -0.0557861328125, 0.032867431640625, 0.0220184326171875, -0.03985595703125, -0.0102691650390625, 0.0137786865234375, 0.0276947021484375, 0.02008056640625, 0.052459716796875, -0.0205230712890625, 0.019317626953125, 0.006832122802734375, 0.0122528076171875, -0.01384735107421875, -0.0137786865234375, -0.008331298828125, 0.016143798828125, -0.00502777099609375, -0.0158233642578125 ] ]
facebook/sam-vit-large
2023-07-11T15:08:45.000Z
[ "transformers", "pytorch", "tf", "sam", "mask-generation", "license:apache-2.0", "endpoints_compatible", "region:us", "has_space" ]
null
facebook
null
null
facebook/sam-vit-large
13
114,038
transformers
2023-04-19T14:17:03
--- license: apache-2.0 --- # Model Card for Segment Anything Model (SAM) - ViT Large (ViT-L) version <p> <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-architecture.png" alt="Model architecture"> <em> Detailed architecture of Segment Anything Model (SAM).</em> </p> # Table of Contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Citation](#citation) # TL;DR [Link to original repository](https://github.com/facebookresearch/segment-anything) | <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-beancans.png" alt="Snow" width="600" height="600"> | <img src="https://huggingface.co/facebook/sam-vit-huge/discussions/7" alt="Forest" width="600" height="600"> | <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-car-seg.png" alt="Mountains" width="600" height="600"> | |---------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------| The **Segment Anything Model (SAM)** produces high quality object masks from input prompts such as points or boxes, and it can be used to generate masks for all objects in an image. It has been trained on a [dataset](https://segment-anything.com/dataset/index.html) of 11 million images and 1.1 billion masks, and has strong zero-shot performance on a variety of segmentation tasks. The abstract of the paper states: > We introduce the Segment Anything (SA) project: a new task, model, and dataset for image segmentation. Using our efficient model in a data collection loop, we built the largest segmentation dataset to date (by far), with over 1 billion masks on 11M licensed and privacy respecting images. The model is designed and trained to be promptable, so it can transfer zero-shot to new image distributions and tasks. We evaluate its capabilities on numerous tasks and find that its zero-shot performance is impressive -- often competitive with or even superior to prior fully supervised results. We are releasing the Segment Anything Model (SAM) and corresponding dataset (SA-1B) of 1B masks and 11M images at [https://segment-anything.com](https://segment-anything.com) to foster research into foundation models for computer vision. **Disclaimer**: Content from **this** model card has been written by the Hugging Face team, and parts of it were copy pasted from the original [SAM model card](https://github.com/facebookresearch/segment-anything). # Model Details The SAM model is made up of 3 modules: - The `VisionEncoder`: a VIT based image encoder. It computes the image embeddings using attention on patches of the image. Relative Positional Embedding is used. - The `PromptEncoder`: generates embeddings for points and bounding boxes - The `MaskDecoder`: a two-ways transformer which performs cross attention between the image embedding and the point embeddings (->) and between the point embeddings and the image embeddings. The outputs are fed - The `Neck`: predicts the output masks based on the contextualized masks produced by the `MaskDecoder`. # Usage ## Prompted-Mask-Generation ```python from PIL import Image import requests from transformers import SamModel, SamProcessor model = SamModel.from_pretrained("facebook/sam-vit-large") processor = SamProcessor.from_pretrained("facebook/sam-vit-large") img_url = "https://huggingface.co/ybelkada/segment-anything/resolve/main/assets/car.png" raw_image = Image.open(requests.get(img_url, stream=True).raw).convert("RGB") input_points = [[[450, 600]]] # 2D localization of a window ``` ```python inputs = processor(raw_image, input_points=input_points, return_tensors="pt").to("cuda") outputs = model(**inputs) masks = processor.image_processor.post_process_masks(outputs.pred_masks.cpu(), inputs["original_sizes"].cpu(), inputs["reshaped_input_sizes"].cpu()) scores = outputs.iou_scores ``` Among other arguments to generate masks, you can pass 2D locations on the approximate position of your object of interest, a bounding box wrapping the object of interest (the format should be x, y coordinate of the top right and bottom left point of the bounding box), a segmentation mask. At this time of writing, passing a text as input is not supported by the official model according to [the official repository](https://github.com/facebookresearch/segment-anything/issues/4#issuecomment-1497626844). For more details, refer to this notebook, which shows a walk throught of how to use the model, with a visual example! ## Automatic-Mask-Generation The model can be used for generating segmentation masks in a "zero-shot" fashion, given an input image. The model is automatically prompt with a grid of `1024` points which are all fed to the model. The pipeline is made for automatic mask generation. The following snippet demonstrates how easy you can run it (on any device! Simply feed the appropriate `points_per_batch` argument) ```python from transformers import pipeline generator = pipeline("mask-generation", device = 0, points_per_batch = 256) image_url = "https://huggingface.co/ybelkada/segment-anything/resolve/main/assets/car.png" outputs = generator(image_url, points_per_batch = 256) ``` Now to display the image: ```python import matplotlib.pyplot as plt from PIL import Image import numpy as np def show_mask(mask, ax, random_color=False): if random_color: color = np.concatenate([np.random.random(3), np.array([0.6])], axis=0) else: color = np.array([30 / 255, 144 / 255, 255 / 255, 0.6]) h, w = mask.shape[-2:] mask_image = mask.reshape(h, w, 1) * color.reshape(1, 1, -1) ax.imshow(mask_image) plt.imshow(np.array(raw_image)) ax = plt.gca() for mask in outputs["masks"]: show_mask(mask, ax=ax, random_color=True) plt.axis("off") plt.show() ``` # Citation If you use this model, please use the following BibTeX entry. ``` @article{kirillov2023segany, title={Segment Anything}, author={Kirillov, Alexander and Mintun, Eric and Ravi, Nikhila and Mao, Hanzi and Rolland, Chloe and Gustafson, Laura and Xiao, Tete and Whitehead, Spencer and Berg, Alexander C. and Lo, Wan-Yen and Doll{\'a}r, Piotr and Girshick, Ross}, journal={arXiv:2304.02643}, year={2023} } ```
6,656
[ [ -0.03704833984375, -0.05517578125, 0.038818359375, 0.00574493408203125, -0.033966064453125, -0.0137786865234375, 0.021270751953125, -0.03948974609375, 0.046234130859375, 0.03460693359375, -0.042999267578125, -0.045806884765625, -0.042144775390625, -0.0205841064453125, -0.02520751953125, 0.043060302734375, 0.0118865966796875, -0.01393890380859375, -0.00693511962890625, -0.005138397216796875, -0.0283203125, -0.033538818359375, -0.0546875, -0.0024852752685546875, 0.017669677734375, 0.007427215576171875, 0.051483154296875, 0.0947265625, 0.03839111328125, 0.0231475830078125, -0.0140380859375, 0.0024662017822265625, -0.004688262939453125, -0.00809478759765625, -0.0013751983642578125, -0.0214691162109375, -0.0204620361328125, -0.000530242919921875, 0.05877685546875, 0.033447265625, 0.0014476776123046875, 0.0194244384765625, -0.024566650390625, 0.03448486328125, -0.03790283203125, -0.0009946823120117188, -0.038665771484375, -0.00754547119140625, -0.0125732421875, 0.0108489990234375, -0.013702392578125, -0.02557373046875, -0.002750396728515625, -0.037322998046875, 0.01123046875, 0.00600433349609375, 0.11932373046875, 0.031982421875, -0.010650634765625, 0.006755828857421875, -0.036834716796875, 0.043182373046875, -0.042083740234375, 0.024017333984375, 0.00988006591796875, 0.021514892578125, 0.018157958984375, -0.064208984375, -0.03564453125, 0.0121917724609375, -0.0142974853515625, 0.002735137939453125, -0.0300140380859375, -0.010009765625, 0.0280609130859375, 0.018157958984375, -0.039398193359375, -0.0233306884765625, -0.05804443359375, -0.015106201171875, 0.05999755859375, 0.01263427734375, 0.02947998046875, -0.04638671875, -0.049835205078125, -0.018157958984375, -0.035797119140625, 0.0311737060546875, -0.0012607574462890625, 0.0028667449951171875, -0.0276336669921875, 0.046112060546875, -0.01641845703125, 0.0650634765625, 0.02545166015625, -0.0294342041015625, 0.02734375, -0.006694793701171875, -0.03570556640625, -0.0012369155883789062, 0.042999267578125, 0.041534423828125, -0.0090484619140625, 0.005523681640625, -0.006378173828125, 0.0083465576171875, 0.01541900634765625, -0.07958984375, -0.027069091796875, 0.0195465087890625, -0.043365478515625, -0.01444244384765625, 0.0231475830078125, -0.043243408203125, -0.00954437255859375, -0.0206756591796875, 0.044677734375, -0.040802001953125, -0.01204681396484375, 0.005992889404296875, -0.016998291015625, 0.046295166015625, 0.007556915283203125, -0.036834716796875, -0.00678253173828125, 0.030487060546875, 0.07562255859375, -0.01045989990234375, -0.0133514404296875, -0.0195159912109375, 0.01453399658203125, -0.0205230712890625, 0.07440185546875, -0.051361083984375, -0.01922607421875, -0.0211181640625, 0.037322998046875, -0.0272979736328125, -0.03778076171875, 0.0347900390625, -0.0257568359375, -0.0102081298828125, -0.010101318359375, -0.0301513671875, -0.04095458984375, 0.0143585205078125, -0.037322998046875, 0.0648193359375, 0.017730712890625, -0.038543701171875, 0.017608642578125, -0.057586669921875, -0.031707763671875, -0.002155303955078125, -0.004810333251953125, -0.0538330078125, 0.0064697265625, 0.0288848876953125, 0.042999267578125, -0.0049896240234375, 0.01041412353515625, -0.03466796875, -0.0183258056640625, 0.019378662109375, 0.0029544830322265625, 0.076171875, 0.01568603515625, -0.03570556640625, 0.018096923828125, -0.05596923828125, -0.0011310577392578125, 0.03497314453125, 0.01141357421875, -0.00380706787109375, -0.0226593017578125, -0.01377105712890625, 0.030059814453125, 0.0149993896484375, -0.050750732421875, -0.0034332275390625, -0.0060272216796875, 0.049102783203125, 0.054412841796875, 0.03485107421875, 0.04071044921875, -0.03668212890625, 0.0390625, 0.0092620849609375, 0.042938232421875, -0.056610107421875, -0.04681396484375, -0.071044921875, -0.055328369140625, 0.015289306640625, 0.0294342041015625, -0.038360595703125, 0.035797119140625, 0.0019483566284179688, -0.051849365234375, -0.037811279296875, -0.026702880859375, 0.016082763671875, 0.036041259765625, 0.0151824951171875, -0.03460693359375, -0.045196533203125, -0.07562255859375, 0.0203857421875, -0.00637054443359375, -0.004596710205078125, 0.034210205078125, 0.04022216796875, -0.0129241943359375, 0.069580078125, -0.07318115234375, -0.0204620361328125, 0.00046706199645996094, -0.0307159423828125, 0.0010318756103515625, 0.05322265625, 0.046661376953125, -0.057098388671875, -0.037384033203125, -0.01230621337890625, -0.05548095703125, 0.005764007568359375, 0.0007777214050292969, -0.0308837890625, 0.016143798828125, 0.029205322265625, -0.053802490234375, 0.051605224609375, 0.019989013671875, -0.03326416015625, 0.0369873046875, 0.015594482421875, -0.0031147003173828125, -0.0833740234375, 0.029571533203125, 0.01326751708984375, -0.0283355712890625, -0.047943115234375, 0.01389312744140625, -0.00046944618225097656, -0.029815673828125, -0.05010986328125, 0.047515869140625, -0.017578125, -0.00608062744140625, -0.00757598876953125, -0.0003211498260498047, 0.0208740234375, 0.05828857421875, 0.015106201171875, 0.0248870849609375, 0.060821533203125, -0.047515869140625, 0.0194549560546875, 0.03216552734375, -0.036376953125, 0.06280517578125, -0.062347412109375, -0.00203704833984375, -0.0183258056640625, 0.02069091796875, -0.0758056640625, -0.0469970703125, 0.037353515625, -0.03131103515625, 0.018157958984375, -0.017486572265625, -0.0088653564453125, -0.0308990478515625, -0.01358795166015625, 0.0281219482421875, 0.04864501953125, -0.04052734375, 0.043609619140625, 0.0496826171875, -0.0035400390625, -0.0160369873046875, -0.044189453125, -0.0233917236328125, -0.025115966796875, -0.0675048828125, 0.0404052734375, 0.0015087127685546875, -0.00508880615234375, 0.02008056640625, -0.004764556884765625, -0.0018053054809570312, -0.017791748046875, 0.049041748046875, 0.0467529296875, -0.00804901123046875, -0.01390838623046875, -0.0037059783935546875, -0.0162506103515625, -0.003185272216796875, -0.01535797119140625, 0.05706787109375, -0.00923919677734375, -0.03656005859375, -0.047393798828125, 0.01032257080078125, 0.0343017578125, -0.03564453125, 0.0305938720703125, 0.052520751953125, -0.03277587890625, -0.0004303455352783203, -0.054412841796875, -0.01580810546875, -0.035980224609375, 0.0234222412109375, -0.03045654296875, -0.0692138671875, 0.05718994140625, 0.00734710693359375, 0.0014810562133789062, 0.055450439453125, 0.0284423828125, -0.01148223876953125, 0.0809326171875, 0.044189453125, 0.0100555419921875, 0.043365478515625, -0.04400634765625, 0.0183868408203125, -0.0731201171875, -0.04949951171875, -0.024169921875, -0.03485107421875, -0.0299224853515625, -0.05474853515625, 0.035125732421875, 0.01148223876953125, -0.041595458984375, 0.036773681640625, -0.0682373046875, 0.046295166015625, 0.04034423828125, 0.01416778564453125, -0.00757598876953125, 0.017669677734375, -0.001529693603515625, 0.01348876953125, -0.049346923828125, -0.0192718505859375, 0.05419921875, 0.025360107421875, 0.035186767578125, -0.03094482421875, 0.04547119140625, 0.0037841796875, 0.0048675537109375, -0.042816162109375, 0.0477294921875, -0.00400543212890625, -0.06903076171875, -0.0078277587890625, -0.00724029541015625, -0.0657958984375, 0.0226287841796875, 0.00452423095703125, -0.0712890625, 0.04803466796875, -0.0015115737915039062, -0.048126220703125, 0.047271728515625, -0.059600830078125, 0.0694580078125, -0.006481170654296875, -0.01265716552734375, 0.0268096923828125, -0.059356689453125, 0.036224365234375, 0.007770538330078125, -0.00574493408203125, -0.019256591796875, 0.00989532470703125, 0.0633544921875, -0.0268096923828125, 0.0635986328125, -0.028472900390625, 0.0204010009765625, 0.0555419921875, -0.01029205322265625, 0.032196044921875, -0.0034542083740234375, 0.006168365478515625, 0.0248870849609375, 0.01155853271484375, -0.039642333984375, -0.027740478515625, 0.04888916015625, -0.05438232421875, -0.040771484375, -0.0296783447265625, -0.0245819091796875, 0.024200439453125, 0.01326751708984375, 0.031829833984375, 0.025360107421875, 0.01055908203125, 0.01117706298828125, 0.032501220703125, -0.0251922607421875, 0.043212890625, 0.0179290771484375, -0.031341552734375, -0.045928955078125, 0.088623046875, 0.005229949951171875, 0.018096923828125, 0.00756072998046875, -0.004913330078125, -0.0246734619140625, -0.004207611083984375, -0.039886474609375, 0.036651611328125, -0.045196533203125, -0.043731689453125, -0.043609619140625, -0.047088623046875, -0.0262603759765625, -0.033294677734375, -0.033721923828125, -0.042266845703125, -0.017364501953125, -0.0038928985595703125, 0.024200439453125, 0.03790283203125, -0.016265869140625, 0.041900634765625, -0.039306640625, 0.01355743408203125, 0.0266265869140625, 0.02398681640625, 0.0016927719116210938, -0.041839599609375, -0.007640838623046875, 0.0006537437438964844, -0.046600341796875, -0.0447998046875, 0.037628173828125, -0.006072998046875, 0.032073974609375, 0.0526123046875, -0.0017995834350585938, 0.07708740234375, -0.0195159912109375, 0.06524658203125, 0.024078369140625, -0.068115234375, 0.03839111328125, -0.0103759765625, 0.0184173583984375, 0.022491455078125, 0.02239990234375, -0.0386962890625, 0.00038933753967285156, -0.0704345703125, -0.0633544921875, 0.080810546875, 0.0135040283203125, -0.007297515869140625, 0.0073394775390625, 0.0237884521484375, -0.0112762451171875, 0.017669677734375, -0.0579833984375, -0.036834716796875, -0.0267791748046875, 0.01428985595703125, 0.0142364501953125, -0.005718231201171875, -0.004016876220703125, -0.039154052734375, 0.061553955078125, 0.01181793212890625, 0.053741455078125, 0.0268402099609375, -0.004154205322265625, -0.01209259033203125, -0.0171051025390625, 0.0496826171875, 0.045806884765625, -0.023590087890625, 0.0023975372314453125, -0.005481719970703125, -0.01358795166015625, 0.007785797119140625, 0.017669677734375, -0.04595947265625, 0.00868988037109375, 0.00823211669921875, 0.08599853515625, -0.009521484375, -0.0299224853515625, 0.033111572265625, 0.0222930908203125, -0.0253753662109375, -0.023193359375, 0.00817108154296875, 0.006298065185546875, 0.036834716796875, 0.0347900390625, 0.017608642578125, -0.0062103271484375, -0.0299530029296875, 0.01543426513671875, 0.037689208984375, -0.03582763671875, -0.0260467529296875, 0.05499267578125, -0.0034332275390625, -0.0255126953125, 0.0282135009765625, -0.0283203125, -0.054443359375, 0.0643310546875, 0.042327880859375, 0.06817626953125, -0.03533935546875, 0.0419921875, 0.05853271484375, 0.024993896484375, 0.0168304443359375, 0.0032176971435546875, 0.00559234619140625, -0.033416748046875, -0.0308380126953125, -0.07012939453125, -0.007190704345703125, 0.0255584716796875, -0.035125732421875, 0.0184478759765625, -0.042999267578125, -0.0098114013671875, 0.0096588134765625, -0.01015472412109375, -0.040802001953125, 0.03326416015625, 0.006160736083984375, 0.0572509765625, -0.0635986328125, 0.041473388671875, 0.05328369140625, -0.03924560546875, -0.0699462890625, -0.016632080078125, -0.01079559326171875, -0.079833984375, 0.0238800048828125, 0.033447265625, 0.0187225341796875, -0.0019893646240234375, -0.057891845703125, -0.0838623046875, 0.09136962890625, 0.0239410400390625, -0.0189971923828125, -0.00574493408203125, 0.0187225341796875, 0.008880615234375, -0.0494384765625, 0.0100860595703125, 0.045318603515625, 0.04425048828125, 0.04205322265625, -0.03375244140625, 0.0199432373046875, -0.022125244140625, 0.01308441162109375, 0.0204010009765625, -0.07183837890625, 0.0723876953125, -0.01140594482421875, -0.0279388427734375, 0.0087127685546875, 0.031768798828125, 0.0280303955078125, 0.034759521484375, 0.04876708984375, 0.055145263671875, 0.03802490234375, -0.021026611328125, 0.073974609375, -0.027130126953125, 0.0175933837890625, 0.05670166015625, 0.0003485679626464844, 0.04278564453125, 0.0226898193359375, -0.015655517578125, 0.033447265625, 0.06866455078125, -0.035614013671875, 0.037445068359375, 0.0003864765167236328, -0.0035247802734375, -0.0308990478515625, -0.027923583984375, -0.03411865234375, 0.041412353515625, 0.01261138916015625, -0.04412841796875, -0.01065826416015625, 0.005832672119140625, 0.0030536651611328125, -0.03387451171875, -0.028717041015625, 0.046600341796875, 0.00310516357421875, -0.043212890625, 0.04913330078125, 0.01739501953125, 0.0288238525390625, -0.043060302734375, 0.00952911376953125, -0.0177154541015625, 0.00495147705078125, -0.0284271240234375, -0.045074462890625, 0.043060302734375, -0.007595062255859375, -0.00949859619140625, 0.013031005859375, 0.06695556640625, -0.01904296875, -0.061431884765625, -0.0014963150024414062, 0.01116943359375, 0.0264434814453125, -0.0185699462890625, -0.058868408203125, 0.0300445556640625, 0.0220184326171875, -0.01690673828125, 0.0225982666015625, 0.015899658203125, 0.00016224384307861328, 0.036956787109375, 0.0611572265625, -0.004878997802734375, 0.0178070068359375, -0.02734375, 0.0799560546875, -0.057586669921875, -0.040557861328125, -0.057037353515625, 0.061492919921875, -0.0209197998046875, -0.00799560546875, 0.05194091796875, 0.05865478515625, 0.0679931640625, -0.005260467529296875, 0.041839599609375, -0.031219482421875, 0.021392822265625, -0.01535797119140625, 0.041107177734375, -0.0562744140625, -0.00733184814453125, -0.0228729248046875, -0.0875244140625, -0.0286712646484375, 0.0689697265625, -0.019256591796875, 0.01544952392578125, 0.04364013671875, 0.06866455078125, -0.01482391357421875, -0.00305938720703125, 0.0204315185546875, 0.010528564453125, 0.0157318115234375, 0.0252838134765625, 0.047088623046875, -0.039764404296875, 0.043853759765625, -0.0604248046875, -0.00745391845703125, -0.005588531494140625, -0.045928955078125, -0.06219482421875, -0.06134033203125, -0.041351318359375, -0.02520751953125, -0.01401519775390625, 0.04443359375, 0.088623046875, -0.04644775390625, -0.0063629150390625, 0.0104827880859375, 0.0138397216796875, -0.01264190673828125, -0.0172271728515625, 0.051422119140625, 0.0038814544677734375, -0.06793212890625, 0.007732391357421875, 0.038116455078125, 0.007671356201171875, -0.007015228271484375, -0.002513885498046875, -0.014923095703125, 0.0160675048828125, 0.05023193359375, 0.029998779296875, -0.048126220703125, -0.020050048828125, 0.002239227294921875, 0.011199951171875, 0.0185394287109375, 0.035186767578125, -0.035430908203125, 0.0299530029296875, 0.02301025390625, 0.032806396484375, 0.06414794921875, 0.0249176025390625, -0.004016876220703125, -0.045654296875, 0.0141448974609375, -0.00592803955078125, 0.0226898193359375, 0.0309295654296875, -0.01325225830078125, 0.045806884765625, 0.0258941650390625, -0.04095458984375, -0.0657958984375, 0.0092926025390625, -0.0897216796875, -0.0247039794921875, 0.08538818359375, -0.0189971923828125, -0.0548095703125, 0.01342010498046875, -0.0218963623046875, 0.020355224609375, -0.03167724609375, 0.04193115234375, 0.02313232421875, -0.0158538818359375, -0.0236663818359375, -0.016754150390625, 0.0272216796875, 0.01139068603515625, -0.0538330078125, -0.0290679931640625, 0.029083251953125, 0.0299224853515625, 0.036651611328125, 0.035858154296875, -0.01898193359375, 0.01265716552734375, 0.001575469970703125, 0.01287078857421875, -0.0226287841796875, -0.0229339599609375, -0.020660400390625, 0.0259246826171875, -0.032318115234375, -0.0447998046875 ] ]
pyannote/embedding
2023-10-22T08:44:44.000Z
[ "pyannote-audio", "pytorch", "tensorboard", "pyannote", "pyannote-audio-model", "audio", "voice", "speech", "speaker", "speaker-recognition", "speaker-verification", "speaker-identification", "speaker-embedding", "dataset:voxceleb", "license:mit", "region:us" ]
null
pyannote
null
null
pyannote/embedding
46
113,012
pyannote-audio
2022-03-02T23:29:05
--- tags: - pyannote - pyannote-audio - pyannote-audio-model - audio - voice - speech - speaker - speaker-recognition - speaker-verification - speaker-identification - speaker-embedding datasets: - voxceleb license: mit inference: false extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers apply for grants to improve it further. If you are an academic researcher, please cite the relevant papers in your own publications using the model. If you work for a company, please consider contributing back to pyannote.audio development (e.g. through unrestricted gifts). We also provide scientific consulting services around speaker diarization and machine listening." extra_gated_fields: Company/university: text Website: text I plan to use this model for (task, type of audio data, etc): text --- Using this open-source model in production? Make the most of it thanks to our [consulting services](https://herve.niderb.fr/consulting.html). # 🎹 Speaker embedding Relies on pyannote.audio 2.1: see [installation instructions](https://github.com/pyannote/pyannote-audio/). This model is based on the [canonical x-vector TDNN-based architecture](https://ieeexplore.ieee.org/abstract/document/8461375), but with filter banks replaced with [trainable SincNet features](https://ieeexplore.ieee.org/document/8639585). See [`XVectorSincNet`](https://github.com/pyannote/pyannote-audio/blob/3c988c028dc505c64fe776720372f6fe816b585a/pyannote/audio/models/embedding/xvector.py#L104-L169) architecture for implementation details. ## Basic usage ```python # 1. visit hf.co/pyannote/embedding and accept user conditions # 2. visit hf.co/settings/tokens to create an access token # 3. instantiate pretrained model from pyannote.audio import Model model = Model.from_pretrained("pyannote/embedding", use_auth_token="ACCESS_TOKEN_GOES_HERE") ``` ```python from pyannote.audio import Inference inference = Inference(model, window="whole") embedding1 = inference("speaker1.wav") embedding2 = inference("speaker2.wav") # `embeddingX` is (1 x D) numpy array extracted from the file as a whole. from scipy.spatial.distance import cdist distance = cdist(embedding1, embedding2, metric="cosine")[0,0] # `distance` is a `float` describing how dissimilar speakers 1 and 2 are. ``` Using cosine distance directly, this model reaches 2.8% equal error rate (EER) on VoxCeleb 1 test set. This is without voice activity detection (VAD) nor probabilistic linear discriminant analysis (PLDA). Expect even better results when adding one of those. ## Advanced usage ### Running on GPU ```python import torch inference.to(torch.device("cuda")) embedding = inference("audio.wav") ``` ### Extract embedding from an excerpt ```python from pyannote.audio import Inference from pyannote.core import Segment inference = Inference(model, window="whole") excerpt = Segment(13.37, 19.81) embedding = inference.crop("audio.wav", excerpt) # `embedding` is (1 x D) numpy array extracted from the file excerpt. ``` ### Extract embeddings using a sliding window ```python from pyannote.audio import Inference inference = Inference(model, window="sliding", duration=3.0, step=1.0) embeddings = inference("audio.wav") # `embeddings` is a (N x D) pyannote.core.SlidingWindowFeature # `embeddings[i]` is the embedding of the ith position of the # sliding window, i.e. from [i * step, i * step + duration]. ``` ## Citation ```bibtex @inproceedings{Bredin2020, Title = {{pyannote.audio: neural building blocks for speaker diarization}}, Author = {{Bredin}, Herv{\'e} and {Yin}, Ruiqing and {Coria}, Juan Manuel and {Gelly}, Gregory and {Korshunov}, Pavel and {Lavechin}, Marvin and {Fustes}, Diego and {Titeux}, Hadrien and {Bouaziz}, Wassim and {Gill}, Marie-Philippe}, Booktitle = {ICASSP 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing}, Address = {Barcelona, Spain}, Month = {May}, Year = {2020}, } ``` ```bibtex @inproceedings{Coria2020, author="Coria, Juan M. and Bredin, Herv{\'e} and Ghannay, Sahar and Rosset, Sophie", editor="Espinosa-Anke, Luis and Mart{\'i}n-Vide, Carlos and Spasi{\'{c}}, Irena", title="{A Comparison of Metric Learning Loss Functions for End-To-End Speaker Verification}", booktitle="Statistical Language and Speech Processing", year="2020", publisher="Springer International Publishing", pages="137--148", isbn="978-3-030-59430-5" } ```
4,553
[ [ -0.0207061767578125, -0.04705810546875, 0.02197265625, 0.022186279296875, -0.01103973388671875, -0.015411376953125, -0.041717529296875, -0.0254669189453125, 0.0205535888671875, 0.0239715576171875, -0.017608642578125, -0.058929443359375, -0.02099609375, -0.026519775390625, -0.0269622802734375, 0.051177978515625, 0.020172119140625, 0.00848388671875, -0.019287109375, -0.0004177093505859375, -0.0184478759765625, -0.022857666015625, -0.033172607421875, -0.0272216796875, 0.00988006591796875, 0.0211181640625, 0.0186004638671875, 0.035186767578125, 0.012298583984375, 0.0252227783203125, -0.034271240234375, -0.004302978515625, -0.00014710426330566406, -0.010894775390625, 0.00637054443359375, -0.0257720947265625, -0.033233642578125, 0.01424407958984375, 0.055877685546875, 0.04278564453125, -0.0218963623046875, 0.0132598876953125, 0.0038471221923828125, 0.004962921142578125, -0.023529052734375, 0.01241302490234375, -0.036956787109375, 0.00511932373046875, -0.0121307373046875, -0.0099945068359375, -0.03936767578125, -0.0142974853515625, 0.0298614501953125, -0.05438232421875, 0.00908660888671875, -0.0130157470703125, 0.070556640625, 0.005229949951171875, -0.001544952392578125, -0.021331787109375, -0.04864501953125, 0.05377197265625, -0.078369140625, 0.03778076171875, 0.0347900390625, 0.00710296630859375, -0.0157470703125, -0.059844970703125, -0.0396728515625, -0.03326416015625, 0.0062713623046875, 0.0176849365234375, -0.01055908203125, 0.012786865234375, 0.0172576904296875, 0.026641845703125, -0.03369140625, -0.00310516357421875, -0.040557861328125, -0.026947021484375, 0.049713134765625, -0.03033447265625, 0.0302886962890625, -0.0222015380859375, -0.0243072509765625, -0.0323486328125, -0.0306854248046875, 0.005306243896484375, 0.042236328125, 0.029022216796875, -0.040557861328125, 0.031494140625, 0.0146942138671875, 0.038970947265625, 0.006763458251953125, -0.01763916015625, 0.051239013671875, -0.029815673828125, -0.01163482666015625, 0.0430908203125, 0.0740966796875, 0.004589080810546875, 0.01410675048828125, 0.00934600830078125, -0.0007548332214355469, -0.021514892578125, -0.00525665283203125, -0.0399169921875, -0.049652099609375, 0.0231781005859375, -0.040740966796875, -0.0030994415283203125, 0.0012950897216796875, -0.051361083984375, -0.01528167724609375, -0.0157470703125, 0.06353759765625, -0.03955078125, -0.05328369140625, 0.0087738037109375, -0.0283203125, 0.00798797607421875, 0.0030498504638671875, -0.0682373046875, 0.0265045166015625, 0.045135498046875, 0.0869140625, 0.01824951171875, -0.018585205078125, -0.042449951171875, -0.0034084320068359375, -0.023773193359375, 0.032135009765625, -0.02154541015625, -0.03662109375, -0.00848388671875, -0.01558685302734375, -0.01232147216796875, -0.05023193359375, 0.05023193359375, 0.008392333984375, 0.0198211669921875, -0.004550933837890625, -0.045928955078125, -0.010284423828125, -0.0290679931640625, -0.032958984375, 0.0794677734375, 0.004627227783203125, -0.0543212890625, 0.014495849609375, -0.045501708984375, -0.0120849609375, -0.016937255859375, -0.00849151611328125, -0.050994873046875, -0.0007796287536621094, 0.01396942138671875, 0.022796630859375, -0.003688812255859375, 0.0064697265625, -0.02191162109375, -0.036407470703125, 0.01495361328125, -0.0248870849609375, 0.0887451171875, 0.0031490325927734375, -0.04888916015625, -0.004383087158203125, -0.08306884765625, -0.014556884765625, -0.0016832351684570312, -0.043853759765625, -0.03765869140625, 0.0100860595703125, 0.0197906494140625, 0.005413055419921875, 0.0032100677490234375, -0.0655517578125, -0.018951416015625, -0.050079345703125, 0.045623779296875, 0.0458984375, 0.00496673583984375, 0.024139404296875, -0.0183868408203125, 0.006603240966796875, 0.0097808837890625, -0.0005068778991699219, -0.02313232421875, -0.048492431640625, -0.0399169921875, -0.04046630859375, 0.028533935546875, 0.041290283203125, -0.0229949951171875, 0.03680419921875, -0.00848388671875, -0.053131103515625, -0.055816650390625, 0.0036106109619140625, 0.031280517578125, 0.034393310546875, 0.04608154296875, -0.015106201171875, -0.045684814453125, -0.058990478515625, -0.01280975341796875, -0.032012939453125, -0.01154327392578125, 0.039642333984375, 0.0275726318359375, 0.01378631591796875, 0.06341552734375, -0.0208587646484375, -0.0160675048828125, 0.01194000244140625, 0.005893707275390625, 0.039642333984375, 0.06378173828125, 0.0391845703125, -0.05389404296875, -0.042083740234375, -0.0001951456069946289, -0.03460693359375, -0.0205230712890625, -0.0254669189453125, -0.004425048828125, -0.00012165307998657227, 0.038543701171875, -0.059600830078125, 0.03717041015625, 0.022613525390625, -0.0242767333984375, 0.04754638671875, -0.00829315185546875, -0.0096893310546875, -0.07489013671875, -0.0016841888427734375, 0.0249176025390625, -0.0074310302734375, -0.046417236328125, -0.046905517578125, -0.00628662109375, -0.0028018951416015625, -0.02996826171875, 0.03839111328125, -0.04327392578125, -0.0185089111328125, 0.007476806640625, 0.046722412109375, -0.007205963134765625, 0.0533447265625, -0.0022792816162109375, 0.047119140625, 0.049102783203125, -0.04583740234375, 0.0238189697265625, 0.041900634765625, -0.054901123046875, 0.037200927734375, -0.07244873046875, 0.01342010498046875, 0.0181121826171875, 0.015289306640625, -0.086181640625, 0.015869140625, 0.0426025390625, -0.0645751953125, 0.0347900390625, -0.0281524658203125, -0.01395416259765625, -0.0177764892578125, -0.018463134765625, 0.0255889892578125, 0.03369140625, -0.055572509765625, 0.037200927734375, 0.031890869140625, -0.026092529296875, -0.034881591796875, -0.06475830078125, -0.000576019287109375, -0.02099609375, -0.05029296875, 0.04632568359375, -0.0044708251953125, -0.0286865234375, 0.0066986083984375, -0.0112457275390625, 0.0121002197265625, -0.0151519775390625, 0.0288543701171875, 0.00713348388671875, -0.0341796875, 0.01401519775390625, -0.005573272705078125, -0.0006651878356933594, 0.006473541259765625, -0.056488037109375, 0.0266571044921875, 0.0090484619140625, -0.03131103515625, -0.05322265625, 0.0205078125, 0.036468505859375, -0.03948974609375, 0.034515380859375, 0.07159423828125, -0.02178955078125, -0.0158233642578125, -0.041412353515625, 0.0005359649658203125, -0.037628173828125, 0.051910400390625, -0.0147857666015625, -0.043304443359375, 0.03692626953125, 0.014251708984375, 0.0116424560546875, 0.033233642578125, 0.047119140625, -0.00860595703125, 0.05316162109375, 0.0246124267578125, 0.006053924560546875, 0.064697265625, -0.04327392578125, 0.019073486328125, -0.078125, -0.0335693359375, -0.0570068359375, -0.00591278076171875, -0.0574951171875, -0.044342041015625, 0.0246429443359375, -0.00018405914306640625, -0.0183868408203125, 0.029205322265625, -0.046600341796875, 0.01129913330078125, 0.05902099609375, 0.011932373046875, -0.01435089111328125, 0.0178680419921875, -0.0272369384765625, -0.0022735595703125, -0.0435791015625, -0.022674560546875, 0.07830810546875, 0.044769287109375, 0.037384033203125, 0.006946563720703125, 0.06060791015625, 0.01113128662109375, -0.0182342529296875, -0.05963134765625, 0.032257080078125, -0.01520538330078125, -0.033447265625, -0.03509521484375, -0.0304107666015625, -0.05853271484375, 0.0286102294921875, 0.0019092559814453125, -0.08367919921875, 0.051422119140625, -0.0065155029296875, -0.029510498046875, 0.0347900390625, -0.055145263671875, 0.051788330078125, -0.001983642578125, -0.026153564453125, -0.0241241455078125, -0.03253173828125, 0.0043487548828125, 0.0256805419921875, 0.0121612548828125, -0.007476806640625, 0.01531982421875, 0.09075927734375, -0.026519775390625, 0.061309814453125, -0.050262451171875, -0.0086517333984375, 0.04864501953125, -0.016571044921875, 0.0201568603515625, 0.010894775390625, -0.0118255615234375, 0.0176544189453125, 0.0079498291015625, -0.0198211669921875, -0.021881103515625, 0.05859375, -0.0758056640625, -0.038177490234375, -0.01038360595703125, -0.028839111328125, -0.0012388229370117188, 0.00414276123046875, 0.0159912109375, 0.06744384765625, -0.00821685791015625, 0.0313720703125, 0.06243896484375, -0.0428466796875, 0.0616455078125, 0.0202484130859375, 0.0036869049072265625, -0.06439208984375, 0.06854248046875, 0.0223846435546875, 0.0152740478515625, 0.03887939453125, 0.026519775390625, -0.026397705078125, -0.04754638671875, -0.0230712890625, 0.0290679931640625, -0.046417236328125, 0.01654052734375, -0.047882080078125, -0.0174560546875, -0.0386962890625, 0.01654052734375, -0.04827880859375, -0.04443359375, -0.0316162109375, -0.00986480712890625, 0.0211639404296875, 0.02728271484375, -0.0386962890625, 0.0245208740234375, -0.039154052734375, 0.005748748779296875, 0.020416259765625, 0.018890380859375, 0.01198577880859375, -0.061187744140625, -0.0511474609375, 0.01520538330078125, -0.017364501953125, -0.069580078125, 0.018096923828125, 0.035125732421875, 0.0721435546875, 0.0243072509765625, -0.01248931884765625, 0.034576416015625, -0.02239990234375, 0.077880859375, 0.0230865478515625, -0.07830810546875, 0.043243408203125, -0.030120849609375, 0.00992584228515625, 0.026611328125, 0.01026153564453125, -0.041107177734375, -0.01132965087890625, -0.050933837890625, -0.0789794921875, 0.0660400390625, 0.0220947265625, 0.01319122314453125, 0.00534820556640625, 0.0156402587890625, -0.0104522705078125, 0.0020236968994140625, -0.0460205078125, -0.03326416015625, -0.0406494140625, -0.0105438232421875, -0.021728515625, -0.0166168212890625, -0.0020904541015625, -0.043060302734375, 0.07061767578125, 0.0162811279296875, 0.050201416015625, 0.049407958984375, -0.00806427001953125, 0.0003285408020019531, 0.0001971721649169922, 0.055755615234375, 0.036285400390625, -0.03466796875, -0.0007748603820800781, 0.003314971923828125, -0.056854248046875, 0.01114654541015625, 0.009918212890625, 0.0023250579833984375, 0.040771484375, 0.036834716796875, 0.08209228515625, 0.02777099609375, -0.0198822021484375, 0.0457763671875, -0.0036773681640625, -0.0299835205078125, -0.049652099609375, -0.0044097900390625, 0.0288543701171875, 0.0295867919921875, 0.0294952392578125, -0.0030422210693359375, -0.00623321533203125, -0.0253448486328125, 0.0265655517578125, 0.011077880859375, -0.034149169921875, -0.0247955322265625, 0.04986572265625, 0.016845703125, -0.053955078125, 0.052825927734375, -0.0030651092529296875, -0.02435302734375, 0.05413818359375, 0.05084228515625, 0.08642578125, -0.0341796875, 0.01885986328125, 0.04827880859375, 0.0176849365234375, 0.004695892333984375, 0.018798828125, -0.03704833984375, -0.036956787109375, -0.0179595947265625, -0.046295166015625, -0.0228424072265625, 0.0102996826171875, -0.0447998046875, 0.018951416015625, -0.03851318359375, -0.02099609375, 0.0152740478515625, 0.0122222900390625, -0.0270538330078125, 0.002864837646484375, 0.02587890625, 0.07672119140625, -0.0574951171875, 0.05419921875, 0.045135498046875, -0.0301971435546875, -0.06396484375, 0.0050201416015625, -0.007030487060546875, -0.022125244140625, 0.0235595703125, -0.00449371337890625, -0.0030498504638671875, -0.006229400634765625, -0.0389404296875, -0.06671142578125, 0.086181640625, 0.031890869140625, -0.057647705078125, 0.01467132568359375, -0.01392364501953125, 0.0279388427734375, -0.024078369140625, 0.0248870849609375, 0.036376953125, 0.040679931640625, -0.0065460205078125, -0.0914306640625, -0.0111541748046875, -0.03460693359375, -0.01094818115234375, -0.0030498504638671875, -0.05133056640625, 0.08282470703125, -0.01285552978515625, 0.0005145072937011719, 0.0015516281127929688, 0.054168701171875, 0.022674560546875, 0.035064697265625, 0.0389404296875, 0.04388427734375, 0.06585693359375, -0.0010013580322265625, 0.049072265625, -0.0203094482421875, 0.0225982666015625, 0.0924072265625, 0.0178070068359375, 0.06982421875, 0.035064697265625, -0.035736083984375, 0.049072265625, 0.044342041015625, -0.01361083984375, 0.055816650390625, 0.024322509765625, -0.018402099609375, -0.006885528564453125, -0.0012693405151367188, -0.041290283203125, 0.054656982421875, 0.0235595703125, -0.030548095703125, 0.0160369873046875, -0.0038471221923828125, 0.0067901611328125, -0.00409698486328125, -0.009490966796875, 0.039642333984375, 0.035064697265625, -0.0233306884765625, 0.052520751953125, 0.0008187294006347656, 0.055328369140625, -0.0411376953125, -0.0021305084228515625, -0.0054779052734375, 0.0121612548828125, -0.025054931640625, -0.0244293212890625, 0.00302886962890625, -0.01666259765625, -0.01035308837890625, -0.0025634765625, 0.036407470703125, -0.05523681640625, -0.022979736328125, 0.040130615234375, 0.0160064697265625, 0.016754150390625, 0.0077362060546875, -0.04095458984375, -0.0027599334716796875, 0.006069183349609375, -0.029510498046875, 0.0161285400390625, 0.0267333984375, 0.0159149169921875, 0.0228424072265625, 0.0501708984375, 0.0278167724609375, 0.019744873046875, 0.01477813720703125, 0.0401611328125, -0.03363037109375, -0.0699462890625, -0.0606689453125, 0.040435791015625, -0.012298583984375, -0.0282135009765625, 0.0609130859375, 0.04669189453125, 0.06744384765625, 0.007793426513671875, 0.056396484375, 0.00414276123046875, 0.0298919677734375, -0.03106689453125, 0.06634521484375, -0.0416259765625, 0.0295867919921875, -0.047637939453125, -0.06903076171875, 0.0001386404037475586, 0.066650390625, -0.0178070068359375, 0.0098724365234375, 0.036102294921875, 0.07891845703125, -0.0109710693359375, 0.003665924072265625, 0.0195465087890625, 0.037506103515625, 0.034576416015625, 0.041351318359375, 0.053131103515625, -0.04443359375, 0.0477294921875, -0.03350830078125, -0.01108551025390625, -0.0081634521484375, -0.050689697265625, -0.0621337890625, -0.06591796875, -0.04705810546875, -0.025970458984375, -0.00039505958557128906, 0.0897216796875, 0.06292724609375, -0.06427001953125, -0.042877197265625, 0.013580322265625, 0.0160675048828125, -0.03424072265625, -0.0133209228515625, 0.0576171875, 0.016876220703125, -0.049072265625, 0.052825927734375, 0.01464080810546875, 0.0086822509765625, -0.0186920166015625, -0.02142333984375, -0.045745849609375, 0.01035308837890625, 0.01181793212890625, 0.0179595947265625, -0.041046142578125, -0.017181396484375, -0.0289306640625, 0.00567626953125, 0.016937255859375, 0.038818359375, -0.0426025390625, 0.042236328125, 0.05535888671875, 0.01558685302734375, 0.0606689453125, -0.019287109375, 0.0303802490234375, -0.051361083984375, 0.0257110595703125, 0.01318359375, 0.0244140625, 0.045684814453125, -0.011474609375, 0.0130462646484375, 0.034576416015625, -0.03704833984375, -0.0767822265625, -0.03033447265625, -0.0692138671875, -0.00920867919921875, 0.08343505859375, -0.0249176025390625, -0.02239990234375, -0.0086822509765625, -0.0138092041015625, 0.046966552734375, -0.038421630859375, 0.040557861328125, 0.041229248046875, 0.0038089752197265625, -0.0068359375, -0.050994873046875, 0.037628173828125, 0.01617431640625, -0.0216064453125, -0.005130767822265625, 0.01389312744140625, 0.0304412841796875, 0.036163330078125, 0.061279296875, -0.00954437255859375, 0.0313720703125, 0.03887939453125, 0.01885986328125, -0.03094482421875, -0.02215576171875, -0.035614013671875, 0.0055999755859375, -0.006969451904296875, -0.05218505859375 ] ]
google/mobilenet_v1_0.75_192
2023-05-16T16:38:23.000Z
[ "transformers", "pytorch", "mobilenet_v1", "image-classification", "vision", "dataset:imagenet-1k", "arxiv:1704.04861", "license:other", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
google
null
null
google/mobilenet_v1_0.75_192
2
112,627
transformers
2022-11-10T16:06:51
--- license: other tags: - vision - image-classification datasets: - imagenet-1k widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg example_title: Tiger - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg example_title: Teapot - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg example_title: Palace --- # MobileNet V1 MobileNet V1 model pre-trained on ImageNet-1k at resolution 192x192. It was introduced in [MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications](https://arxiv.org/abs/1704.04861) by Howard et al, and first released in [this repository](https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md). Disclaimer: The team releasing MobileNet V1 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description From the [original README](https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md): > MobileNets are small, low-latency, low-power models parameterized to meet the resource constraints of a variety of use cases. They can be built upon for classification, detection, embeddings and segmentation similar to how other popular large scale models, such as Inception, are used. MobileNets can be run efficiently on mobile devices [...] MobileNets trade off between latency, size and accuracy while comparing favorably with popular models from the literature. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=mobilenet_v1) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import AutoImageProcessor, AutoModelForImageClassification from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) preprocessor = AutoImageProcessor.from_pretrained("google/mobilenet_v1_0.75_192") model = AutoModelForImageClassification.from_pretrained("google/mobilenet_v1_0.75_192") inputs = preprocessor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` Note: This model actually predicts 1001 classes, the 1000 classes from ImageNet plus an extra “background” class (index 0). Currently, both the feature extractor and model support PyTorch.
2,778
[ [ -0.042816162109375, -0.0174407958984375, -0.020294189453125, 0.005603790283203125, -0.02490234375, -0.031341552734375, 0.019866943359375, -0.047210693359375, 0.033416748046875, 0.030853271484375, -0.03216552734375, -0.025726318359375, -0.04669189453125, -0.01378631591796875, -0.0296783447265625, 0.0523681640625, 0.005214691162109375, -0.00009143352508544922, -0.033203125, -0.0266571044921875, -0.01232147216796875, -0.037811279296875, -0.07452392578125, -0.026702880859375, 0.04736328125, 0.0341796875, 0.044586181640625, 0.043548583984375, 0.04852294921875, 0.02880859375, 0.0003325939178466797, 0.0007228851318359375, -0.0200042724609375, -0.0311737060546875, 0.0149688720703125, -0.0267791748046875, -0.035064697265625, 0.03369140625, 0.017791748046875, 0.021209716796875, 0.00843048095703125, 0.040069580078125, -0.0018301010131835938, 0.046112060546875, -0.048492431640625, -0.006328582763671875, -0.040618896484375, 0.017303466796875, -0.001300811767578125, 0.00574493408203125, -0.02117919921875, -0.005340576171875, 0.019256591796875, -0.0338134765625, 0.0189666748046875, -0.01026153564453125, 0.08917236328125, 0.0238494873046875, -0.033782958984375, -0.007266998291015625, -0.027862548828125, 0.0423583984375, -0.0341796875, 0.02215576171875, 0.0399169921875, 0.0382080078125, 0.00946807861328125, -0.0909423828125, -0.036346435546875, -0.01033782958984375, 0.0005741119384765625, 0.0049285888671875, -0.0189056396484375, 0.0009503364562988281, 0.01025390625, 0.032470703125, -0.043701171875, 0.01363372802734375, -0.066162109375, -0.02008056640625, 0.056793212890625, 0.0021839141845703125, 0.0084075927734375, -0.00756072998046875, -0.043609619140625, -0.003314971923828125, -0.035736083984375, 0.026336669921875, 0.009124755859375, 0.00007778406143188477, -0.031890869140625, 0.032867431640625, -0.0199737548828125, 0.0504150390625, 0.004901885986328125, -0.0178985595703125, 0.0236053466796875, -0.0084075927734375, -0.04156494140625, 0.0205841064453125, 0.07330322265625, 0.0311126708984375, 0.0091552734375, 0.013275146484375, -0.002960205078125, 0.0058135986328125, 0.032867431640625, -0.09149169921875, -0.0180206298828125, 0.023651123046875, -0.06097412109375, -0.054779052734375, 0.00426483154296875, -0.024993896484375, -0.0213470458984375, -0.004016876220703125, 0.030914306640625, -0.0034198760986328125, -0.028594970703125, 0.0009918212890625, 0.0146331787109375, 0.02374267578125, 0.020782470703125, -0.054229736328125, 0.0273895263671875, 0.0213775634765625, 0.08251953125, 0.0040740966796875, -0.0169525146484375, 0.0013523101806640625, -0.052581787109375, -0.01654052734375, 0.03509521484375, -0.00423431396484375, -0.01617431640625, -0.0279083251953125, 0.0252532958984375, -0.0110931396484375, -0.045623779296875, 0.0445556640625, -0.041015625, -0.0008172988891601562, 0.01412200927734375, -0.01000213623046875, -0.035919189453125, 0.0232696533203125, -0.04833984375, 0.070556640625, 0.0153656005859375, -0.062347412109375, 0.018341064453125, -0.041351318359375, -0.018829345703125, -0.005977630615234375, 0.01070404052734375, -0.054962158203125, -0.004791259765625, -0.01096343994140625, 0.05267333984375, -0.0170745849609375, -0.00029349327087402344, -0.03070068359375, -0.031951904296875, 0.005634307861328125, -0.0126190185546875, 0.07855224609375, 0.049774169921875, -0.0207366943359375, 0.01018524169921875, -0.050933837890625, 0.02471923828125, 0.0169677734375, -0.011627197265625, -0.00324249267578125, -0.0179290771484375, 0.01523590087890625, 0.039398193359375, 0.00731658935546875, -0.033233642578125, 0.01548004150390625, 0.005725860595703125, 0.053497314453125, 0.018341064453125, -0.0225830078125, 0.04461669921875, -0.0224151611328125, 0.0321044921875, 0.02093505859375, 0.0355224609375, -0.03173828125, -0.036468505859375, -0.058197021484375, -0.020782470703125, 0.0263519287109375, 0.042724609375, -0.0439453125, 0.01291656494140625, -0.0215911865234375, -0.06695556640625, -0.026824951171875, -0.00019800662994384766, 0.0208892822265625, 0.03375244140625, 0.0181732177734375, -0.03363037109375, -0.0673828125, -0.07147216796875, 0.01064300537109375, 0.0018949508666992188, 0.019744873046875, 0.032073974609375, 0.04443359375, -0.0273590087890625, 0.062347412109375, -0.0006999969482421875, -0.01076507568359375, -0.006977081298828125, -0.00710296630859375, 0.0196380615234375, 0.06683349609375, 0.044281005859375, -0.08587646484375, -0.026611328125, -0.00024580955505371094, -0.07562255859375, 0.0312347412109375, 0.01058197021484375, 0.0004050731658935547, -0.0018987655639648438, 0.031890869140625, -0.040008544921875, 0.0550537109375, 0.04180908203125, -0.0182647705078125, 0.032318115234375, 0.004985809326171875, 0.0041351318359375, -0.0833740234375, 0.00426483154296875, 0.02874755859375, -0.02880859375, -0.037872314453125, 0.0018253326416015625, 0.006374359130859375, -0.022979736328125, -0.06536865234375, 0.052154541015625, -0.0213775634765625, -0.0001665353775024414, -0.0300445556640625, -0.035797119140625, -0.0020999908447265625, 0.0273895263671875, 0.006633758544921875, 0.03839111328125, 0.058990478515625, -0.06451416015625, 0.041259765625, 0.0132904052734375, -0.0268402099609375, 0.01253509521484375, -0.06463623046875, 0.01015472412109375, -0.00710296630859375, 0.0293731689453125, -0.057373046875, -0.0239105224609375, 0.03253173828125, -0.044830322265625, 0.016998291015625, -0.042877197265625, -0.01012420654296875, -0.056243896484375, -0.00757598876953125, 0.0419921875, 0.040252685546875, -0.053558349609375, 0.040985107421875, 0.028289794921875, 0.034454345703125, -0.0445556640625, -0.06451416015625, -0.00542449951171875, -0.01971435546875, -0.060760498046875, 0.031951904296875, 0.018768310546875, -0.0011205673217773438, 0.0006747245788574219, -0.019012451171875, -0.0272064208984375, 0.00640869140625, 0.0592041015625, 0.0286865234375, -0.024932861328125, -0.00669097900390625, -0.0172882080078125, -0.005859375, -0.004756927490234375, -0.043060302734375, 0.035400390625, -0.0301055908203125, 0.01064300537109375, -0.056671142578125, -0.00800323486328125, 0.058074951171875, -0.0194854736328125, 0.050384521484375, 0.059967041015625, -0.042144775390625, 0.005222320556640625, -0.0297698974609375, -0.0071868896484375, -0.039703369140625, 0.0261077880859375, -0.0411376953125, -0.04901123046875, 0.0472412109375, -0.002960205078125, -0.0226287841796875, 0.03448486328125, 0.0163726806640625, 0.0003955364227294922, 0.059600830078125, 0.04400634765625, 0.00730133056640625, 0.045501708984375, -0.058074951171875, -0.005016326904296875, -0.061279296875, -0.03106689453125, -0.02008056640625, -0.0303192138671875, -0.06646728515625, -0.01442718505859375, 0.0160064697265625, 0.030731201171875, -0.0341796875, 0.05181884765625, -0.05096435546875, 0.0283050537109375, 0.049102783203125, 0.05279541015625, -0.02093505859375, 0.0211639404296875, -0.00414276123046875, 0.0258331298828125, -0.07147216796875, -0.040008544921875, 0.08013916015625, 0.041534423828125, 0.040740966796875, -0.018096923828125, 0.038421630859375, 0.0037593841552734375, 0.033477783203125, -0.061920166015625, 0.03955078125, -0.0263214111328125, -0.05987548828125, -0.0025920867919921875, -0.027313232421875, -0.062347412109375, 0.0187225341796875, -0.0257415771484375, -0.0550537109375, 0.034454345703125, 0.0222015380859375, -0.025421142578125, 0.0271148681640625, -0.0672607421875, 0.095947265625, -0.0118865966796875, -0.056182861328125, 0.01117706298828125, -0.052398681640625, 0.031463623046875, 0.00911712646484375, -0.01312255859375, -0.00769805908203125, 0.0190887451171875, 0.055145263671875, -0.05401611328125, 0.058746337890625, -0.0233917236328125, 0.027923583984375, 0.06683349609375, -0.0022125244140625, 0.03265380859375, -0.00313568115234375, -0.01415252685546875, 0.034942626953125, 0.01053619384765625, -0.043060302734375, -0.0269317626953125, 0.054290771484375, -0.059722900390625, -0.01690673828125, -0.0166015625, -0.006481170654296875, 0.0159454345703125, 0.0194549560546875, 0.0592041015625, 0.051422119140625, 0.0054168701171875, 0.0191650390625, 0.03515625, -0.01505279541015625, 0.04083251953125, -0.01068878173828125, -0.0316162109375, -0.01654052734375, 0.06561279296875, 0.01385498046875, 0.006862640380859375, 0.0028285980224609375, 0.00832366943359375, -0.021636962890625, -0.033203125, -0.035919189453125, -0.0152435302734375, -0.05078125, -0.0219268798828125, -0.046783447265625, -0.036285400390625, -0.032745361328125, -0.01097869873046875, -0.05206298828125, -0.0225830078125, -0.03271484375, 0.0086669921875, 0.01776123046875, 0.0325927734375, -0.003841400146484375, 0.05389404296875, -0.046600341796875, 0.0003399848937988281, 0.0295562744140625, 0.032379150390625, 0.001033782958984375, -0.04766845703125, -0.0214691162109375, 0.01186370849609375, -0.02655029296875, -0.03387451171875, 0.0290069580078125, 0.006717681884765625, 0.0230865478515625, 0.04327392578125, -0.0230712890625, 0.044952392578125, -0.0052947998046875, 0.045623779296875, 0.052032470703125, -0.046142578125, 0.033172607421875, -0.0032711029052734375, 0.0254974365234375, 0.033477783203125, 0.056976318359375, -0.0164642333984375, 0.032989501953125, -0.047027587890625, -0.043731689453125, 0.0418701171875, 0.005100250244140625, 0.019775390625, 0.01221466064453125, 0.03509521484375, -0.00537872314453125, 0.007137298583984375, -0.060943603515625, -0.033935546875, -0.058349609375, -0.01507568359375, -0.0074920654296875, -0.0229644775390625, 0.0264892578125, -0.055145263671875, 0.04058837890625, 0.00643157958984375, 0.040496826171875, 0.0297698974609375, -0.0029659271240234375, 0.00665283203125, -0.0311126708984375, 0.057464599609375, 0.0330810546875, -0.01551055908203125, 0.020660400390625, 0.0011243820190429688, -0.060943603515625, 0.0186614990234375, -0.0011587142944335938, -0.01284027099609375, 0.0020294189453125, 0.009765625, 0.0802001953125, -0.00397491455078125, -0.0093841552734375, 0.04132080078125, -0.0059356689453125, -0.0298614501953125, -0.0389404296875, 0.0080108642578125, -0.01055145263671875, 0.016998291015625, 0.0224151611328125, 0.042510986328125, 0.01079559326171875, -0.034210205078125, 0.017791748046875, 0.01548004150390625, -0.048736572265625, -0.0162353515625, 0.059356689453125, 0.00647735595703125, -0.0242462158203125, 0.053497314453125, -0.031402587890625, -0.048187255859375, 0.08538818359375, 0.03765869140625, 0.058349609375, -0.01438140869140625, 0.018310546875, 0.0791015625, 0.01422119140625, -0.01433563232421875, 0.00006878376007080078, 0.0007257461547851562, -0.0643310546875, -0.0160369873046875, -0.037017822265625, 0.004451751708984375, 0.0288543701171875, -0.03759765625, 0.02880859375, -0.042022705078125, -0.0302734375, 0.02203369140625, 0.0025463104248046875, -0.06256103515625, 0.04608154296875, 0.018096923828125, 0.08355712890625, -0.031463623046875, 0.07574462890625, 0.0638427734375, -0.0237274169921875, -0.07489013671875, -0.037872314453125, -0.0081329345703125, -0.04498291015625, 0.065185546875, 0.042694091796875, 0.005512237548828125, 0.0185089111328125, -0.07757568359375, -0.0557861328125, 0.09063720703125, 0.002208709716796875, -0.035797119140625, 0.02069091796875, -0.004909515380859375, 0.00580596923828125, -0.05029296875, 0.03863525390625, -0.0020885467529296875, 0.01495361328125, 0.0297698974609375, -0.061798095703125, -0.0019521713256835938, -0.03863525390625, 0.004180908203125, -0.0005440711975097656, -0.0523681640625, 0.068603515625, -0.0325927734375, -0.00968170166015625, 0.00496673583984375, 0.039794921875, -0.00954437255859375, 0.047119140625, 0.040069580078125, 0.046142578125, 0.043426513671875, -0.007568359375, 0.07647705078125, -0.004962921142578125, 0.02899169921875, 0.06695556640625, 0.01261138916015625, 0.04034423828125, 0.021881103515625, -0.002590179443359375, 0.01390838623046875, 0.08428955078125, -0.0208282470703125, 0.042236328125, 0.0218353271484375, -0.0240478515625, -0.01537322998046875, 0.0018711090087890625, -0.0282440185546875, 0.057464599609375, 0.0045013427734375, -0.049591064453125, 0.0010967254638671875, 0.02960205078125, -0.0089874267578125, -0.042236328125, -0.059417724609375, 0.023101806640625, 0.0011119842529296875, -0.04083251953125, 0.06658935546875, 0.01126861572265625, 0.0654296875, -0.021636962890625, 0.00200653076171875, -0.021697998046875, 0.0114593505859375, -0.041717529296875, -0.0275726318359375, 0.02593994140625, -0.01763916015625, -0.01113128662109375, 0.00974273681640625, 0.08001708984375, -0.00856781005859375, -0.034820556640625, -0.0005574226379394531, 0.00030112266540527344, 0.03216552734375, -0.0196990966796875, -0.05877685546875, 0.027099609375, -0.01329803466796875, -0.0193023681640625, 0.0163726806640625, -0.00015294551849365234, -0.001018524169921875, 0.06719970703125, 0.044219970703125, -0.0243682861328125, 0.02581787109375, -0.04345703125, 0.06390380859375, -0.029541015625, -0.0205230712890625, -0.050262451171875, 0.04534912109375, -0.01168060302734375, -0.043731689453125, 0.030792236328125, 0.058319091796875, 0.0765380859375, -0.01248931884765625, 0.042755126953125, -0.0264892578125, -0.0089111328125, -0.019439697265625, 0.049774169921875, -0.052978515625, -0.012359619140625, 0.0103607177734375, -0.038177490234375, -0.01373291015625, 0.0733642578125, -0.019866943359375, 0.01215362548828125, 0.03265380859375, 0.067626953125, -0.03839111328125, 0.0050811767578125, 0.0280914306640625, -0.0084991455078125, -0.0169830322265625, 0.0265960693359375, 0.042236328125, -0.0810546875, 0.0270233154296875, -0.05108642578125, -0.00949859619140625, -0.0241241455078125, -0.055389404296875, -0.053863525390625, -0.05133056640625, -0.043212890625, -0.061126708984375, -0.0019855499267578125, 0.072265625, 0.10870361328125, -0.049652099609375, -0.010833740234375, -0.0229034423828125, 0.005859375, -0.0027408599853515625, -0.01554107666015625, 0.0160369873046875, 0.00640869140625, -0.041259765625, 0.0037212371826171875, -0.0127105712890625, 0.0249786376953125, -0.003452301025390625, -0.02374267578125, -0.00991058349609375, -0.017852783203125, 0.051788330078125, 0.0433349609375, -0.0239105224609375, -0.004535675048828125, -0.00730133056640625, -0.0300445556640625, 0.01270294189453125, 0.049591064453125, -0.0479736328125, 0.0183258056640625, 0.013092041015625, 0.035400390625, 0.0660400390625, -0.012359619140625, -0.0090179443359375, -0.0297698974609375, 0.044769287109375, 0.004535675048828125, 0.0228729248046875, 0.00893402099609375, -0.022308349609375, 0.05438232421875, 0.017364501953125, -0.03668212890625, -0.056243896484375, 0.005428314208984375, -0.09344482421875, -0.01003265380859375, 0.070556640625, -0.01776123046875, -0.03631591796875, 0.02362060546875, -0.0022449493408203125, 0.0159454345703125, 0.00821685791015625, 0.0272064208984375, 0.020477294921875, 0.0118408203125, -0.05133056640625, -0.04833984375, 0.024810791015625, -0.0005507469177246094, -0.04248046875, -0.056427001953125, 0.006622314453125, 0.0340576171875, 0.00943756103515625, 0.04376220703125, -0.0127410888671875, 0.0237579345703125, 0.028472900390625, 0.032684326171875, -0.038177490234375, -0.0237274169921875, -0.0109710693359375, 0.006763458251953125, -0.02911376953125, -0.052459716796875 ] ]
facebook/mms-1b-all
2023-06-15T10:45:44.000Z
[ "transformers", "pytorch", "safetensors", "wav2vec2", "automatic-speech-recognition", "mms", "ab", "af", "ak", "am", "ar", "as", "av", "ay", "az", "ba", "bm", "be", "bn", "bi", "bo", "sh", "br", "bg", "ca", "cs", "ce", "cv", "ku", "cy", "da", "de", "dv", "dz", "el", "en", "eo", "et", "eu", "ee", "fo", "fa", "fj", "fi", "fr", "fy", "ff", "ga", "gl", "gn", "gu", "zh", "ht", "ha", "he", "hi", "hu", "hy", "ig", "ia", "ms", "is", "it", "jv", "ja", "kn", "ka", "kk", "kr", "km", "ki", "rw", "ky", "ko", "kv", "lo", "la", "lv", "ln", "lt", "lb", "lg", "mh", "ml", "mr", "mk", "mg", "mt", "mn", "mi", "my", "nl", "no", "ne", "ny", "oc", "om", "or", "os", "pa", "pl", "pt", "ps", "qu", "ro", "rn", "ru", "sg", "sk", "sl", "sm", "sn", "sd", "so", "es", "sq", "su", "sv", "sw", "ta", "tt", "te", "tg", "tl", "th", "ti", "ts", "tr", "uk", "vi", "wo", "xh", "yo", "zu", "za", "dataset:google/fleurs", "arxiv:2305.13516", "license:cc-by-nc-4.0", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
facebook
null
null
facebook/mms-1b-all
59
112,017
transformers
2023-05-27T11:43:21
--- tags: - mms language: - ab - af - ak - am - ar - as - av - ay - az - ba - bm - be - bn - bi - bo - sh - br - bg - ca - cs - ce - cv - ku - cy - da - de - dv - dz - el - en - eo - et - eu - ee - fo - fa - fj - fi - fr - fy - ff - ga - gl - gn - gu - zh - ht - ha - he - hi - sh - hu - hy - ig - ia - ms - is - it - jv - ja - kn - ka - kk - kr - km - ki - rw - ky - ko - kv - lo - la - lv - ln - lt - lb - lg - mh - ml - mr - ms - mk - mg - mt - mn - mi - my - zh - nl - 'no' - 'no' - ne - ny - oc - om - or - os - pa - pl - pt - ms - ps - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - qu - ro - rn - ru - sg - sk - sl - sm - sn - sd - so - es - sq - su - sv - sw - ta - tt - te - tg - tl - th - ti - ts - tr - uk - ms - vi - wo - xh - ms - yo - ms - zu - za license: cc-by-nc-4.0 datasets: - google/fleurs metrics: - wer --- # Massively Multilingual Speech (MMS) - Finetuned ASR - ALL This checkpoint is a model fine-tuned for multi-lingual ASR and part of Facebook's [Massive Multilingual Speech project](https://research.facebook.com/publications/scaling-speech-technology-to-1000-languages/). This checkpoint is based on the [Wav2Vec2 architecture](https://huggingface.co/docs/transformers/model_doc/wav2vec2) and makes use of adapter models to transcribe 1000+ languages. The checkpoint consists of **1 billion parameters** and has been fine-tuned from [facebook/mms-1b](https://huggingface.co/facebook/mms-1b) on 1162 languages. ## Table Of Content - [Example](#example) - [Supported Languages](#supported-languages) - [Model details](#model-details) - [Additional links](#additional-links) ## Example This MMS checkpoint can be used with [Transformers](https://github.com/huggingface/transformers) to transcribe audio of 1107 different languages. Let's look at a simple example. First, we install transformers and some other libraries ``` pip install torch accelerate torchaudio datasets pip install --upgrade transformers ```` **Note**: In order to use MMS you need to have at least `transformers >= 4.30` installed. If the `4.30` version is not yet available [on PyPI](https://pypi.org/project/transformers/) make sure to install `transformers` from source: ``` pip install git+https://github.com/huggingface/transformers.git ``` Next, we load a couple of audio samples via `datasets`. Make sure that the audio data is sampled to 16000 kHz. ```py from datasets import load_dataset, Audio # English stream_data = load_dataset("mozilla-foundation/common_voice_13_0", "en", split="test", streaming=True) stream_data = stream_data.cast_column("audio", Audio(sampling_rate=16000)) en_sample = next(iter(stream_data))["audio"]["array"] # French stream_data = load_dataset("mozilla-foundation/common_voice_13_0", "fr", split="test", streaming=True) stream_data = stream_data.cast_column("audio", Audio(sampling_rate=16000)) fr_sample = next(iter(stream_data))["audio"]["array"] ``` Next, we load the model and processor ```py from transformers import Wav2Vec2ForCTC, AutoProcessor import torch model_id = "facebook/mms-1b-all" processor = AutoProcessor.from_pretrained(model_id) model = Wav2Vec2ForCTC.from_pretrained(model_id) ``` Now we process the audio data, pass the processed audio data to the model and transcribe the model output, just like we usually do for Wav2Vec2 models such as [facebook/wav2vec2-base-960h](https://huggingface.co/facebook/wav2vec2-base-960h) ```py inputs = processor(en_sample, sampling_rate=16_000, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs).logits ids = torch.argmax(outputs, dim=-1)[0] transcription = processor.decode(ids) # 'joe keton disapproved of films and buster also had reservations about the media' ``` We can now keep the same model in memory and simply switch out the language adapters by calling the convenient [`load_adapter()`]() function for the model and [`set_target_lang()`]() for the tokenizer. We pass the target language as an input - "fra" for French. ```py processor.tokenizer.set_target_lang("fra") model.load_adapter("fra") inputs = processor(fr_sample, sampling_rate=16_000, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs).logits ids = torch.argmax(outputs, dim=-1)[0] transcription = processor.decode(ids) # "ce dernier est volé tout au long de l'histoire romaine" ``` In the same way the language can be switched out for all other supported languages. Please have a look at: ```py processor.tokenizer.vocab.keys() ``` For more details, please have a look at [the official docs](https://huggingface.co/docs/transformers/main/en/model_doc/mms). ## Supported Languages This model supports 1162 languages. Unclick the following to toogle all supported languages of this checkpoint in [ISO 639-3 code](https://en.wikipedia.org/wiki/ISO_639-3). You can find more details about the languages and their ISO 649-3 codes in the [MMS Language Coverage Overview](https://dl.fbaipublicfiles.com/mms/misc/language_coverage_mms.html). <details> <summary>Click to toggle</summary> - abi - abk - abp - aca - acd - ace - acf - ach - acn - acr - acu - ade - adh - adj - adx - aeu - afr - agd - agg - agn - agr - agu - agx - aha - ahk - aia - aka - akb - ake - akp - alj - alp - alt - alz - ame - amf - amh - ami - amk - ann - any - aoz - apb - apr - ara - arl - asa - asg - asm - ast - ata - atb - atg - ati - atq - ava - avn - avu - awa - awb - ayo - ayr - ayz - azb - azg - azj-script_cyrillic - azj-script_latin - azz - bak - bam - ban - bao - bas - bav - bba - bbb - bbc - bbo - bcc-script_arabic - bcc-script_latin - bcl - bcw - bdg - bdh - bdq - bdu - bdv - beh - bel - bem - ben - bep - bex - bfa - bfo - bfy - bfz - bgc - bgq - bgr - bgt - bgw - bha - bht - bhz - bib - bim - bis - biv - bjr - bjv - bjw - bjz - bkd - bkv - blh - blt - blx - blz - bmq - bmr - bmu - bmv - bng - bno - bnp - boa - bod - boj - bom - bor - bos - bov - box - bpr - bps - bqc - bqi - bqj - bqp - bre - bru - bsc - bsq - bss - btd - bts - btt - btx - bud - bul - bus - bvc - bvz - bwq - bwu - byr - bzh - bzi - bzj - caa - cab - cac-dialect_sanmateoixtatan - cac-dialect_sansebastiancoatan - cak-dialect_central - cak-dialect_santamariadejesus - cak-dialect_santodomingoxenacoj - cak-dialect_southcentral - cak-dialect_western - cak-dialect_yepocapa - cap - car - cas - cat - cax - cbc - cbi - cbr - cbs - cbt - cbu - cbv - cce - cco - cdj - ceb - ceg - cek - ces - cfm - cgc - che - chf - chv - chz - cjo - cjp - cjs - ckb - cko - ckt - cla - cle - cly - cme - cmn-script_simplified - cmo-script_khmer - cmo-script_latin - cmr - cnh - cni - cnl - cnt - coe - cof - cok - con - cot - cou - cpa - cpb - cpu - crh - crk-script_latin - crk-script_syllabics - crn - crq - crs - crt - csk - cso - ctd - ctg - cto - ctu - cuc - cui - cuk - cul - cwa - cwe - cwt - cya - cym - daa - dah - dan - dar - dbj - dbq - ddn - ded - des - deu - dga - dgi - dgk - dgo - dgr - dhi - did - dig - dik - dip - div - djk - dnj-dialect_blowowest - dnj-dialect_gweetaawueast - dnt - dnw - dop - dos - dsh - dso - dtp - dts - dug - dwr - dyi - dyo - dyu - dzo - eip - eka - ell - emp - enb - eng - enx - epo - ese - ess - est - eus - evn - ewe - eza - fal - fao - far - fas - fij - fin - flr - fmu - fon - fra - frd - fry - ful - gag-script_cyrillic - gag-script_latin - gai - gam - gau - gbi - gbk - gbm - gbo - gde - geb - gej - gil - gjn - gkn - gld - gle - glg - glk - gmv - gna - gnd - gng - gof-script_latin - gog - gor - gqr - grc - gri - grn - grt - gso - gub - guc - gud - guh - guj - guk - gum - guo - guq - guu - gux - gvc - gvl - gwi - gwr - gym - gyr - had - hag - hak - hap - hat - hau - hay - heb - heh - hif - hig - hil - hin - hlb - hlt - hne - hnn - hns - hoc - hoy - hrv - hsb - hto - hub - hui - hun - hus-dialect_centralveracruz - hus-dialect_westernpotosino - huu - huv - hvn - hwc - hye - hyw - iba - ibo - icr - idd - ifa - ifb - ife - ifk - ifu - ify - ign - ikk - ilb - ilo - imo - ina - inb - ind - iou - ipi - iqw - iri - irk - isl - ita - itl - itv - ixl-dialect_sangasparchajul - ixl-dialect_sanjuancotzal - ixl-dialect_santamarianebaj - izr - izz - jac - jam - jav - jbu - jen - jic - jiv - jmc - jmd - jpn - jun - juy - jvn - kaa - kab - kac - kak - kam - kan - kao - kaq - kat - kay - kaz - kbo - kbp - kbq - kbr - kby - kca - kcg - kdc - kde - kdh - kdi - kdj - kdl - kdn - kdt - kea - kek - ken - keo - ker - key - kez - kfb - kff-script_telugu - kfw - kfx - khg - khm - khq - kia - kij - kik - kin - kir - kjb - kje - kjg - kjh - kki - kkj - kle - klu - klv - klw - kma - kmd - kml - kmr-script_arabic - kmr-script_cyrillic - kmr-script_latin - kmu - knb - kne - knf - knj - knk - kno - kog - kor - kpq - kps - kpv - kpy - kpz - kqe - kqp - kqr - kqy - krc - kri - krj - krl - krr - krs - kru - ksb - ksr - kss - ktb - ktj - kub - kue - kum - kus - kvn - kvw - kwd - kwf - kwi - kxc - kxf - kxm - kxv - kyb - kyc - kyf - kyg - kyo - kyq - kyu - kyz - kzf - lac - laj - lam - lao - las - lat - lav - law - lbj - lbw - lcp - lee - lef - lem - lew - lex - lgg - lgl - lhu - lia - lid - lif - lin - lip - lis - lit - lje - ljp - llg - lln - lme - lnd - lns - lob - lok - lom - lon - loq - lsi - lsm - ltz - luc - lug - luo - lwo - lww - lzz - maa-dialect_sanantonio - maa-dialect_sanjeronimo - mad - mag - mah - mai - maj - mak - mal - mam-dialect_central - mam-dialect_northern - mam-dialect_southern - mam-dialect_western - maq - mar - maw - maz - mbb - mbc - mbh - mbj - mbt - mbu - mbz - mca - mcb - mcd - mco - mcp - mcq - mcu - mda - mdf - mdv - mdy - med - mee - mej - men - meq - met - mev - mfe - mfh - mfi - mfk - mfq - mfy - mfz - mgd - mge - mgh - mgo - mhi - mhr - mhu - mhx - mhy - mib - mie - mif - mih - mil - mim - min - mio - mip - miq - mit - miy - miz - mjl - mjv - mkd - mkl - mkn - mlg - mlt - mmg - mnb - mnf - mnk - mnw - mnx - moa - mog - mon - mop - mor - mos - mox - moz - mpg - mpm - mpp - mpx - mqb - mqf - mqj - mqn - mri - mrw - msy - mtd - mtj - mto - muh - mup - mur - muv - muy - mvp - mwq - mwv - mxb - mxq - mxt - mxv - mya - myb - myk - myl - myv - myx - myy - mza - mzi - mzj - mzk - mzm - mzw - nab - nag - nan - nas - naw - nca - nch - ncj - ncl - ncu - ndj - ndp - ndv - ndy - ndz - neb - new - nfa - nfr - nga - ngl - ngp - ngu - nhe - nhi - nhu - nhw - nhx - nhy - nia - nij - nim - nin - nko - nlc - nld - nlg - nlk - nmz - nnb - nno - nnq - nnw - noa - nob - nod - nog - not - npi - npl - npy - nso - nst - nsu - ntm - ntr - nuj - nus - nuz - nwb - nxq - nya - nyf - nyn - nyo - nyy - nzi - obo - oci - ojb-script_latin - ojb-script_syllabics - oku - old - omw - onb - ood - orm - ory - oss - ote - otq - ozm - pab - pad - pag - pam - pan - pao - pap - pau - pbb - pbc - pbi - pce - pcm - peg - pez - pib - pil - pir - pis - pjt - pkb - pls - plw - pmf - pny - poh-dialect_eastern - poh-dialect_western - poi - pol - por - poy - ppk - pps - prf - prk - prt - pse - pss - ptu - pui - pus - pwg - pww - pxm - qub - quc-dialect_central - quc-dialect_east - quc-dialect_north - quf - quh - qul - quw - quy - quz - qvc - qve - qvh - qvm - qvn - qvo - qvs - qvw - qvz - qwh - qxh - qxl - qxn - qxo - qxr - rah - rai - rap - rav - raw - rej - rel - rgu - rhg - rif-script_arabic - rif-script_latin - ril - rim - rjs - rkt - rmc-script_cyrillic - rmc-script_latin - rmo - rmy-script_cyrillic - rmy-script_latin - rng - rnl - roh-dialect_sursilv - roh-dialect_vallader - rol - ron - rop - rro - rub - ruf - rug - run - rus - sab - sag - sah - saj - saq - sas - sat - sba - sbd - sbl - sbp - sch - sck - sda - sea - seh - ses - sey - sgb - sgj - sgw - shi - shk - shn - sho - shp - sid - sig - sil - sja - sjm - sld - slk - slu - slv - sml - smo - sna - snd - sne - snn - snp - snw - som - soy - spa - spp - spy - sqi - sri - srm - srn - srp-script_cyrillic - srp-script_latin - srx - stn - stp - suc - suk - sun - sur - sus - suv - suz - swe - swh - sxb - sxn - sya - syl - sza - tac - taj - tam - tao - tap - taq - tat - tav - tbc - tbg - tbk - tbl - tby - tbz - tca - tcc - tcs - tcz - tdj - ted - tee - tel - tem - teo - ter - tes - tew - tex - tfr - tgj - tgk - tgl - tgo - tgp - tha - thk - thl - tih - tik - tir - tkr - tlb - tlj - tly - tmc - tmf - tna - tng - tnk - tnn - tnp - tnr - tnt - tob - toc - toh - tom - tos - tpi - tpm - tpp - tpt - trc - tri - trn - trs - tso - tsz - ttc - tte - ttq-script_tifinagh - tue - tuf - tuk-script_arabic - tuk-script_latin - tuo - tur - tvw - twb - twe - twu - txa - txq - txu - tye - tzh-dialect_bachajon - tzh-dialect_tenejapa - tzj-dialect_eastern - tzj-dialect_western - tzo-dialect_chamula - tzo-dialect_chenalho - ubl - ubu - udm - udu - uig-script_arabic - uig-script_cyrillic - ukr - umb - unr - upv - ura - urb - urd-script_arabic - urd-script_devanagari - urd-script_latin - urk - urt - ury - usp - uzb-script_cyrillic - uzb-script_latin - vag - vid - vie - vif - vmw - vmy - vot - vun - vut - wal-script_ethiopic - wal-script_latin - wap - war - waw - way - wba - wlo - wlx - wmw - wob - wol - wsg - wwa - xal - xdy - xed - xer - xho - xmm - xnj - xnr - xog - xon - xrb - xsb - xsm - xsr - xsu - xta - xtd - xte - xtm - xtn - xua - xuo - yaa - yad - yal - yam - yao - yas - yat - yaz - yba - ybb - ycl - ycn - yea - yka - yli - yor - yre - yua - yue-script_traditional - yuz - yva - zaa - zab - zac - zad - zae - zai - zam - zao - zaq - zar - zas - zav - zaw - zca - zga - zim - ziw - zlm - zmz - zne - zos - zpc - zpg - zpi - zpl - zpm - zpo - zpt - zpu - zpz - ztq - zty - zul - zyb - zyp - zza </details> ## Model details - **Developed by:** Vineel Pratap et al. - **Model type:** Multi-Lingual Automatic Speech Recognition model - **Language(s):** 1000+ languages, see [supported languages](#supported-languages) - **License:** CC-BY-NC 4.0 license - **Num parameters**: 1 billion - **Audio sampling rate**: 16,000 kHz - **Cite as:** @article{pratap2023mms, title={Scaling Speech Technology to 1,000+ Languages}, author={Vineel Pratap and Andros Tjandra and Bowen Shi and Paden Tomasello and Arun Babu and Sayani Kundu and Ali Elkahky and Zhaoheng Ni and Apoorv Vyas and Maryam Fazel-Zarandi and Alexei Baevski and Yossi Adi and Xiaohui Zhang and Wei-Ning Hsu and Alexis Conneau and Michael Auli}, journal={arXiv}, year={2023} } ## Additional Links - [Blog post](https://ai.facebook.com/blog/multilingual-model-speech-recognition/) - [Transformers documentation](https://huggingface.co/docs/transformers/main/en/model_doc/mms). - [Paper](https://arxiv.org/abs/2305.13516) - [GitHub Repository](https://github.com/facebookresearch/fairseq/tree/main/examples/mms#asr) - [Other **MMS** checkpoints](https://huggingface.co/models?other=mms) - MMS base checkpoints: - [facebook/mms-1b](https://huggingface.co/facebook/mms-1b) - [facebook/mms-300m](https://huggingface.co/facebook/mms-300m) - [Official Space](https://huggingface.co/spaces/facebook/MMS)
14,885
[ [ -0.0386962890625, -0.03338623046875, 0.0164642333984375, 0.03216552734375, -0.018585205078125, 0.006069183349609375, -0.0291748046875, -0.01513671875, 0.0386962890625, 0.028045654296875, -0.044769287109375, -0.042724609375, -0.0391845703125, 0.017059326171875, -0.00862884521484375, 0.052764892578125, 0.00632476806640625, 0.008636474609375, 0.027618408203125, -0.01171875, -0.01097869873046875, -0.017059326171875, -0.033599853515625, 0.0031986236572265625, 0.01800537109375, 0.035247802734375, 0.0452880859375, 0.035400390625, 0.0237579345703125, 0.0282440185546875, -0.0149688720703125, -0.005008697509765625, -0.004695892333984375, -0.019287109375, -0.002773284912109375, -0.0297698974609375, -0.0198516845703125, -0.0031261444091796875, 0.0518798828125, 0.048797607421875, -0.001255035400390625, 0.018157958984375, -0.006809234619140625, 0.054473876953125, -0.0238800048828125, 0.01526641845703125, -0.0240325927734375, -0.0125579833984375, -0.0171966552734375, -0.010650634765625, -0.0172271728515625, -0.04058837890625, 0.0005245208740234375, -0.035980224609375, 0.0011434555053710938, -0.0162811279296875, 0.0972900390625, -0.00629425048828125, -0.0272369384765625, -0.0240631103515625, -0.0438232421875, 0.06298828125, -0.05023193359375, 0.03656005859375, 0.037506103515625, 0.0155792236328125, -0.01230621337890625, -0.046875, -0.060791015625, 0.00870513916015625, -0.00536346435546875, 0.0300750732421875, -0.0216064453125, -0.01226043701171875, 0.015899658203125, 0.0234222412109375, -0.04205322265625, -0.01114654541015625, -0.059173583984375, -0.0296783447265625, 0.043731689453125, -0.002986907958984375, 0.037384033203125, -0.035400390625, -0.0250244140625, -0.006877899169921875, -0.036163330078125, 0.0245361328125, 0.0282440185546875, 0.0240631103515625, -0.034088134765625, 0.0499267578125, -0.030364990234375, 0.046844482421875, 0.008758544921875, -0.0350341796875, 0.050933837890625, -0.04449462890625, -0.01522064208984375, -0.0007538795471191406, 0.07220458984375, 0.0413818359375, 0.003574371337890625, 0.0036792755126953125, -0.00022304058074951172, 0.026336669921875, -0.0255126953125, -0.05841064453125, 0.00016236305236816406, 0.0289459228515625, -0.0245819091796875, 0.005828857421875, 0.00045943260192871094, -0.06396484375, 0.0135345458984375, -0.0174407958984375, 0.036895751953125, -0.051055908203125, -0.035186767578125, 0.00307464599609375, -0.015716552734375, 0.01983642578125, 0.0139617919921875, -0.0758056640625, 0.005126953125, 0.0276947021484375, 0.07568359375, 0.0033664703369140625, -0.0301513671875, -0.01654052734375, 0.016357421875, -0.0217132568359375, 0.045928955078125, -0.0277099609375, -0.03204345703125, 0.0038700103759765625, 0.0255126953125, -0.0295257568359375, -0.0295257568359375, 0.0400390625, -0.0187530517578125, 0.034881591796875, -0.028594970703125, -0.01715087890625, -0.01081085205078125, -0.007110595703125, -0.0360107421875, 0.085205078125, 0.0335693359375, -0.056427001953125, 0.0312347412109375, -0.0287322998046875, -0.036773681640625, -0.01049041748046875, -0.00031828880310058594, -0.030914306640625, -0.010589599609375, 0.0245819091796875, 0.021820068359375, -0.01178741455078125, 0.005466461181640625, 0.004547119140625, -0.03369140625, 0.00403594970703125, -0.019775390625, 0.101318359375, 0.04840087890625, -0.040252685546875, 0.0009207725524902344, -0.0614013671875, 0.0024433135986328125, -0.0029296875, -0.058258056640625, 0.004390716552734375, -0.0243377685546875, 0.0169830322265625, 0.038116455078125, 0.0142822265625, -0.05908203125, -0.0010366439819335938, -0.037445068359375, 0.03857421875, 0.046539306640625, -0.005680084228515625, 0.0316162109375, -0.038818359375, 0.0400390625, 0.016021728515625, 0.002643585205078125, -0.0194091796875, -0.035614013671875, -0.05902099609375, -0.042083740234375, 0.0122222900390625, 0.060302734375, -0.0518798828125, 0.043212890625, -0.0250091552734375, -0.05194091796875, -0.05267333984375, -0.006069183349609375, 0.015289306640625, 0.026641845703125, 0.02374267578125, -0.018646240234375, -0.054290771484375, -0.060791015625, -0.01068115234375, -0.023956298828125, -0.0024566650390625, 0.0284423828125, 0.046051025390625, -0.01959228515625, 0.06103515625, -0.015899658203125, -0.042724609375, -0.026336669921875, -0.021636962890625, 0.04315185546875, 0.03546142578125, 0.051177978515625, -0.050445556640625, -0.057342529296875, 0.00859832763671875, -0.046295166015625, -0.008209228515625, -0.0045166015625, 0.00899505615234375, 0.040618896484375, 0.029266357421875, -0.0474853515625, 0.0279693603515625, 0.047088623046875, -0.0391845703125, 0.04608154296875, -0.01751708984375, 0.032989501953125, -0.09014892578125, 0.002086639404296875, -0.0193328857421875, 0.003345489501953125, -0.040069580078125, -0.01287078857421875, 0.003692626953125, -0.00818634033203125, -0.040679931640625, 0.045379638671875, -0.042633056640625, 0.00688934326171875, 0.00658416748046875, 0.013702392578125, -0.015777587890625, 0.0465087890625, 0.0107421875, 0.06402587890625, 0.06634521484375, -0.0418701171875, 0.0293731689453125, 0.0196533203125, -0.049407958984375, 0.044158935546875, -0.035675048828125, -0.0218505859375, -0.007274627685546875, 0.0187530517578125, -0.0880126953125, -0.0214996337890625, 0.022125244140625, -0.055938720703125, 0.01097869873046875, -0.020050048828125, -0.035614013671875, -0.041778564453125, -0.01091766357421875, 0.00511932373046875, 0.0333251953125, -0.025238037109375, 0.039398193359375, 0.036102294921875, -0.020416259765625, -0.0482177734375, -0.0654296875, 0.0120849609375, -0.0223541259765625, -0.067138671875, 0.0173187255859375, -0.0008611679077148438, -0.006855010986328125, -0.0048675537109375, 0.0003502368927001953, -0.006561279296875, -0.006839752197265625, 0.0211181640625, 0.01319122314453125, -0.020263671875, -0.016326904296875, 0.004116058349609375, -0.00806427001953125, -0.01247406005859375, -0.01399993896484375, 0.06396484375, -0.01001739501953125, -0.00901031494140625, -0.0577392578125, 0.0271759033203125, 0.04852294921875, -0.035888671875, 0.062255859375, 0.060272216796875, -0.020050048828125, 0.0172271728515625, -0.045257568359375, -0.006137847900390625, -0.03558349609375, 0.025604248046875, -0.04443359375, -0.06524658203125, 0.0645751953125, -0.0123748779296875, -0.0018491744995117188, 0.051177978515625, 0.0604248046875, -0.0056304931640625, 0.062255859375, 0.025390625, -0.01522064208984375, 0.041778564453125, -0.0428466796875, -0.0034732818603515625, -0.04949951171875, -0.02789306640625, -0.05792236328125, -0.01044464111328125, -0.05487060546875, -0.037506103515625, 0.033294677734375, 0.00225830078125, -0.01174163818359375, 0.04278564453125, -0.029937744140625, 0.020416259765625, 0.034393310546875, -0.0026645660400390625, 0.00974273681640625, 0.013397216796875, -0.03521728515625, -0.0094451904296875, -0.03472900390625, -0.0447998046875, 0.09088134765625, 0.01198577880859375, 0.032196044921875, 0.033538818359375, 0.058013916015625, 0.00525665283203125, -0.00428009033203125, -0.0440673828125, 0.033905029296875, -0.00405120849609375, -0.05755615234375, -0.0271759033203125, -0.019317626953125, -0.07049560546875, 0.031402587890625, -0.00011134147644042969, -0.0802001953125, 0.0377197265625, -0.01450347900390625, -0.0217132568359375, 0.016143798828125, -0.049163818359375, 0.055206298828125, 0.014617919921875, -0.0202178955078125, -0.0244140625, -0.053863525390625, 0.0201263427734375, 0.01358795166015625, 0.0357666015625, -0.01482391357421875, 0.010589599609375, 0.06695556640625, -0.03619384765625, 0.037017822265625, -0.0159912109375, -0.0025768280029296875, 0.037017822265625, -0.0048980712890625, 0.0236053466796875, 0.01212310791015625, -0.0304412841796875, 0.0177001953125, 0.0263824462890625, -0.0288848876953125, -0.015777587890625, 0.062164306640625, -0.06976318359375, -0.04345703125, -0.05035400390625, -0.0270843505859375, 0.013336181640625, 0.033599853515625, 0.03717041015625, 0.037384033203125, -0.00247955322265625, 0.01641845703125, 0.0301513671875, -0.02459716796875, 0.045074462890625, 0.04693603515625, -0.0169525146484375, -0.06396484375, 0.07049560546875, 0.021636962890625, 0.0305023193359375, 0.0196533203125, 0.0083465576171875, -0.0266571044921875, -0.025543212890625, -0.0460205078125, 0.0231170654296875, -0.033843994140625, -0.0006775856018066406, -0.05755615234375, -0.0045928955078125, -0.0635986328125, -0.0097503662109375, -0.01849365234375, -0.044097900390625, -0.01049041748046875, 0.00494384765625, 0.022918701171875, 0.0228424072265625, -0.0301361083984375, 0.031402587890625, -0.0361328125, 0.03387451171875, 0.00141143798828125, 0.00960540771484375, -0.0245208740234375, -0.059539794921875, -0.0255889892578125, 0.0170135498046875, -0.025970458984375, -0.07989501953125, 0.041656494140625, 0.012237548828125, 0.0291290283203125, 0.0309600830078125, -0.0037689208984375, 0.05169677734375, -0.0308380126953125, 0.06890869140625, 0.01178741455078125, -0.07659912109375, 0.047515869140625, -0.035888671875, 0.0263824462890625, 0.039764404296875, 0.0303955078125, -0.0699462890625, -0.043701171875, -0.038360595703125, -0.0631103515625, 0.08135986328125, 0.0268402099609375, 0.01470184326171875, -0.012054443359375, -0.0025348663330078125, -0.011810302734375, -0.000736236572265625, -0.045257568359375, -0.04833984375, -0.021148681640625, -0.0125732421875, -0.00853729248046875, -0.01123809814453125, -0.00783538818359375, -0.04522705078125, 0.055999755859375, 0.01483917236328125, 0.0362548828125, 0.037506103515625, -0.01029205322265625, -0.01678466796875, 0.03094482421875, 0.06512451171875, 0.048919677734375, -0.02496337890625, 0.006683349609375, 0.016448974609375, -0.05047607421875, 0.0166168212890625, -0.00004750490188598633, -0.025665283203125, 0.0203704833984375, 0.0150146484375, 0.0611572265625, -0.006591796875, -0.0391845703125, 0.03924560546875, -0.0057373046875, -0.016357421875, -0.051544189453125, -0.00909423828125, 0.025848388671875, 0.01416015625, 0.037841796875, 0.007221221923828125, -0.005519866943359375, -0.06280517578125, 0.020751953125, 0.032257080078125, -0.03668212890625, -0.01788330078125, 0.0594482421875, 0.0045013427734375, -0.01107025146484375, 0.0283203125, -0.00972747802734375, -0.0540771484375, 0.052215576171875, 0.045135498046875, 0.045562744140625, -0.046142578125, 0.019134521484375, 0.05755615234375, 0.042266845703125, 0.007061004638671875, 0.049072265625, -0.00008493661880493164, -0.04449462890625, -0.01093292236328125, -0.061981201171875, -0.004852294921875, 0.00890350341796875, -0.04522705078125, 0.0328369140625, -0.035003662109375, -0.0164031982421875, 0.00023436546325683594, 0.01538848876953125, -0.04296875, 0.0207672119140625, -0.001644134521484375, 0.06695556640625, -0.077392578125, 0.08074951171875, 0.0389404296875, -0.038665771484375, -0.06695556640625, -0.01540374755859375, -0.00067901611328125, -0.045928955078125, 0.045684814453125, 0.01415252685546875, 0.00017642974853515625, 0.0085296630859375, -0.0157470703125, -0.078369140625, 0.07958984375, 0.0156707763671875, -0.02813720703125, 0.027008056640625, 0.0198822021484375, 0.0360107421875, -0.01318359375, 0.036163330078125, 0.060882568359375, 0.053314208984375, 0.0087432861328125, -0.08538818359375, 0.0102691650390625, -0.03411865234375, -0.0217437744140625, 0.006809234619140625, -0.0667724609375, 0.07147216796875, -0.0253753662109375, -0.0301361083984375, 0.00434112548828125, 0.056793212890625, 0.0228271484375, 0.0182952880859375, 0.02764892578125, 0.038360595703125, 0.03857421875, -0.0252838134765625, 0.0638427734375, -0.0268707275390625, 0.0234832763671875, 0.04412841796875, 0.01031494140625, 0.059173583984375, 0.03399658203125, -0.037506103515625, 0.03314208984375, 0.04315185546875, -0.01309967041015625, 0.040740966796875, -0.0032749176025390625, -0.03558349609375, -0.0023212432861328125, -0.0190582275390625, -0.03466796875, 0.04840087890625, 0.031158447265625, -0.0289764404296875, 0.005870819091796875, -0.0001983642578125, 0.030029296875, -0.0006022453308105469, -0.01165771484375, 0.044281005859375, 0.0019664764404296875, -0.05230712890625, 0.060882568359375, 0.0149993896484375, 0.059906005859375, -0.04736328125, 0.01751708984375, -0.01511383056640625, 0.0172882080078125, -0.0247039794921875, -0.05584716796875, 0.024261474609375, -0.01540374755859375, -0.01483154296875, 0.0032329559326171875, 0.0147552490234375, -0.0518798828125, -0.045379638671875, 0.0251007080078125, 0.0168304443359375, 0.02813720703125, 0.00920867919921875, -0.04364013671875, 0.00855255126953125, 0.021026611328125, -0.021484375, 0.01444244384765625, 0.025390625, 0.0063323974609375, 0.043060302734375, 0.048828125, 0.037261962890625, 0.0213623046875, -0.007755279541015625, 0.06280517578125, -0.053802490234375, -0.04034423828125, -0.056488037109375, 0.02874755859375, 0.006099700927734375, -0.027069091796875, 0.07373046875, 0.0635986328125, 0.0704345703125, -0.0135040283203125, 0.068603515625, -0.02886962890625, 0.042022705078125, -0.0350341796875, 0.0673828125, -0.041961669921875, 0.001499176025390625, -0.03955078125, -0.068603515625, -0.0193328857421875, 0.053985595703125, -0.0204010009765625, 0.0023212432861328125, 0.056243896484375, 0.0771484375, -0.0014543533325195312, -0.00489044189453125, 0.0175933837890625, 0.0212554931640625, 0.0139312744140625, 0.053863525390625, 0.03955078125, -0.054229736328125, 0.05328369140625, -0.05230712890625, -0.0098419189453125, -0.007358551025390625, -0.0361328125, -0.05230712890625, -0.064697265625, -0.021881103515625, -0.03875732421875, -0.01551055908203125, 0.06414794921875, 0.02484130859375, -0.06927490234375, -0.04144287109375, 0.02850341796875, 0.01096343994140625, -0.0300750732421875, -0.0177001953125, 0.068603515625, 0.0087432861328125, -0.07421875, 0.0306549072265625, 0.0196533203125, 0.00603485107421875, -0.00258636474609375, -0.0178680419921875, -0.03143310546875, 0.0162353515625, 0.032806396484375, 0.029510498046875, -0.06353759765625, -0.01093292236328125, -0.00827789306640625, -0.0248870849609375, 0.00745391845703125, 0.03192138671875, -0.036590576171875, 0.036590576171875, 0.052276611328125, 0.00091552734375, 0.0389404296875, -0.00897979736328125, 0.0181884765625, -0.0296630859375, 0.034149169921875, -0.006381988525390625, 0.03411865234375, 0.01303863525390625, -0.0111236572265625, 0.03314208984375, 0.033782958984375, -0.03814697265625, -0.061431884765625, -0.0010395050048828125, -0.09307861328125, -0.00534820556640625, 0.09393310546875, -0.0199737548828125, -0.0254669189453125, -0.0254669189453125, -0.0287322998046875, 0.042236328125, -0.042205810546875, 0.04791259765625, 0.041839599609375, 0.0062103271484375, -0.0018663406372070312, -0.0482177734375, 0.041168212890625, 0.031585693359375, -0.04205322265625, -0.002880096435546875, 0.025115966796875, 0.03778076171875, 0.01922607421875, 0.065673828125, -0.0287322998046875, 0.0281219482421875, 0.0033206939697265625, 0.011444091796875, 0.008209228515625, -0.004352569580078125, -0.0231170654296875, -0.0016393661499023438, -0.0252227783203125, -0.01519775390625 ] ]
impira/layoutlm-document-qa
2023-03-18T00:54:24.000Z
[ "transformers", "pytorch", "tf", "safetensors", "layoutlm", "document-question-answering", "pdf", "en", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
document-question-answering
impira
null
null
impira/layoutlm-document-qa
576
111,804
transformers
2022-08-07T21:07:19
--- language: en license: mit pipeline_tag: document-question-answering tags: - layoutlm - document-question-answering - pdf widget: - text: "What is the invoice number?" src: "https://huggingface.co/spaces/impira/docquery/resolve/2359223c1837a7587402bda0f2643382a6eefeab/invoice.png" - text: "What is the purchase amount?" src: "https://huggingface.co/spaces/impira/docquery/resolve/2359223c1837a7587402bda0f2643382a6eefeab/contract.jpeg" --- # LayoutLM for Visual Question Answering This is a fine-tuned version of the multi-modal [LayoutLM](https://aka.ms/layoutlm) model for the task of question answering on documents. It has been fine-tuned using both the [SQuAD2.0](https://huggingface.co/datasets/squad_v2) and [DocVQA](https://www.docvqa.org/) datasets. ## Getting started with the model To run these examples, you must have [PIL](https://pillow.readthedocs.io/en/stable/installation.html), [pytesseract](https://pypi.org/project/pytesseract/), and [PyTorch](https://pytorch.org/get-started/locally/) installed in addition to [transformers](https://huggingface.co/docs/transformers/index). ```python from transformers import pipeline nlp = pipeline( "document-question-answering", model="impira/layoutlm-document-qa", ) nlp( "https://templates.invoicehome.com/invoice-template-us-neat-750px.png", "What is the invoice number?" ) # {'score': 0.9943977, 'answer': 'us-001', 'start': 15, 'end': 15} nlp( "https://miro.medium.com/max/787/1*iECQRIiOGTmEFLdWkVIH2g.jpeg", "What is the purchase amount?" ) # {'score': 0.9912159, 'answer': '$1,000,000,000', 'start': 97, 'end': 97} nlp( "https://www.accountingcoach.com/wp-content/uploads/2013/10/income-statement-example@2x.png", "What are the 2020 net sales?" ) # {'score': 0.59147286, 'answer': '$ 3,750', 'start': 19, 'end': 20} ``` **NOTE**: This model and pipeline was recently landed in transformers via [PR #18407](https://github.com/huggingface/transformers/pull/18407) and [PR #18414](https://github.com/huggingface/transformers/pull/18414), so you'll need to use a recent version of transformers, for example: ```bash pip install git+https://github.com/huggingface/transformers.git@2ef774211733f0acf8d3415f9284c49ef219e991 ``` ## About us This model was created by the team at [Impira](https://www.impira.com/).
2,326
[ [ -0.0258026123046875, -0.062103271484375, 0.0243988037109375, 0.0238800048828125, -0.006244659423828125, 0.005405426025390625, 0.02484130859375, -0.0165252685546875, 0.014678955078125, 0.05072021484375, -0.0628662109375, -0.0228271484375, -0.026519775390625, -0.012725830078125, -0.0182037353515625, 0.0826416015625, -0.006534576416015625, -0.00174713134765625, -0.043548583984375, -0.007602691650390625, -0.023834228515625, -0.0200958251953125, -0.043792724609375, -0.0130767822265625, 0.01678466796875, 0.020965576171875, 0.044219970703125, 0.018280029296875, 0.044952392578125, 0.0240631103515625, -0.01300048828125, 0.01042938232421875, -0.0137176513671875, 0.021331787109375, 0.004154205322265625, -0.028350830078125, -0.0413818359375, 0.005832672119140625, 0.032989501953125, 0.04058837890625, -0.01537322998046875, 0.04132080078125, -0.01141357421875, 0.04345703125, -0.0252685546875, 0.0200958251953125, -0.02655029296875, -0.0083770751953125, 0.0221710205078125, -0.000675201416015625, -0.01849365234375, -0.02337646484375, 0.0211029052734375, -0.0526123046875, 0.0098724365234375, 0.01351165771484375, 0.09375, 0.032379150390625, -0.0228271484375, -0.0238800048828125, -0.021331787109375, 0.050506591796875, -0.051513671875, 0.021209716796875, 0.02154541015625, 0.023040771484375, 0.00809478759765625, -0.08880615234375, -0.036529541015625, -0.00708770751953125, -0.011505126953125, 0.046417236328125, -0.0269012451171875, -0.0175933837890625, 0.03826904296875, 0.02215576171875, -0.068115234375, -0.0215301513671875, -0.050872802734375, -0.02325439453125, 0.0426025390625, 0.0224609375, 0.0248260498046875, -0.040618896484375, -0.044219970703125, -0.0194244384765625, -0.0200042724609375, 0.02978515625, 0.02734375, 0.003902435302734375, -0.04345703125, 0.0633544921875, -0.003406524658203125, 0.0521240234375, 0.0007390975952148438, -0.0074462890625, 0.019561767578125, -0.01702880859375, -0.026336669921875, -0.02154541015625, 0.05181884765625, 0.037994384765625, 0.0095672607421875, 0.01416778564453125, -0.0175933837890625, -0.005207061767578125, 0.0286865234375, -0.06243896484375, -0.01885986328125, 0.04315185546875, -0.03826904296875, -0.02459716796875, 0.00774383544921875, -0.05352783203125, -0.0154876708984375, -0.0166473388671875, 0.04205322265625, -0.040008544921875, -0.0157012939453125, 0.0096588134765625, -0.028961181640625, 0.043426513671875, 0.041107177734375, -0.038604736328125, 0.01971435546875, 0.040740966796875, 0.060333251953125, 0.0167388916015625, -0.00885772705078125, -0.055633544921875, -0.00992584228515625, -0.00954437255859375, 0.0699462890625, -0.0141143798828125, -0.0207977294921875, 0.00522613525390625, 0.0079803466796875, -0.0211944580078125, -0.032806396484375, 0.0308074951171875, -0.042083740234375, 0.06243896484375, -0.0019626617431640625, -0.05645751953125, -0.00870513916015625, 0.02984619140625, -0.042877197265625, 0.07220458984375, 0.047607421875, -0.068603515625, 0.0004773139953613281, -0.06256103515625, -0.01483917236328125, 0.016815185546875, 0.00975799560546875, -0.049224853515625, -0.010162353515625, 0.00420379638671875, 0.00811767578125, -0.0177459716796875, 0.0032520294189453125, -0.0196075439453125, -0.006305694580078125, 0.018402099609375, -0.0141143798828125, 0.0953369140625, 0.0243072509765625, -0.007541656494140625, 0.02069091796875, -0.050689697265625, 0.01023101806640625, 0.016754150390625, -0.027008056640625, 0.0166473388671875, -0.028106689453125, 0.0309295654296875, 0.0249481201171875, 0.03253173828125, -0.040191650390625, 0.028350830078125, -0.0250396728515625, 0.0335693359375, 0.04461669921875, -0.00046539306640625, 0.0347900390625, -0.033203125, 0.072021484375, -0.010467529296875, 0.01425933837890625, -0.0217437744140625, -0.0611572265625, -0.046539306640625, -0.025970458984375, 0.0092620849609375, 0.052337646484375, -0.061859130859375, 0.01342010498046875, 0.007556915283203125, -0.043243408203125, -0.031463623046875, -0.00917816162109375, 0.0179443359375, 0.06890869140625, 0.030120849609375, -0.003948211669921875, -0.038726806640625, -0.07049560546875, -0.007434844970703125, -0.026031494140625, -0.022857666015625, 0.0189361572265625, 0.052337646484375, 0.005649566650390625, 0.0760498046875, -0.06341552734375, -0.0129852294921875, -0.020050048828125, 0.0121307373046875, 0.031982421875, 0.035369873046875, 0.042694091796875, -0.0670166015625, -0.037200927734375, -0.0223236083984375, -0.05035400390625, -0.0094146728515625, -0.0081634521484375, -0.00733184814453125, 0.016998291015625, 0.0299530029296875, -0.0650634765625, 0.03533935546875, 0.035003662109375, -0.0509033203125, 0.042022705078125, -0.0275421142578125, -0.00888824462890625, -0.1070556640625, 0.0037670135498046875, 0.0190582275390625, -0.0225982666015625, -0.04266357421875, 0.024200439453125, 0.0242462158203125, -0.022247314453125, -0.0465087890625, 0.061737060546875, -0.0163421630859375, 0.003910064697265625, 0.00405120849609375, -0.0008511543273925781, 0.0274200439453125, 0.04345703125, -0.0068359375, 0.0635986328125, 0.033416748046875, -0.02655029296875, 0.034637451171875, 0.04034423828125, -0.0267333984375, 0.049560546875, -0.0721435546875, 0.0052337646484375, -0.0196990966796875, 0.021942138671875, -0.0924072265625, -0.0238800048828125, 0.037841796875, -0.042510986328125, 0.0212554931640625, -0.025238037109375, -0.03057861328125, -0.03564453125, -0.02215576171875, 0.00954437255859375, 0.051483154296875, -0.0234527587890625, 0.0616455078125, 0.024749755859375, -0.0149688720703125, -0.029937744140625, -0.049102783203125, -0.01059722900390625, -0.028961181640625, -0.07855224609375, 0.022735595703125, -0.022369384765625, -0.0203704833984375, -0.013214111328125, -0.0061798095703125, -0.015899658203125, -0.00197601318359375, 0.003269195556640625, 0.036376953125, -0.019805908203125, 0.00897979736328125, -0.00576019287109375, -0.0135650634765625, 0.013824462890625, -0.029876708984375, 0.052703857421875, -0.029083251953125, -0.0204620361328125, -0.0198974609375, 0.0270538330078125, 0.0576171875, -0.0426025390625, 0.04425048828125, 0.049285888671875, -0.031219482421875, 0.0083770751953125, -0.0374755859375, 0.004756927490234375, -0.03387451171875, 0.027130126953125, -0.02325439453125, -0.04608154296875, 0.057373046875, 0.017333984375, 0.00858306884765625, 0.0570068359375, 0.04669189453125, -0.03204345703125, 0.0701904296875, 0.031890869140625, 0.0186309814453125, 0.032470703125, -0.041717529296875, 0.0140228271484375, -0.061614990234375, -0.03631591796875, -0.045196533203125, -0.01091766357421875, -0.040924072265625, -0.03668212890625, 0.049072265625, 0.0177154541015625, -0.030181884765625, 0.028961181640625, -0.04754638671875, 0.00804901123046875, 0.051605224609375, 0.0015125274658203125, 0.0010805130004882812, -0.0036334991455078125, -0.006153106689453125, 0.0014190673828125, -0.0443115234375, -0.03863525390625, 0.061279296875, 0.0275421142578125, 0.05230712890625, 0.0184326171875, 0.0391845703125, -0.0004050731658935547, 0.0128173828125, -0.05572509765625, 0.02520751953125, 0.006702423095703125, -0.0445556640625, -0.02294921875, -0.02471923828125, -0.06884765625, 0.01432037353515625, 0.00030159950256347656, -0.06256103515625, 0.00759124755859375, 0.0020923614501953125, -0.0262298583984375, 0.01537322998046875, -0.072021484375, 0.083984375, -0.0200653076171875, -0.02978515625, 0.01406097412109375, -0.053009033203125, 0.012237548828125, 0.01367950439453125, 0.0087890625, -0.005817413330078125, -0.005275726318359375, 0.055206298828125, -0.0452880859375, 0.04852294921875, -0.0168609619140625, -0.0126800537109375, 0.0288543701171875, -0.0035343170166015625, 0.03411865234375, 0.0233612060546875, -0.006916046142578125, -0.0152130126953125, 0.0301666259765625, -0.037994384765625, -0.058624267578125, 0.034210205078125, -0.042205810546875, -0.034820556640625, -0.0296630859375, -0.060028076171875, 0.0003714561462402344, 0.0154876708984375, 0.02618408203125, 0.046875, 0.0093536376953125, 0.0169830322265625, 0.050262451171875, -0.0299072265625, 0.02880859375, 0.02880859375, -0.0301361083984375, -0.034454345703125, 0.058502197265625, 0.00124359130859375, 0.0171661376953125, 0.0426025390625, 0.023345947265625, -0.050079345703125, -0.00620269775390625, -0.051422119140625, 0.01453399658203125, -0.05206298828125, -0.0301666259765625, -0.054718017578125, -0.0286712646484375, -0.04644775390625, 0.00008660554885864258, -0.01030731201171875, -0.046539306640625, -0.0261688232421875, 0.0011167526245117188, 0.050140380859375, 0.046112060546875, 0.0152740478515625, 0.006740570068359375, -0.0521240234375, 0.0384521484375, 0.034820556640625, 0.0309906005859375, -0.01776123046875, -0.047088623046875, -0.00555419921875, 0.016845703125, -0.04278564453125, -0.0751953125, 0.03179931640625, -0.02313232421875, 0.0297698974609375, 0.00606536865234375, 0.00832366943359375, 0.043212890625, -0.0141143798828125, 0.06640625, 0.01468658447265625, -0.0626220703125, 0.033477783203125, -0.039215087890625, 0.039031982421875, 0.020721435546875, 0.01541900634765625, -0.052764892578125, -0.0303497314453125, -0.06719970703125, -0.06829833984375, 0.059906005859375, 0.03582763671875, 0.0189361572265625, 0.01043701171875, 0.005382537841796875, -0.01195526123046875, 0.02874755859375, -0.03515625, -0.0260772705078125, -0.04010009765625, -0.0171356201171875, 0.0193634033203125, -0.00836944580078125, 0.0017032623291015625, -0.0416259765625, 0.04766845703125, -0.0189056396484375, 0.02239990234375, 0.0239105224609375, -0.00751495361328125, -0.002349853515625, 0.0045013427734375, 0.05645751953125, 0.059906005859375, -0.039947509765625, -0.0259246826171875, 0.0027332305908203125, -0.0262298583984375, -0.0206298828125, 0.0282745361328125, -0.0233612060546875, 0.003665924072265625, 0.02032470703125, 0.0589599609375, -0.005336761474609375, -0.05059814453125, 0.044921875, -0.01045989990234375, -0.03680419921875, -0.046539306640625, -0.0236663818359375, 0.032379150390625, 0.0275421142578125, 0.0244598388671875, 0.01172637939453125, 0.0021953582763671875, -0.035919189453125, 0.0171661376953125, 0.034820556640625, -0.0428466796875, 0.0028667449951171875, 0.06439208984375, -0.0019512176513671875, -0.04962158203125, 0.059814453125, -0.00507354736328125, -0.04412841796875, 0.0751953125, 0.0252685546875, 0.06378173828125, 0.01342010498046875, 0.04345703125, 0.02728271484375, 0.0226287841796875, 0.02099609375, 0.042816162109375, 0.0067138671875, -0.041046142578125, -0.0160675048828125, -0.05072021484375, -0.03497314453125, 0.011199951171875, -0.0538330078125, 0.0295867919921875, -0.03204345703125, 0.01261138916015625, 0.00017714500427246094, 0.0001302957534790039, -0.06829833984375, 0.0271759033203125, 0.002864837646484375, 0.08038330078125, -0.0372314453125, 0.034881591796875, 0.07720947265625, -0.0677490234375, -0.07611083984375, -0.01163482666015625, -0.00452423095703125, -0.06365966796875, 0.035675048828125, 0.0035552978515625, 0.01232147216796875, 0.00003725290298461914, -0.060455322265625, -0.0634765625, 0.0721435546875, 0.021331787109375, -0.0026569366455078125, -0.008819580078125, -0.0016756057739257812, 0.0433349609375, -0.029022216796875, 0.0546875, 0.0191650390625, 0.038055419921875, 0.01213836669921875, -0.053619384765625, 0.000286102294921875, -0.04052734375, -0.013641357421875, 0.0021820068359375, -0.0574951171875, 0.07781982421875, -0.007488250732421875, 0.01296234130859375, 0.032958984375, 0.038665771484375, 0.0309906005859375, 0.028045654296875, 0.040740966796875, 0.043853759765625, 0.061431884765625, -0.0182952880859375, 0.09783935546875, -0.0211639404296875, 0.05792236328125, 0.07421875, -0.005313873291015625, 0.045989990234375, 0.04058837890625, -0.0236663818359375, 0.045196533203125, 0.05072021484375, -0.0181732177734375, 0.01165771484375, 0.02105712890625, 0.014007568359375, -0.025238037109375, 0.0006699562072753906, -0.0297698974609375, 0.027435302734375, 0.00975799560546875, -0.0167083740234375, -0.01971435546875, -0.0158843994140625, -0.00963592529296875, -0.0024929046630859375, -0.0010156631469726562, 0.051513671875, -0.0013904571533203125, -0.040740966796875, 0.031951904296875, -0.00945281982421875, 0.035400390625, -0.05755615234375, -0.0141754150390625, -0.0301361083984375, -0.002979278564453125, -0.0126190185546875, -0.07098388671875, 0.01275634765625, -0.004638671875, -0.02593994140625, -0.03741455078125, 0.038330078125, -0.0274200439453125, -0.06915283203125, 0.00249481201171875, 0.048858642578125, 0.0036449432373046875, -0.007633209228515625, -0.06842041015625, -0.0190277099609375, 0.00681304931640625, -0.0214691162109375, 0.00891876220703125, 0.01226806640625, -0.00200653076171875, 0.061279296875, 0.040924072265625, -0.023040771484375, 0.0037708282470703125, 0.015289306640625, 0.06378173828125, -0.0347900390625, -0.047393798828125, -0.02978515625, 0.08404541015625, -0.0211944580078125, -0.041595458984375, 0.053466796875, 0.06280517578125, 0.061981201171875, -0.02490234375, 0.042266845703125, -0.0115814208984375, 0.041107177734375, -0.03643798828125, 0.0657958984375, -0.07958984375, 0.00778961181640625, -0.044830322265625, -0.07122802734375, -0.0147705078125, 0.047454833984375, -0.01483917236328125, 0.0026645660400390625, 0.0567626953125, 0.057830810546875, -0.00727081298828125, 0.004581451416015625, 0.01047515869140625, 0.00982666015625, 0.02325439453125, 0.018646240234375, 0.06805419921875, -0.05267333984375, 0.05865478515625, -0.03466796875, 0.001422882080078125, 0.0037441253662109375, -0.046630859375, -0.04583740234375, -0.04986572265625, -0.039031982421875, -0.054473876953125, -0.034271240234375, 0.06280517578125, 0.04754638671875, -0.05084228515625, -0.021148681640625, 0.002994537353515625, 0.020538330078125, -0.0158843994140625, -0.0267333984375, 0.032196044921875, -0.0147705078125, -0.062347412109375, 0.003246307373046875, 0.01690673828125, 0.00882720947265625, -0.005035400390625, 0.005035400390625, -0.028350830078125, -0.0012483596801757812, 0.03863525390625, 0.01495361328125, -0.05267333984375, -0.0186309814453125, 0.025115966796875, 0.0042877197265625, 0.01488494873046875, 0.0521240234375, -0.04547119140625, 0.0264739990234375, 0.058868408203125, 0.033355712890625, 0.054351806640625, 0.01080322265625, 0.036712646484375, -0.04150390625, 0.008453369140625, 0.018798828125, 0.04522705078125, 0.0274810791015625, -0.0262603759765625, 0.0318603515625, 0.01335906982421875, -0.0300140380859375, -0.045257568359375, 0.0198516845703125, -0.0836181640625, -0.0226287841796875, 0.0675048828125, 0.0022296905517578125, -0.02081298828125, -0.00530242919921875, -0.04248046875, 0.014434814453125, -0.0364990234375, 0.03582763671875, 0.043670654296875, -0.006809234619140625, -0.023712158203125, -0.0236053466796875, 0.0182952880859375, 0.0382080078125, -0.058746337890625, -0.0217742919921875, 0.03448486328125, 0.02435302734375, 0.025054931640625, 0.03631591796875, -0.00047779083251953125, 0.0281219482421875, 0.0015048980712890625, -0.006740570068359375, -0.007358551025390625, -0.0010366439819335938, -0.021820068359375, 0.0189056396484375, -0.0216064453125, -0.02215576171875 ] ]
SG161222/Realistic_Vision_V1.4
2023-05-02T14:39:13.000Z
[ "diffusers", "stable-diffusion", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
SG161222
null
null
SG161222/Realistic_Vision_V1.4
320
111,740
diffusers
2023-03-05T13:17:54
--- license: creativeml-openrail-m tags: - stable-diffusion - text-to-image --- <b>Please read this!</b><br> My model has always been free and always will be free. There are no restrictions on the use of the model. The rights to this model still belong to me. <hr/> <b>Important note: "RAW photo" in the prompt may degrade the result.</b> <b>I use this template to get good generation results: Prompt:</b> *subject*, (high detailed skin:1.2), 8k uhd, dslr, soft lighting, high quality, film grain, Fujifilm XT3 <b>Example:</b> a close up portrait photo of 26 y.o woman in wastelander clothes, long haircut, pale skin, slim body, background is city ruins, (high detailed skin:1.2), 8k uhd, dslr, soft lighting, high quality, film grain, Fujifilm XT3 <b>Negative Prompt:</b> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime:1.4), text, close up, cropped, out of frame, worst quality, low quality, jpeg artifacts, ugly, duplicate, morbid, mutilated, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, mutation, deformed, blurry, dehydrated, bad anatomy, bad proportions, extra limbs, cloned face, disfigured, gross proportions, malformed limbs, missing arms, missing legs, extra arms, extra legs, fused fingers, too many fingers, long neck<br> <b>OR</b><br> (deformed iris, deformed pupils, semi-realistic, cgi, 3d, render, sketch, cartoon, drawing, anime, mutated hands and fingers:1.4), (deformed, distorted, disfigured:1.3), poorly drawn, bad anatomy, wrong anatomy, extra limb, missing limb, floating limbs, disconnected limbs, mutation, mutated, ugly, disgusting, amputation <b>Euler A or DPM++ 2M Karras with 25 steps<br> CFG Scale 3,5 - 7<br> Hires. fix with Latent upscaler<br> 0 Hires steps and Denoising strength 0.25-0.45<br> Upscale by 1.1-2.0</b>
1,833
[ [ -0.02777099609375, -0.052032470703125, 0.0247039794921875, 0.01262664794921875, -0.040985107421875, -0.0123748779296875, 0.013031005859375, -0.03973388671875, 0.029327392578125, 0.0465087890625, -0.048858642578125, -0.04205322265625, -0.038543701171875, 0.0021305084228515625, -0.0239715576171875, 0.058349609375, 0.01117706298828125, 0.015899658203125, -0.0003314018249511719, 0.01108551025390625, -0.036285400390625, -0.007488250732421875, -0.0386962890625, -0.00397491455078125, 0.031768798828125, 0.044036865234375, 0.055450439453125, 0.06396484375, 0.01568603515625, 0.015777587890625, -0.01366424560546875, -0.00991058349609375, -0.03875732421875, -0.005096435546875, -0.01010894775390625, -0.023895263671875, -0.0567626953125, 0.0172271728515625, 0.0338134765625, 0.035247802734375, 0.004085540771484375, 0.0167999267578125, -0.003681182861328125, 0.047760009765625, -0.03717041015625, -0.007526397705078125, -0.01435089111328125, 0.01085662841796875, -0.0297088623046875, 0.0007214546203613281, -0.01456451416015625, -0.036163330078125, -0.044097900390625, -0.054718017578125, 0.05126953125, 0.0026378631591796875, 0.0836181640625, 0.0012979507446289062, -0.0221710205078125, 0.00548553466796875, -0.06134033203125, 0.044921875, -0.060638427734375, 0.0153961181640625, 0.0180816650390625, 0.046661376953125, 0.003887176513671875, -0.0716552734375, -0.0565185546875, -0.004985809326171875, 0.002109527587890625, 0.0234832763671875, -0.039031982421875, -0.005756378173828125, 0.049896240234375, 0.061553955078125, -0.06195068359375, 0.001651763916015625, -0.0555419921875, -0.0233612060546875, 0.0697021484375, 0.020263671875, 0.039459228515625, -0.0156707763671875, -0.040283203125, -0.020660400390625, -0.06365966796875, 0.0034942626953125, 0.04583740234375, 0.00970458984375, -0.0333251953125, 0.06988525390625, -0.00804901123046875, 0.04852294921875, 0.0294189453125, -0.01318359375, 0.01690673828125, -0.02593994140625, -0.01100921630859375, -0.022125244140625, 0.061981201171875, 0.0657958984375, 0.005767822265625, 0.0092620849609375, -0.0025177001953125, -0.0016021728515625, 0.0167388916015625, -0.07745361328125, -0.01351165771484375, 0.01262664794921875, -0.06219482421875, -0.0192718505859375, -0.0189971923828125, -0.08514404296875, -0.008148193359375, -0.0136260986328125, 0.0272064208984375, -0.0155029296875, -0.024261474609375, -0.004695892333984375, -0.016448974609375, 0.0302581787109375, 0.03466796875, -0.06695556640625, 0.0034618377685546875, -0.002597808837890625, 0.04266357421875, 0.024169921875, 0.0188446044921875, -0.00556182861328125, 0.00962066650390625, -0.0406494140625, 0.052581787109375, -0.0071563720703125, -0.034027099609375, -0.0184478759765625, 0.02252197265625, 0.0036449432373046875, -0.03692626953125, 0.050689697265625, -0.0267791748046875, 0.015533447265625, -0.006622314453125, -0.03106689453125, -0.01239776611328125, -0.0162811279296875, -0.0728759765625, 0.032135009765625, 0.036712646484375, -0.0665283203125, 0.032318115234375, -0.03076171875, -0.01556396484375, 0.008819580078125, -0.0013446807861328125, -0.040771484375, 0.0078582763671875, 0.0151824951171875, 0.02978515625, -0.009185791015625, 0.0045928955078125, -0.033355712890625, -0.02606201171875, -0.00101470947265625, -0.0274658203125, 0.062255859375, 0.033660888671875, -0.023681640625, 0.0106658935546875, -0.08251953125, -0.0011196136474609375, 0.04595947265625, -0.0177154541015625, 0.017303466796875, -0.0125579833984375, 0.0311431884765625, 0.035125732421875, 0.019287109375, -0.04510498046875, 0.0195159912109375, -0.02581787109375, -0.0002551078796386719, 0.0428466796875, 0.0230712890625, 0.0209503173828125, -0.044281005859375, 0.06304931640625, 0.0155029296875, 0.00885772705078125, -0.014923095703125, -0.033447265625, -0.0626220703125, -0.01654052734375, 0.0028018951416015625, 0.0180206298828125, -0.054931640625, 0.0263214111328125, 0.0011796951293945312, -0.04779052734375, -0.0272979736328125, -0.004302978515625, 0.033538818359375, 0.042816162109375, 0.0267486572265625, -0.02947998046875, -0.043304443359375, -0.08721923828125, -0.003246307373046875, 0.002498626708984375, -0.0206298828125, 0.0168609619140625, 0.0265960693359375, 0.00974273681640625, 0.04473876953125, -0.048858642578125, -0.0139923095703125, -0.0277557373046875, -0.0134735107421875, 0.021240234375, 0.056488037109375, 0.061431884765625, -0.05609130859375, -0.0229339599609375, -0.0174560546875, -0.046905517578125, 0.0012664794921875, 0.0032596588134765625, -0.0182037353515625, 0.002452850341796875, 0.021331787109375, -0.04168701171875, 0.053924560546875, 0.02838134765625, -0.04571533203125, 0.065185546875, -0.01336669921875, 0.01387786865234375, -0.09735107421875, 0.034271240234375, 0.01274871826171875, -0.0223846435546875, -0.04095458984375, 0.0306854248046875, -0.00385284423828125, -0.03692626953125, -0.052978515625, 0.05535888671875, -0.0291748046875, 0.0013141632080078125, 0.0024127960205078125, -0.007965087890625, 0.011688232421875, 0.03924560546875, -0.003253936767578125, 0.04052734375, 0.036285400390625, -0.045654296875, 0.03271484375, 0.0287628173828125, -0.0244903564453125, 0.052886962890625, -0.0633544921875, 0.018707275390625, -0.0096435546875, 0.0088958740234375, -0.06201171875, -0.05377197265625, 0.049896240234375, -0.046112060546875, 0.0240325927734375, 0.0027141571044921875, -0.024932861328125, -0.040374755859375, -0.0204315185546875, 0.01517486572265625, 0.0599365234375, -0.0307159423828125, 0.031280517578125, -0.0033473968505859375, 0.0016050338745117188, -0.0171661376953125, -0.053253173828125, -0.005626678466796875, -0.018341064453125, -0.043670654296875, 0.02642822265625, -0.01361083984375, -0.0092926025390625, 0.00951385498046875, -0.00023937225341796875, -0.014404296875, -0.014678955078125, 0.017547607421875, 0.006946563720703125, -0.029052734375, -0.044830322265625, 0.01617431640625, -0.0008044242858886719, -0.00677490234375, 0.015716552734375, 0.04107666015625, 0.025634765625, -0.033538818359375, -0.040618896484375, 0.0406494140625, 0.059539794921875, -0.009124755859375, 0.018035888671875, 0.06005859375, -0.043365478515625, 0.0126495361328125, -0.0283660888671875, -0.006206512451171875, -0.03009033203125, 0.01459503173828125, -0.0225067138671875, -0.045654296875, 0.056884765625, 0.015533447265625, -0.002872467041015625, 0.07879638671875, 0.04241943359375, -0.01557159423828125, 0.11126708984375, 0.03924560546875, 0.0170135498046875, 0.0259857177734375, -0.049041748046875, -0.004871368408203125, -0.0665283203125, -0.03399658203125, -0.0250701904296875, -0.0391845703125, -0.051361083984375, -0.02642822265625, 0.0198211669921875, 0.0230865478515625, -0.0289764404296875, 0.026092529296875, -0.03375244140625, 0.0188140869140625, 0.032379150390625, 0.033355712890625, 0.00975799560546875, 0.007244110107421875, -0.0197296142578125, -0.01413726806640625, -0.042572021484375, -0.04937744140625, 0.0677490234375, 0.0182952880859375, 0.060699462890625, 0.012451171875, 0.044342041015625, 0.0088958740234375, 0.01367950439453125, -0.044097900390625, 0.053192138671875, -0.0276641845703125, -0.08795166015625, 0.003025054931640625, -0.024078369140625, -0.0780029296875, 0.0039005279541015625, -0.03265380859375, -0.07684326171875, 0.033843994140625, 0.031982421875, -0.0171966552734375, 0.035247802734375, -0.042938232421875, 0.05621337890625, -0.0092620849609375, -0.052154541015625, -0.00766754150390625, -0.054412841796875, 0.04345703125, 0.0061798095703125, 0.003765106201171875, -0.0096282958984375, 0.016510009765625, 0.05389404296875, -0.0357666015625, 0.059112548828125, -0.0252838134765625, 0.01434326171875, 0.036346435546875, 0.01007843017578125, 0.00452423095703125, 0.0039043426513671875, 0.003032684326171875, -0.0104827880859375, -0.009857177734375, -0.0379638671875, -0.0309295654296875, 0.041107177734375, -0.050201416015625, -0.0498046875, -0.02056884765625, -0.0249176025390625, 0.01377105712890625, 0.03668212890625, 0.06610107421875, 0.042938232421875, -0.0167999267578125, 0.01357269287109375, 0.0211334228515625, -0.0276641845703125, 0.041900634765625, 0.0128936767578125, -0.036407470703125, -0.047882080078125, 0.0634765625, 0.01526641845703125, 0.034210205078125, -0.0060577392578125, 0.01422119140625, -0.02740478515625, -0.0273590087890625, -0.064453125, 0.020965576171875, -0.043670654296875, -0.0258026123046875, -0.034332275390625, -0.01546478271484375, -0.041473388671875, -0.02069091796875, -0.022796630859375, -0.0286102294921875, -0.053436279296875, -0.00278472900390625, 0.0261688232421875, 0.054168701171875, -0.01239013671875, 0.01381683349609375, -0.048492431640625, 0.04296875, 0.0156707763671875, 0.026031494140625, 0.0103607177734375, -0.0277862548828125, -0.015350341796875, 0.01293182373046875, -0.064453125, -0.07598876953125, 0.04974365234375, -0.007720947265625, 0.04376220703125, 0.039947509765625, -0.007282257080078125, 0.0625, -0.0131683349609375, 0.0794677734375, 0.03680419921875, -0.0452880859375, 0.04986572265625, -0.05322265625, 0.0338134765625, 0.02349853515625, 0.0211029052734375, -0.032318115234375, -0.0304718017578125, -0.086181640625, -0.06610107421875, 0.05035400390625, 0.02459716796875, 0.0274200439453125, 0.00341033935546875, 0.03082275390625, 0.004024505615234375, 0.01285552978515625, -0.06707763671875, -0.02947998046875, -0.03802490234375, 0.007183074951171875, -0.005413055419921875, -0.0224609375, 0.0034122467041015625, -0.038909912109375, 0.05413818359375, 0.004085540771484375, 0.041259765625, 0.0121002197265625, 0.0187530517578125, -0.0389404296875, 0.00141143798828125, 0.045806884765625, 0.042755126953125, -0.027587890625, -0.001987457275390625, -0.0137939453125, -0.03369140625, 0.0200958251953125, 0.004062652587890625, -0.032073974609375, 0.00927734375, 0.026519775390625, 0.0614013671875, 0.00074005126953125, -0.0135345458984375, 0.03387451171875, -0.000675201416015625, -0.0218658447265625, -0.00952911376953125, 0.01837158203125, 0.00778961181640625, 0.00412750244140625, 0.032684326171875, 0.01654052734375, 0.01457977294921875, -0.0297698974609375, 0.01145172119140625, 0.003753662109375, -0.0120391845703125, -0.02581787109375, 0.0718994140625, 0.00005996227264404297, -0.0228729248046875, 0.04443359375, -0.040069580078125, -0.023895263671875, 0.06854248046875, 0.06866455078125, 0.051910400390625, -0.0248260498046875, 0.0186309814453125, 0.0775146484375, 0.007495880126953125, 0.0025119781494140625, 0.040496826171875, 0.0260467529296875, -0.03436279296875, -0.0174713134765625, -0.045806884765625, -0.0187835693359375, 0.06396484375, -0.057891845703125, 0.03399658203125, -0.055755615234375, -0.00992584228515625, -0.0017032623291015625, 0.005542755126953125, -0.045013427734375, 0.059783935546875, 0.006389617919921875, 0.062255859375, -0.07427978515625, 0.031768798828125, 0.0484619140625, -0.0684814453125, -0.0875244140625, -0.01082611083984375, 0.0084686279296875, -0.03692626953125, 0.0245819091796875, 0.01316070556640625, 0.01488494873046875, 0.019134521484375, -0.0718994140625, -0.06353759765625, 0.08172607421875, 0.041473388671875, -0.038665771484375, -0.0133819580078125, -0.02191162109375, 0.04241943359375, -0.0291290283203125, 0.004360198974609375, 0.0126495361328125, 0.03033447265625, 0.032928466796875, -0.04266357421875, 0.01422119140625, -0.0367431640625, 0.0244293212890625, -0.007640838623046875, -0.05352783203125, 0.0653076171875, -0.0250091552734375, -0.042205810546875, 0.034423828125, 0.040069580078125, 0.0268402099609375, 0.04058837890625, 0.03424072265625, 0.05633544921875, 0.01232147216796875, 0.00017905235290527344, 0.0849609375, -0.0160980224609375, 0.0246734619140625, 0.055023193359375, 0.0134124755859375, 0.02642822265625, 0.00989532470703125, -0.01296234130859375, 0.0234222412109375, 0.07391357421875, -0.032257080078125, 0.035552978515625, 0.0208892822265625, -0.005340576171875, -0.01153564453125, -0.007171630859375, -0.036224365234375, 0.0384521484375, 0.01425933837890625, -0.045654296875, -0.0292816162109375, 0.007904052734375, 0.0018014907836914062, 0.0170745849609375, -0.0212249755859375, 0.046539306640625, -0.00445556640625, -0.02099609375, 0.050994873046875, 0.005016326904296875, 0.0289154052734375, -0.0279083251953125, -0.0209503173828125, -0.0289459228515625, 0.00023889541625976562, -0.032501220703125, -0.04595947265625, 0.036285400390625, 0.0039825439453125, -0.0190277099609375, -0.0171966552734375, 0.059051513671875, -0.014068603515625, -0.0660400390625, 0.0026340484619140625, 0.032440185546875, 0.03533935546875, 0.020660400390625, -0.0548095703125, -0.00008058547973632812, 0.00788116455078125, -0.0224761962890625, 0.00797271728515625, 0.0011081695556640625, -0.001232147216796875, 0.034515380859375, 0.040313720703125, 0.01448822021484375, -0.01549530029296875, 0.005161285400390625, 0.07293701171875, -0.04559326171875, -0.01959228515625, -0.04559326171875, 0.06634521484375, -0.00501251220703125, -0.029266357421875, 0.0435791015625, 0.05517578125, 0.0662841796875, -0.039306640625, 0.044036865234375, -0.031402587890625, 0.0260467529296875, -0.051239013671875, 0.07537841796875, -0.060577392578125, -0.0244140625, -0.03570556640625, -0.08172607421875, -0.0274810791015625, 0.067626953125, -0.0016546249389648438, 0.033599853515625, 0.035064697265625, 0.0567626953125, -0.00740814208984375, -0.0224761962890625, 0.032135009765625, 0.00724029541015625, 0.015960693359375, 0.027984619140625, 0.0537109375, -0.0290679931640625, 0.004199981689453125, -0.04864501953125, -0.01200103759765625, -0.0221405029296875, -0.06488037109375, -0.032623291015625, -0.053314208984375, -0.034637451171875, -0.04168701171875, -0.004184722900390625, 0.05255126953125, 0.06488037109375, -0.04150390625, -0.0023097991943359375, 0.0016422271728515625, -0.0008816719055175781, -0.0175933837890625, -0.021728515625, 0.01235198974609375, 0.03167724609375, -0.06292724609375, 0.00543212890625, -0.00835418701171875, 0.04107666015625, -0.01100921630859375, 0.01183319091796875, -0.0153961181640625, -0.014373779296875, 0.0282745361328125, 0.029937744140625, -0.046630859375, -0.0205535888671875, -0.004810333251953125, -0.00406646728515625, 0.012359619140625, 0.0389404296875, -0.027587890625, 0.04315185546875, 0.0279388427734375, -0.0133819580078125, 0.04632568359375, 0.01187896728515625, 0.036956787109375, -0.056915283203125, 0.0088653564453125, 0.03387451171875, 0.0257415771484375, 0.03387451171875, -0.0433349609375, 0.0276336669921875, 0.0272216796875, -0.02655029296875, -0.043792724609375, 0.033233642578125, -0.09027099609375, -0.00684356689453125, 0.07763671875, -0.0009298324584960938, -0.01168060302734375, 0.0201873779296875, -0.037567138671875, 0.05596923828125, -0.015838623046875, 0.034942626953125, 0.0259857177734375, -0.0171966552734375, -0.023895263671875, -0.034454345703125, 0.0318603515625, -0.013702392578125, -0.05010986328125, -0.0050811767578125, 0.06866455078125, 0.03179931640625, 0.02490234375, 0.0518798828125, -0.048187255859375, 0.029937744140625, 0.01959228515625, 0.03790283203125, -0.0187835693359375, -0.008087158203125, -0.042266845703125, 0.005039215087890625, 0.00507354736328125, -0.026214599609375 ] ]
guillaumekln/faster-whisper-large-v2
2023-05-12T18:58:25.000Z
[ "ctranslate2", "audio", "automatic-speech-recognition", "en", "zh", "de", "es", "ru", "ko", "fr", "ja", "pt", "tr", "pl", "ca", "nl", "ar", "sv", "it", "id", "hi", "fi", "vi", "he", "uk", "el", "ms", "cs", "ro", "da", "hu", "ta", "no", "th", "ur", "hr", "bg", "lt", "la", "mi", "ml", "cy", "sk", "te", "fa", "lv", "bn", "sr", "az", "sl", "kn", "et", "mk", "br", "eu", "is", "hy", "ne", "mn", "bs", "kk", "sq", "sw", "gl", "mr", "pa", "si", "km", "sn", "yo", "so", "af", "oc", "ka", "be", "tg", "sd", "gu", "am", "yi", "lo", "uz", "fo", "ht", "ps", "tk", "nn", "mt", "sa", "lb", "my", "bo", "tl", "mg", "as", "tt", "haw", "ln", "ha", "ba", "jw", "su", "license:mit", "has_space", "region:us" ]
automatic-speech-recognition
guillaumekln
null
null
guillaumekln/faster-whisper-large-v2
124
111,644
ctranslate2
2023-03-23T10:36:06
--- language: - en - zh - de - es - ru - ko - fr - ja - pt - tr - pl - ca - nl - ar - sv - it - id - hi - fi - vi - he - uk - el - ms - cs - ro - da - hu - ta - 'no' - th - ur - hr - bg - lt - la - mi - ml - cy - sk - te - fa - lv - bn - sr - az - sl - kn - et - mk - br - eu - is - hy - ne - mn - bs - kk - sq - sw - gl - mr - pa - si - km - sn - yo - so - af - oc - ka - be - tg - sd - gu - am - yi - lo - uz - fo - ht - ps - tk - nn - mt - sa - lb - my - bo - tl - mg - as - tt - haw - ln - ha - ba - jw - su tags: - audio - automatic-speech-recognition license: mit library_name: ctranslate2 --- # Whisper large-v2 model for CTranslate2 This repository contains the conversion of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format. This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper). ## Example ```python from faster_whisper import WhisperModel model = WhisperModel("large-v2") segments, info = model.transcribe("audio.mp3") for segment in segments: print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text)) ``` ## Conversion details The original model was converted with the following command: ``` ct2-transformers-converter --model openai/whisper-large-v2 --output_dir faster-whisper-large-v2 \ --copy_files tokenizer.json --quantization float16 ``` Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html). ## More information **For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-large-v2).**
2,024
[ [ 0.008026123046875, -0.0290679931640625, 0.025726318359375, 0.036865234375, -0.03411865234375, -0.0246734619140625, -0.036468505859375, -0.032867431640625, 0.005451202392578125, 0.061126708984375, -0.035064697265625, -0.038299560546875, -0.044403076171875, -0.0274200439453125, -0.030609130859375, 0.06549072265625, -0.0142364501953125, 0.0183563232421875, 0.0216827392578125, -0.0008754730224609375, -0.027557373046875, -0.0230560302734375, -0.05908203125, -0.0272369384765625, 0.0166015625, 0.0166015625, 0.0440673828125, 0.024932861328125, 0.03424072265625, 0.02325439453125, -0.0250244140625, -0.004985809326171875, -0.0255889892578125, -0.0045623779296875, 0.01532745361328125, -0.042938232421875, -0.050445556640625, 0.0014715194702148438, 0.058074951171875, 0.006290435791015625, -0.0176849365234375, 0.03216552734375, -0.0213165283203125, 0.025421142578125, -0.03521728515625, 0.0273895263671875, -0.0430908203125, 0.0039520263671875, -0.01264190673828125, -0.01849365234375, -0.042144775390625, -0.0235443115234375, 0.03607177734375, -0.06689453125, 0.00345611572265625, -0.004375457763671875, 0.06378173828125, 0.021240234375, -0.04278564453125, -0.019073486328125, -0.0721435546875, 0.0732421875, -0.057525634765625, 0.0171356201171875, 0.02056884765625, 0.041046142578125, 0.0103607177734375, -0.0858154296875, -0.02044677734375, -0.00850677490234375, 0.0176544189453125, 0.0246124267578125, -0.03314208984375, 0.01503753662109375, 0.0182342529296875, 0.0322265625, -0.047393798828125, 0.007221221923828125, -0.059356689453125, -0.034637451171875, 0.039947509765625, 0.0071868896484375, 0.00746917724609375, -0.0111846923828125, -0.0273895263671875, -0.04486083984375, -0.041534423828125, -0.004856109619140625, 0.037628173828125, 0.0122833251953125, -0.04205322265625, 0.05145263671875, 0.002605438232421875, 0.024566650390625, 0.0017690658569335938, -0.02276611328125, 0.035247802734375, -0.0204010009765625, -0.00867462158203125, 0.032745361328125, 0.051177978515625, 0.0430908203125, 0.006862640380859375, 0.0270843505859375, -0.0230560302734375, -0.01425933837890625, 0.0083160400390625, -0.092529296875, -0.02923583984375, 0.0249176025390625, -0.060394287109375, -0.0183563232421875, -0.0003380775451660156, -0.0266265869140625, -0.0037555694580078125, -0.0182037353515625, 0.05206298828125, -0.03021240234375, -0.045654296875, 0.0287628173828125, -0.0440673828125, 0.0179290771484375, 0.03607177734375, -0.053863525390625, 0.0300140380859375, 0.0469970703125, 0.08221435546875, 0.00937652587890625, -0.0043182373046875, -0.025848388671875, 0.01947021484375, -0.007099151611328125, 0.039459228515625, 0.0023784637451171875, -0.035491943359375, 0.007495880126953125, -0.0043182373046875, 0.0014057159423828125, -0.048126220703125, 0.052093505859375, -0.0223388671875, 0.02490234375, 0.0140838623046875, -0.0206146240234375, -0.0174713134765625, -0.0005674362182617188, -0.050262451171875, 0.06988525390625, 0.034027099609375, -0.05242919921875, -0.01369476318359375, -0.0552978515625, -0.0166015625, -0.01174163818359375, 0.03692626953125, -0.039764404296875, 0.0210113525390625, -0.0098724365234375, 0.0040283203125, -0.035675048828125, 0.020416259765625, -0.014892578125, -0.0285797119140625, 0.0237274169921875, -0.0343017578125, 0.06591796875, 0.0224609375, 0.0010471343994140625, 0.02801513671875, -0.039581298828125, 0.0030231475830078125, 0.00713348388671875, -0.0242919921875, -0.01727294921875, -0.01113128662109375, 0.044586181640625, 0.0011701583862304688, 0.0335693359375, -0.0391845703125, 0.023895263671875, -0.0300140380859375, 0.067626953125, 0.035919189453125, -0.00812530517578125, 0.025299072265625, -0.0265045166015625, 0.0026760101318359375, 0.016265869140625, 0.029876708984375, 0.006496429443359375, -0.03570556640625, -0.052734375, -0.0077667236328125, 0.02935791015625, 0.030609130859375, -0.042236328125, 0.02703857421875, -0.02423095703125, -0.06842041015625, -0.054840087890625, -0.031341552734375, 0.0182342529296875, 0.01216888427734375, 0.039794921875, -0.002346038818359375, -0.060211181640625, -0.06610107421875, 0.0013608932495117188, -0.0289154052734375, -0.0177459716796875, 0.011474609375, 0.042205810546875, -0.0157623291015625, 0.034942626953125, -0.053619384765625, -0.0379638671875, -0.0175018310546875, 0.0241546630859375, 0.0198822021484375, 0.055755615234375, 0.043853759765625, -0.058380126953125, -0.00919342041015625, -0.01047515869140625, -0.0217437744140625, -0.00043773651123046875, -0.006439208984375, -0.006099700927734375, 0.007503509521484375, 0.0167083740234375, -0.058685302734375, 0.0255126953125, 0.060211181640625, -0.0256805419921875, 0.044891357421875, -0.0068817138671875, -0.00690460205078125, -0.09490966796875, 0.0091552734375, 0.004886627197265625, -0.01004791259765625, -0.040069580078125, 0.0021457672119140625, 0.0160369873046875, 0.0089874267578125, -0.049560546875, 0.048583984375, -0.01116180419921875, -0.00662994384765625, -0.01544189453125, -0.01233673095703125, -0.00942230224609375, 0.0164794921875, 0.028076171875, 0.058258056640625, 0.029266357421875, -0.0280609130859375, 0.023590087890625, 0.043701171875, -0.021820068359375, 0.010040283203125, -0.0828857421875, 0.01220703125, 0.016387939453125, 0.042236328125, -0.044830322265625, -0.01126861572265625, 0.0118255615234375, -0.047698974609375, 0.017059326171875, -0.04498291015625, -0.0426025390625, -0.030792236328125, -0.04083251953125, 0.036041259765625, 0.054840087890625, -0.034423828125, 0.047576904296875, 0.0194854736328125, 0.002590179443359375, -0.00264739990234375, -0.0753173828125, -0.00501251220703125, -0.0133209228515625, -0.0572509765625, 0.04718017578125, -0.0185394287109375, -0.0098114013671875, -0.01509857177734375, -0.00390625, -0.021240234375, -0.01470947265625, 0.032745361328125, 0.0264129638671875, -0.031494140625, -0.01233673095703125, 0.031982421875, -0.0272216796875, 0.01287078857421875, -0.044158935546875, 0.06072998046875, -0.001556396484375, 0.005496978759765625, -0.05853271484375, 0.00909423828125, 0.052734375, -0.0238800048828125, 0.03955078125, 0.06427001953125, -0.0235443115234375, -0.010528564453125, -0.037261962890625, -0.01189422607421875, -0.03515625, 0.033203125, -0.02264404296875, -0.06268310546875, 0.0278472900390625, 0.0020313262939453125, 0.00028252601623535156, 0.061126708984375, 0.0433349609375, 0.01216888427734375, 0.083984375, 0.026947021484375, 0.016082763671875, 0.035797119140625, -0.054534912109375, -0.01061248779296875, -0.0908203125, -0.026947021484375, -0.052581787109375, -0.010986328125, -0.0221099853515625, -0.0290679931640625, 0.041351318359375, 0.009185791015625, -0.039764404296875, 0.044219970703125, -0.04803466796875, -0.0032405853271484375, 0.040008544921875, 0.01354217529296875, 0.01546478271484375, 0.0017423629760742188, 0.02899169921875, -0.00656890869140625, -0.0174407958984375, -0.0206146240234375, 0.0867919921875, 0.046417236328125, 0.046539306640625, 0.0273895263671875, 0.04052734375, 0.0022983551025390625, -0.001079559326171875, -0.07012939453125, 0.0208892822265625, -0.020843505859375, -0.039703369140625, -0.010040283203125, -0.0263671875, -0.0516357421875, 0.00395965576171875, -0.004940032958984375, -0.043365478515625, -0.007694244384765625, 0.001445770263671875, -0.006267547607421875, 0.0293731689453125, -0.041748046875, 0.05596923828125, 0.0089874267578125, 0.0188446044921875, -0.022705078125, -0.027618408203125, 0.0435791015625, -0.0010843276977539062, -0.0263671875, 0.011322021484375, -0.00835418701171875, 0.07623291015625, -0.054962158203125, 0.05169677734375, -0.0310211181640625, -0.0216217041015625, 0.04669189453125, 0.0027675628662109375, 0.043853759765625, 0.01000213623046875, -0.01837158203125, 0.033935546875, 0.027557373046875, -0.00313568115234375, -0.0273284912109375, 0.04559326171875, -0.0958251953125, -0.015289306640625, -0.0175323486328125, -0.036834716796875, 0.0284576416015625, 0.00711822509765625, 0.036773681640625, 0.05169677734375, 0.0018053054809570312, 0.0259552001953125, 0.050537109375, 0.0005574226379394531, 0.0284423828125, 0.043853759765625, -0.014312744140625, -0.055267333984375, 0.0555419921875, 0.0167999267578125, 0.025360107421875, 0.0305023193359375, 0.0279388427734375, -0.0440673828125, -0.0738525390625, -0.0256195068359375, 0.0106201171875, -0.03704833984375, -0.0305023193359375, -0.050048828125, -0.04425048828125, -0.04052734375, 0.0144195556640625, -0.048583984375, -0.049041748046875, -0.03546142578125, 0.023345947265625, 0.053009033203125, 0.0408935546875, 0.0004572868347167969, 0.050140380859375, -0.0745849609375, 0.0182952880859375, 0.006649017333984375, 0.007472991943359375, 0.011383056640625, -0.07794189453125, -0.005329132080078125, 0.006927490234375, -0.027618408203125, -0.048492431640625, 0.029510498046875, -0.0009765625, 0.0019350051879882812, 0.0188140869140625, 0.00679779052734375, 0.05401611328125, -0.018768310546875, 0.072998046875, 0.027984619140625, -0.087890625, 0.041748046875, -0.04010009765625, 0.0177154541015625, 0.0300750732421875, 0.00858306884765625, -0.0443115234375, 0.0004119873046875, -0.0408935546875, -0.060394287109375, 0.05902099609375, 0.02764892578125, 0.0003788471221923828, 0.0213623046875, 0.01194000244140625, 0.00490570068359375, 0.01348114013671875, -0.054107666015625, -0.0176849365234375, -0.040863037109375, -0.023834228515625, 0.006931304931640625, -0.0284423828125, -0.0014009475708007812, -0.01690673828125, 0.05816650390625, -0.0136566162109375, 0.03277587890625, 0.035186767578125, -0.003726959228515625, -0.01042938232421875, -0.0016279220581054688, 0.057159423828125, 0.0125732421875, -0.035675048828125, -0.0040283203125, 0.0109405517578125, -0.060546875, -0.0086517333984375, 0.00688934326171875, -0.021636962890625, 0.00875091552734375, 0.03704833984375, 0.065185546875, 0.0234222412109375, -0.0251617431640625, 0.056793212890625, -0.0068206787109375, -0.038543701171875, -0.0706787109375, 0.0033054351806640625, 0.01715087890625, 0.024749755859375, 0.0255889892578125, 0.01241302490234375, 0.01837158203125, -0.0323486328125, -0.020904541015625, 0.0124664306640625, -0.03289794921875, -0.034637451171875, 0.055450439453125, 0.01279449462890625, -0.0265045166015625, 0.04986572265625, 0.002681732177734375, -0.0222625732421875, 0.04083251953125, 0.05963134765625, 0.08404541015625, -0.0123748779296875, 0.00826263427734375, 0.04925537109375, 0.018157958984375, -0.0230560302734375, 0.039947509765625, -0.005619049072265625, -0.0198974609375, -0.033599853515625, -0.052276611328125, -0.023284912109375, 0.017852783203125, -0.06561279296875, 0.0302734375, -0.042724609375, -0.01428985595703125, 0.0052947998046875, 0.00974273681640625, -0.03656005859375, -0.00331878662109375, 0.0121612548828125, 0.1014404296875, -0.042572021484375, 0.0828857421875, 0.03582763671875, -0.02490234375, -0.07080078125, -0.0153350830078125, -0.0023517608642578125, -0.050140380859375, 0.039031982421875, 0.01258087158203125, 0.005374908447265625, -0.0019817352294921875, -0.058013916015625, -0.07135009765625, 0.10894775390625, 0.0024242401123046875, -0.0516357421875, -0.01049041748046875, 0.0038604736328125, 0.037628173828125, -0.03167724609375, 0.04693603515625, 0.024932861328125, 0.066650390625, 0.0141754150390625, -0.091064453125, 0.0022220611572265625, -0.0092315673828125, 0.018646240234375, 0.0209808349609375, -0.0716552734375, 0.08587646484375, -0.01343536376953125, 0.0008039474487304688, 0.0697021484375, 0.0545654296875, 0.020050048828125, 0.0235137939453125, 0.03375244140625, 0.031982421875, 0.03204345703125, -0.0230560302734375, 0.047393798828125, -0.005451202392578125, 0.0369873046875, 0.07000732421875, -0.0241851806640625, 0.08221435546875, 0.0263671875, -0.004886627197265625, 0.046417236328125, 0.02496337890625, -0.024749755859375, 0.033599853515625, 0.01070404052734375, -0.0035381317138671875, -0.0112152099609375, -0.002704620361328125, -0.04302978515625, 0.05621337890625, 0.028167724609375, -0.0308380126953125, -0.0033016204833984375, -0.01184844970703125, 0.0118255615234375, -0.017608642578125, -0.03326416015625, 0.04876708984375, -0.0079345703125, -0.0345458984375, 0.04949951171875, 0.028076171875, 0.061431884765625, -0.07110595703125, -0.01140594482421875, 0.0252685546875, 0.0181427001953125, -0.0163421630859375, -0.06072998046875, 0.025726318359375, -0.0180816650390625, -0.0226593017578125, 0.005329132080078125, 0.042572021484375, -0.043243408203125, -0.031494140625, 0.0286102294921875, 0.00323486328125, 0.0233612060546875, -0.0283203125, -0.04803466796875, 0.03851318359375, 0.017333984375, -0.0235748291015625, 0.0214691162109375, 0.0014553070068359375, -0.0021877288818359375, 0.033050537109375, 0.06402587890625, 0.0093231201171875, -0.004970550537109375, 0.0060577392578125, 0.0531005859375, -0.049041748046875, -0.039947509765625, -0.02508544921875, 0.037841796875, -0.0107574462890625, -0.058380126953125, 0.034698486328125, 0.0640869140625, 0.04022216796875, -0.022125244140625, 0.044891357421875, 0.005153656005859375, 0.023468017578125, -0.06304931640625, 0.054840087890625, -0.034576416015625, -0.014892578125, -0.00325775146484375, -0.0689697265625, 0.0031280517578125, 0.0233612060546875, -0.002033233642578125, -0.012908935546875, 0.04937744140625, 0.05908203125, -0.005802154541015625, 0.0193939208984375, 0.00307464599609375, 0.046661376953125, 0.0289154052734375, 0.04986572265625, 0.0455322265625, -0.0703125, 0.05279541015625, -0.0144500732421875, -0.005443572998046875, -0.00986480712890625, -0.046966552734375, -0.06640625, -0.050384521484375, -0.034912109375, -0.050048828125, -0.0088653564453125, 0.06683349609375, 0.06744384765625, -0.039947509765625, -0.0223388671875, 0.0102996826171875, -0.01251220703125, 0.00594329833984375, -0.018829345703125, 0.03424072265625, 0.026275634765625, -0.056640625, 0.03582763671875, 0.00884246826171875, 0.032623291015625, -0.027008056640625, -0.02801513671875, 0.0256805419921875, -0.0011425018310546875, 0.039764404296875, 0.00469207763671875, -0.060089111328125, -0.019317626953125, -0.025390625, 0.005153656005859375, 0.0057830810546875, 0.0533447265625, -0.0462646484375, 0.002338409423828125, 0.03924560546875, -0.01483154296875, 0.061767578125, -0.031402587890625, 0.0159759521484375, -0.0287017822265625, 0.031982421875, 0.01129913330078125, 0.030242919921875, 0.00937652587890625, -0.006511688232421875, 0.0253143310546875, 0.011810302734375, -0.034576416015625, -0.068603515625, 0.0036525726318359375, -0.1102294921875, -0.008392333984375, 0.09295654296875, 0.00414276123046875, -0.02813720703125, 0.0192108154296875, -0.050384521484375, 0.02545166015625, -0.046051025390625, 0.0141754150390625, 0.011016845703125, 0.03570556640625, -0.0010728836059570312, -0.032928466796875, 0.0255889892578125, -0.004589080810546875, -0.033966064453125, 0.006198883056640625, 0.0174102783203125, 0.033355712890625, 0.03265380859375, 0.032318115234375, -0.035919189453125, 0.031951904296875, 0.0182037353515625, 0.0257415771484375, -0.0274200439453125, -0.0296630859375, -0.025360107421875, 0.002483367919921875, -0.000034928321838378906, -0.0145721435546875 ] ]
oliverguhr/german-sentiment-bert
2023-03-16T18:09:30.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "bert", "text-classification", "sentiment", "de", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-classification
oliverguhr
null
null
oliverguhr/german-sentiment-bert
37
111,502
transformers
2022-03-02T23:29:05
--- language: - de tags: - sentiment - bert license: mit widget: - text: "Das ist gar nicht mal so schlecht" metrics: - f1 --- # German Sentiment Classification with Bert This model was trained for sentiment classification of German language texts. To achieve the best results all model inputs needs to be preprocessed with the same procedure, that was applied during the training. To simplify the usage of the model, we provide a Python package that bundles the code need for the preprocessing and inferencing. The model uses the Googles Bert architecture and was trained on 1.834 million German-language samples. The training data contains texts from various domains like Twitter, Facebook and movie, app and hotel reviews. You can find more information about the dataset and the training process in the [paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.202.pdf). ## Using the Python package To get started install the package from [pypi](https://pypi.org/project/germansentiment/): ```bash pip install germansentiment ``` ```python from germansentiment import SentimentModel model = SentimentModel() texts = [ "Mit keinem guten Ergebniss","Das ist gar nicht mal so gut", "Total awesome!","nicht so schlecht wie erwartet", "Der Test verlief positiv.","Sie fährt ein grünes Auto."] result = model.predict_sentiment(texts) print(result) ``` The code above will output following list: ```python ["negative","negative","positive","positive","neutral", "neutral"] ``` ### Output class probabilities ```python from germansentiment import SentimentModel model = SentimentModel() classes, probabilities = model.predict_sentiment(["das ist super"], output_probabilities = True) print(classes, probabilities) ``` ```python ['positive'] [[['positive', 0.9761366844177246], ['negative', 0.023540444672107697], ['neutral', 0.00032294404809363186]]] ``` ## Model and Data If you are interested in code and data that was used to train this model please have a look at [this repository](https://github.com/oliverguhr/german-sentiment) and our [paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.202.pdf). Here is a table of the F1 scores that this model achieves on different datasets. Since we trained this model with a newer version of the transformer library, the results are slightly better than reported in the paper. | Dataset | F1 micro Score | | :----------------------------------------------------------- | -------------: | | [holidaycheck](https://github.com/oliverguhr/german-sentiment) | 0.9568 | | [scare](https://www.romanklinger.de/scare/) | 0.9418 | | [filmstarts](https://github.com/oliverguhr/german-sentiment) | 0.9021 | | [germeval](https://sites.google.com/view/germeval2017-absa/home) | 0.7536 | | [PotTS](https://www.aclweb.org/anthology/L16-1181/) | 0.6780 | | [emotions](https://github.com/oliverguhr/german-sentiment) | 0.9649 | | [sb10k](https://www.spinningbytes.com/resources/germansentiment/) | 0.7376 | | [Leipzig Wikipedia Corpus 2016](https://wortschatz.uni-leipzig.de/de/download/german) | 0.9967 | | all | 0.9639 | ## Cite For feedback and questions contact me view mail or Twitter [@oliverguhr](https://twitter.com/oliverguhr). Please cite us if you found this useful: ``` @InProceedings{guhr-EtAl:2020:LREC, author = {Guhr, Oliver and Schumann, Anne-Kathrin and Bahrmann, Frank and Böhme, Hans Joachim}, title = {Training a Broad-Coverage German Sentiment Classification Model for Dialog Systems}, booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference}, month = {May}, year = {2020}, address = {Marseille, France}, publisher = {European Language Resources Association}, pages = {1620--1625}, url = {https://www.aclweb.org/anthology/2020.lrec-1.202} } ```
4,081
[ [ -0.04620361328125, -0.040283203125, 0.0186767578125, 0.0122833251953125, -0.01473236083984375, -0.010223388671875, -0.02630615234375, -0.0316162109375, 0.00835418701171875, -0.00827789306640625, -0.04119873046875, -0.061431884765625, -0.037841796875, -0.006256103515625, -0.01480865478515625, 0.1142578125, 0.004261016845703125, 0.042388916015625, -0.0167388916015625, -0.0141143798828125, 0.018646240234375, -0.06683349609375, -0.0369873046875, -0.047149658203125, 0.0411376953125, 0.007549285888671875, 0.033111572265625, -0.0055694580078125, 0.04327392578125, 0.0209503173828125, -0.0225372314453125, -0.021270751953125, -0.0222625732421875, -0.00713348388671875, 0.010894775390625, -0.016815185546875, -0.03936767578125, 0.0196685791015625, 0.0323486328125, 0.030914306640625, -0.00019085407257080078, 0.01467132568359375, 0.0161590576171875, 0.032257080078125, -0.0199737548828125, 0.0281524658203125, -0.013427734375, 0.004852294921875, 0.004344940185546875, 0.007678985595703125, -0.03839111328125, -0.042022705078125, 0.0224456787109375, 0.0007462501525878906, 0.0251617431640625, -0.01509857177734375, 0.094970703125, -0.00733184814453125, -0.0239105224609375, -0.0220489501953125, -0.03692626953125, 0.069091796875, -0.075439453125, 0.0193328857421875, -0.0011835098266601562, 0.0034198760986328125, 0.00966644287109375, -0.03802490234375, -0.04132080078125, -0.0154876708984375, -0.00679779052734375, 0.0350341796875, -0.0240478515625, -0.00458526611328125, 0.00970458984375, 0.045867919921875, -0.035125732421875, -0.004482269287109375, -0.014801025390625, -0.0172576904296875, 0.055755615234375, 0.0024547576904296875, -0.00833892822265625, -0.0311279296875, -0.02301025390625, -0.040740966796875, -0.01113128662109375, 0.033355712890625, 0.0297698974609375, 0.024078369140625, -0.029205322265625, 0.024139404296875, -0.0116119384765625, 0.0340576171875, 0.000705718994140625, -0.01235198974609375, 0.061920166015625, -0.007289886474609375, -0.01389312744140625, -0.017364501953125, 0.0775146484375, 0.04473876953125, 0.037750244140625, 0.00997161865234375, -0.02972412109375, 0.015655517578125, 0.0014858245849609375, -0.07232666015625, -0.0227813720703125, 0.0189056396484375, -0.044677734375, -0.051361083984375, 0.0220184326171875, -0.05950927734375, -0.0114288330078125, -0.01369476318359375, 0.027252197265625, -0.042816162109375, -0.01544189453125, 0.007007598876953125, -0.0205078125, 0.026458740234375, 0.0193634033203125, -0.042388916015625, -0.004058837890625, 0.024688720703125, 0.051971435546875, 0.00711822509765625, -0.01271820068359375, -0.004608154296875, -0.040008544921875, -0.0163421630859375, 0.056243896484375, -0.0206756591796875, -0.034820556640625, 0.02178955078125, 0.0105743408203125, 0.007843017578125, -0.0208740234375, 0.0579833984375, -0.042022705078125, 0.059326171875, 0.002506256103515625, -0.059722900390625, -0.033355712890625, 0.0165557861328125, -0.030975341796875, 0.0751953125, 0.01102447509765625, -0.07171630859375, 0.0065155029296875, -0.051422119140625, -0.038238525390625, -0.02606201171875, 0.01380157470703125, -0.04254150390625, 0.007228851318359375, 0.01214599609375, 0.0550537109375, -0.007762908935546875, 0.022674560546875, -0.038665771484375, -0.00952911376953125, 0.03900146484375, -0.02911376953125, 0.0828857421875, 0.0178680419921875, -0.022247314453125, 0.0030040740966796875, -0.0552978515625, 0.01116180419921875, 0.0097503662109375, -0.037261962890625, -0.009246826171875, -0.0068359375, 0.02911376953125, 0.0338134765625, 0.02313232421875, -0.041259765625, 0.0011005401611328125, -0.032196044921875, 0.020172119140625, 0.0640869140625, 0.0037555694580078125, 0.03875732421875, -0.0224456787109375, 0.034271240234375, 0.0171051025390625, 0.01806640625, -0.000629425048828125, -0.04156494140625, -0.059661865234375, -0.0199737548828125, 0.031341552734375, 0.057861328125, -0.032318115234375, 0.0643310546875, -0.0211944580078125, -0.07196044921875, -0.039642333984375, -0.009490966796875, 0.0215606689453125, 0.048065185546875, 0.0228271484375, -0.0065155029296875, -0.041748046875, -0.0635986328125, -0.007083892822265625, -0.0257110595703125, -0.001728057861328125, 0.0161285400390625, 0.049407958984375, -0.034942626953125, 0.06390380859375, -0.049285888671875, -0.01556396484375, -0.018585205078125, 0.0215606689453125, 0.04864501953125, 0.033355712890625, 0.044830322265625, -0.048095703125, -0.04742431640625, -0.009796142578125, -0.062744140625, 0.0007920265197753906, 0.01551055908203125, 0.0009927749633789062, 0.04638671875, 0.01476287841796875, -0.037506103515625, 0.00931549072265625, 0.032379150390625, -0.032470703125, 0.038665771484375, 0.007602691650390625, -0.00010436773300170898, -0.08624267578125, 0.001918792724609375, 0.033599853515625, -0.018646240234375, -0.05303955078125, -0.005390167236328125, -0.006618499755859375, 0.00838470458984375, -0.037109375, 0.04833984375, 0.00473785400390625, 0.032318115234375, 0.006969451904296875, -0.019378662109375, -0.0047149658203125, 0.05279541015625, 0.0096588134765625, 0.03912353515625, 0.040618896484375, -0.0234527587890625, 0.02728271484375, 0.0198211669921875, -0.035491943359375, 0.03314208984375, -0.035888671875, 0.0018358230590820312, -0.015625, 0.0257415771484375, -0.0748291015625, -0.0157318115234375, 0.02606201171875, -0.05657958984375, 0.02484130859375, 0.0051422119140625, -0.0404052734375, -0.028594970703125, -0.041259765625, -0.0089874267578125, 0.068115234375, -0.0281524658203125, 0.04150390625, 0.021697998046875, -0.01387786865234375, -0.04119873046875, -0.054229736328125, -0.027587890625, -0.0255889892578125, -0.04364013671875, 0.00901031494140625, -0.0139007568359375, -0.0187530517578125, 0.002010345458984375, -0.001918792724609375, 0.000053882598876953125, -0.0091552734375, 0.0193328857421875, 0.04022216796875, -0.00875091552734375, 0.0255279541015625, -0.01515960693359375, -0.00623321533203125, 0.0135955810546875, -0.0030956268310546875, 0.054595947265625, -0.05511474609375, 0.003696441650390625, -0.0338134765625, 0.0218658447265625, 0.042999267578125, 0.004791259765625, 0.053497314453125, 0.071044921875, -0.0191192626953125, -0.00513458251953125, -0.032745361328125, -0.02545166015625, -0.032989501953125, 0.027069091796875, -0.0199127197265625, -0.0684814453125, 0.046722412109375, 0.006237030029296875, -0.00145721435546875, 0.061248779296875, 0.04119873046875, -0.044342041015625, 0.0888671875, 0.0477294921875, -0.0364990234375, 0.047027587890625, -0.041229248046875, 0.0211029052734375, -0.027801513671875, -0.00917816162109375, -0.04144287109375, -0.0147552490234375, -0.06048583984375, -0.01129150390625, 0.027557373046875, 0.006160736083984375, -0.031829833984375, 0.0052947998046875, -0.032684326171875, 0.0066680908203125, 0.04791259765625, 0.012969970703125, 0.001781463623046875, 0.0257415771484375, -0.006435394287109375, -0.01324462890625, -0.05072021484375, -0.051422119140625, 0.07086181640625, 0.049713134765625, 0.05511474609375, -0.01160430908203125, 0.0684814453125, 0.02923583984375, 0.040496826171875, -0.07025146484375, 0.052581787109375, -0.0290069580078125, -0.05096435546875, -0.011627197265625, -0.0318603515625, -0.03924560546875, 0.0014753341674804688, -0.01117706298828125, -0.0423583984375, 0.0187530517578125, -0.00002181529998779297, -0.018768310546875, 0.0189666748046875, -0.05694580078125, 0.06646728515625, -0.007511138916015625, -0.00750732421875, -0.0223236083984375, -0.048187255859375, 0.00505828857421875, 0.004970550537109375, 0.0279693603515625, -0.00675201416015625, 0.0207672119140625, 0.0643310546875, -0.0158233642578125, 0.08209228515625, -0.027801513671875, -0.017333984375, 0.0235443115234375, -0.016265869140625, 0.035247802734375, -0.006153106689453125, -0.0198516845703125, 0.040313720703125, -0.007343292236328125, -0.0229949951171875, -0.019927978515625, 0.0479736328125, -0.07623291015625, -0.0274200439453125, -0.05517578125, -0.0278778076171875, -0.0111846923828125, 0.01080322265625, 0.042938232421875, 0.0318603515625, -0.020050048828125, 0.0184173583984375, 0.047637939453125, -0.028594970703125, 0.037384033203125, 0.03717041015625, -0.0224456787109375, -0.040191650390625, 0.07208251953125, 0.00689697265625, 0.0081939697265625, 0.0301361083984375, 0.01279449462890625, -0.027984619140625, -0.00870513916015625, -0.0074462890625, 0.020660400390625, -0.07855224609375, -0.0218963623046875, -0.038299560546875, -0.040679931640625, -0.042724609375, -0.0184173583984375, -0.005401611328125, -0.0306549072265625, -0.02899169921875, -0.026214599609375, 0.036895751953125, 0.03814697265625, -0.01548004150390625, 0.0273895263671875, -0.05230712890625, -0.007289886474609375, 0.01554107666015625, 0.026123046875, -0.0011701583862304688, -0.049224853515625, -0.0341796875, 0.0014123916625976562, -0.034820556640625, -0.0650634765625, 0.040283203125, 0.0111846923828125, 0.02227783203125, 0.045074462890625, 0.006084442138671875, 0.0240020751953125, -0.02386474609375, 0.06988525390625, 0.0175628662109375, -0.06658935546875, 0.04571533203125, -0.0289154052734375, 0.0235137939453125, 0.03253173828125, 0.041656494140625, -0.039306640625, -0.03814697265625, -0.07159423828125, -0.06280517578125, 0.0648193359375, -0.0029468536376953125, 0.017974853515625, -0.00079345703125, 0.0002675056457519531, 0.00018274784088134766, 0.023284912109375, -0.0841064453125, -0.0290679931640625, -0.0284576416015625, -0.038055419921875, -0.032989501953125, -0.0247650146484375, -0.01502227783203125, -0.046905517578125, 0.08062744140625, 0.0050811767578125, 0.047332763671875, 0.01849365234375, 0.0007185935974121094, -0.002132415771484375, 0.028472900390625, 0.02960205078125, 0.00799560546875, -0.04498291015625, -0.0005879402160644531, 0.0279083251953125, -0.0264129638671875, 0.01384735107421875, 0.015655517578125, -0.023101806640625, 0.0230865478515625, 0.036041259765625, 0.0679931640625, -0.00786590576171875, -0.0325927734375, 0.0479736328125, -0.0014934539794921875, -0.029876708984375, -0.048431396484375, 0.00006365776062011719, 0.0013895034790039062, 0.0224151611328125, 0.023773193359375, 0.02044677734375, 0.0078887939453125, -0.0301361083984375, 0.00592803955078125, 0.0322265625, -0.041229248046875, -0.033172607421875, 0.02276611328125, 0.01525115966796875, -0.00986480712890625, 0.04638671875, -0.027587890625, -0.059326171875, 0.037506103515625, 0.020172119140625, 0.0772705078125, -0.01104736328125, 0.037933349609375, 0.042694091796875, 0.01605224609375, -0.0008625984191894531, 0.0380859375, 0.00970458984375, -0.0697021484375, -0.01468658447265625, -0.060272216796875, -0.0161285400390625, 0.0199127197265625, -0.05474853515625, 0.0154876708984375, -0.028228759765625, -0.03472900390625, -0.001728057861328125, 0.008758544921875, -0.0546875, 0.031890869140625, 0.027069091796875, 0.09051513671875, -0.08062744140625, 0.06744384765625, 0.07513427734375, -0.041259765625, -0.054046630859375, 0.006561279296875, -0.0096893310546875, -0.032928466796875, 0.035125732421875, 0.024749755859375, -0.0047454833984375, -0.0013637542724609375, -0.046630859375, -0.04205322265625, 0.057769775390625, -0.00479888916015625, -0.0284881591796875, 0.001102447509765625, 0.0070953369140625, 0.068359375, -0.0274658203125, 0.0238189697265625, 0.031829833984375, 0.038055419921875, -0.01396942138671875, -0.039276123046875, -0.01018524169921875, -0.0391845703125, -0.0110321044921875, -0.00284576416015625, -0.054168701171875, 0.06109619140625, 0.0007596015930175781, -0.003200531005859375, -0.007843017578125, 0.050201416015625, 0.004497528076171875, 0.02716064453125, 0.047027587890625, 0.056610107421875, 0.051422119140625, -0.0283660888671875, 0.09112548828125, -0.027496337890625, 0.054168701171875, 0.057861328125, -0.002880096435546875, 0.07275390625, 0.0243988037109375, -0.035675048828125, 0.06298828125, 0.057525634765625, -0.017181396484375, 0.05712890625, -0.003719329833984375, -0.027679443359375, -0.0257415771484375, 0.004802703857421875, -0.03143310546875, 0.01953125, 0.01433563232421875, -0.03436279296875, -0.019195556640625, 0.01258087158203125, 0.020538330078125, -0.021270751953125, -0.01245880126953125, 0.0587158203125, 0.01471710205078125, -0.04541015625, 0.047027587890625, 0.0011768341064453125, 0.0631103515625, -0.03961181640625, 0.025482177734375, -0.0214080810546875, 0.032989501953125, -0.0232696533203125, -0.0687255859375, 0.006298065185546875, 0.01702880859375, -0.0338134765625, -0.025726318359375, 0.054351806640625, -0.020111083984375, -0.058502197265625, 0.03570556640625, 0.034210205078125, 0.01178741455078125, 0.006561279296875, -0.061309814453125, -0.016021728515625, 0.0256805419921875, -0.048583984375, 0.0200653076171875, 0.0222625732421875, 0.01067352294921875, 0.02874755859375, 0.03717041015625, -0.005340576171875, -0.01438140869140625, 0.0194091796875, 0.06671142578125, -0.056243896484375, -0.041961669921875, -0.06597900390625, 0.06378173828125, -0.012420654296875, -0.0244903564453125, 0.061492919921875, 0.04339599609375, 0.074951171875, -0.037628173828125, 0.0736083984375, -0.0189056396484375, 0.051605224609375, -0.01007843017578125, 0.04571533203125, -0.044281005859375, -0.002838134765625, -0.024322509765625, -0.060821533203125, -0.01557159423828125, 0.06256103515625, -0.04107666015625, 0.0076751708984375, 0.0361328125, 0.05987548828125, 0.01111602783203125, 0.0041656494140625, -0.0005044937133789062, 0.0340576171875, 0.017242431640625, 0.0277099609375, 0.060028076171875, -0.034515380859375, 0.04022216796875, -0.0390625, -0.0241241455078125, -0.0252532958984375, -0.053741455078125, -0.0677490234375, -0.048614501953125, -0.029022216796875, -0.0509033203125, -0.01445770263671875, 0.0697021484375, 0.03314208984375, -0.086669921875, -0.02484130859375, 0.0022907257080078125, -0.00792694091796875, -0.01195526123046875, -0.027496337890625, 0.02813720703125, -0.038909912109375, -0.062103271484375, -0.003826141357421875, -0.0120849609375, 0.017120361328125, -0.0068359375, -0.0179290771484375, -0.00531005859375, 0.00443267822265625, 0.045745849609375, -0.0017185211181640625, -0.048614501953125, -0.02288818359375, 0.00830841064453125, -0.026519775390625, 0.0110931396484375, 0.01885986328125, -0.042755126953125, 0.0100250244140625, 0.050140380859375, 0.020721435546875, 0.0310516357421875, -0.0106353759765625, 0.02423095703125, -0.05523681640625, 0.00634002685546875, 0.0230865478515625, 0.0396728515625, 0.025390625, -0.014862060546875, 0.037078857421875, 0.01593017578125, -0.0484619140625, -0.058197021484375, -0.00466156005859375, -0.078125, -0.026153564453125, 0.08660888671875, -0.018585205078125, -0.00955963134765625, 0.0292816162109375, -0.017303466796875, 0.0280303955078125, -0.05279541015625, 0.061492919921875, 0.08477783203125, -0.006847381591796875, 0.00016415119171142578, -0.0278778076171875, 0.0406494140625, 0.0303192138671875, -0.045562744140625, -0.01419830322265625, 0.031951904296875, 0.0243377685546875, 0.027984619140625, 0.036773681640625, -0.00543212890625, 0.01473236083984375, -0.0166015625, 0.037689208984375, 0.0269927978515625, -0.00737762451171875, -0.0555419921875, 0.019989013671875, -0.01236724853515625, -0.00617218017578125 ] ]
hotshotco/Hotshot-XL
2023-10-11T17:44:31.000Z
[ "diffusers", "text-to-video", "stable-diffusion", "license:openrail++", "diffusers:HotshotXLPipeline", "region:us" ]
text-to-video
hotshotco
null
null
hotshotco/Hotshot-XL
149
111,340
diffusers
2023-10-03T08:20:30
--- license: openrail++ tags: - text-to-video - stable-diffusion --- ![image/gif](https://cdn-uploads.huggingface.co/production/uploads/637a6daf7ce76c3b83497ea2/ux_sZKB9snVPsKRT1TzfG.gif) <font size="32">**Try Hotshot-XL yourself here**: https://www.hotshot.co</font> Hotshot-XL is an AI text-to-GIF model trained to work alongside [Stable Diffusion XL](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0). Hotshot-XL can generate GIFs with any fine-tuned SDXL model. This means two things: 1. You’ll be able to make GIFs with any existing or newly fine-tuned SDXL model you may want to use. 2. If you'd like to make GIFs of personalized subjects, you can load your own SDXL based LORAs, and not have to worry about fine-tuning Hotshot-XL. This is awesome because it’s usually much easier to find suitable images for training data than it is to find videos. It also hopefully fits into everyone's existing LORA usage/workflows :) See more [here](https://github.com/hotshotco/Hotshot-XL/blob/main/README.md#text-to-gif-with-personalized-loras). Hotshot-XL is compatible with SDXL ControlNet to make GIFs in the composition/layout you’d like. See [here](https://github.com/hotshotco/Hotshot-XL/blob/main/README.md#text-to-gif-with-controlnet) for more info. Hotshot-XL was trained to generate 1 second GIFs at 8 FPS. Hotshot-XL was trained on various aspect ratios. For best results with the base Hotshot-XL model, we recommend using it with an SDXL model that has been fine-tuned with 512x512 images. You can find an SDXL model we fine-tuned for 512x512 resolutions [here](https://github.com/hotshotco/Hotshot-XL/blob/main/README.md#text-to-gif-with-personalized-loras). ![image/gif](https://cdn-uploads.huggingface.co/production/uploads/637a6daf7ce76c3b83497ea2/XXgnk14nIasPdkvkPlDzn.gif) ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/637a6daf7ce76c3b83497ea2/6OknWOlsl9Zs_esGtPTlZ.jpeg) Source code is available at https://github.com/hotshotco/Hotshot-XL. # Model Description - **Developed by**: Natural Synthetics Inc. - **Model type**: Diffusion-based text-to-GIF generative model - **License**: [CreativeML Open RAIL++-M License](https://huggingface.co/hotshotco/Hotshot-XL/raw/main/LICENSE.md) - **Model Description**: This is a model that can be used to generate and modify GIFs based on text prompts. It is a Latent Diffusion Model that uses two fixed, pretrained text encoders (OpenCLIP-ViT/G and CLIP-ViT/L). - **Resources for more information**: Check out our [GitHub Repository](https://github.com/hotshotco/Hotshot-XL). # Limitations and Bias ## Limitations - The model does not achieve perfect photorealism - The model cannot render legible text - The model struggles with more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere” - Faces and people in general may not be generated properly. ## Bias While the capabilities of video generation models are impressive, they can also reinforce or exacerbate social biases.
3,061
[ [ -0.0384521484375, -0.08282470703125, 0.034088134765625, 0.01198577880859375, -0.0240020751953125, -0.005062103271484375, -0.01934814453125, -0.0178375244140625, 0.031982421875, 0.0172271728515625, -0.041229248046875, -0.05950927734375, -0.064453125, -0.0140228271484375, -0.050567626953125, 0.100341796875, -0.0140380859375, -0.0272674560546875, -0.006793975830078125, -0.01378631591796875, -0.0030517578125, -0.001056671142578125, -0.04150390625, -0.0257568359375, 0.05010986328125, 0.00036716461181640625, 0.0758056640625, 0.0303802490234375, 0.0411376953125, 0.020751953125, -0.021148681640625, -0.010101318359375, -0.02642822265625, 0.006114959716796875, -0.0012178421020507812, -0.002490997314453125, -0.029510498046875, -0.01154327392578125, 0.046844482421875, 0.0122222900390625, -0.024444580078125, 0.0022869110107421875, -0.0083770751953125, 0.0692138671875, -0.050933837890625, -0.024139404296875, -0.0279541015625, 0.0285491943359375, -0.00807952880859375, 0.005939483642578125, 0.0154571533203125, -0.024078369140625, 0.0186920166015625, -0.050384521484375, 0.0236053466796875, -0.02423095703125, 0.08062744140625, 0.0191192626953125, -0.0157318115234375, -0.007297515869140625, -0.0640869140625, 0.004138946533203125, -0.054229736328125, -0.0005164146423339844, 0.006778717041015625, 0.039886474609375, 0.0167236328125, -0.0479736328125, -0.04107666015625, 0.010955810546875, 0.025726318359375, 0.0034084320068359375, -0.041473388671875, -0.011871337890625, 0.032623291015625, 0.0248565673828125, -0.0660400390625, -0.01953125, -0.0191497802734375, 0.0151214599609375, 0.042022705078125, 0.0012159347534179688, 0.0294952392578125, 0.0009889602661132812, -0.016082763671875, -0.0131378173828125, -0.030548095703125, -0.0078582763671875, 0.03900146484375, -0.0016689300537109375, -0.04974365234375, 0.053009033203125, 0.006664276123046875, 0.038482666015625, 0.0242462158203125, -0.004405975341796875, 0.032958984375, -0.01444244384765625, -0.034515380859375, -0.0187225341796875, 0.09100341796875, 0.03778076171875, 0.01300048828125, 0.0163726806640625, -0.012786865234375, 0.0019083023071289062, -0.00345611572265625, -0.07159423828125, -0.00007903575897216797, 0.017120361328125, -0.00809478759765625, -0.036651611328125, 0.004077911376953125, -0.05499267578125, -0.008331298828125, 0.0161895751953125, 0.0369873046875, -0.0438232421875, -0.0411376953125, 0.01702880859375, -0.040252685546875, -0.005279541015625, 0.02874755859375, -0.041839599609375, -0.01096343994140625, 0.005939483642578125, 0.088623046875, -0.006931304931640625, -0.0081787109375, -0.0186309814453125, 0.01776123046875, -0.00591278076171875, 0.0758056640625, -0.0306243896484375, -0.025909423828125, 0.0198211669921875, 0.002147674560546875, 0.00811767578125, -0.032867431640625, 0.0143280029296875, -0.03802490234375, 0.012420654296875, 0.0007748603820800781, -0.035797119140625, -0.002086639404296875, 0.004131317138671875, -0.042327880859375, 0.0645751953125, 0.0174407958984375, -0.0697021484375, 0.01474761962890625, -0.075927734375, -0.0196380615234375, 0.0030517578125, -0.01042938232421875, -0.050628662109375, -0.016998291015625, -0.0031604766845703125, 0.047088623046875, 0.00904083251953125, 0.0114593505859375, -0.03265380859375, -0.0191497802734375, 0.010650634765625, -0.01111602783203125, 0.03076171875, 0.025848388671875, -0.035980224609375, 0.0110931396484375, -0.0445556640625, -0.00215911865234375, 0.0386962890625, -0.034454345703125, -0.01442718505859375, -0.028045654296875, 0.004123687744140625, 0.015594482421875, 0.0029506683349609375, -0.0198516845703125, 0.00421142578125, -0.01495361328125, 0.04229736328125, 0.039886474609375, 0.01111602783203125, 0.05523681640625, -0.00799560546875, 0.054779052734375, 0.03521728515625, -0.0013093948364257812, -0.031402587890625, -0.037017822265625, -0.033599853515625, 0.0008177757263183594, -0.0035762786865234375, 0.023040771484375, -0.08050537109375, 0.0300750732421875, -0.00936126708984375, -0.036041259765625, -0.017364501953125, 0.0303802490234375, 0.0249481201171875, 0.032012939453125, 0.0031986236572265625, -0.0213775634765625, -0.0172271728515625, -0.044281005859375, 0.018829345703125, 0.0027332305908203125, 0.009796142578125, 0.012298583984375, 0.0496826171875, -0.0137939453125, 0.06805419921875, -0.05853271484375, -0.02496337890625, -0.01666259765625, -0.0033931732177734375, 0.0193939208984375, 0.02783203125, 0.0792236328125, -0.059417724609375, -0.053436279296875, 0.002933502197265625, -0.07391357421875, 0.01238250732421875, 0.0192108154296875, -0.0224151611328125, 0.019317626953125, 0.0243988037109375, -0.0750732421875, 0.043121337890625, 0.04437255859375, -0.04364013671875, 0.03936767578125, -0.0299072265625, 0.01525115966796875, -0.0767822265625, 0.014862060546875, 0.045562744140625, -0.0379638671875, -0.028961181640625, 0.014373779296875, -0.007625579833984375, -0.033355712890625, -0.0701904296875, 0.046173095703125, -0.0301361083984375, -0.006046295166015625, -0.028228759765625, 0.0168304443359375, 0.00586700439453125, 0.030975341796875, 0.01485443115234375, 0.054931640625, 0.07073974609375, -0.044281005859375, 0.02471923828125, 0.01427459716796875, -0.0384521484375, 0.058258056640625, -0.07403564453125, 0.0028934478759765625, -0.04010009765625, 0.029266357421875, -0.051513671875, -0.02545166015625, 0.03656005859375, -0.0372314453125, 0.003314971923828125, -0.0253143310546875, -0.03863525390625, -0.036590576171875, -0.0268096923828125, 0.008544921875, 0.07257080078125, -0.0226898193359375, 0.0251617431640625, 0.0185699462890625, 0.0152130126953125, -0.039581298828125, -0.053314208984375, -0.00902557373046875, -0.03619384765625, -0.064208984375, 0.0408935546875, -0.0267181396484375, -0.0144195556640625, 0.0183258056640625, 0.0247650146484375, -0.00347900390625, -0.0162506103515625, 0.042083740234375, 0.019683837890625, -0.027984619140625, -0.0033550262451171875, 0.01424407958984375, 0.0089111328125, -0.01568603515625, -0.0054931640625, 0.02142333984375, -0.007358551025390625, -0.0098724365234375, -0.039276123046875, 0.044036865234375, 0.04608154296875, 0.01071929931640625, 0.0654296875, 0.08697509765625, -0.0299072265625, -0.0015935897827148438, -0.0195770263671875, -0.0014591217041015625, -0.03961181640625, 0.0184478759765625, -0.0018901824951171875, -0.054229736328125, 0.037384033203125, 0.0240936279296875, 0.01126861572265625, 0.02923583984375, 0.053070068359375, -0.0213165283203125, 0.067626953125, 0.044036865234375, 0.0095062255859375, 0.0709228515625, -0.0562744140625, -0.01233673095703125, -0.048553466796875, -0.003032684326171875, -0.008148193359375, -0.01385498046875, -0.002872467041015625, -0.0352783203125, 0.03216552734375, 0.0027923583984375, -0.039825439453125, 0.035614013671875, -0.030975341796875, 0.042327880859375, -0.0015926361083984375, 0.0203094482421875, 0.0078277587890625, 0.023468017578125, -0.0016193389892578125, 0.0016889572143554688, -0.037384033203125, -0.03515625, 0.053375244140625, 0.0266571044921875, 0.06927490234375, 0.0224761962890625, 0.040252685546875, 0.04107666015625, 0.01629638671875, -0.031829833984375, 0.051849365234375, -0.0215301513671875, -0.04315185546875, -0.00960540771484375, -0.0184783935546875, -0.06488037109375, -0.0021724700927734375, -0.038238525390625, -0.029052734375, -0.02349853515625, 0.024871826171875, -0.046142578125, 0.03814697265625, -0.061370849609375, 0.047760009765625, 0.004425048828125, -0.051055908203125, 0.0108795166015625, -0.044281005859375, 0.0235595703125, 0.0153656005859375, 0.005489349365234375, -0.036529541015625, -0.006877899169921875, 0.051971435546875, -0.045989990234375, 0.06329345703125, -0.037139892578125, -0.024871826171875, 0.034210205078125, -0.0110626220703125, 0.0256805419921875, 0.0099945068359375, -0.0164337158203125, 0.0008015632629394531, -0.01100921630859375, -0.01611328125, -0.055999755859375, 0.07183837890625, -0.06549072265625, -0.0380859375, -0.0171661376953125, -0.003032684326171875, 0.0212249755859375, 0.0035533905029296875, 0.04632568359375, 0.0149993896484375, -0.0236663818359375, -0.00484466552734375, 0.055633544921875, -0.015655517578125, 0.053558349609375, 0.03082275390625, -0.03271484375, -0.0386962890625, 0.0469970703125, 0.0008492469787597656, 0.033355712890625, 0.011932373046875, 0.01470184326171875, -0.00827789306640625, -0.02001953125, -0.04595947265625, 0.0236663818359375, -0.06732177734375, -0.0303802490234375, -0.0604248046875, -0.0352783203125, -0.045684814453125, -0.039703369140625, -0.033355712890625, -0.01166534423828125, -0.069091796875, -0.0005898475646972656, 0.06182861328125, 0.039642333984375, 0.00011116266250610352, 0.0285491943359375, -0.052581787109375, 0.0236053466796875, 0.0220489501953125, 0.0120697021484375, 0.0011920928955078125, -0.05670166015625, -0.01107025146484375, 0.006717681884765625, -0.045867919921875, -0.045684814453125, 0.0413818359375, 0.028076171875, 0.01983642578125, 0.034912109375, -0.0087127685546875, 0.054840087890625, -0.03936767578125, 0.0770263671875, 0.047515869140625, -0.044097900390625, 0.06732177734375, -0.038421630859375, 0.03900146484375, 0.01377105712890625, 0.037078857421875, -0.0301666259765625, -0.026519775390625, -0.056243896484375, -0.0606689453125, 0.0232391357421875, 0.025848388671875, 0.0455322265625, -0.00472259521484375, 0.04638671875, 0.0025463104248046875, -0.003711700439453125, -0.05908203125, -0.03253173828125, -0.0207977294921875, -0.00609588623046875, 0.0003044605255126953, 0.007595062255859375, 0.0013866424560546875, -0.0171966552734375, 0.04534912109375, -0.0237884521484375, 0.04290771484375, 0.0224761962890625, 0.0053253173828125, -0.0268707275390625, -0.0004911422729492188, 0.05645751953125, 0.00742340087890625, -0.01371002197265625, -0.00026035308837890625, -0.01404571533203125, -0.0228729248046875, 0.02435302734375, -0.015960693359375, -0.023345947265625, 0.024871826171875, -0.00290679931640625, 0.09197998046875, 0.015838623046875, -0.0430908203125, 0.057830810546875, -0.00743865966796875, -0.037841796875, -0.04364013671875, 0.0423583984375, 0.01702880859375, 0.0263824462890625, 0.0113067626953125, 0.0294036865234375, -0.005290985107421875, -0.035064697265625, -0.00015044212341308594, 0.03472900390625, -0.0194091796875, -0.044921875, 0.0728759765625, 0.017242431640625, -0.02880859375, 0.01525115966796875, -0.0170440673828125, -0.01280975341796875, 0.04937744140625, 0.0408935546875, 0.06298828125, 0.01343536376953125, 0.04974365234375, 0.044158935546875, -0.0225677490234375, -0.02325439453125, 0.006465911865234375, -0.005985260009765625, -0.05413818359375, 0.0085906982421875, -0.0302734375, -0.0201416015625, 0.007045745849609375, -0.03759765625, 0.04534912109375, -0.04345703125, -0.043121337890625, 0.0343017578125, -0.02459716796875, -0.051300048828125, 0.0025272369384765625, 0.0179443359375, 0.07275390625, -0.0552978515625, 0.05938720703125, 0.05255126953125, -0.0521240234375, -0.0178985595703125, -0.0029449462890625, 0.01117706298828125, -0.055145263671875, 0.03997802734375, 0.002902984619140625, -0.00687408447265625, -0.0119171142578125, -0.0596923828125, -0.05426025390625, 0.0968017578125, 0.04290771484375, -0.0303497314453125, -0.0311431884765625, -0.01043701171875, 0.04437255859375, -0.048828125, 0.0237274169921875, 0.0181732177734375, 0.0182647705078125, 0.04412841796875, -0.03387451171875, -0.006015777587890625, -0.030242919921875, 0.0293121337890625, -0.00443267822265625, -0.05120849609375, 0.042633056640625, -0.055145263671875, -0.0230255126953125, 0.022613525390625, 0.053009033203125, 0.00969696044921875, 0.0297393798828125, 0.03302001953125, 0.05914306640625, 0.01126861572265625, 0.004390716552734375, 0.07525634765625, -0.00003510713577270508, 0.03021240234375, 0.0716552734375, -0.03216552734375, 0.05950927734375, 0.029632568359375, -0.0091552734375, 0.047119140625, 0.04193115234375, 0.00009441375732421875, 0.051055908203125, -0.007587432861328125, -0.034942626953125, 0.0018138885498046875, 0.00455474853515625, -0.0291290283203125, 0.004047393798828125, 0.014251708984375, -0.051116943359375, -0.0269927978515625, 0.040313720703125, 0.0104827880859375, -0.00011551380157470703, -0.006000518798828125, 0.05364990234375, -0.00891876220703125, -0.039306640625, 0.0445556640625, -0.003910064697265625, 0.08135986328125, -0.059234619140625, 0.00962066650390625, -0.02325439453125, 0.0008497238159179688, -0.03460693359375, -0.1070556640625, 0.0438232421875, 0.03125, -0.01259613037109375, -0.01812744140625, 0.06048583984375, -0.03436279296875, -0.022918701171875, 0.012542724609375, 0.025054931640625, 0.0208282470703125, -0.0118865966796875, -0.0841064453125, 0.03179931640625, -0.00524139404296875, -0.031494140625, 0.03057861328125, 0.021942138671875, 0.03759765625, 0.048065185546875, 0.03515625, -0.017578125, -0.01605224609375, 0.01406097412109375, 0.0703125, -0.05364990234375, -0.0257568359375, -0.06988525390625, 0.04083251953125, -0.004146575927734375, -0.0245361328125, 0.040496826171875, 0.0265350341796875, 0.03143310546875, -0.029571533203125, 0.06292724609375, -0.0223541259765625, 0.002201080322265625, -0.041717529296875, 0.06781005859375, -0.061004638671875, 0.001628875732421875, -0.05072021484375, -0.05078125, -0.0218048095703125, 0.042083740234375, -0.0031604766845703125, 0.013031005859375, 0.029754638671875, 0.0699462890625, 0.00009673833847045898, -0.01934814453125, 0.04498291015625, -0.018798828125, 0.02032470703125, 0.0400390625, 0.045257568359375, -0.055419921875, 0.044219970703125, -0.0256500244140625, 0.00039315223693847656, -0.00955963134765625, -0.07403564453125, -0.0693359375, -0.05963134765625, -0.07696533203125, -0.044464111328125, -0.000011086463928222656, 0.06182861328125, 0.07318115234375, -0.004894256591796875, 0.01396942138671875, -0.00011283159255981445, 0.01873779296875, 0.0112152099609375, -0.0188140869140625, -0.00250244140625, 0.0278778076171875, -0.059417724609375, -0.01108551025390625, 0.0311737060546875, 0.06549072265625, -0.021636962890625, -0.0204620361328125, -0.013092041015625, 0.005733489990234375, 0.05462646484375, 0.040679931640625, -0.026885986328125, -0.0023345947265625, -0.005336761474609375, -0.0177001953125, 0.0262908935546875, 0.041534423828125, -0.049530029296875, -0.01300048828125, 0.0247802734375, 0.01419830322265625, 0.042816162109375, -0.0207366943359375, 0.00897979736328125, -0.0479736328125, 0.0182647705078125, -0.018218994140625, 0.0292205810546875, 0.03863525390625, -0.03228759765625, 0.04083251953125, 0.030670166015625, -0.02801513671875, -0.0430908203125, -0.007904052734375, -0.08807373046875, -0.0128936767578125, 0.10662841796875, -0.0260162353515625, -0.03106689453125, 0.0289154052734375, -0.028839111328125, -0.0157318115234375, -0.0491943359375, 0.0301055908203125, 0.0262298583984375, -0.0022525787353515625, -0.043487548828125, -0.020050048828125, 0.0179901123046875, -0.0016260147094726562, -0.06707763671875, -0.01442718505859375, 0.034210205078125, 0.056854248046875, 0.018890380859375, 0.06256103515625, -0.00830841064453125, 0.029998779296875, -0.0072021484375, -0.0014200210571289062, 0.004856109619140625, 0.01152801513671875, -0.0256500244140625, 0.0179901123046875, -0.0226898193359375, 0.000025510787963867188 ] ]
unitary/toxic-bert
2023-08-18T10:42:55.000Z
[ "transformers", "pytorch", "jax", "bert", "text-classification", "arxiv:1703.04009", "arxiv:1905.12516", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
unitary
null
null
unitary/toxic-bert
87
111,015
transformers
2022-03-02T23:29:05
--- license: apache-2.0 --- <div align="center"> **⚠️ Disclaimer:** The huggingface models currently give different results to the detoxify library (see issue [here](https://github.com/unitaryai/detoxify/issues/15)). For the most up to date models we recommend using the models from https://github.com/unitaryai/detoxify # 🙊 Detoxify ## Toxic Comment Classification with ⚡ Pytorch Lightning and 🤗 Transformers ![CI testing](https://github.com/unitaryai/detoxify/workflows/CI%20testing/badge.svg) ![Lint](https://github.com/unitaryai/detoxify/workflows/Lint/badge.svg) </div> ![Examples image](examples.png) ## Description Trained models & code to predict toxic comments on 3 Jigsaw challenges: Toxic comment classification, Unintended Bias in Toxic comments, Multilingual toxic comment classification. Built by [Laura Hanu](https://laurahanu.github.io/) at [Unitary](https://www.unitary.ai/), where we are working to stop harmful content online by interpreting visual content in context. Dependencies: - For inference: - 🤗 Transformers - ⚡ Pytorch lightning - For training will also need: - Kaggle API (to download data) | Challenge | Year | Goal | Original Data Source | Detoxify Model Name | Top Kaggle Leaderboard Score | Detoxify Score |-|-|-|-|-|-|-| | [Toxic Comment Classification Challenge](https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge) | 2018 | build a multi-headed model that’s capable of detecting different types of of toxicity like threats, obscenity, insults, and identity-based hate. | Wikipedia Comments | `original` | 0.98856 | 0.98636 | [Jigsaw Unintended Bias in Toxicity Classification](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification) | 2019 | build a model that recognizes toxicity and minimizes this type of unintended bias with respect to mentions of identities. You'll be using a dataset labeled for identity mentions and optimizing a metric designed to measure unintended bias. | Civil Comments | `unbiased` | 0.94734 | 0.93639 | [Jigsaw Multilingual Toxic Comment Classification](https://www.kaggle.com/c/jigsaw-multilingual-toxic-comment-classification) | 2020 | build effective multilingual models | Wikipedia Comments + Civil Comments | `multilingual` | 0.9536 | 0.91655* *Score not directly comparable since it is obtained on the validation set provided and not on the test set. To update when the test labels are made available. It is also noteworthy to mention that the top leadearboard scores have been achieved using model ensembles. The purpose of this library was to build something user-friendly and straightforward to use. ## Limitations and ethical considerations If words that are associated with swearing, insults or profanity are present in a comment, it is likely that it will be classified as toxic, regardless of the tone or the intent of the author e.g. humorous/self-deprecating. This could present some biases towards already vulnerable minority groups. The intended use of this library is for research purposes, fine-tuning on carefully constructed datasets that reflect real world demographics and/or to aid content moderators in flagging out harmful content quicker. Some useful resources about the risk of different biases in toxicity or hate speech detection are: - [The Risk of Racial Bias in Hate Speech Detection](https://homes.cs.washington.edu/~msap/pdfs/sap2019risk.pdf) - [Automated Hate Speech Detection and the Problem of Offensive Language](https://arxiv.org/pdf/1703.04009.pdf%201.pdf) - [Racial Bias in Hate Speech and Abusive Language Detection Datasets](https://arxiv.org/pdf/1905.12516.pdf) ## Quick prediction The `multilingual` model has been trained on 7 different languages so it should only be tested on: `english`, `french`, `spanish`, `italian`, `portuguese`, `turkish` or `russian`. ```bash # install detoxify pip install detoxify ``` ```python from detoxify import Detoxify # each model takes in either a string or a list of strings results = Detoxify('original').predict('example text') results = Detoxify('unbiased').predict(['example text 1','example text 2']) results = Detoxify('multilingual').predict(['example text','exemple de texte','texto de ejemplo','testo di esempio','texto de exemplo','örnek metin','пример текста']) # optional to display results nicely (will need to pip install pandas) import pandas as pd print(pd.DataFrame(results, index=input_text).round(5)) ``` For more details check the Prediction section. ## Labels All challenges have a toxicity label. The toxicity labels represent the aggregate ratings of up to 10 annotators according the following schema: - **Very Toxic** (a very hateful, aggressive, or disrespectful comment that is very likely to make you leave a discussion or give up on sharing your perspective) - **Toxic** (a rude, disrespectful, or unreasonable comment that is somewhat likely to make you leave a discussion or give up on sharing your perspective) - **Hard to Say** - **Not Toxic** More information about the labelling schema can be found [here](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification/data). ### Toxic Comment Classification Challenge This challenge includes the following labels: - `toxic` - `severe_toxic` - `obscene` - `threat` - `insult` - `identity_hate` ### Jigsaw Unintended Bias in Toxicity Classification This challenge has 2 types of labels: the main toxicity labels and some additional identity labels that represent the identities mentioned in the comments. Only identities with more than 500 examples in the test set (combined public and private) are included during training as additional labels and in the evaluation calculation. - `toxicity` - `severe_toxicity` - `obscene` - `threat` - `insult` - `identity_attack` - `sexual_explicit` Identity labels used: - `male` - `female` - `homosexual_gay_or_lesbian` - `christian` - `jewish` - `muslim` - `black` - `white` - `psychiatric_or_mental_illness` A complete list of all the identity labels available can be found [here](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification/data). ### Jigsaw Multilingual Toxic Comment Classification Since this challenge combines the data from the previous 2 challenges, it includes all labels from above, however the final evaluation is only on: - `toxicity` ## How to run First, install dependencies ```bash # clone project git clone https://github.com/unitaryai/detoxify # create virtual env python3 -m venv toxic-env source toxic-env/bin/activate # install project pip install -e detoxify cd detoxify # for training pip install -r requirements.txt ``` ## Prediction Trained models summary: |Model name| Transformer type| Data from |:--:|:--:|:--:| |`original`| `bert-base-uncased` | Toxic Comment Classification Challenge |`unbiased`| `roberta-base`| Unintended Bias in Toxicity Classification |`multilingual`| `xlm-roberta-base`| Multilingual Toxic Comment Classification For a quick prediction can run the example script on a comment directly or from a txt containing a list of comments. ```bash # load model via torch.hub python run_prediction.py --input 'example' --model_name original # load model from from checkpoint path python run_prediction.py --input 'example' --from_ckpt_path model_path # save results to a .csv file python run_prediction.py --input test_set.txt --model_name original --save_to results.csv # to see usage python run_prediction.py --help ``` Checkpoints can be downloaded from the latest release or via the Pytorch hub API with the following names: - `toxic_bert` - `unbiased_toxic_roberta` - `multilingual_toxic_xlm_r` ```bash model = torch.hub.load('unitaryai/detoxify','toxic_bert') ``` Importing detoxify in python: ```python from detoxify import Detoxify results = Detoxify('original').predict('some text') results = Detoxify('unbiased').predict(['example text 1','example text 2']) results = Detoxify('multilingual').predict(['example text','exemple de texte','texto de ejemplo','testo di esempio','texto de exemplo','örnek metin','пример текста']) # to display results nicely import pandas as pd print(pd.DataFrame(results,index=input_text).round(5)) ``` ## Training If you do not already have a Kaggle account: - you need to create one to be able to download the data - go to My Account and click on Create New API Token - this will download a kaggle.json file - make sure this file is located in ~/.kaggle ```bash # create data directory mkdir jigsaw_data cd jigsaw_data # download data kaggle competitions download -c jigsaw-toxic-comment-classification-challenge kaggle competitions download -c jigsaw-unintended-bias-in-toxicity-classification kaggle competitions download -c jigsaw-multilingual-toxic-comment-classification ``` ## Start Training ### Toxic Comment Classification Challenge ```bash python create_val_set.py python train.py --config configs/Toxic_comment_classification_BERT.json ``` ### Unintended Bias in Toxicicity Challenge ```bash python train.py --config configs/Unintended_bias_toxic_comment_classification_RoBERTa.json ``` ### Multilingual Toxic Comment Classification This is trained in 2 stages. First, train on all available data, and second, train only on the translated versions of the first challenge. The [translated data](https://www.kaggle.com/miklgr500/jigsaw-train-multilingual-coments-google-api) can be downloaded from Kaggle in french, spanish, italian, portuguese, turkish, and russian (the languages available in the test set). ```bash # stage 1 python train.py --config configs/Multilingual_toxic_comment_classification_XLMR.json # stage 2 python train.py --config configs/Multilingual_toxic_comment_classification_XLMR_stage2.json ``` ### Monitor progress with tensorboard ```bash tensorboard --logdir=./saved ``` ## Model Evaluation ### Toxic Comment Classification Challenge This challenge is evaluated on the mean AUC score of all the labels. ```bash python evaluate.py --checkpoint saved/lightning_logs/checkpoints/example_checkpoint.pth --test_csv test.csv ``` ### Unintended Bias in Toxicicity Challenge This challenge is evaluated on a novel bias metric that combines different AUC scores to balance overall performance. More information on this metric [here](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification/overview/evaluation). ```bash python evaluate.py --checkpoint saved/lightning_logs/checkpoints/example_checkpoint.pth --test_csv test.csv # to get the final bias metric python model_eval/compute_bias_metric.py ``` ### Multilingual Toxic Comment Classification This challenge is evaluated on the AUC score of the main toxic label. ```bash python evaluate.py --checkpoint saved/lightning_logs/checkpoints/example_checkpoint.pth --test_csv test.csv ``` ### Citation ``` @misc{Detoxify, title={Detoxify}, author={Hanu, Laura and {Unitary team}}, howpublished={Github. https://github.com/unitaryai/detoxify}, year={2020} } ```
11,095
[ [ -0.0111541748046875, -0.036865234375, 0.0302886962890625, 0.01568603515625, -0.0002989768981933594, -0.003337860107421875, -0.0029315948486328125, -0.036407470703125, 0.007171630859375, 0.0278167724609375, -0.03802490234375, -0.053955078125, -0.047119140625, 0.01541900634765625, -0.0045166015625, 0.10589599609375, 0.011260986328125, 0.0164642333984375, 0.001922607421875, -0.012481689453125, -0.0242156982421875, -0.03778076171875, -0.041961669921875, -0.037078857421875, 0.06829833984375, 0.0178985595703125, 0.042694091796875, 0.01122283935546875, 0.0200042724609375, 0.021209716796875, -0.0279083251953125, -0.019561767578125, -0.0225372314453125, 0.0027523040771484375, -0.036590576171875, -0.0286865234375, -0.016357421875, 0.01520538330078125, -0.002960205078125, 0.0243988037109375, -0.009002685546875, 0.0169830322265625, -0.005191802978515625, 0.032684326171875, -0.044952392578125, 0.018646240234375, -0.040618896484375, 0.0202178955078125, 0.005970001220703125, 0.01273345947265625, -0.047515869140625, -0.0238494873046875, 0.0007176399230957031, -0.034942626953125, -0.023712158203125, -0.0067901611328125, 0.05511474609375, 0.02569580078125, -0.054534912109375, -0.00799560546875, -0.0335693359375, 0.0576171875, -0.0693359375, 0.02154541015625, 0.014190673828125, 0.0062713623046875, -0.0099639892578125, -0.05596923828125, -0.0269927978515625, -0.0176849365234375, -0.0132904052734375, 0.039581298828125, -0.007579803466796875, 0.00862884521484375, 0.026153564453125, 0.01456451416015625, -0.038299560546875, 0.0006628036499023438, -0.030548095703125, -0.03350830078125, 0.0625, 0.01366424560546875, 0.03643798828125, -0.0241851806640625, -0.034637451171875, -0.0053558349609375, -0.019378662109375, -0.005023956298828125, 0.0203094482421875, 0.0269012451171875, -0.0158538818359375, 0.027862548828125, -0.005413055419921875, 0.0056915283203125, -0.010955810546875, -0.0087127685546875, 0.0582275390625, -0.0200042724609375, -0.00691986083984375, -0.0037708282470703125, 0.0843505859375, 0.03399658203125, 0.00843048095703125, -0.004009246826171875, 0.0126800537109375, 0.0235137939453125, 0.0096435546875, -0.076416015625, -0.052886962890625, 0.029205322265625, -0.041351318359375, -0.047760009765625, -0.030029296875, -0.08697509765625, -0.03399658203125, 0.0006160736083984375, 0.039306640625, -0.02935791015625, -0.040802001953125, -0.0003826618194580078, -0.0298919677734375, -0.020843505859375, 0.0216827392578125, -0.03692626953125, 0.0036468505859375, -0.0043792724609375, 0.060638427734375, -0.0080108642578125, 0.0008635520935058594, -0.022735595703125, -0.005809783935546875, -0.0018749237060546875, 0.0404052734375, -0.049530029296875, -0.0194091796875, 0.0022182464599609375, 0.0206756591796875, -0.000957489013671875, -0.037078857421875, 0.0635986328125, -0.0189208984375, 0.0335693359375, -0.0246429443359375, -0.01800537109375, -0.036102294921875, 0.0140380859375, -0.038909912109375, 0.0831298828125, 0.0145111083984375, -0.1021728515625, 0.00843048095703125, -0.049041748046875, -0.020904541015625, -0.006740570068359375, 0.031829833984375, -0.038604736328125, -0.026153564453125, -0.01055908203125, 0.0198211669921875, 0.0002720355987548828, 0.0110626220703125, -0.049530029296875, -0.015869140625, 0.004634857177734375, -0.006298065185546875, 0.1046142578125, 0.037261962890625, -0.034912109375, 0.0232086181640625, -0.054962158203125, 0.0125579833984375, 0.00627899169921875, -0.03826904296875, -0.019989013671875, 0.0035037994384765625, 0.04156494140625, 0.050079345703125, 0.0015363693237304688, -0.0478515625, -0.0032978057861328125, -0.0126953125, 0.035980224609375, 0.057373046875, 0.012298583984375, -0.004192352294921875, -0.0302276611328125, 0.029052734375, 0.03204345703125, 0.01505279541015625, 0.0095062255859375, -0.050048828125, -0.044097900390625, -0.0048065185546875, 0.0177001953125, 0.072021484375, -0.06341552734375, 0.04290771484375, -0.005908966064453125, -0.06591796875, -0.01087188720703125, 0.0016069412231445312, 0.05126953125, 0.0293731689453125, 0.039581298828125, -0.032867431640625, -0.046844482421875, -0.061370849609375, -0.0188446044921875, -0.030242919921875, -0.00972747802734375, 0.023406982421875, 0.0662841796875, -0.0310821533203125, 0.0234832763671875, -0.045867919921875, -0.034271240234375, 0.0022296905517578125, 0.0015668869018554688, 0.0099334716796875, 0.033447265625, 0.038909912109375, -0.041351318359375, -0.060943603515625, -0.004306793212890625, -0.0499267578125, -0.01026153564453125, 0.01102447509765625, -0.0194091796875, -0.0024662017822265625, 0.023468017578125, -0.0224761962890625, 0.0110015869140625, 0.0289306640625, -0.01666259765625, 0.049346923828125, -0.0012683868408203125, -0.00560760498046875, -0.0753173828125, 0.0141448974609375, 0.0162353515625, -0.0033054351806640625, -0.06744384765625, -0.000667572021484375, -0.003864288330078125, 0.0018892288208007812, -0.05596923828125, 0.040924072265625, -0.0161590576171875, 0.0504150390625, 0.01343536376953125, 0.00029850006103515625, -0.011199951171875, 0.0626220703125, 0.0179443359375, 0.0300445556640625, 0.044464111328125, -0.03131103515625, 0.01319122314453125, 0.02874755859375, -0.0103607177734375, 0.0360107421875, -0.033416748046875, 0.01036834716796875, -0.01424407958984375, 0.036865234375, -0.07135009765625, -0.022216796875, 0.03289794921875, -0.033233642578125, 0.012451171875, -0.00258636474609375, -0.042816162109375, -0.04888916015625, -0.035552978515625, 0.01256561279296875, 0.044342041015625, -0.02874755859375, 0.028106689453125, 0.042816162109375, 0.006053924560546875, -0.055877685546875, -0.060211181640625, -0.0177154541015625, -0.044403076171875, -0.024017333984375, 0.00811767578125, -0.036102294921875, -0.01224517822265625, -0.01482391357421875, 0.002979278564453125, -0.007358551025390625, 0.0034770965576171875, 0.0178375244140625, 0.01042938232421875, 0.0201568603515625, 0.0157318115234375, 0.0027256011962890625, 0.001178741455078125, 0.033447265625, 0.01055908203125, 0.0259552001953125, -0.010711669921875, 0.02197265625, -0.028656005859375, 0.0195770263671875, 0.04541015625, -0.002269744873046875, 0.0517578125, 0.05072021484375, -0.0300750732421875, -0.0029850006103515625, -0.0180511474609375, -0.01129913330078125, -0.033843994140625, 0.04095458984375, -0.0006957054138183594, -0.034271240234375, 0.0595703125, 0.001430511474609375, 0.007450103759765625, 0.042388916015625, 0.056488037109375, 0.00795745849609375, 0.10565185546875, 0.027557373046875, -0.0092315673828125, 0.037689208984375, -0.0162506103515625, 0.01100921630859375, -0.058013916015625, -0.02618408203125, -0.0305633544921875, -0.0128936767578125, -0.050506591796875, -0.01378631591796875, 0.037353515625, -0.020263671875, -0.07061767578125, 0.0262451171875, -0.068603515625, 0.0222320556640625, 0.0350341796875, 0.03125, 0.016632080078125, -0.006053924560546875, -0.021209716796875, -0.01971435546875, -0.040496826171875, -0.030731201171875, 0.07818603515625, 0.047454833984375, 0.046722412109375, 0.004573822021484375, 0.040283203125, 0.007282257080078125, 0.06463623046875, -0.04473876953125, 0.034271240234375, -0.01064300537109375, -0.09075927734375, 0.0011568069458007812, -0.038055419921875, -0.04815673828125, 0.0310821533203125, -0.0160980224609375, -0.08233642578125, 0.00785064697265625, -0.0129852294921875, -0.0157623291015625, 0.0482177734375, -0.058929443359375, 0.06793212890625, -0.01454925537109375, -0.035736083984375, 0.01019287109375, -0.06890869140625, 0.0491943359375, -0.007659912109375, 0.037078857421875, -0.0172882080078125, 0.0151519775390625, 0.07818603515625, -0.02252197265625, 0.07568359375, -0.021087646484375, -0.009002685546875, 0.028533935546875, -0.000362396240234375, 0.0149993896484375, 0.001583099365234375, -0.020843505859375, 0.026885986328125, 0.0212860107421875, -0.0034008026123046875, -0.0084381103515625, 0.037200927734375, -0.06805419921875, -0.0250396728515625, -0.047943115234375, -0.045257568359375, -0.005706787109375, 0.04241943359375, 0.039459228515625, -0.005828857421875, 0.004512786865234375, -0.0136260986328125, 0.050079345703125, -0.051055908203125, 0.012908935546875, 0.0233612060546875, -0.0218505859375, -0.03759765625, 0.06146240234375, 0.005207061767578125, 0.023406982421875, 0.01442718505859375, 0.031402587890625, -0.021148681640625, -0.016754150390625, -0.0087890625, 0.020904541015625, -0.0693359375, -0.032257080078125, -0.06182861328125, -0.034271240234375, -0.0241546630859375, 0.0186309814453125, -0.0101470947265625, -0.00739288330078125, -0.0255889892578125, -0.0189056396484375, 0.03656005859375, 0.058013916015625, -0.0274200439453125, 0.019073486328125, -0.0518798828125, -0.0008273124694824219, 0.00951385498046875, 0.0380859375, 0.007720947265625, -0.032684326171875, -0.0013856887817382812, 0.02337646484375, -0.05377197265625, -0.10833740234375, 0.039093017578125, -0.0011615753173828125, 0.036102294921875, 0.0343017578125, 0.0280609130859375, 0.02252197265625, -0.025360107421875, 0.06787109375, 0.023681640625, -0.05364990234375, 0.0430908203125, -0.04168701171875, -0.0016317367553710938, 0.04864501953125, 0.059051513671875, -0.0606689453125, -0.05291748046875, -0.0435791015625, -0.061309814453125, 0.0531005859375, 0.02252197265625, 0.00930023193359375, -0.015869140625, 0.01311492919921875, -0.00092315673828125, 0.0025730133056640625, -0.0960693359375, -0.053802490234375, -0.014739990234375, -0.0187530517578125, 0.007171630859375, -0.0157623291015625, -0.0229949951171875, -0.04290771484375, 0.080322265625, 0.01160430908203125, 0.0159759521484375, 0.00794219970703125, -0.0006403923034667969, -0.00830841064453125, 0.025787353515625, 0.003963470458984375, 0.01090240478515625, -0.038604736328125, 0.01202392578125, 0.034088134765625, -0.050567626953125, 0.026458740234375, -0.0028839111328125, -0.0166168212890625, -0.005840301513671875, 0.006103515625, 0.034271240234375, 0.01080322265625, -0.05023193359375, 0.0311431884765625, -0.00818634033203125, -0.0235443115234375, -0.02801513671875, 0.01318359375, 0.0189666748046875, 0.00313568115234375, -0.0014400482177734375, 0.005481719970703125, 0.0243988037109375, -0.0478515625, 0.0203857421875, 0.027801513671875, -0.0439453125, -0.0157012939453125, 0.08697509765625, 0.0132293701171875, -0.009735107421875, 0.049346923828125, -0.025604248046875, -0.057952880859375, 0.040069580078125, 0.03692626953125, 0.05621337890625, -0.028656005859375, 0.033599853515625, 0.072021484375, 0.0190582275390625, 0.01317596435546875, 0.01910400390625, 0.016326904296875, -0.045989990234375, 0.0023899078369140625, -0.045135498046875, -0.01088714599609375, 0.034820556640625, -0.0399169921875, 0.0173187255859375, -0.0428466796875, -0.0152435302734375, 0.0157012939453125, 0.02069091796875, -0.036590576171875, 0.0179443359375, 0.015899658203125, 0.0703125, -0.10797119140625, 0.041290283203125, 0.054534912109375, -0.06719970703125, -0.050506591796875, -0.01448822021484375, 0.0216522216796875, -0.0197906494140625, 0.038482666015625, 0.039215087890625, 0.0157623291015625, -0.01666259765625, -0.050750732421875, -0.06719970703125, 0.05902099609375, 0.01123046875, -0.0156402587890625, -0.00225067138671875, -0.0063323974609375, 0.061309814453125, 0.0087432861328125, 0.06451416015625, 0.044464111328125, 0.04730224609375, 0.01245880126953125, -0.06634521484375, 0.025299072265625, -0.046661376953125, 0.00518035888671875, -0.01490020751953125, -0.061309814453125, 0.057769775390625, -0.0182952880859375, -0.0222015380859375, -0.000835418701171875, 0.03289794921875, 0.014404296875, 0.01727294921875, 0.043487548828125, 0.05908203125, 0.052215576171875, 0.0000069141387939453125, 0.05511474609375, -0.0083160400390625, 0.04095458984375, 0.08056640625, -0.01274871826171875, 0.050384521484375, 0.004581451416015625, -0.034942626953125, 0.06988525390625, 0.077880859375, -0.007595062255859375, 0.050262451171875, 0.0222930908203125, -0.0283660888671875, -0.00972747802734375, -0.0164031982421875, -0.031524658203125, 0.005115509033203125, 0.0229339599609375, -0.0270538330078125, -0.02679443359375, 0.0012264251708984375, 0.03436279296875, -0.0094146728515625, -0.01439666748046875, 0.047576904296875, 0.007328033447265625, -0.048553466796875, 0.04132080078125, -0.0130157470703125, 0.066650390625, -0.0222015380859375, 0.003070831298828125, 0.0008025169372558594, 0.0252227783203125, -0.035736083984375, -0.063232421875, 0.0169525146484375, -0.0233154296875, 0.0034027099609375, -0.00698089599609375, 0.044464111328125, -0.0462646484375, -0.036651611328125, 0.045562744140625, 0.0140533447265625, 0.0230712890625, 0.0013113021850585938, -0.07623291015625, -0.01629638671875, 0.0139312744140625, -0.025390625, 0.0300445556640625, 0.01197052001953125, -0.0013914108276367188, 0.05108642578125, 0.040771484375, -0.0113525390625, -0.001148223876953125, -0.006259918212890625, 0.07135009765625, -0.0308380126953125, -0.0259857177734375, -0.041168212890625, 0.059295654296875, -0.003368377685546875, -0.0250091552734375, 0.05682373046875, 0.049530029296875, 0.0633544921875, -0.01110076904296875, 0.0657958984375, -0.00980377197265625, 0.033233642578125, -0.006229400634765625, 0.07525634765625, -0.045928955078125, -0.0137176513671875, -0.043853759765625, -0.053497314453125, -0.0308074951171875, 0.046844482421875, -0.0246124267578125, 0.025115966796875, 0.0577392578125, 0.061798095703125, 0.00640106201171875, -0.006679534912109375, -0.004604339599609375, 0.044921875, 0.029449462890625, 0.06365966796875, 0.042999267578125, -0.0389404296875, 0.04315185546875, -0.053436279296875, -0.02197265625, 0.0073089599609375, -0.056182861328125, -0.08184814453125, -0.05419921875, -0.04638671875, -0.06414794921875, -0.02288818359375, 0.04168701171875, 0.0284576416015625, -0.070068359375, 0.00177001953125, 0.0021514892578125, 0.0246429443359375, -0.01068878173828125, -0.0232086181640625, 0.0171356201171875, -0.01126861572265625, -0.0526123046875, -0.01335906982421875, -0.00292205810546875, -0.0082855224609375, -0.00952911376953125, 0.0030612945556640625, -0.03472900390625, 0.00731658935546875, 0.0518798828125, 0.0197296142578125, -0.04302978515625, -0.03887939453125, -0.01555633544921875, -0.01019287109375, 0.007633209228515625, 0.00876617431640625, -0.0445556640625, 0.038360595703125, 0.048492431640625, 0.0011205673217773438, 0.0364990234375, -0.01451873779296875, 0.0006704330444335938, -0.0307464599609375, 0.007038116455078125, 0.0232086181640625, 0.0273284912109375, 0.01678466796875, -0.039276123046875, 0.05755615234375, 0.0186767578125, -0.041015625, -0.06964111328125, 0.01202392578125, -0.06707763671875, -0.0280914306640625, 0.098388671875, -0.0168914794921875, -0.021881103515625, -0.0156707763671875, -0.0279998779296875, 0.0310211181640625, -0.0208587646484375, 0.05755615234375, 0.049041748046875, -0.006397247314453125, -0.0039520263671875, -0.05389404296875, 0.045623779296875, 0.020782470703125, -0.07196044921875, 0.0193939208984375, 0.035675048828125, 0.06085205078125, 0.03546142578125, 0.0413818359375, -0.022796630859375, 0.00638580322265625, -0.005413055419921875, 0.01557159423828125, 0.0218505859375, -0.00849151611328125, -0.01422882080078125, -0.01224517822265625, -0.0288848876953125, 0.00008732080459594727 ] ]
facebook/detr-resnet-101
2023-10-17T17:20:46.000Z
[ "transformers", "pytorch", "safetensors", "detr", "object-detection", "vision", "dataset:coco", "arxiv:2005.12872", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
object-detection
facebook
null
null
facebook/detr-resnet-101
66
110,332
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - object-detection - vision datasets: - coco widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/savanna.jpg example_title: Savanna - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/football-match.jpg example_title: Football Match - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/airport.jpg example_title: Airport --- # DETR (End-to-End Object Detection) model with ResNet-101 backbone DEtection TRansformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images). It was introduced in the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Carion et al. and first released in [this repository](https://github.com/facebookresearch/detr). Disclaimer: The team releasing DETR did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100. The model is trained using a "bipartite matching loss": one compares the predicted classes + bounding boxes of each of the N = 100 object queries to the ground truth annotations, padded up to the same length N (so if an image only contains 4 objects, 96 annotations will just have a "no object" as class and "no bounding box" as bounding box). The Hungarian matching algorithm is used to create an optimal one-to-one mapping between each of the N queries and each of the N annotations. Next, standard cross-entropy (for the classes) and a linear combination of the L1 and generalized IoU loss (for the bounding boxes) are used to optimize the parameters of the model. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/detr_architecture.png) ## Intended uses & limitations You can use the raw model for object detection. See the [model hub](https://huggingface.co/models?search=facebook/detr) to look for all available DETR models. ### How to use Here is how to use this model: ```python from transformers import DetrImageProcessor, DetrForObjectDetection import torch from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) # you can specify the revision tag if you don't want the timm dependency processor = DetrImageProcessor.from_pretrained("facebook/detr-resnet-101", revision="no_timm") model = DetrForObjectDetection.from_pretrained("facebook/detr-resnet-101", revision="no_timm") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) # convert outputs (bounding boxes and class logits) to COCO API # let's only keep detections with score > 0.9 target_sizes = torch.tensor([image.size[::-1]]) results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0] for score, label, box in zip(results["scores"], results["labels"], results["boxes"]): box = [round(i, 2) for i in box.tolist()] print( f"Detected {model.config.id2label[label.item()]} with confidence " f"{round(score.item(), 3)} at location {box}" ) ``` This should output (something along the lines of): ``` Detected cat with confidence 0.998 at location [344.06, 24.85, 640.34, 373.74] Detected remote with confidence 0.997 at location [328.13, 75.93, 372.81, 187.66] Detected remote with confidence 0.997 at location [39.34, 70.13, 175.56, 118.78] Detected cat with confidence 0.998 at location [15.36, 51.75, 316.89, 471.16] Detected couch with confidence 0.995 at location [-0.19, 0.71, 639.73, 474.17] ``` Currently, both the feature extractor and model support PyTorch. ## Training data The DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py). Images are resized/rescaled such that the shortest side is at least 800 pixels and the largest side at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training The model was trained for 300 epochs on 16 V100 GPUs. This takes 3 days, with 4 images per GPU (hence a total batch size of 64). ## Evaluation results This model achieves an AP (average precision) of **43.5** on COCO 2017 validation. For more details regarding evaluation results, we refer to table 1 of the original paper. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2005-12872, author = {Nicolas Carion and Francisco Massa and Gabriel Synnaeve and Nicolas Usunier and Alexander Kirillov and Sergey Zagoruyko}, title = {End-to-End Object Detection with Transformers}, journal = {CoRR}, volume = {abs/2005.12872}, year = {2020}, url = {https://arxiv.org/abs/2005.12872}, archivePrefix = {arXiv}, eprint = {2005.12872}, timestamp = {Thu, 28 May 2020 17:38:09 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2005-12872.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
5,926
[ [ -0.0538330078125, -0.044281005859375, 0.00046753883361816406, 0.0000032782554626464844, -0.019775390625, -0.005489349365234375, -0.01226806640625, -0.053436279296875, 0.0092010498046875, 0.0271453857421875, -0.04248046875, -0.044952392578125, -0.04119873046875, 0.0214385986328125, -0.0355224609375, 0.059906005859375, 0.012176513671875, -0.02362060546875, -0.01505279541015625, -0.00565338134765625, -0.02655029296875, -0.038421630859375, -0.05682373046875, -0.00004559755325317383, 0.023651123046875, 0.027313232421875, 0.04083251953125, 0.039306640625, 0.061248779296875, 0.0249176025390625, -0.0179443359375, 0.01290130615234375, -0.031463623046875, -0.0260162353515625, -0.0190887451171875, -0.0377197265625, -0.0156402587890625, 0.0008349418640136719, 0.02435302734375, 0.005008697509765625, 0.01031494140625, 0.0248260498046875, -0.004398345947265625, 0.045562744140625, -0.047821044921875, 0.017669677734375, -0.04022216796875, 0.01325225830078125, -0.00662994384765625, -0.00412750244140625, -0.03997802734375, -0.004344940185546875, -0.002681732177734375, -0.0302734375, 0.050018310546875, 0.01328277587890625, 0.10858154296875, 0.0166168212890625, -0.005603790283203125, -0.0099334716796875, -0.04010009765625, 0.056640625, -0.032257080078125, 0.037017822265625, 0.029266357421875, 0.030853271484375, -0.00579833984375, -0.061431884765625, -0.045806884765625, -0.02294921875, -0.0171051025390625, 0.0098724365234375, -0.025848388671875, -0.01255035400390625, 0.0304718017578125, 0.034881591796875, -0.0340576171875, -0.0082855224609375, -0.04937744140625, -0.02032470703125, 0.06439208984375, -0.00399017333984375, 0.016693115234375, -0.004650115966796875, -0.0264129638671875, -0.029937744140625, -0.019317626953125, 0.031646728515625, 0.0308380126953125, 0.00771331787109375, -0.028900146484375, 0.047393798828125, -0.0238800048828125, 0.051788330078125, 0.029296875, -0.007419586181640625, 0.0307159423828125, -0.0196685791015625, -0.0146484375, 0.0064544677734375, 0.0836181640625, 0.0297698974609375, 0.0288238525390625, 0.004207611083984375, -0.0086212158203125, 0.011444091796875, 0.01441192626953125, -0.06915283203125, -0.0288848876953125, 0.01001739501953125, -0.03485107421875, -0.037109375, 0.0198974609375, -0.045928955078125, 0.0020427703857421875, -0.0003523826599121094, 0.0134735107421875, -0.026580810546875, -0.0234527587890625, 0.0217132568359375, -0.0006093978881835938, 0.041473388671875, 0.00977325439453125, -0.049285888671875, 0.0160369873046875, 0.02105712890625, 0.0675048828125, -0.00379180908203125, -0.0008821487426757812, -0.027984619140625, -0.0219573974609375, -0.03289794921875, 0.06439208984375, -0.021240234375, -0.0290985107421875, -0.0217742919921875, 0.042205810546875, -0.00563812255859375, -0.040863037109375, 0.05267333984375, -0.0269775390625, 0.006439208984375, -0.0210723876953125, -0.00608062744140625, -0.037017822265625, 0.036468505859375, -0.046905517578125, 0.09698486328125, 0.028411865234375, -0.067138671875, 0.022552490234375, -0.031585693359375, -0.0229034423828125, -0.0191650390625, 0.0074310302734375, -0.06280517578125, 0.00366973876953125, 0.03704833984375, 0.03564453125, -0.0272369384765625, 0.01763916015625, -0.03582763671875, -0.034393310546875, 0.031494140625, -0.0352783203125, 0.0782470703125, 0.0121307373046875, -0.0172882080078125, 0.005947113037109375, -0.04937744140625, -0.0208892822265625, 0.034393310546875, -0.0116424560546875, 0.017669677734375, -0.01514434814453125, 0.00817108154296875, 0.0384521484375, 0.01250457763671875, -0.049560546875, 0.004291534423828125, -0.0097808837890625, 0.0198974609375, 0.037384033203125, -0.00792694091796875, 0.0135040283203125, -0.020111083984375, 0.0302276611328125, 0.0219573974609375, 0.041839599609375, -0.0183868408203125, -0.04656982421875, -0.0634765625, -0.0251922607421875, -0.00470733642578125, 0.0260009765625, -0.0263214111328125, 0.0562744140625, -0.0188140869140625, -0.0321044921875, -0.033935546875, -0.0091552734375, 0.0196685791015625, 0.046112060546875, 0.0404052734375, -0.05291748046875, -0.05145263671875, -0.07049560546875, 0.0205535888671875, 0.0002682209014892578, 0.0019044876098632812, 0.0108184814453125, 0.061248779296875, -0.0123291015625, 0.08795166015625, -0.04278564453125, -0.043182373046875, -0.0151519775390625, -0.019317626953125, 0.019744873046875, 0.04107666015625, 0.055419921875, -0.0701904296875, -0.037322998046875, -0.001041412353515625, -0.07440185546875, 0.0122528076171875, 0.0144805908203125, -0.015655517578125, 0.019775390625, 0.0275726318359375, -0.041717529296875, 0.064453125, 0.015655517578125, -0.00531768798828125, 0.051422119140625, 0.004573822021484375, 0.00664520263671875, -0.072021484375, 0.01511383056640625, 0.01898193359375, -0.01727294921875, -0.043701171875, 0.00670623779296875, 0.00525665283203125, -0.00299072265625, -0.052001953125, 0.0219879150390625, -0.03497314453125, -0.010101318359375, -0.01708984375, -0.00022125244140625, 0.0132904052734375, 0.0538330078125, 0.0301971435546875, 0.03131103515625, 0.053009033203125, -0.04327392578125, 0.035369873046875, 0.0168609619140625, -0.024688720703125, 0.04986572265625, -0.054229736328125, 0.01554107666015625, -0.0144500732421875, 0.0251922607421875, -0.07830810546875, -0.0165863037109375, 0.049713134765625, -0.04718017578125, 0.03643798828125, -0.02532958984375, -0.0281829833984375, -0.0770263671875, -0.031036376953125, 0.03253173828125, 0.028900146484375, -0.04766845703125, 0.039794921875, 0.0025043487548828125, 0.038909912109375, -0.06317138671875, -0.061431884765625, -0.005092620849609375, -0.00974273681640625, -0.043182373046875, 0.020263671875, -0.009185791015625, 0.01076507568359375, 0.0053863525390625, -0.021240234375, -0.0005812644958496094, -0.0064849853515625, 0.025421142578125, 0.028778076171875, -0.005767822265625, -0.0198974609375, -0.018463134765625, -0.019256591796875, 0.005924224853515625, -0.01439666748046875, 0.059173583984375, -0.026641845703125, -0.03472900390625, -0.06390380859375, -0.010009765625, 0.03851318359375, -0.03448486328125, 0.0506591796875, 0.076171875, -0.040679931640625, 0.0169830322265625, -0.051513671875, -0.017547607421875, -0.039764404296875, 0.0258941650390625, -0.037994384765625, -0.0299224853515625, 0.057586669921875, 0.02899169921875, -0.007549285888671875, 0.047698974609375, 0.0300445556640625, 0.0110931396484375, 0.07177734375, 0.035888671875, -0.007709503173828125, 0.03363037109375, -0.070068359375, 0.02398681640625, -0.080810546875, -0.05706787109375, -0.0296783447265625, -0.024017333984375, -0.033905029296875, -0.046234130859375, 0.009368896484375, 0.029388427734375, -0.028045654296875, 0.0557861328125, -0.0804443359375, 0.0183868408203125, 0.041259765625, 0.0418701171875, 0.0125732421875, 0.007068634033203125, 0.007160186767578125, 0.009521484375, -0.03759765625, -0.0141143798828125, 0.070556640625, 0.030792236328125, 0.05029296875, -0.01629638671875, 0.032989501953125, 0.0049591064453125, 0.0300140380859375, -0.0643310546875, 0.049530029296875, 0.0005755424499511719, -0.05548095703125, 0.001277923583984375, -0.00934600830078125, -0.0582275390625, 0.01430511474609375, -0.0095672607421875, -0.0745849609375, 0.04840087890625, 0.0015048980712890625, -0.0037975311279296875, 0.041259765625, -0.0299224853515625, 0.06719970703125, -0.024078369140625, -0.036102294921875, 0.01666259765625, -0.0606689453125, 0.039947509765625, -0.0019855499267578125, -0.01194000244140625, -0.018524169921875, 0.03497314453125, 0.0557861328125, -0.0275421142578125, 0.0535888671875, -0.002552032470703125, 0.003173828125, 0.061248779296875, -0.00003540515899658203, 0.038421630859375, 0.0000908970832824707, -0.0029735565185546875, 0.04058837890625, -0.01558685302734375, -0.0125579833984375, -0.022796630859375, 0.029052734375, -0.043609619140625, -0.036956787109375, -0.036376953125, -0.05133056640625, 0.031951904296875, 0.00972747802734375, 0.0538330078125, 0.0272369384765625, 0.01285552978515625, 0.0276641845703125, 0.041259765625, -0.0360107421875, 0.039398193359375, 0.01374053955078125, -0.0208892822265625, -0.0148162841796875, 0.06414794921875, -0.00525665283203125, 0.01219940185546875, 0.0457763671875, 0.0121002197265625, -0.04742431640625, -0.0127410888671875, -0.018798828125, 0.0251922607421875, -0.051513671875, -0.03692626953125, -0.059906005859375, -0.033935546875, -0.039764404296875, -0.0198516845703125, -0.04241943359375, -0.004596710205078125, -0.047821044921875, -0.01020050048828125, 0.03741455078125, 0.0220489501953125, -0.00920867919921875, 0.03155517578125, -0.02398681640625, 0.0224609375, 0.00331878662109375, 0.01534271240234375, -0.005863189697265625, -0.053436279296875, -0.00986480712890625, 0.020172119140625, -0.038330078125, -0.052642822265625, 0.04083251953125, -0.0017614364624023438, 0.027618408203125, 0.057220458984375, 0.01090240478515625, 0.060028076171875, 0.0017652511596679688, 0.049835205078125, 0.037139892578125, -0.060028076171875, 0.047637939453125, 0.00004857778549194336, 0.006443023681640625, 0.0160980224609375, 0.0218353271484375, -0.0247650146484375, -0.0217437744140625, -0.0421142578125, -0.04833984375, 0.07244873046875, 0.01654052734375, -0.0127410888671875, -0.005062103271484375, 0.0145111083984375, -0.0281829833984375, 0.004169464111328125, -0.060272216796875, -0.0207366943359375, -0.0318603515625, -0.0201416015625, -0.00786590576171875, -0.006870269775390625, 0.006885528564453125, -0.044403076171875, 0.03955078125, -0.006450653076171875, 0.049163818359375, 0.0282745361328125, -0.01763916015625, -0.006885528564453125, -0.0229644775390625, 0.032196044921875, 0.025665283203125, -0.0364990234375, 0.0040435791015625, 0.0167694091796875, -0.040618896484375, 0.00738525390625, 0.00589752197265625, -0.00482940673828125, -0.0024967193603515625, 0.025482177734375, 0.0562744140625, 0.005161285400390625, -0.031585693359375, 0.043670654296875, 0.0033016204833984375, -0.0206298828125, -0.0304718017578125, 0.00867462158203125, -0.0085296630859375, 0.0120849609375, 0.034942626953125, 0.0177764892578125, -0.0008740425109863281, -0.03326416015625, 0.0272369384765625, 0.03326416015625, -0.039581298828125, -0.0295257568359375, 0.0780029296875, -0.025848388671875, -0.0246734619140625, 0.06640625, -0.0235595703125, -0.048828125, 0.083740234375, 0.04608154296875, 0.0589599609375, -0.0229949951171875, 0.0148162841796875, 0.053619384765625, 0.034210205078125, -0.0200347900390625, -0.004299163818359375, -0.006496429443359375, -0.06134033203125, -0.0007915496826171875, -0.052581787109375, 0.005794525146484375, 0.0135345458984375, -0.067138671875, 0.042144775390625, -0.0251922607421875, -0.0205230712890625, -0.0047454833984375, 0.0073394775390625, -0.08245849609375, 0.0305023193359375, -0.0030918121337890625, 0.07177734375, -0.056640625, 0.044952392578125, 0.0289764404296875, -0.056060791015625, -0.06103515625, -0.02203369140625, -0.0073699951171875, -0.06634521484375, 0.048614501953125, 0.055328369140625, -0.006778717041015625, 0.00566864013671875, -0.0562744140625, -0.06585693359375, 0.0941162109375, 0.022491455078125, -0.04351806640625, 0.0075531005859375, 0.006771087646484375, 0.033843994140625, -0.0282440185546875, 0.04644775390625, 0.03857421875, 0.027374267578125, 0.02789306640625, -0.04071044921875, 0.01078033447265625, -0.0158233642578125, 0.004398345947265625, 0.001979827880859375, -0.05377197265625, 0.057861328125, -0.03240966796875, -0.0195770263671875, 0.0006623268127441406, 0.049530029296875, 0.022247314453125, 0.03289794921875, 0.03515625, 0.061553955078125, 0.03424072265625, -0.0189056396484375, 0.08026123046875, -0.0268402099609375, 0.05926513671875, 0.057861328125, 0.0024166107177734375, 0.04058837890625, 0.0157928466796875, -0.019775390625, 0.03228759765625, 0.07366943359375, -0.027557373046875, 0.0340576171875, 0.015350341796875, -0.00626373291015625, -0.00383758544921875, -0.0231170654296875, -0.03167724609375, 0.03216552734375, 0.0289306640625, -0.032073974609375, -0.00472259521484375, 0.018218994140625, -0.005458831787109375, -0.0252838134765625, -0.010650634765625, 0.0220184326171875, -0.0023651123046875, -0.03167724609375, 0.05841064453125, -0.0026702880859375, 0.052032470703125, -0.0229339599609375, 0.005214691162109375, -0.0185089111328125, 0.0225982666015625, -0.0304718017578125, -0.05859375, 0.022369384765625, -0.00479888916015625, -0.0006232261657714844, 0.0189971923828125, 0.06201171875, -0.023651123046875, -0.0599365234375, 0.03515625, -0.004749298095703125, 0.034912109375, -0.0168304443359375, -0.05548095703125, 0.026641845703125, -0.002330780029296875, -0.032257080078125, 0.0016613006591796875, 0.00745391845703125, -0.01313018798828125, 0.0648193359375, 0.062744140625, -0.0248565673828125, 0.011138916015625, -0.00499725341796875, 0.0675048828125, -0.0301361083984375, -0.01393890380859375, -0.05364990234375, 0.047210693359375, -0.0157012939453125, -0.0130767822265625, 0.0413818359375, 0.07000732421875, 0.0865478515625, -0.01177215576171875, 0.02972412109375, -0.0283203125, -0.00960540771484375, -0.006801605224609375, 0.0264739990234375, -0.0404052734375, 0.010162353515625, -0.0196075439453125, -0.0740966796875, -0.0287322998046875, 0.06866455078125, -0.01543426513671875, 0.0164337158203125, 0.04638671875, 0.08587646484375, -0.03424072265625, -0.0177154541015625, 0.0191192626953125, -0.001064300537109375, 0.02655029296875, 0.047576904296875, 0.043182373046875, -0.06634521484375, 0.048797607421875, -0.058319091796875, -0.0017642974853515625, -0.0333251953125, -0.04510498046875, -0.07672119140625, -0.0428466796875, -0.03790283203125, -0.034576416015625, -0.005031585693359375, 0.03814697265625, 0.074462890625, -0.057952880859375, -0.006595611572265625, -0.0245361328125, 0.01267242431640625, -0.035186767578125, -0.025146484375, 0.046051025390625, 0.00762939453125, -0.0689697265625, 0.0019512176513671875, 0.01129913330078125, 0.00859832763671875, -0.006801605224609375, -0.0127410888671875, -0.0287017822265625, -0.017578125, 0.044189453125, 0.027740478515625, -0.04864501953125, -0.033355712890625, 0.01059722900390625, 0.0016698837280273438, 0.0218048095703125, 0.035430908203125, -0.05084228515625, 0.04180908203125, 0.0318603515625, 0.01299285888671875, 0.0721435546875, 0.0179901123046875, -0.004985809326171875, -0.04779052734375, 0.03363037109375, -0.002437591552734375, 0.0306243896484375, 0.03564453125, -0.03875732421875, 0.058868408203125, 0.04266357421875, -0.03485107421875, -0.0693359375, 0.007114410400390625, -0.09906005859375, -0.01073455810546875, 0.061859130859375, -0.0164642333984375, -0.039794921875, -0.0023021697998046875, -0.01439666748046875, 0.049652099609375, -0.02337646484375, 0.04656982421875, 0.0204010009765625, -0.000051140785217285156, -0.043609619140625, -0.0298614501953125, 0.00893402099609375, -0.0158843994140625, -0.04766845703125, -0.0316162109375, 0.0132904052734375, 0.039398193359375, 0.03497314453125, 0.042572021484375, -0.0186004638671875, 0.01250457763671875, 0.004085540771484375, 0.0295257568359375, -0.0220184326171875, -0.036285400390625, -0.01194000244140625, -0.003093719482421875, -0.0321044921875, -0.052001953125 ] ]
microsoft/beit-base-patch16-224
2023-02-27T17:56:38.000Z
[ "transformers", "pytorch", "jax", "beit", "image-classification", "vision", "dataset:imagenet", "dataset:imagenet-21k", "arxiv:2106.08254", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
microsoft
null
null
microsoft/beit-base-patch16-224
5
110,159
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - image-classification - vision datasets: - imagenet - imagenet-21k --- # BEiT (base-sized model, fine-tuned on ImageNet-1k) BEiT model pre-trained in a self-supervised fashion on ImageNet-21k (14 million images, 21,841 classes) at resolution 224x224, and fine-tuned on ImageNet 2012 (1 million images, 1,000 classes) at resolution 224x224. It was introduced in the paper [BEIT: BERT Pre-Training of Image Transformers](https://arxiv.org/abs/2106.08254) by Hangbo Bao, Li Dong and Furu Wei and first released in [this repository](https://github.com/microsoft/unilm/tree/master/beit). Disclaimer: The team releasing BEiT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The BEiT model is a Vision Transformer (ViT), which is a transformer encoder model (BERT-like). In contrast to the original ViT model, BEiT is pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels. The pre-training objective for the model is to predict visual tokens from the encoder of OpenAI's DALL-E's VQ-VAE, based on masked patches. Next, the model was fine-tuned in a supervised fashion on ImageNet (also referred to as ILSVRC2012), a dataset comprising 1 million images and 1,000 classes, also at resolution 224x224. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. Contrary to the original ViT models, BEiT models do use relative position embeddings (similar to T5) instead of absolute position embeddings, and perform classification of images by mean-pooling the final hidden states of the patches, instead of placing a linear layer on top of the final hidden state of the [CLS] token. By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image. Alternatively, one can mean-pool the final hidden states of the patch embeddings, and place a linear layer on top of that. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=microsoft/beit) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import BeitImageProcessor, BeitForImageClassification from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) processor = BeitImageProcessor.from_pretrained('microsoft/beit-base-patch16-224') model = BeitForImageClassification.from_pretrained('microsoft/beit-base-patch16-224') inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` Currently, both the feature extractor and model support PyTorch. ## Training data The BEiT model was pretrained on [ImageNet-21k](http://www.image-net.org/), a dataset consisting of 14 million images and 21k classes, and fine-tuned on [ImageNet](http://www.image-net.org/challenges/LSVRC/2012/), a dataset consisting of 1 million images and 1k classes. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/microsoft/unilm/blob/master/beit/datasets.py). Images are resized/rescaled to the same resolution (224x224) and normalized across the RGB channels with mean (0.5, 0.5, 0.5) and standard deviation (0.5, 0.5, 0.5). ### Pretraining For all pre-training related hyperparameters, we refer to page 15 of the [original paper](https://arxiv.org/abs/2106.08254). ## Evaluation results For evaluation results on several image classification benchmarks, we refer to tables 1 and 2 of the original paper. Note that for fine-tuning, the best results are obtained with a higher resolution (384x384). Of course, increasing the model size will result in better performance. ### BibTeX entry and citation info ```@article{DBLP:journals/corr/abs-2106-08254, author = {Hangbo Bao and Li Dong and Furu Wei}, title = {BEiT: {BERT} Pre-Training of Image Transformers}, journal = {CoRR}, volume = {abs/2106.08254}, year = {2021}, url = {https://arxiv.org/abs/2106.08254}, archivePrefix = {arXiv}, eprint = {2106.08254}, timestamp = {Tue, 29 Jun 2021 16:55:04 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-08254.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` ```bibtex @inproceedings{deng2009imagenet, title={Imagenet: A large-scale hierarchical image database}, author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li}, booktitle={2009 IEEE conference on computer vision and pattern recognition}, pages={248--255}, year={2009}, organization={Ieee} } ```
5,562
[ [ -0.0504150390625, -0.0203857421875, 0.00019788742065429688, -0.0121612548828125, -0.035552978515625, -0.0080108642578125, -0.0006093978881835938, -0.050811767578125, 0.01885986328125, 0.040191650390625, -0.0267333984375, -0.035552978515625, -0.053802490234375, -0.00885009765625, -0.03631591796875, 0.07696533203125, -0.00751495361328125, 0.0013980865478515625, -0.00809478759765625, -0.0124969482421875, -0.02142333984375, -0.040496826171875, -0.0498046875, -0.0215606689453125, 0.041412353515625, 0.01058197021484375, 0.043731689453125, 0.059661865234375, 0.05047607421875, 0.035736083984375, -0.002384185791015625, 0.0037593841552734375, -0.024383544921875, -0.0288543701171875, 0.005237579345703125, -0.035888671875, -0.0239715576171875, 0.01555633544921875, 0.046112060546875, 0.0311737060546875, 0.0158843994140625, 0.0277099609375, 0.0060882568359375, 0.04644775390625, -0.05126953125, 0.019866943359375, -0.0433349609375, 0.0299835205078125, -0.009307861328125, -0.00942230224609375, -0.03302001953125, -0.01971435546875, 0.0186614990234375, -0.041748046875, 0.040496826171875, -0.00005125999450683594, 0.11541748046875, 0.0095062255859375, -0.01515960693359375, 0.00989532470703125, -0.051849365234375, 0.053497314453125, -0.033599853515625, 0.027984619140625, 0.019317626953125, 0.035400390625, 0.0103912353515625, -0.0849609375, -0.0296783447265625, -0.01163482666015625, -0.0121612548828125, 0.0102386474609375, -0.0279388427734375, 0.0142974853515625, 0.03759765625, 0.02874755859375, -0.02520751953125, 0.012969970703125, -0.049468994140625, -0.0345458984375, 0.0291290283203125, -0.01229095458984375, 0.00786590576171875, 0.003200531005859375, -0.04583740234375, -0.0227813720703125, -0.036895751953125, 0.01486968994140625, 0.023651123046875, 0.0038089752197265625, -0.00794219970703125, 0.03277587890625, 0.0009675025939941406, 0.05364990234375, 0.0184173583984375, 0.0005831718444824219, 0.041351318359375, -0.020782470703125, -0.0290374755859375, 0.01218414306640625, 0.057037353515625, 0.01947021484375, 0.022918701171875, -0.00286865234375, -0.02392578125, 0.00788116455078125, 0.0300750732421875, -0.076171875, -0.0209197998046875, -0.01043701171875, -0.05169677734375, -0.0276336669921875, 0.017913818359375, -0.043609619140625, -0.0042266845703125, -0.0302276611328125, 0.049652099609375, -0.018341064453125, -0.0162353515625, -0.010589599609375, 0.0059814453125, 0.03369140625, 0.0265655517578125, -0.047698974609375, 0.0261688232421875, 0.01934814453125, 0.0738525390625, -0.006252288818359375, -0.026580810546875, -0.01546478271484375, -0.01873779296875, -0.0419921875, 0.05059814453125, -0.0137176513671875, -0.0017242431640625, 0.00630950927734375, 0.02484130859375, 0.0018758773803710938, -0.04425048828125, 0.0263519287109375, -0.0621337890625, -0.005031585693359375, -0.0222625732421875, -0.02099609375, -0.0168304443359375, 0.01555633544921875, -0.057464599609375, 0.07818603515625, 0.0126953125, -0.0726318359375, 0.038482666015625, -0.035888671875, -0.0068817138671875, -0.005855560302734375, -0.004817962646484375, -0.048187255859375, -0.004119873046875, 0.0200042724609375, 0.03717041015625, -0.0062713623046875, 0.0029621124267578125, -0.018951416015625, -0.045257568359375, 0.01424407958984375, -0.0238189697265625, 0.05828857421875, 0.021484375, -0.0272369384765625, 0.020477294921875, -0.050140380859375, 0.002109527587890625, 0.0167083740234375, -0.017242431640625, -0.0018367767333984375, -0.02056884765625, 0.0034084320068359375, 0.0274505615234375, 0.02569580078125, -0.058135986328125, 0.00788116455078125, -0.023101806640625, 0.025634765625, 0.06414794921875, -0.0174560546875, 0.0301513671875, -0.0191497802734375, 0.0277862548828125, 0.0300750732421875, 0.031280517578125, -0.0171966552734375, -0.0343017578125, -0.0716552734375, -0.01544189453125, 0.041107177734375, 0.02532958984375, -0.04888916015625, 0.05157470703125, -0.0260772705078125, -0.047332763671875, -0.03851318359375, -0.003749847412109375, 0.03192138671875, 0.04254150390625, 0.0372314453125, -0.047698974609375, -0.04833984375, -0.0782470703125, 0.0221099853515625, 0.0006470680236816406, 0.016510009765625, 0.007205963134765625, 0.050262451171875, -0.01535797119140625, 0.07110595703125, -0.02618408203125, -0.0230560302734375, -0.00879669189453125, 0.0079345703125, 0.0191192626953125, 0.048065185546875, 0.04205322265625, -0.05108642578125, -0.02960205078125, -0.003421783447265625, -0.059814453125, 0.0174713134765625, 0.0007157325744628906, -0.02587890625, 0.0201416015625, 0.0452880859375, -0.041107177734375, 0.06304931640625, 0.0263519287109375, -0.0019102096557617188, 0.0498046875, -0.018035888671875, 0.00731658935546875, -0.07672119140625, -0.00428009033203125, 0.01316070556640625, -0.0141143798828125, -0.031890869140625, 0.004718780517578125, 0.007221221923828125, -0.01313018798828125, -0.03802490234375, 0.0171051025390625, -0.0361328125, -0.027801513671875, -0.0163116455078125, -0.0242919921875, -0.0032501220703125, 0.049591064453125, 0.0002903938293457031, 0.046295166015625, 0.061737060546875, -0.038543701171875, 0.032806396484375, 0.0126953125, -0.0465087890625, 0.02032470703125, -0.062042236328125, 0.01479339599609375, -0.00811767578125, 0.0253143310546875, -0.07275390625, -0.005229949951171875, 0.0176849365234375, -0.0252838134765625, 0.04974365234375, -0.022186279296875, -0.04547119140625, -0.049224853515625, -0.017242431640625, 0.0379638671875, 0.053497314453125, -0.050811767578125, 0.03704833984375, 0.01085662841796875, 0.03387451171875, -0.061279296875, -0.0653076171875, -0.006687164306640625, -0.006679534912109375, -0.0400390625, 0.041259765625, 0.006862640380859375, 0.0226593017578125, 0.026092529296875, -0.00699615478515625, -0.00691986083984375, -0.0157928466796875, 0.02911376953125, 0.034210205078125, -0.0301513671875, 0.00901031494140625, -0.01904296875, -0.019744873046875, -0.0028934478759765625, -0.037017822265625, 0.047393798828125, -0.032867431640625, -0.038726806640625, -0.045013427734375, 0.0039215087890625, 0.041412353515625, -0.027435302734375, 0.052154541015625, 0.0784912109375, -0.04156494140625, 0.01143646240234375, -0.03936767578125, -0.01145172119140625, -0.039794921875, 0.040313720703125, -0.0230712890625, -0.04296875, 0.053924560546875, -0.00339508056640625, 0.0012149810791015625, 0.04705810546875, 0.027984619140625, -0.009735107421875, 0.07305908203125, 0.04669189453125, -0.01026153564453125, 0.0523681640625, -0.059844970703125, 0.00574493408203125, -0.06256103515625, -0.0213470458984375, -0.02838134765625, -0.044158935546875, -0.0517578125, -0.0145721435546875, 0.020751953125, 0.013641357421875, -0.044830322265625, 0.033111572265625, -0.05511474609375, 0.0289154052734375, 0.06768798828125, 0.04583740234375, -0.017669677734375, 0.0244293212890625, -0.0165557861328125, 0.0042724609375, -0.045989990234375, -0.0223541259765625, 0.0728759765625, 0.034912109375, 0.048492431640625, -0.0177154541015625, 0.05950927734375, 0.0058441162109375, 0.01178741455078125, -0.060821533203125, 0.050811767578125, -0.0149993896484375, -0.043792724609375, -0.0111236572265625, -0.0207977294921875, -0.0953369140625, -0.0033435821533203125, -0.02154541015625, -0.058380126953125, 0.02520751953125, 0.0198974609375, -0.0133209228515625, 0.049072265625, -0.05877685546875, 0.07000732421875, -0.016876220703125, -0.0172882080078125, -0.0033702850341796875, -0.058380126953125, 0.006580352783203125, -0.00885772705078125, -0.0095062255859375, 0.0095062255859375, 0.0156402587890625, 0.07354736328125, -0.0517578125, 0.07354736328125, -0.027374267578125, 0.03485107421875, 0.04095458984375, -0.015716552734375, 0.02264404296875, -0.0369873046875, 0.01258087158203125, 0.03350830078125, 0.00540924072265625, -0.041351318359375, -0.040740966796875, 0.035888671875, -0.09051513671875, -0.035736083984375, -0.032562255859375, -0.024505615234375, 0.003082275390625, 0.0214691162109375, 0.054107666015625, 0.0565185546875, 0.0195159912109375, 0.027984619140625, 0.04754638671875, -0.032470703125, 0.031463623046875, -0.010040283203125, -0.01715087890625, -0.0179443359375, 0.06524658203125, 0.038330078125, 0.01386260986328125, 0.0183868408203125, 0.02313232421875, -0.0190887451171875, -0.048126220703125, -0.027557373046875, 0.01009368896484375, -0.07574462890625, -0.032684326171875, -0.031463623046875, -0.057281494140625, -0.021575927734375, -0.0094757080078125, -0.031585693359375, -0.0016469955444335938, -0.03997802734375, -0.00836944580078125, 0.0309295654296875, 0.056304931640625, -0.0038509368896484375, 0.044830322265625, -0.05377197265625, -0.0006098747253417969, 0.0285797119140625, 0.0372314453125, 0.01548004150390625, -0.06036376953125, -0.03759765625, -0.004177093505859375, -0.024139404296875, -0.051544189453125, 0.0255584716796875, 0.013763427734375, 0.05218505859375, 0.03973388671875, -0.0145263671875, 0.0653076171875, -0.0343017578125, 0.05401611328125, 0.041717529296875, -0.046661376953125, 0.040802001953125, 0.0022487640380859375, 0.01500701904296875, 0.025054931640625, 0.039825439453125, -0.003265380859375, 0.00972747802734375, -0.06427001953125, -0.0567626953125, 0.056549072265625, -0.0005497932434082031, 0.0178985595703125, 0.0169219970703125, 0.032257080078125, -0.00566864013671875, 0.0032806396484375, -0.0595703125, -0.02532958984375, -0.042694091796875, -0.01270294189453125, -0.0038089752197265625, -0.020538330078125, 0.0002605915069580078, -0.0517578125, 0.04315185546875, 0.0064849853515625, 0.06256103515625, 0.01532745361328125, -0.0088348388671875, -0.00958251953125, -0.02325439453125, 0.021392822265625, 0.0372314453125, -0.0227203369140625, 0.0167388916015625, 0.0048980712890625, -0.0567626953125, 0.00467681884765625, 0.004146575927734375, -0.0120849609375, 0.001312255859375, 0.03717041015625, 0.08465576171875, 0.0022220611572265625, -0.005218505859375, 0.0467529296875, 0.009368896484375, -0.0280303955078125, -0.0289764404296875, -0.000049233436584472656, -0.016571044921875, 0.0211944580078125, 0.02490234375, 0.03509521484375, -0.0001208186149597168, -0.0290374755859375, 0.0275115966796875, 0.0167083740234375, -0.039794921875, -0.027557373046875, 0.03424072265625, -0.018585205078125, -0.0071258544921875, 0.062347412109375, -0.01549530029296875, -0.04791259765625, 0.06756591796875, 0.0423583984375, 0.058868408203125, -0.00139617919921875, 0.0122528076171875, 0.05126953125, 0.022552490234375, -0.002716064453125, 0.0005583763122558594, 0.004207611083984375, -0.07550048828125, -0.024993896484375, -0.04766845703125, 0.00261688232421875, 0.0212249755859375, -0.052337646484375, 0.0267333984375, -0.04248046875, -0.03271484375, 0.0167999267578125, 0.00701904296875, -0.08319091796875, 0.0224761962890625, 0.01544189453125, 0.0728759765625, -0.058746337890625, 0.0687255859375, 0.056121826171875, -0.048736572265625, -0.07904052734375, -0.0183868408203125, -0.0229949951171875, -0.07489013671875, 0.069580078125, 0.0316162109375, 0.00039315223693847656, 0.01273345947265625, -0.063232421875, -0.0718994140625, 0.08770751953125, 0.026092529296875, -0.0261993408203125, 0.0043487548828125, 0.002231597900390625, 0.034088134765625, -0.0245208740234375, 0.02880859375, 0.0085296630859375, 0.01535797119140625, 0.03778076171875, -0.05096435546875, -0.003292083740234375, -0.028228759765625, 0.00936126708984375, 0.00833892822265625, -0.038055419921875, 0.07086181640625, -0.01103973388671875, -0.0015964508056640625, -0.0003421306610107422, 0.047698974609375, 0.0022068023681640625, 0.00023925304412841797, 0.056304931640625, 0.058563232421875, 0.03558349609375, -0.0191192626953125, 0.0740966796875, -0.0124664306640625, 0.024505615234375, 0.055267333984375, 0.0141754150390625, 0.04986572265625, 0.0214996337890625, -0.021575927734375, 0.045318603515625, 0.08819580078125, -0.03192138671875, 0.05206298828125, 0.0140380859375, 0.00213623046875, -0.01461029052734375, -0.005718231201171875, -0.037628173828125, 0.03851318359375, 0.0188751220703125, -0.043609619140625, -0.016082763671875, 0.0102386474609375, -0.0230255126953125, -0.03192138671875, -0.044586181640625, 0.039642333984375, -0.00449371337890625, -0.03741455078125, 0.051544189453125, -0.00899505615234375, 0.048431396484375, -0.04449462890625, -0.003856658935546875, -0.00403594970703125, 0.017425537109375, -0.026031494140625, -0.053497314453125, 0.01206207275390625, -0.01488494873046875, 0.0012264251708984375, 0.00666046142578125, 0.07000732421875, -0.007007598876953125, -0.046722412109375, 0.0178680419921875, 0.01453399658203125, 0.033416748046875, 0.0039520263671875, -0.06475830078125, 0.000339508056640625, -0.00888824462890625, -0.0269622802734375, 0.034210205078125, 0.0307769775390625, -0.010711669921875, 0.0307159423828125, 0.0489501953125, 0.0102386474609375, 0.027374267578125, 0.003391265869140625, 0.076416015625, -0.0266876220703125, -0.0295867919921875, -0.04705810546875, 0.03594970703125, -0.01055908203125, -0.0227508544921875, 0.036224365234375, 0.03436279296875, 0.0833740234375, -0.01532745361328125, 0.039794921875, 0.00064849853515625, -0.0032806396484375, -0.0292816162109375, 0.037933349609375, -0.047393798828125, -0.01265716552734375, -0.02593994140625, -0.062225341796875, -0.0181884765625, 0.06219482421875, -0.0205230712890625, 0.0292816162109375, 0.034393310546875, 0.0628662109375, -0.0187225341796875, -0.01253509521484375, 0.0296783447265625, 0.00835418701171875, 0.0020904541015625, 0.02484130859375, 0.05755615234375, -0.043121337890625, 0.039031982421875, -0.044677734375, -0.0196380615234375, -0.01096343994140625, -0.05865478515625, -0.06866455078125, -0.057464599609375, -0.03106689453125, -0.03558349609375, -0.01226043701171875, 0.06494140625, 0.0843505859375, -0.06524658203125, -0.0026874542236328125, -0.0150909423828125, -0.023651123046875, -0.02099609375, -0.0102081298828125, 0.05426025390625, -0.0184478759765625, -0.04052734375, -0.0257720947265625, -0.0029010772705078125, 0.0181732177734375, -0.022186279296875, -0.0027027130126953125, -0.015869140625, -0.0203857421875, 0.0433349609375, 0.01401519775390625, -0.04510498046875, -0.037445068359375, 0.000377655029296875, -0.00638580322265625, 0.025543212890625, 0.044891357421875, -0.061187744140625, 0.0401611328125, 0.024932861328125, 0.048248291015625, 0.06671142578125, -0.01094818115234375, 0.00060272216796875, -0.060638427734375, 0.0281219482421875, 0.0058135986328125, 0.0479736328125, 0.01309967041015625, -0.022186279296875, 0.032379150390625, 0.0297698974609375, -0.039886474609375, -0.051422119140625, 0.007442474365234375, -0.08966064453125, -0.01372528076171875, 0.06298828125, -0.033447265625, -0.02587890625, 0.0078125, -0.01250457763671875, 0.05316162109375, -0.0035343170166015625, 0.040313720703125, 0.035491943359375, 0.00554656982421875, -0.031524658203125, -0.044891357421875, 0.0197906494140625, 0.0033721923828125, -0.041046142578125, -0.038238525390625, 0.01212310791015625, 0.015228271484375, 0.0333251953125, 0.0322265625, -0.0212554931640625, 0.01012420654296875, 0.007518768310546875, 0.0300140380859375, -0.0186920166015625, -0.032318115234375, -0.01087188720703125, 0.00270843505859375, -0.0014791488647460938, -0.056243896484375 ] ]
stablediffusionapi/realistic-vision-v51
2023-10-09T11:08:00.000Z
[ "diffusers", "stablediffusionapi.com", "stable-diffusion-api", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
stablediffusionapi
null
null
stablediffusionapi/realistic-vision-v51
3
109,420
diffusers
2023-10-09T11:06:33
--- license: creativeml-openrail-m tags: - stablediffusionapi.com - stable-diffusion-api - text-to-image - ultra-realistic pinned: true --- # Realistic Vision V5.1 API Inference ![generated from stablediffusionapi.com](https://cdn.stablediffusionapi.com/generations/8112328501690811758.png) ## Get API Key Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed. Replace Key in below code, change **model_id** to "realistic-vision-v51" Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs) Try model for free: [Generate Images](https://stablediffusionapi.com/models/realistic-vision-v51) Model link: [View model](https://stablediffusionapi.com/models/realistic-vision-v51) Credits: [View credits](https://civitai.com/?query=Realistic%20Vision%20V5.1) View all models: [View Models](https://stablediffusionapi.com/models) import requests import json url = "https://stablediffusionapi.com/api/v4/dreambooth" payload = json.dumps({ "key": "your_api_key", "model_id": "realistic-vision-v51", "prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": "no", "enhance_prompt": "yes", "seed": None, "guidance_scale": 7.5, "multi_lingual": "no", "panorama": "no", "self_attention": "no", "upscale": "no", "embeddings": "embeddings_model_id", "lora": "lora_model_id", "webhook": None, "track_id": None }) headers = { 'Content-Type': 'application/json' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) > Use this coupon code to get 25% off **DMGG0RBN**
2,500
[ [ -0.038482666015625, -0.051025390625, 0.04150390625, 0.0131072998046875, -0.03887939453125, 0.003940582275390625, 0.023834228515625, -0.04541015625, 0.037994384765625, 0.045379638671875, -0.06524658203125, -0.0631103515625, -0.02703857421875, -0.001544952392578125, -0.013885498046875, 0.037017822265625, 0.0030307769775390625, -0.007587432861328125, -0.011138916015625, 0.00658416748046875, -0.0174407958984375, -0.0090484619140625, -0.04296875, -0.0028362274169921875, 0.01395416259765625, -0.0049285888671875, 0.0433349609375, 0.047149658203125, 0.039581298828125, 0.022552490234375, -0.01102447509765625, -0.0037631988525390625, -0.028839111328125, -0.026092529296875, -0.01074981689453125, -0.05133056640625, -0.04095458984375, 0.0009469985961914062, 0.0296173095703125, 0.0267181396484375, -0.003200531005859375, 0.039947509765625, -0.01025390625, 0.04656982421875, -0.051910400390625, 0.01467132568359375, -0.0249481201171875, 0.0240020751953125, 0.003173828125, -0.0011615753173828125, -0.01546478271484375, -0.016876220703125, -0.0007381439208984375, -0.07830810546875, 0.0289154052734375, 0.01219940185546875, 0.108154296875, 0.0237884521484375, -0.01435089111328125, 0.00429534912109375, -0.04364013671875, 0.060821533203125, -0.06854248046875, 0.028167724609375, 0.02215576171875, 0.0270233154296875, 0.0003654956817626953, -0.0714111328125, -0.0496826171875, 0.0128173828125, 0.0276031494140625, 0.021728515625, -0.0307464599609375, 0.0034198760986328125, 0.027099609375, 0.02301025390625, -0.038238525390625, -0.0186767578125, -0.038970947265625, -0.01004791259765625, 0.040557861328125, 0.0208282470703125, 0.0088653564453125, -0.01934814453125, -0.035064697265625, -0.0243682861328125, -0.033416748046875, 0.023101806640625, 0.04425048828125, 0.0161285400390625, -0.045196533203125, 0.03314208984375, -0.03521728515625, 0.06732177734375, 0.0198211669921875, -0.01971435546875, 0.04779052734375, -0.0171966552734375, -0.023712158203125, -0.021728515625, 0.059783935546875, 0.058929443359375, -0.0015277862548828125, 0.0249481201171875, -0.01161956787109375, 0.0088958740234375, 0.0186614990234375, -0.08013916015625, -0.01317596435546875, 0.0560302734375, -0.049835205078125, -0.048583984375, 0.0062713623046875, -0.0711669921875, -0.01287841796875, -0.005229949951171875, 0.0316162109375, -0.02752685546875, -0.028900146484375, 0.0279541015625, -0.01494598388671875, 0.017242431640625, 0.0280609130859375, -0.05181884765625, 0.006744384765625, 0.02984619140625, 0.06402587890625, 0.01508331298828125, -0.0004627704620361328, 0.0179443359375, 0.009735107421875, -0.029052734375, 0.06292724609375, -0.007701873779296875, -0.0269012451171875, -0.00997161865234375, 0.023681640625, 0.00707244873046875, -0.0447998046875, 0.04876708984375, -0.03424072265625, -0.00980377197265625, -0.01190948486328125, -0.02935791015625, -0.03369140625, 0.026153564453125, -0.044036865234375, 0.038238525390625, 0.005443572998046875, -0.055908203125, 0.0312042236328125, -0.050384521484375, -0.0026397705078125, 0.0062713623046875, -0.003910064697265625, -0.03619384765625, -0.01099395751953125, 0.005779266357421875, 0.0201873779296875, 0.0035381317138671875, -0.0022029876708984375, -0.05804443359375, -0.03289794921875, 0.0149993896484375, -0.02362060546875, 0.08203125, 0.02740478515625, -0.02166748046875, 0.005558013916015625, -0.07061767578125, 0.0016469955444335938, 0.048065185546875, -0.01389312744140625, -0.00826263427734375, -0.0162200927734375, 0.007904052734375, 0.0001666545867919922, 0.0240478515625, -0.05023193359375, 0.0107421875, -0.032562255859375, 0.017608642578125, 0.044677734375, 0.01332855224609375, 0.014739990234375, -0.0231170654296875, 0.05181884765625, 0.0172271728515625, 0.03936767578125, 0.004398345947265625, -0.0452880859375, -0.042724609375, -0.035003662109375, 0.0103912353515625, 0.04473876953125, -0.043792724609375, 0.0218048095703125, -0.01525115966796875, -0.048248291015625, -0.04595947265625, -0.01403045654296875, 0.0279083251953125, 0.03961181640625, 0.00875091552734375, -0.023468017578125, -0.050628662109375, -0.07501220703125, -0.0009722709655761719, -0.005023956298828125, -0.0091400146484375, 0.0271759033203125, 0.03948974609375, -0.0226287841796875, 0.06463623046875, -0.06414794921875, -0.011444091796875, -0.00371551513671875, -0.0150909423828125, 0.057220458984375, 0.04803466796875, 0.06793212890625, -0.069580078125, -0.0203857421875, -0.0292205810546875, -0.059722900390625, 0.01690673828125, 0.01190185546875, -0.01371002197265625, -0.00820159912109375, 0.002132415771484375, -0.059967041015625, 0.048095703125, 0.03607177734375, -0.04095458984375, 0.0516357421875, -0.01788330078125, 0.03656005859375, -0.08984375, 0.005367279052734375, 0.018707275390625, -0.029571533203125, -0.02960205078125, 0.038787841796875, -0.0029964447021484375, -0.006771087646484375, -0.06298828125, 0.044281005859375, -0.02301025390625, 0.01175689697265625, -0.00951385498046875, 0.0017404556274414062, 0.0154266357421875, 0.02667236328125, 0.005157470703125, 0.026702880859375, 0.046234130859375, -0.0338134765625, 0.049835205078125, 0.0141143798828125, -0.033721923828125, 0.043182373046875, -0.05145263671875, -0.00018930435180664062, 0.002712249755859375, 0.02783203125, -0.07586669921875, -0.04644775390625, 0.042449951171875, -0.045867919921875, -0.00716400146484375, -0.041656494140625, -0.0328369140625, -0.05157470703125, -0.029296875, 0.0255126953125, 0.058502197265625, -0.032745361328125, 0.04522705078125, 0.015777587890625, 0.0238037109375, -0.0478515625, -0.06988525390625, -0.0142059326171875, -0.0237579345703125, -0.0478515625, 0.0312347412109375, -0.0034637451171875, -0.0309906005859375, 0.0074920654296875, 0.00885009765625, -0.01537322998046875, -0.01514434814453125, 0.03143310546875, 0.045257568359375, -0.0171661376953125, -0.03411865234375, 0.004817962646484375, -0.0008959770202636719, 0.004367828369140625, -0.0247955322265625, 0.046875, -0.0091552734375, -0.0300445556640625, -0.06976318359375, -0.003040313720703125, 0.0511474609375, 0.005748748779296875, 0.04705810546875, 0.044586181640625, -0.054351806640625, 0.004817962646484375, -0.0419921875, -0.0209197998046875, -0.03802490234375, 0.0222930908203125, -0.03985595703125, -0.0227813720703125, 0.071044921875, -0.0037078857421875, -0.0004470348358154297, 0.03875732421875, 0.02655029296875, -0.0185546875, 0.08880615234375, 0.0260772705078125, 0.0265960693359375, 0.032684326171875, -0.06451416015625, -0.00873565673828125, -0.057159423828125, -0.0200347900390625, -0.006977081298828125, -0.023406982421875, -0.0186767578125, -0.040771484375, 0.01049041748046875, 0.0113677978515625, -0.0213775634765625, 0.0221099853515625, -0.044921875, 0.0290374755859375, 0.038970947265625, 0.036712646484375, 0.00885009765625, 0.0069580078125, -0.0141143798828125, -0.00948333740234375, -0.03204345703125, -0.035736083984375, 0.07769775390625, 0.018096923828125, 0.05242919921875, -0.0013637542724609375, 0.0433349609375, 0.0084381103515625, -0.00749969482421875, -0.038055419921875, 0.045623779296875, 0.0030059814453125, -0.076171875, 0.0169525146484375, -0.0166168212890625, -0.06494140625, 0.0226287841796875, -0.03509521484375, -0.054962158203125, 0.04705810546875, 0.0284881591796875, -0.05743408203125, 0.041656494140625, -0.05291748046875, 0.0654296875, -0.0052642822265625, -0.046905517578125, -0.0165557861328125, -0.041656494140625, 0.04974365234375, 0.00044655799865722656, 0.035369873046875, -0.027740478515625, -0.01299285888671875, 0.04736328125, -0.032501220703125, 0.08099365234375, -0.03399658203125, 0.011962890625, 0.045867919921875, 0.004608154296875, 0.0245208740234375, 0.0222015380859375, -0.0014715194702148438, 0.0218048095703125, 0.027587890625, -0.0455322265625, -0.02508544921875, 0.054290771484375, -0.049896240234375, -0.0350341796875, -0.0235595703125, -0.0269012451171875, 0.00585174560546875, 0.028656005859375, 0.043548583984375, 0.032196044921875, -0.0033626556396484375, -0.0013246536254882812, 0.04803466796875, -0.0099945068359375, 0.0287017822265625, 0.016815185546875, -0.04913330078125, -0.05230712890625, 0.058258056640625, -0.001983642578125, 0.033447265625, 0.0029926300048828125, 0.0128173828125, -0.02752685546875, -0.045989990234375, -0.04522705078125, 0.0198974609375, -0.0528564453125, -0.0231781005859375, -0.042510986328125, 0.007587432861328125, -0.053253173828125, -0.0216064453125, -0.05645751953125, -0.0230865478515625, -0.0380859375, -0.007602691650390625, 0.040618896484375, 0.021484375, -0.0048675537109375, 0.01361846923828125, -0.053192138671875, 0.032257080078125, 0.007965087890625, 0.0141754150390625, -0.0011005401611328125, -0.041229248046875, -0.00621795654296875, 0.019073486328125, -0.03271484375, -0.0599365234375, 0.046722412109375, -0.01068115234375, 0.0268402099609375, 0.06256103515625, 0.01267242431640625, 0.0748291015625, -0.0036411285400390625, 0.07061767578125, 0.0248260498046875, -0.052215576171875, 0.055938720703125, -0.045440673828125, 0.0211029052734375, 0.04437255859375, 0.020538330078125, -0.01047515869140625, -0.0217742919921875, -0.062469482421875, -0.07647705078125, 0.03857421875, 0.0079345703125, 0.0166168212890625, 0.0152740478515625, 0.041717529296875, -0.0063018798828125, 0.0151214599609375, -0.06903076171875, -0.0199127197265625, -0.0233306884765625, -0.01372528076171875, 0.0247955322265625, 0.005886077880859375, -0.019927978515625, -0.038726806640625, 0.051544189453125, -0.005229949951171875, 0.023345947265625, 0.03594970703125, 0.024139404296875, -0.01267242431640625, -0.0172119140625, 0.0391845703125, 0.053009033203125, -0.0460205078125, -0.01049041748046875, 0.0022411346435546875, -0.043853759765625, 0.0019330978393554688, 0.00370025634765625, -0.0308074951171875, 0.00243377685546875, 0.0263519287109375, 0.0670166015625, -0.0016107559204101562, -0.046630859375, 0.044921875, -0.0106201171875, -0.033416748046875, -0.044403076171875, 0.006038665771484375, 0.0213775634765625, 0.043792724609375, 0.028167724609375, 0.0355224609375, 0.0189971923828125, -0.03192138671875, -0.00656890869140625, 0.02288818359375, -0.0249176025390625, -0.03314208984375, 0.08056640625, 0.0007615089416503906, -0.0289306640625, 0.032928466796875, -0.0289306640625, -0.011871337890625, 0.06390380859375, 0.050445556640625, 0.057464599609375, -0.0025310516357421875, 0.01439666748046875, 0.059539794921875, 0.00583648681640625, -0.00710296630859375, 0.050811767578125, 0.01558685302734375, -0.049407958984375, -0.0150299072265625, -0.0628662109375, -0.0192413330078125, 0.01369476318359375, -0.045562744140625, 0.041961669921875, -0.06011962890625, -0.03179931640625, -0.0130462646484375, -0.01258087158203125, -0.04522705078125, 0.03680419921875, 0.014129638671875, 0.06585693359375, -0.048431396484375, 0.052703857421875, 0.048492431640625, -0.044677734375, -0.07073974609375, -0.01277923583984375, 0.0168304443359375, -0.04937744140625, 0.0250244140625, 0.004756927490234375, -0.00769805908203125, 0.01371002197265625, -0.056427001953125, -0.06158447265625, 0.0869140625, 0.037506103515625, -0.044219970703125, -0.0073089599609375, -0.008026123046875, 0.02911376953125, -0.035736083984375, 0.0113677978515625, 0.0085601806640625, 0.028900146484375, 0.0257720947265625, -0.0377197265625, 0.004741668701171875, -0.0189208984375, -0.00266265869140625, -0.015838623046875, -0.0533447265625, 0.058349609375, -0.042022705078125, -0.0007443428039550781, 0.0208740234375, 0.04351806640625, 0.049652099609375, 0.03448486328125, 0.047271728515625, 0.07122802734375, 0.0318603515625, -0.007617950439453125, 0.08197021484375, -0.0282135009765625, 0.0496826171875, 0.04736328125, -0.0007953643798828125, 0.07244873046875, 0.039581298828125, -0.0309295654296875, 0.04522705078125, 0.0814208984375, -0.0263214111328125, 0.0560302734375, 0.00608062744140625, -0.0325927734375, -0.0168304443359375, 0.0091552734375, -0.03704833984375, 0.025634765625, 0.016998291015625, -0.0296630859375, 0.0167083740234375, 0.0102996826171875, -0.0155181884765625, -0.016021728515625, -0.0181884765625, 0.0274810791015625, 0.0012607574462890625, -0.017242431640625, 0.058349609375, -0.00991058349609375, 0.0718994140625, -0.04290771484375, -0.0066375732421875, -0.00653839111328125, 0.0289154052734375, -0.0223846435546875, -0.04461669921875, 0.0179901123046875, -0.00939178466796875, -0.0087432861328125, 0.00522613525390625, 0.03961181640625, 0.005252838134765625, -0.06158447265625, 0.020782470703125, 0.0079345703125, 0.0274810791015625, -0.00594329833984375, -0.0703125, 0.0171966552734375, 0.0151214599609375, -0.02947998046875, -0.0018663406372070312, 0.01129913330078125, 0.0309295654296875, 0.05230712890625, 0.058319091796875, 0.020263671875, 0.00994873046875, -0.0088653564453125, 0.0438232421875, -0.0443115234375, -0.03936767578125, -0.058837890625, 0.046417236328125, -0.0194854736328125, -0.02484130859375, 0.038726806640625, 0.062255859375, 0.06781005859375, -0.048309326171875, 0.04974365234375, -0.0160980224609375, 0.0290374755859375, -0.0283050537109375, 0.06011962890625, -0.06463623046875, 0.00339508056640625, -0.032684326171875, -0.06011962890625, -0.023193359375, 0.048187255859375, -0.00955963134765625, 0.004657745361328125, 0.0287322998046875, 0.05963134765625, -0.0295257568359375, -0.0191192626953125, 0.0225830078125, 0.01496124267578125, 0.01015472412109375, 0.0233612060546875, 0.042266845703125, -0.0430908203125, 0.03594970703125, -0.054595947265625, -0.016571044921875, -0.0024356842041015625, -0.0511474609375, -0.04132080078125, -0.0254364013671875, -0.04248046875, -0.04736328125, -0.0003886222839355469, 0.060211181640625, 0.07989501953125, -0.054962158203125, -0.0099029541015625, -0.0089111328125, -0.00841522216796875, -0.0212554931640625, -0.0224151611328125, 0.013458251953125, 0.0258941650390625, -0.08477783203125, 0.0120086669921875, 0.0048065185546875, 0.035064697265625, -0.01198577880859375, 0.005046844482421875, -0.0132293701171875, 0.006130218505859375, 0.0229949951171875, 0.0229339599609375, -0.0723876953125, -0.01427459716796875, -0.00608062744140625, 0.005809783935546875, 0.0151214599609375, 0.01111602783203125, -0.0472412109375, 0.033966064453125, 0.040771484375, 0.009796142578125, 0.04656982421875, -0.0017242431640625, 0.0155181884765625, -0.024139404296875, 0.042510986328125, 0.01183319091796875, 0.040802001953125, 0.01690673828125, -0.040191650390625, 0.03607177734375, 0.0531005859375, -0.024993896484375, -0.0699462890625, 0.006755828857421875, -0.09283447265625, -0.0243377685546875, 0.065185546875, -0.0271759033203125, -0.04913330078125, 0.01358795166015625, -0.0237579345703125, 0.023193359375, -0.0162811279296875, 0.047607421875, 0.033416748046875, -0.0164947509765625, -0.017303466796875, -0.0562744140625, 0.01435089111328125, 0.008758544921875, -0.0555419921875, -0.0213775634765625, 0.037353515625, 0.04315185546875, 0.034759521484375, 0.038238525390625, -0.032440185546875, 0.0185546875, 0.0391845703125, 0.03778076171875, 0.002742767333984375, 0.015045166015625, -0.01873779296875, 0.0088958740234375, -0.0090789794921875, -0.04461669921875 ] ]
facebook/wav2vec2-large-xlsr-53
2022-03-18T16:11:44.000Z
[ "transformers", "pytorch", "jax", "wav2vec2", "pretraining", "speech", "multilingual", "dataset:common_voice", "arxiv:2006.13979", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
null
facebook
null
null
facebook/wav2vec2-large-xlsr-53
59
108,294
transformers
2022-03-02T23:29:05
--- language: multilingual datasets: - common_voice tags: - speech license: apache-2.0 --- # Wav2Vec2-XLSR-53 [Facebook's XLSR-Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/) The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information. [Paper](https://arxiv.org/abs/2006.13979) Authors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli **Abstract** This paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages. The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20. # Usage See [this notebook](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLSR_Wav2Vec2_on_Turkish_ASR_with_%F0%9F%A4%97_Transformers.ipynb) for more information on how to fine-tune the model. ![model image](https://raw.githubusercontent.com/patrickvonplaten/scientific_images/master/xlsr_wav2vec2.png)
2,288
[ [ -0.015655517578125, -0.0168609619140625, -0.0081634521484375, 0.00695037841796875, -0.021087646484375, 0.009979248046875, -0.0199737548828125, -0.054779052734375, -0.00714874267578125, 0.010986328125, -0.048675537109375, -0.0225677490234375, -0.03436279296875, -0.006084442138671875, -0.00952911376953125, 0.05828857421875, 0.004528045654296875, 0.047515869140625, 0.01384735107421875, -0.02313232421875, -0.0253143310546875, -0.03656005859375, -0.0609130859375, -0.030670166015625, 0.03997802734375, 0.0269927978515625, 0.0240325927734375, 0.0238494873046875, 0.02166748046875, 0.02142333984375, -0.02032470703125, 0.00385284423828125, -0.05450439453125, -0.01296234130859375, -0.007785797119140625, -0.0290069580078125, -0.0262298583984375, -0.0009512901306152344, 0.059906005859375, 0.052642822265625, -0.0261077880859375, 0.01284027099609375, 0.004970550537109375, 0.03656005859375, -0.03363037109375, 0.0195159912109375, -0.05279541015625, 0.01172637939453125, -0.02044677734375, 0.00626373291015625, -0.0308685302734375, 0.0064849853515625, 0.01025390625, -0.036041259765625, 0.006671905517578125, -0.006137847900390625, 0.057769775390625, 0.00949859619140625, -0.032989501953125, -0.01334381103515625, -0.0675048828125, 0.0814208984375, -0.026641845703125, 0.08154296875, 0.042388916015625, 0.0266876220703125, 0.0260772705078125, -0.07659912109375, -0.0271453857421875, -0.0003666877746582031, 0.00982666015625, 0.0224609375, -0.0295562744140625, -0.01522064208984375, 0.00920867919921875, 0.0211944580078125, -0.06109619140625, 0.01546478271484375, -0.057769775390625, -0.03521728515625, 0.029876708984375, -0.01052093505859375, 0.0126190185546875, -0.0023517608642578125, -0.0230255126953125, -0.0305328369140625, -0.03717041015625, 0.0369873046875, 0.01267242431640625, 0.036285400390625, -0.04376220703125, 0.025115966796875, 0.006244659423828125, 0.038604736328125, -0.004283905029296875, -0.0010156631469726562, 0.050872802734375, -0.044891357421875, 0.01010894775390625, 0.0303497314453125, 0.057281494140625, -0.007663726806640625, 0.01233673095703125, 0.0007424354553222656, -0.01861572265625, 0.021697998046875, -0.0033893585205078125, -0.0517578125, -0.0272216796875, -0.0014009475708007812, -0.0228424072265625, 0.0166015625, 0.0052947998046875, -0.006229400634765625, 0.0199127197265625, -0.049072265625, 0.032501220703125, -0.039947509765625, -0.0252227783203125, -0.0028285980224609375, 0.0034961700439453125, 0.02301025390625, 0.0041046142578125, -0.0625, 0.04071044921875, 0.0384521484375, 0.059234619140625, 0.0008840560913085938, -0.01474761962890625, -0.05731201171875, 0.0002903938293457031, -0.036041259765625, 0.051116943359375, -0.026824951171875, -0.0287017822265625, -0.0208892822265625, 0.0008640289306640625, 0.0194244384765625, -0.039459228515625, 0.03643798828125, -0.036590576171875, 0.007534027099609375, -0.0190582275390625, -0.022491455078125, 0.0015869140625, -0.0211334228515625, -0.06072998046875, 0.08154296875, 0.006526947021484375, -0.054962158203125, 0.01727294921875, -0.030609130859375, -0.03973388671875, -0.005695343017578125, -0.0199432373046875, -0.032196044921875, -0.01186370849609375, 0.005947113037109375, 0.011627197265625, -0.008026123046875, -0.012939453125, -0.00235748291015625, -0.0173187255859375, 0.0155487060546875, -0.022491455078125, 0.06390380859375, 0.03948974609375, -0.0168304443359375, 0.00507354736328125, -0.06939697265625, 0.02337646484375, -0.006786346435546875, -0.028350830078125, -0.0195159912109375, -0.018310546875, 0.0220794677734375, 0.0277557373046875, 0.003505706787109375, -0.04608154296875, -0.032928466796875, -0.04302978515625, 0.0504150390625, 0.0289764404296875, -0.023468017578125, 0.03497314453125, -0.01678466796875, 0.016571044921875, 0.004375457763671875, 0.0007596015930175781, -0.00384521484375, -0.0226593017578125, -0.035614013671875, -0.017822265625, 0.039764404296875, 0.04498291015625, -0.0273590087890625, 0.021087646484375, -0.0131988525390625, -0.045806884765625, -0.05633544921875, 0.006740570068359375, 0.05877685546875, 0.0291595458984375, 0.052642822265625, -0.0313720703125, -0.07049560546875, -0.044342041015625, -0.01236724853515625, -0.0035266876220703125, -0.003162384033203125, 0.0225677490234375, 0.0143280029296875, -0.0261993408203125, 0.053375244140625, -0.00907135009765625, -0.0278778076171875, -0.0252227783203125, 0.0269012451171875, -0.0035457611083984375, 0.047332763671875, 0.03173828125, -0.077880859375, -0.050872802734375, 0.003292083740234375, -0.009521484375, 0.003910064697265625, 0.0212860107421875, -0.0021381378173828125, 0.0355224609375, 0.041229248046875, -0.019744873046875, 0.0146636962890625, 0.057952880859375, -0.00620269775390625, 0.0257110595703125, -0.00902557373046875, -0.00836944580078125, -0.0875244140625, -0.0009212493896484375, 0.01148223876953125, -0.031890869140625, -0.039306640625, -0.035400390625, 0.0106353759765625, -0.01317596435546875, -0.053375244140625, 0.0308685302734375, -0.048065185546875, -0.025177001953125, -0.0115509033203125, 0.01384735107421875, -0.01090240478515625, 0.0287322998046875, 0.0206298828125, 0.05914306640625, 0.040802001953125, -0.0443115234375, 0.019866943359375, 0.033203125, -0.03643798828125, 0.01165008544921875, -0.059417724609375, 0.0275726318359375, 0.004085540771484375, 0.026031494140625, -0.0743408203125, 0.007434844970703125, 0.00267791748046875, -0.06427001953125, 0.033966064453125, -0.005062103271484375, -0.01485443115234375, -0.0091705322265625, 0.00696563720703125, 0.048309326171875, 0.060546875, -0.040802001953125, 0.048492431640625, 0.048614501953125, -0.0241546630859375, -0.054229736328125, -0.0823974609375, -0.0192108154296875, 0.0234222412109375, -0.03924560546875, 0.053924560546875, -0.00875091552734375, -0.0092315673828125, 0.0035724639892578125, -0.03240966796875, -0.0142059326171875, -0.0382080078125, 0.021453857421875, 0.005939483642578125, -0.02496337890625, 0.004528045654296875, -0.00696563720703125, -0.0169219970703125, -0.003940582275390625, -0.032806396484375, 0.0462646484375, -0.00971221923828125, -0.0078125, -0.0545654296875, 0.0272369384765625, 0.019378662109375, -0.040252685546875, 0.0231475830078125, 0.068359375, -0.029388427734375, -0.018951416015625, -0.0299530029296875, -0.0160980224609375, -0.035980224609375, 0.061431884765625, -0.044158935546875, -0.0697021484375, 0.03399658203125, 0.0061798095703125, -0.01084136962890625, 0.033294677734375, 0.041595458984375, 0.0019855499267578125, 0.0762939453125, 0.055938720703125, -0.0107421875, 0.045745849609375, -0.0406494140625, 0.00736236572265625, -0.05218505859375, -0.04217529296875, -0.058319091796875, 0.0005364418029785156, -0.043304443359375, -0.04779052734375, 0.015960693359375, -0.0037784576416015625, -0.00861358642578125, 0.0304718017578125, -0.020111083984375, 0.0249176025390625, 0.042388916015625, 0.01617431640625, 0.0011997222900390625, 0.0219268798828125, -0.0335693359375, -0.0142059326171875, -0.056182861328125, -0.01116943359375, 0.053863525390625, 0.021697998046875, 0.06304931640625, 0.0078582763671875, 0.039093017578125, 0.00826263427734375, -0.0287628173828125, -0.06439208984375, 0.0207977294921875, -0.03363037109375, -0.038848876953125, -0.0310211181640625, -0.0462646484375, -0.06219482421875, 0.0160064697265625, -0.0174102783203125, -0.036285400390625, 0.00716400146484375, 0.011138916015625, -0.0245513916015625, 0.01232147216796875, -0.04339599609375, 0.05084228515625, -0.0209503173828125, -0.0237579345703125, -0.048187255859375, -0.0682373046875, -0.001689910888671875, -0.0169830322265625, 0.028411865234375, 0.00685882568359375, 0.0229034423828125, 0.0706787109375, -0.0247802734375, 0.05682373046875, -0.0162811279296875, -0.006778717041015625, 0.0207977294921875, -0.0173492431640625, 0.03240966796875, -0.0057830810546875, -0.005153656005859375, 0.0296173095703125, 0.01448822021484375, -0.023468017578125, -0.01910400390625, 0.0447998046875, -0.09625244140625, 0.00044655799865722656, -0.023895263671875, -0.0284881591796875, -0.01641845703125, 0.0116424560546875, 0.04217529296875, 0.06427001953125, -0.0237884521484375, 0.02850341796875, 0.0570068359375, -0.01055908203125, 0.0193939208984375, 0.048065185546875, 0.006824493408203125, -0.04144287109375, 0.06170654296875, 0.024993896484375, 0.0191802978515625, 0.0504150390625, 0.0130615234375, -0.03741455078125, -0.033294677734375, -0.0029315948486328125, 0.0174407958984375, -0.060516357421875, 0.0017833709716796875, -0.058013916015625, -0.032501220703125, -0.0626220703125, 0.01555633544921875, -0.061737060546875, -0.040771484375, -0.038055419921875, -0.00482940673828125, 0.0150909423828125, 0.05010986328125, -0.050689697265625, 0.0173187255859375, -0.06597900390625, 0.044158935546875, 0.0306243896484375, 0.012237548828125, -0.00896453857421875, -0.07366943359375, -0.035064697265625, 0.0225830078125, 0.0108489990234375, -0.0341796875, 0.0301361083984375, 0.032867431640625, 0.0540771484375, 0.02349853515625, 0.0010929107666015625, 0.053680419921875, -0.057769775390625, 0.0477294921875, 0.039337158203125, -0.086669921875, 0.043182373046875, 0.01097869873046875, 0.024078369140625, 0.0299530029296875, 0.049163818359375, -0.049591064453125, 0.002414703369140625, -0.0218353271484375, -0.0677490234375, 0.07733154296875, 0.00634765625, 0.0121307373046875, 0.0268707275390625, 0.01457977294921875, 0.01343536376953125, -0.0211334228515625, -0.05255126953125, -0.031585693359375, -0.040069580078125, -0.019683837890625, -0.0254669189453125, -0.034088134765625, -0.0006623268127441406, -0.036285400390625, 0.065185546875, 0.022857666015625, 0.0146942138671875, 0.0129547119140625, -0.0234222412109375, -0.0139007568359375, 0.01276397705078125, 0.041900634765625, 0.043914794921875, -0.01296234130859375, 0.0180206298828125, 0.01445770263671875, -0.049468994140625, 0.01355743408203125, 0.030364990234375, 0.01708984375, 0.00635528564453125, 0.0259857177734375, 0.08990478515625, 0.025665283203125, -0.032501220703125, 0.031494140625, -0.0271148681640625, -0.0382080078125, -0.04608154296875, -0.0030879974365234375, 0.00688934326171875, 0.0272369384765625, 0.03515625, 0.006496429443359375, -0.0003771781921386719, -0.03533935546875, 0.0313720703125, 0.021331787109375, -0.06121826171875, -0.048004150390625, 0.049163818359375, 0.01568603515625, -0.0270538330078125, 0.041656494140625, -0.01099395751953125, -0.03558349609375, 0.0279541015625, 0.050262451171875, 0.047515869140625, -0.044464111328125, -0.005229949951171875, 0.04339599609375, 0.01355743408203125, -0.005016326904296875, 0.0443115234375, -0.024261474609375, -0.056640625, -0.049102783203125, -0.03973388671875, -0.03564453125, 0.0098724365234375, -0.056549072265625, 0.02996826171875, -0.00817108154296875, -0.007904052734375, 0.0256195068359375, 0.0164947509765625, -0.043609619140625, 0.0232086181640625, 0.0325927734375, 0.0631103515625, -0.05889892578125, 0.08953857421875, 0.0543212890625, -0.021759033203125, -0.09814453125, -0.015960693359375, -0.0003066062927246094, -0.048919677734375, 0.0738525390625, 0.00351715087890625, -0.041778564453125, 0.007282257080078125, -0.03729248046875, -0.088623046875, 0.07440185546875, 0.0164031982421875, -0.07635498046875, 0.00830841064453125, 0.010711669921875, 0.0219573974609375, -0.01532745361328125, 0.006114959716796875, 0.014007568359375, 0.01399993896484375, 0.025238037109375, -0.08544921875, -0.0139617919921875, -0.02667236328125, -0.00176239013671875, -0.01340484619140625, -0.045440673828125, 0.0784912109375, -0.031463623046875, -0.0193328857421875, -0.00445556640625, 0.06170654296875, 0.0223236083984375, 0.00933837890625, 0.0482177734375, 0.0369873046875, 0.060882568359375, 0.003993988037109375, 0.041778564453125, -0.0223846435546875, 0.0157623291015625, 0.0970458984375, -0.0293731689453125, 0.07293701171875, 0.0217437744140625, -0.0200042724609375, 0.037750244140625, 0.040435791015625, 0.006671905517578125, 0.071533203125, 0.007144927978515625, -0.01708984375, -0.0138092041015625, -0.004306793212890625, -0.04803466796875, 0.05810546875, 0.02655029296875, -0.00591278076171875, 0.006866455078125, 0.0115509033203125, -0.01001739501953125, -0.00832366943359375, -0.03094482421875, 0.05438232421875, 0.030548095703125, -0.023284912109375, 0.054290771484375, 0.007534027099609375, 0.0478515625, -0.0701904296875, 0.0009145736694335938, 0.002666473388671875, 0.03216552734375, -0.01393890380859375, -0.0294647216796875, 0.0162811279296875, 0.0018157958984375, -0.0218963623046875, -0.0110626220703125, 0.046722412109375, -0.05169677734375, -0.03240966796875, 0.0517578125, 0.017791748046875, 0.0174407958984375, 0.0127410888671875, -0.057525634765625, 0.02301025390625, 0.00621795654296875, -0.01025390625, 0.01454925537109375, 0.034820556640625, 0.0033893585205078125, 0.0236968994140625, 0.04998779296875, 0.021453857421875, 0.00482940673828125, 0.0279693603515625, 0.05523681640625, -0.04547119140625, -0.063720703125, -0.0308380126953125, 0.029205322265625, 0.0146636962890625, -0.003803253173828125, 0.037872314453125, 0.049102783203125, 0.10882568359375, 0.005077362060546875, 0.05877685546875, 0.015655517578125, 0.06964111328125, -0.048309326171875, 0.057952880859375, -0.043365478515625, -0.0034618377685546875, -0.00893402099609375, -0.045318603515625, -0.0166473388671875, 0.0667724609375, 0.0015888214111328125, 0.0175018310546875, 0.05224609375, 0.067138671875, -0.01080322265625, -0.00537109375, 0.0234222412109375, 0.0220794677734375, 0.0035343170166015625, 0.0284576416015625, 0.0628662109375, -0.040863037109375, 0.07757568359375, -0.03118896484375, -0.01259613037109375, 0.0020904541015625, -0.039154052734375, -0.05511474609375, -0.0589599609375, -0.030303955078125, -0.0240325927734375, 0.017578125, 0.074951171875, 0.0828857421875, -0.07611083984375, -0.033233642578125, 0.00600433349609375, -0.02008056640625, -0.0226898193359375, -0.0087432861328125, 0.0257568359375, -0.0206146240234375, -0.04901123046875, 0.04412841796875, -0.0008211135864257812, 0.019073486328125, -0.0030689239501953125, -0.034271240234375, -0.00981903076171875, -0.008453369140625, 0.0396728515625, 0.0286865234375, -0.044891357421875, 0.0088348388671875, 0.0009164810180664062, -0.01433563232421875, 0.0180206298828125, 0.057525634765625, -0.054290771484375, 0.01532745361328125, 0.0357666015625, 0.0218048095703125, 0.0672607421875, -0.02301025390625, 0.02545166015625, -0.05914306640625, 0.030181884765625, 0.0228729248046875, 0.0301361083984375, 0.0335693359375, -0.0007476806640625, 0.01416778564453125, 0.01035308837890625, -0.03668212890625, -0.06365966796875, 0.0169677734375, -0.09161376953125, -0.03167724609375, 0.08953857421875, -0.006824493408203125, -0.01512908935546875, -0.00380706787109375, -0.01497650146484375, 0.061431884765625, -0.037109375, 0.01983642578125, 0.0186920166015625, 0.0094757080078125, -0.021636962890625, -0.0215606689453125, 0.040008544921875, 0.0298919677734375, -0.022003173828125, 0.0038776397705078125, 0.030853271484375, 0.034088134765625, 0.005413055419921875, 0.055572509765625, -0.0250396728515625, 0.0211029052734375, 0.0022525787353515625, 0.0209197998046875, -0.0023193359375, -0.0284576416015625, -0.041015625, -0.007389068603515625, -0.0009517669677734375, -0.0224761962890625 ] ]
microsoft/swin-tiny-patch4-window7-224
2023-09-15T19:59:37.000Z
[ "transformers", "pytorch", "tf", "safetensors", "swin", "image-classification", "vision", "dataset:imagenet-1k", "arxiv:2103.14030", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
microsoft
null
null
microsoft/swin-tiny-patch4-window7-224
19
106,273
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - vision - image-classification datasets: - imagenet-1k widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg example_title: Tiger - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/teapot.jpg example_title: Teapot - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/palace.jpg example_title: Palace --- # Swin Transformer (tiny-sized model) Swin Transformer model trained on ImageNet-1k at resolution 224x224. It was introduced in the paper [Swin Transformer: Hierarchical Vision Transformer using Shifted Windows](https://arxiv.org/abs/2103.14030) by Liu et al. and first released in [this repository](https://github.com/microsoft/Swin-Transformer). Disclaimer: The team releasing Swin Transformer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The Swin Transformer is a type of Vision Transformer. It builds hierarchical feature maps by merging image patches (shown in gray) in deeper layers and has linear computation complexity to input image size due to computation of self-attention only within each local window (shown in red). It can thus serve as a general-purpose backbone for both image classification and dense recognition tasks. In contrast, previous vision Transformers produce feature maps of a single low resolution and have quadratic computation complexity to input image size due to computation of self-attention globally. ![model image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/swin_transformer_architecture.png) [Source](https://paperswithcode.com/method/swin-transformer) ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=swin) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import AutoImageProcessor, AutoModelForImageClassification from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained("microsoft/swin-tiny-patch4-window7-224") model = AutoModelForImageClassification.from_pretrained("microsoft/swin-tiny-patch4-window7-224") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/swin.html#). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2103-14030, author = {Ze Liu and Yutong Lin and Yue Cao and Han Hu and Yixuan Wei and Zheng Zhang and Stephen Lin and Baining Guo}, title = {Swin Transformer: Hierarchical Vision Transformer using Shifted Windows}, journal = {CoRR}, volume = {abs/2103.14030}, year = {2021}, url = {https://arxiv.org/abs/2103.14030}, eprinttype = {arXiv}, eprint = {2103.14030}, timestamp = {Thu, 08 Apr 2021 07:53:26 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2103-14030.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
3,684
[ [ -0.0467529296875, -0.0283660888671875, -0.0100555419921875, 0.006923675537109375, -0.005573272705078125, -0.02435302734375, -0.0018491744995117188, -0.058837890625, 0.006748199462890625, 0.0212860107421875, -0.0447998046875, -0.01221466064453125, -0.040130615234375, -0.00531005859375, -0.0323486328125, 0.06573486328125, 0.0014829635620117188, -0.01261138916015625, -0.019073486328125, -0.0162811279296875, -0.01517486572265625, -0.011077880859375, -0.04241943359375, -0.0186309814453125, 0.038360595703125, 0.0167999267578125, 0.059051513671875, 0.040924072265625, 0.06524658203125, 0.035919189453125, -0.0025920867919921875, -0.00626373291015625, -0.02099609375, -0.0260009765625, 0.00855255126953125, -0.036376953125, -0.036529541015625, 0.01088714599609375, 0.037353515625, 0.03302001953125, 0.0152435302734375, 0.036376953125, 0.00920867919921875, 0.036376953125, -0.0279541015625, 0.0101470947265625, -0.04693603515625, 0.019073486328125, -0.0026912689208984375, -0.0015745162963867188, -0.023712158203125, -0.0069580078125, 0.01523590087890625, -0.03485107421875, 0.059661865234375, 0.0248260498046875, 0.105712890625, 0.01001739501953125, -0.017303466796875, 0.01435089111328125, -0.036529541015625, 0.0650634765625, -0.052978515625, 0.01360321044921875, -0.0005331039428710938, 0.039794921875, 0.0136871337890625, -0.06695556640625, -0.03619384765625, -0.0015916824340820312, -0.0307464599609375, 0.00782012939453125, -0.03448486328125, 0.00665283203125, 0.031982421875, 0.032257080078125, -0.036773681640625, 0.01430511474609375, -0.05426025390625, -0.02197265625, 0.056671142578125, 0.00907135009765625, 0.0258026123046875, -0.0276336669921875, -0.04730224609375, -0.028167724609375, -0.0218963623046875, 0.01605224609375, -0.004711151123046875, 0.01407623291015625, -0.0361328125, 0.039886474609375, 0.01532745361328125, 0.044586181640625, 0.0303192138671875, -0.0229034423828125, 0.035400390625, -0.0193634033203125, -0.02667236328125, -0.00730133056640625, 0.06890869140625, 0.038665771484375, 0.0031833648681640625, 0.01366424560546875, -0.0225067138671875, 0.0007128715515136719, 0.0228424072265625, -0.07354736328125, -0.0171051025390625, 0.00252532958984375, -0.051239013671875, -0.04547119140625, 0.0044403076171875, -0.04986572265625, -0.002887725830078125, -0.015655517578125, 0.028656005859375, -0.013916015625, -0.0214996337890625, -0.034515380859375, -0.01313018798828125, 0.042999267578125, 0.030792236328125, -0.052886962890625, 0.00977325439453125, 0.0174407958984375, 0.0743408203125, -0.023345947265625, -0.0340576171875, 0.007251739501953125, -0.017578125, -0.02679443359375, 0.045135498046875, -0.00382232666015625, -0.01047515869140625, -0.00948333740234375, 0.0347900390625, -0.018280029296875, -0.03778076171875, 0.0218505859375, -0.03271484375, 0.01160430908203125, 0.006275177001953125, -0.00656890869140625, -0.0160369873046875, 0.0174713134765625, -0.050445556640625, 0.08551025390625, 0.031280517578125, -0.07537841796875, 0.01549530029296875, -0.039276123046875, -0.0201568603515625, 0.01189422607421875, -0.0005679130554199219, -0.0484619140625, 0.00484466552734375, 0.01317596435546875, 0.038665771484375, -0.01317596435546875, 0.01953125, -0.031402587890625, -0.0221405029296875, 0.004791259765625, -0.0261688232421875, 0.0753173828125, 0.01287078857421875, -0.036712646484375, 0.02056884765625, -0.04205322265625, -0.002838134765625, 0.036651611328125, 0.008026123046875, -0.016815185546875, -0.034454345703125, 0.026458740234375, 0.04010009765625, 0.0270233154296875, -0.045745849609375, 0.0231170654296875, -0.0217132568359375, 0.035675048828125, 0.05010986328125, -0.006771087646484375, 0.04864501953125, -0.0218963623046875, 0.0290069580078125, 0.0215911865234375, 0.0496826171875, -0.01104736328125, -0.047515869140625, -0.07379150390625, -0.01183319091796875, 0.0088958740234375, 0.0304718017578125, -0.050445556640625, 0.038299560546875, -0.031951904296875, -0.05181884765625, -0.04583740234375, -0.005157470703125, 0.0028553009033203125, 0.044342041015625, 0.036956787109375, -0.01284027099609375, -0.05987548828125, -0.0855712890625, 0.010498046875, -0.0007114410400390625, -0.0029125213623046875, 0.026763916015625, 0.06231689453125, -0.043701171875, 0.07012939453125, -0.025299072265625, -0.0304107666015625, -0.0203094482421875, 0.00626373291015625, 0.0214385986328125, 0.043060302734375, 0.0556640625, -0.06292724609375, -0.036468505859375, -0.0024356842041015625, -0.058990478515625, 0.007740020751953125, -0.01129913330078125, -0.0138397216796875, 0.028656005859375, 0.018829345703125, -0.034698486328125, 0.061248779296875, 0.045989990234375, -0.020721435546875, 0.04931640625, 0.00235748291015625, 0.01088714599609375, -0.06842041015625, 0.005062103271484375, 0.019989013671875, -0.01296234130859375, -0.0386962890625, 0.00078582763671875, 0.0203857421875, -0.014251708984375, -0.042236328125, 0.0428466796875, -0.0294647216796875, -0.007007598876953125, -0.01593017578125, 0.001728057861328125, 0.003887176513671875, 0.051177978515625, 0.0098876953125, 0.0254974365234375, 0.054473876953125, -0.033935546875, 0.02862548828125, 0.02374267578125, -0.02392578125, 0.033843994140625, -0.06341552734375, -0.01171875, 0.00391387939453125, 0.0189208984375, -0.0709228515625, -0.003902435302734375, 0.0016202926635742188, -0.030364990234375, 0.03765869140625, -0.024932861328125, -0.019989013671875, -0.06561279296875, -0.0257568359375, 0.038818359375, 0.047454833984375, -0.060028076171875, 0.04730224609375, 0.01107025146484375, 0.010650634765625, -0.046112060546875, -0.083984375, -0.0096435546875, -0.0016183853149414062, -0.0682373046875, 0.039215087890625, 0.0012731552124023438, 0.0013513565063476562, 0.0168304443359375, -0.01374053955078125, 0.004749298095703125, -0.0198822021484375, 0.0372314453125, 0.0672607421875, -0.02178955078125, -0.021881103515625, -0.0007276535034179688, -0.01044464111328125, -0.0006313323974609375, -0.002223968505859375, 0.0202484130859375, -0.04400634765625, -0.0119476318359375, -0.033966064453125, 0.0022792816162109375, 0.0537109375, 0.0020656585693359375, 0.045806884765625, 0.0743408203125, -0.026763916015625, -0.00814056396484375, -0.0411376953125, -0.025177001953125, -0.04180908203125, 0.02032470703125, -0.0237884521484375, -0.04833984375, 0.04852294921875, 0.00891876220703125, 0.01305389404296875, 0.0692138671875, 0.0215606689453125, -0.02215576171875, 0.07843017578125, 0.03973388671875, -0.00005793571472167969, 0.04681396484375, -0.06512451171875, 0.01470947265625, -0.0648193359375, -0.02655029296875, -0.020416259765625, -0.053192138671875, -0.052581787109375, -0.03619384765625, 0.0264892578125, 0.0013055801391601562, -0.0307464599609375, 0.0506591796875, -0.06524658203125, 0.0048828125, 0.04461669921875, 0.010986328125, -0.007198333740234375, 0.01422119140625, -0.01261138916015625, -0.009429931640625, -0.0616455078125, -0.00307464599609375, 0.0455322265625, 0.040802001953125, 0.064697265625, -0.0222625732421875, 0.038665771484375, 0.0135650634765625, 0.0215606689453125, -0.05963134765625, 0.051116943359375, -0.001068115234375, -0.052276611328125, -0.0186767578125, -0.0243072509765625, -0.074462890625, 0.02215576171875, -0.0254669189453125, -0.046783447265625, 0.047027587890625, 0.00913238525390625, 0.005390167236328125, 0.04901123046875, -0.04986572265625, 0.0693359375, -0.0272979736328125, -0.025054931640625, 0.0037403106689453125, -0.063720703125, 0.02264404296875, 0.01493072509765625, 0.0030956268310546875, -0.00047016143798828125, 0.0117034912109375, 0.061737060546875, -0.043304443359375, 0.08306884765625, -0.0252685546875, 0.0195465087890625, 0.0377197265625, -0.0016565322875976562, 0.033935546875, -0.02093505859375, 0.0209503173828125, 0.04937744140625, 0.010040283203125, -0.034027099609375, -0.04827880859375, 0.0531005859375, -0.0740966796875, -0.032012939453125, -0.037689208984375, -0.031829833984375, 0.0131683349609375, 0.0211639404296875, 0.051177978515625, 0.0386962890625, 0.0035858154296875, 0.0252227783203125, 0.032806396484375, -0.00997161865234375, 0.04461669921875, 0.0128936767578125, -0.0240936279296875, -0.013214111328125, 0.0572509765625, 0.01556396484375, 0.01158905029296875, 0.0267181396484375, 0.0286407470703125, -0.01340484619140625, -0.0169677734375, -0.0297393798828125, 0.0197296142578125, -0.049530029296875, -0.045867919921875, -0.03521728515625, -0.05133056640625, -0.042083740234375, -0.031707763671875, -0.03656005859375, -0.02911376953125, -0.0210113525390625, 0.004344940185546875, 0.02459716796875, 0.041778564453125, 0.0014514923095703125, 0.009552001953125, -0.043212890625, 0.01457977294921875, 0.028289794921875, 0.022430419921875, 0.01800537109375, -0.0662841796875, -0.0158843994140625, 0.0005331039428710938, -0.031158447265625, -0.042205810546875, 0.04241943359375, 0.01442718505859375, 0.039764404296875, 0.0419921875, 0.00669097900390625, 0.0609130859375, -0.024505615234375, 0.061248779296875, 0.047332763671875, -0.047149658203125, 0.05291748046875, -0.00360870361328125, 0.02783203125, 0.0136260986328125, 0.02886962890625, -0.0271148681640625, -0.0159759521484375, -0.07122802734375, -0.06500244140625, 0.056884765625, 0.005039215087890625, -0.0038967132568359375, 0.021728515625, 0.0126495361328125, 0.0019893646240234375, -0.0052337646484375, -0.0616455078125, -0.039794921875, -0.0462646484375, -0.01422882080078125, -0.001811981201171875, -0.0120849609375, -0.0122528076171875, -0.06280517578125, 0.0458984375, -0.01233673095703125, 0.0572509765625, 0.02264404296875, -0.019073486328125, -0.0130767822265625, -0.007396697998046875, 0.020538330078125, 0.019287109375, -0.01171875, 0.0064544677734375, 0.0156402587890625, -0.04937744140625, -0.01163482666015625, -0.0014677047729492188, -0.022308349609375, -0.0026035308837890625, 0.04132080078125, 0.08251953125, 0.0279541015625, -0.0095977783203125, 0.0653076171875, 0.0029392242431640625, -0.039947509765625, -0.0321044921875, 0.00656890869140625, -0.0106201171875, 0.02215576171875, 0.034637451171875, 0.052520751953125, 0.0088043212890625, -0.0221710205078125, 0.0028514862060546875, 0.0194549560546875, -0.0341796875, -0.023345947265625, 0.048004150390625, 0.0043487548828125, -0.01015472412109375, 0.06256103515625, 0.00498199462890625, -0.0396728515625, 0.06402587890625, 0.0528564453125, 0.061676025390625, -0.0035247802734375, 0.0112152099609375, 0.0577392578125, 0.03173828125, 0.00046753883361816406, -0.0116119384765625, -0.002834320068359375, -0.06280517578125, -0.003459930419921875, -0.041778564453125, -0.01029205322265625, 0.0096435546875, -0.056365966796875, 0.0311279296875, -0.02667236328125, -0.0208587646484375, 0.008392333984375, 0.0173797607421875, -0.080322265625, 0.01580810546875, 0.019195556640625, 0.0869140625, -0.06597900390625, 0.062042236328125, 0.04913330078125, -0.0309600830078125, -0.066650390625, -0.038818359375, -0.0026798248291015625, -0.06878662109375, 0.03887939453125, 0.026885986328125, -0.00443267822265625, -0.00829315185546875, -0.0770263671875, -0.055633544921875, 0.11566162109375, 0.0006647109985351562, -0.0474853515625, -0.01059722900390625, -0.0092010498046875, 0.035125732421875, -0.03521728515625, 0.03070068359375, 0.021270751953125, 0.0419921875, 0.02679443359375, -0.048858642578125, 0.01290130615234375, -0.0416259765625, 0.025421142578125, -0.00458526611328125, -0.048614501953125, 0.050048828125, -0.0394287109375, -0.0110015869140625, -0.0017232894897460938, 0.050537109375, 0.002559661865234375, 0.023223876953125, 0.052978515625, 0.03436279296875, 0.03582763671875, -0.027374267578125, 0.07708740234375, -0.009033203125, 0.045562744140625, 0.061553955078125, 0.02081298828125, 0.051666259765625, 0.031585693359375, -0.0238037109375, 0.054229736328125, 0.054595947265625, -0.05096435546875, 0.0264892578125, -0.01279449462890625, 0.0184173583984375, -0.0018758773803710938, 0.01284027099609375, -0.032684326171875, 0.022735595703125, 0.0220489501953125, -0.037811279296875, 0.00963592529296875, 0.0238037109375, -0.027862548828125, -0.0333251953125, -0.03662109375, 0.0323486328125, 0.0021648406982421875, -0.03363037109375, 0.05499267578125, -0.01181793212890625, 0.0780029296875, -0.042236328125, 0.0131683349609375, -0.0136566162109375, 0.0113372802734375, -0.039886474609375, -0.054595947265625, 0.0183258056640625, -0.0236663818359375, -0.01384735107421875, -0.002346038818359375, 0.08856201171875, -0.0236053466796875, -0.042388916015625, 0.02685546875, 0.0087890625, 0.01313018798828125, 0.01039886474609375, -0.079345703125, 0.006412506103515625, -0.0007867813110351562, -0.036895751953125, 0.0265045166015625, 0.01788330078125, -0.00267791748046875, 0.05718994140625, 0.03314208984375, -0.005523681640625, 0.0169830322265625, 0.0005397796630859375, 0.0628662109375, -0.04681396484375, -0.0287017822265625, -0.0236358642578125, 0.044097900390625, -0.0185699462890625, -0.022064208984375, 0.06201171875, 0.0355224609375, 0.055511474609375, -0.0273284912109375, 0.049346923828125, -0.032562255859375, 0.0019321441650390625, 0.00980377197265625, 0.046173095703125, -0.0577392578125, -0.00908660888671875, -0.0218963623046875, -0.05517578125, -0.019866943359375, 0.05792236328125, -0.0251312255859375, 0.0176544189453125, 0.04986572265625, 0.06610107421875, -0.016357421875, 0.0013589859008789062, 0.033294677734375, 0.0202789306640625, -0.004833221435546875, 0.0189666748046875, 0.0372314453125, -0.06658935546875, 0.041595458984375, -0.058685302734375, -0.0223388671875, -0.039459228515625, -0.04620361328125, -0.06640625, -0.060516357421875, -0.03741455078125, -0.043060302734375, -0.033050537109375, 0.0484619140625, 0.08111572265625, -0.07342529296875, -0.0021114349365234375, -0.001071929931640625, 0.004993438720703125, -0.040435791015625, -0.0255889892578125, 0.033050537109375, -0.0089111328125, -0.05218505859375, -0.007762908935546875, 0.007781982421875, 0.0239715576171875, -0.0262908935546875, -0.0178070068359375, -0.00418853759765625, -0.01439666748046875, 0.0499267578125, 0.02825927734375, -0.045623779296875, -0.0196685791015625, 0.00826263427734375, -0.024566650390625, 0.01800537109375, 0.047515869140625, -0.045318603515625, 0.0215606689453125, 0.043365478515625, 0.02008056640625, 0.0635986328125, -0.01313018798828125, 0.0006260871887207031, -0.047119140625, 0.024078369140625, 0.01291656494140625, 0.041168212890625, 0.02069091796875, -0.0269927978515625, 0.0406494140625, 0.0350341796875, -0.04766845703125, -0.0535888671875, -0.005725860595703125, -0.1058349609375, -0.0157928466796875, 0.07403564453125, -0.005229949951171875, -0.0411376953125, 0.0079345703125, -0.01082611083984375, 0.015869140625, -0.015289306640625, 0.03619384765625, 0.01013946533203125, -0.0035858154296875, -0.040771484375, -0.0261688232421875, 0.0168609619140625, -0.01064300537109375, -0.033447265625, -0.0267333984375, 0.007965087890625, 0.03350830078125, 0.035125732421875, 0.01690673828125, -0.027587890625, 0.0156402587890625, 0.024566650390625, 0.03753662109375, -0.0078125, -0.024505615234375, -0.0210418701171875, 0.004383087158203125, -0.0232696533203125, -0.0184783935546875 ] ]
Einmalumdiewelt/T5-Base_GNAD
2022-08-26T15:55:55.000Z
[ "transformers", "pytorch", "t5", "text2text-generation", "generated_from_trainer", "summarization", "de", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
summarization
Einmalumdiewelt
null
null
Einmalumdiewelt/T5-Base_GNAD
14
105,443
transformers
2022-03-02T23:29:04
--- language: - de tags: - generated_from_trainer - summarization metrics: - rouge model-index: - name: T5-Base_GNAD results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # T5-Base_GNAD This model is a fine-tuned version of [Einmalumdiewelt/T5-Base_GNAD](https://huggingface.co/Einmalumdiewelt/T5-Base_GNAD) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1025 - Rouge1: 27.5357 - Rouge2: 8.5623 - Rougel: 19.1508 - Rougelsum: 23.9029 - Gen Len: 52.7253 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10.0 ### Training results ### Framework versions - Transformers 4.22.0.dev0 - Pytorch 1.12.0+cu113 - Datasets 2.4.0 - Tokenizers 0.12.1
1,242
[ [ -0.032196044921875, -0.03662109375, 0.00975799560546875, 0.01120758056640625, -0.035186767578125, -0.030120849609375, -0.00801849365234375, -0.0189208984375, 0.00739288330078125, 0.0174560546875, -0.05230712890625, -0.056396484375, -0.057708740234375, -0.002544403076171875, -0.0195465087890625, 0.08978271484375, 0.005863189697265625, 0.035369873046875, -0.00041294097900390625, -0.016937255859375, -0.033355712890625, -0.04010009765625, -0.053741455078125, -0.05029296875, 0.033843994140625, 0.015625, 0.05242919921875, 0.07574462890625, 0.046051025390625, 0.01708984375, -0.0263824462890625, -0.01450347900390625, -0.05999755859375, -0.02728271484375, -0.0183868408203125, -0.036468505859375, -0.066650390625, -0.00795745849609375, 0.0528564453125, 0.023681640625, -0.01861572265625, 0.043609619140625, -0.0009665489196777344, 0.0221099853515625, -0.0457763671875, 0.01580810546875, -0.03668212890625, 0.0224151611328125, -0.0143585205078125, -0.0236053466796875, -0.024017333984375, -0.00467681884765625, -0.0042877197265625, -0.0298309326171875, 0.052154541015625, 0.0024471282958984375, 0.09027099609375, 0.02435302734375, -0.01027679443359375, 0.01186370849609375, -0.06488037109375, 0.044464111328125, -0.0401611328125, 0.03369140625, 0.02593994140625, 0.03912353515625, 0.006824493408203125, -0.052520751953125, -0.0178375244140625, -0.016571044921875, 0.00432586669921875, 0.00936126708984375, -0.0183563232421875, 0.005725860595703125, 0.0596923828125, 0.0302581787109375, -0.039031982421875, 0.006717681884765625, -0.056182861328125, -0.01450347900390625, 0.0458984375, 0.037017822265625, -0.004978179931640625, -0.00356292724609375, -0.03509521484375, -0.0080108642578125, -0.03961181640625, -0.0054779052734375, 0.0521240234375, 0.012176513671875, -0.032318115234375, 0.05084228515625, -0.017303466796875, 0.052520751953125, 0.0139312744140625, -0.0062408447265625, 0.04046630859375, -0.00304412841796875, -0.0340576171875, -0.0085906982421875, 0.05035400390625, 0.04217529296875, 0.027069091796875, -0.009796142578125, -0.03271484375, 0.0007495880126953125, 0.0261077880859375, -0.07635498046875, -0.0242919921875, 0.006748199462890625, -0.042724609375, -0.051177978515625, 0.004688262939453125, -0.0357666015625, 0.00628662109375, -0.031097412109375, 0.048553466796875, -0.0311126708984375, -0.0180206298828125, 0.0016679763793945312, -0.01385498046875, 0.020477294921875, 0.0169525146484375, -0.066162109375, 0.032073974609375, 0.02374267578125, 0.043609619140625, 0.0078887939453125, -0.0236968994140625, -0.0237274169921875, 0.005855560302734375, -0.0199432373046875, 0.0265350341796875, -0.014862060546875, -0.04156494140625, -0.0167236328125, 0.01079559326171875, -0.00494384765625, -0.038421630859375, 0.0672607421875, -0.0249481201171875, 0.0181427001953125, -0.0147857666015625, -0.043731689453125, -0.0055999755859375, 0.033782958984375, -0.054473876953125, 0.0665283203125, 0.0083465576171875, -0.054595947265625, 0.047637939453125, -0.048919677734375, 0.0034332275390625, 0.00439453125, -0.00644683837890625, -0.057403564453125, 0.005584716796875, 0.005870819091796875, 0.0308990478515625, -0.0234375, 0.0201568603515625, -0.038818359375, -0.050750732421875, -0.0121002197265625, -0.0450439453125, 0.048797607421875, 0.00356292724609375, -0.0299072265625, 0.0210113525390625, -0.0882568359375, 0.0203857421875, 0.015228271484375, -0.0304107666015625, 0.01224517822265625, -0.0304107666015625, 0.032623291015625, 0.0153350830078125, 0.0133209228515625, -0.0399169921875, 0.0177459716796875, -0.0197601318359375, 0.00740814208984375, 0.04443359375, 0.00342559814453125, 0.004955291748046875, -0.034210205078125, 0.0268707275390625, 0.01509857177734375, 0.0298309326171875, 0.03765869140625, -0.030120849609375, -0.06463623046875, -0.0138092041015625, 0.03411865234375, 0.0181427001953125, -0.018463134765625, 0.04302978515625, -0.01461029052734375, -0.0555419921875, -0.0306243896484375, 0.0032787322998046875, 0.03662109375, 0.043853759765625, 0.04205322265625, -0.0112457275390625, -0.03839111328125, -0.0950927734375, -0.00278472900390625, 0.004108428955078125, 0.00930023193359375, 0.0010051727294921875, 0.05340576171875, -0.0164031982421875, 0.0496826171875, -0.0294647216796875, -0.00852203369140625, -0.02325439453125, 0.006412506103515625, 0.0240325927734375, 0.060302734375, 0.0594482421875, -0.03302001953125, -0.0164947509765625, -0.0177001953125, -0.057403564453125, 0.0166778564453125, 0.0057830810546875, -0.02154541015625, -0.00879669189453125, 0.01541900634765625, -0.0445556640625, 0.060302734375, 0.012451171875, -0.0228118896484375, 0.050933837890625, -0.032958984375, -0.01611328125, -0.08428955078125, 0.02203369140625, 0.0254364013671875, -0.007266998291015625, -0.02532958984375, 0.003063201904296875, 0.01190185546875, -0.00659942626953125, -0.0333251953125, 0.052398681640625, -0.01290130615234375, 0.00133514404296875, -0.00952911376953125, -0.0377197265625, -0.0072174072265625, 0.053466796875, 0.0005035400390625, 0.042205810546875, 0.039642333984375, -0.042938232421875, 0.033294677734375, 0.03363037109375, -0.00879669189453125, 0.03985595703125, -0.070556640625, 0.007656097412109375, -0.0097808837890625, -0.00701904296875, -0.04034423828125, -0.0205230712890625, 0.04266357421875, -0.03424072265625, 0.02667236328125, -0.0230865478515625, -0.021575927734375, -0.0214385986328125, -0.010040283203125, 0.0248565673828125, 0.03839111328125, -0.047698974609375, 0.036895751953125, -0.0200042724609375, 0.03515625, -0.0533447265625, -0.0472412109375, -0.020660400390625, -0.020660400390625, -0.03045654296875, 0.019500732421875, -0.005970001220703125, 0.005615234375, 0.00940704345703125, -0.00812530517578125, -0.002471923828125, -0.006015777587890625, 0.02301025390625, 0.0228729248046875, -0.0164642333984375, 0.004230499267578125, -0.005550384521484375, -0.0194549560546875, 0.02447509765625, -0.0011930465698242188, 0.039031982421875, -0.017059326171875, -0.0214691162109375, -0.05523681640625, -0.003475189208984375, 0.0419921875, -0.007740020751953125, 0.061859130859375, 0.07421875, -0.04388427734375, -0.01239013671875, -0.03271484375, -0.006649017333984375, -0.0296783447265625, 0.051910400390625, -0.045074462890625, -0.005573272705078125, 0.046051025390625, 0.0179443359375, -0.0018520355224609375, 0.07708740234375, 0.047027587890625, 0.01885986328125, 0.0908203125, 0.034088134765625, -0.00054931640625, 0.0237579345703125, -0.06341552734375, -0.0206298828125, -0.058624267578125, -0.036041259765625, -0.035980224609375, -0.03271484375, -0.051055908203125, 0.00569915771484375, 0.016448974609375, 0.01166534423828125, -0.054595947265625, 0.017974853515625, -0.0289306640625, 0.03143310546875, 0.050872802734375, 0.038299560546875, 0.0015544891357421875, 0.0122833251953125, -0.0222015380859375, -0.0055999755859375, -0.05999755859375, -0.0322265625, 0.10040283203125, 0.037628173828125, 0.059783935546875, 0.0009379386901855469, 0.055206298828125, 0.00502777099609375, 0.0280914306640625, -0.047607421875, 0.03125, 0.008148193359375, -0.06060791015625, -0.01033782958984375, -0.0328369140625, -0.05169677734375, 0.0138092041015625, -0.034881591796875, -0.051239013671875, 0.0066986083984375, 0.0224151611328125, -0.016693115234375, 0.02972412109375, -0.045989990234375, 0.08587646484375, -0.024993896484375, -0.046783447265625, -0.0197601318359375, -0.033599853515625, 0.015228271484375, 0.016357421875, -0.032623291015625, -0.00005447864532470703, 0.01375579833984375, 0.0609130859375, -0.04541015625, 0.04254150390625, -0.034149169921875, 0.036834716796875, 0.00919342041015625, -0.0178375244140625, 0.0372314453125, 0.018798828125, -0.016815185546875, 0.01263427734375, -0.01357269287109375, -0.0469970703125, -0.035552978515625, 0.050018310546875, -0.0980224609375, -0.0230712890625, -0.0340576171875, -0.0330810546875, -0.01520538330078125, 0.0142974853515625, 0.052764892578125, 0.0638427734375, -0.020751953125, 0.020782470703125, 0.03204345703125, 0.006633758544921875, 0.0240631103515625, 0.00870513916015625, -0.001728057861328125, -0.047576904296875, 0.05987548828125, -0.0044708251953125, 0.018798828125, -0.002849578857421875, 0.0100860595703125, -0.03509521484375, -0.047454833984375, -0.05084228515625, 0.0252685546875, -0.058502197265625, -0.015838623046875, -0.02581787109375, -0.03839111328125, -0.00803375244140625, 0.01090240478515625, -0.024871826171875, -0.008392333984375, -0.039031982421875, -0.024810791015625, 0.01654052734375, 0.051483154296875, -0.002048492431640625, 0.044464111328125, -0.0491943359375, -0.0015630722045898438, 0.0064239501953125, 0.03863525390625, 0.00592041015625, -0.05718994140625, -0.031402587890625, 0.0067596435546875, -0.029571533203125, -0.046295166015625, 0.03741455078125, 0.006122589111328125, 0.0538330078125, 0.02935791015625, -0.0264739990234375, 0.06365966796875, -0.0160369873046875, 0.04095458984375, 0.020233154296875, -0.04071044921875, 0.020782470703125, -0.0269927978515625, 0.0289764404296875, 0.03546142578125, 0.037933349609375, 0.00510406494140625, -0.0018758773803710938, -0.1031494140625, -0.04925537109375, 0.0631103515625, 0.033294677734375, 0.00782012939453125, 0.0249481201171875, 0.0307769775390625, -0.0005517005920410156, 0.026519775390625, -0.0504150390625, -0.0300445556640625, -0.0188140869140625, -0.01044464111328125, -0.00969696044921875, -0.0266571044921875, -0.02099609375, -0.04522705078125, 0.07904052734375, 0.0030956268310546875, 0.034881591796875, 0.0007443428039550781, -0.0035610198974609375, -0.0256195068359375, -0.0022640228271484375, 0.0548095703125, 0.058502197265625, -0.060089111328125, -0.0170745849609375, 0.0218048095703125, -0.04071044921875, 0.0019626617431640625, 0.01473236083984375, -0.0087738037109375, 0.017608642578125, 0.0291595458984375, 0.089599609375, 0.0016345977783203125, 0.004367828369140625, 0.029937744140625, -0.016845703125, -0.033447265625, -0.0286712646484375, 0.022064208984375, -0.0082550048828125, 0.01323699951171875, 0.0091705322265625, 0.042938232421875, 0.002529144287109375, -0.01218414306640625, 0.015228271484375, 0.005523681640625, -0.03228759765625, -0.031707763671875, 0.06634521484375, 0.00733184814453125, -0.0237579345703125, 0.057220458984375, -0.01148223876953125, -0.0166473388671875, 0.065673828125, 0.040740966796875, 0.07318115234375, -0.006389617919921875, -0.0036334991455078125, 0.06243896484375, 0.00551605224609375, -0.0137481689453125, 0.037445068359375, 0.01364898681640625, -0.04376220703125, -0.01123046875, -0.0548095703125, -0.0220489501953125, 0.04998779296875, -0.07861328125, 0.05194091796875, -0.049468994140625, -0.02630615234375, 0.0208740234375, 0.00830078125, -0.086669921875, 0.05084228515625, 0.00972747802734375, 0.08062744140625, -0.0677490234375, 0.062408447265625, 0.053955078125, -0.04150390625, -0.07867431640625, -0.01029205322265625, -0.0162811279296875, -0.063720703125, 0.0556640625, 0.0016222000122070312, 0.0189666748046875, 0.01132965087890625, -0.045745849609375, -0.056243896484375, 0.077392578125, 0.0312042236328125, -0.042877197265625, -0.00647735595703125, 0.01507568359375, 0.05511474609375, -0.0202178955078125, 0.056793212890625, 0.01544189453125, 0.0149993896484375, 0.030792236328125, -0.0672607421875, -0.01169586181640625, -0.032135009765625, 0.016693115234375, 0.01116943359375, -0.0482177734375, 0.07568359375, -0.003742218017578125, 0.034332275390625, 0.014862060546875, 0.045074462890625, 0.0146026611328125, 0.004505157470703125, 0.0235748291015625, 0.0648193359375, 0.034637451171875, -0.00499725341796875, 0.076171875, -0.033935546875, 0.061370849609375, 0.077880859375, 0.01171875, 0.03814697265625, 0.0234527587890625, -0.0016117095947265625, 0.0108795166015625, 0.05841064453125, -0.037872314453125, 0.037567138671875, 0.0201568603515625, -0.0028514862060546875, -0.0309295654296875, 0.016143798828125, -0.05340576171875, 0.03204345703125, 0.0146026611328125, -0.056488037109375, -0.041046142578125, -0.024505615234375, -0.010772705078125, -0.03009033203125, -0.04168701171875, 0.029998779296875, -0.0177001953125, -0.02392578125, 0.061553955078125, 0.016082763671875, 0.0237274169921875, -0.041046142578125, -0.00794219970703125, -0.00794219970703125, 0.0391845703125, -0.0235443115234375, -0.0268707275390625, 0.0206756591796875, -0.00278472900390625, -0.0133514404296875, 0.0029125213623046875, 0.047760009765625, -0.02606201171875, -0.060089111328125, 0.00855255126953125, 0.03814697265625, 0.0117034912109375, 0.01128387451171875, -0.070068359375, -0.0038738250732421875, -0.006877899169921875, -0.0309906005859375, 0.01491546630859375, 0.016693115234375, -0.005237579345703125, 0.0386962890625, 0.033233642578125, 0.00555419921875, -0.0036907196044921875, 0.01300811767578125, 0.06854248046875, -0.043975830078125, -0.04083251953125, -0.060302734375, 0.041595458984375, -0.016357421875, -0.074462890625, 0.0357666015625, 0.07135009765625, 0.07464599609375, -0.028350830078125, 0.04827880859375, 0.003330230712890625, 0.020477294921875, -0.03228759765625, 0.049468994140625, -0.03790283203125, -0.0025844573974609375, -0.0112762451171875, -0.07501220703125, -0.0044403076171875, 0.045166015625, -0.0271759033203125, 0.020233154296875, 0.037322998046875, 0.05743408203125, -0.029571533203125, 0.005809783935546875, 0.0290374755859375, 0.00455474853515625, 0.01482391357421875, 0.025115966796875, 0.034210205078125, -0.0601806640625, 0.029571533203125, -0.03973388671875, 0.00315093994140625, -0.0160980224609375, -0.042938232421875, -0.08575439453125, -0.011993408203125, -0.028350830078125, -0.03924560546875, 0.019195556640625, 0.088623046875, 0.053497314453125, -0.053558349609375, -0.025421142578125, -0.0034198760986328125, -0.0219573974609375, -0.0107421875, -0.01148223876953125, 0.0311431884765625, -0.006256103515625, -0.060516357421875, -0.01297760009765625, -0.0262908935546875, 0.022003173828125, -0.0131683349609375, -0.0168609619140625, -0.01148223876953125, -0.03863525390625, 0.0059661865234375, -0.0005359649658203125, -0.032012939453125, -0.0401611328125, -0.007511138916015625, -0.00936126708984375, 0.016571044921875, 0.014404296875, -0.037017822265625, 0.0238189697265625, 0.0179290771484375, 0.022552490234375, 0.0594482421875, 0.011077880859375, 0.043487548828125, -0.0594482421875, 0.026092529296875, 0.023406982421875, 0.03253173828125, 0.0093536376953125, -0.0273590087890625, 0.04461669921875, 0.04052734375, -0.0291595458984375, -0.0484619140625, -0.01448822021484375, -0.07391357421875, 0.03131103515625, 0.0909423828125, 0.0113525390625, -0.039794921875, 0.031402587890625, -0.0166168212890625, 0.03271484375, -0.0244598388671875, 0.035369873046875, 0.04937744140625, 0.0012159347534179688, 0.0037689208984375, -0.042083740234375, 0.03521728515625, 0.0262603759765625, -0.038360595703125, -0.034515380859375, 0.025421142578125, 0.05230712890625, -0.007030487060546875, 0.0202178955078125, 0.0002598762512207031, 0.0299072265625, 0.008941650390625, 0.03265380859375, -0.0399169921875, -0.0250091552734375, -0.0263214111328125, 0.01446533203125, 0.01146697998046875, -0.04266357421875 ] ]
BAAI/bge-base-en-v1.5
2023-10-12T03:37:11.000Z
[ "sentence-transformers", "pytorch", "bert", "feature-extraction", "sentence-similarity", "transformers", "mteb", "en", "arxiv:2310.07554", "arxiv:2309.07597", "license:mit", "model-index", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
BAAI
null
null
BAAI/bge-base-en-v1.5
49
105,003
sentence-transformers
2023-09-11T15:04:22
--- tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers - mteb model-index: - name: bge-base-en-v1.5 results: - task: type: Classification dataset: type: mteb/amazon_counterfactual name: MTEB AmazonCounterfactualClassification (en) config: en split: test revision: e8379541af4e31359cca9fbcf4b00f2671dba205 metrics: - type: accuracy value: 76.14925373134328 - type: ap value: 39.32336517995478 - type: f1 value: 70.16902252611425 - task: type: Classification dataset: type: mteb/amazon_polarity name: MTEB AmazonPolarityClassification config: default split: test revision: e2d317d38cd51312af73b3d32a06d1a08b442046 metrics: - type: accuracy value: 93.386825 - type: ap value: 90.21276917991995 - type: f1 value: 93.37741030006174 - task: type: Classification dataset: type: mteb/amazon_reviews_multi name: MTEB AmazonReviewsClassification (en) config: en split: test revision: 1399c76144fd37290681b995c656ef9b2e06e26d metrics: - type: accuracy value: 48.846000000000004 - type: f1 value: 48.14646269778261 - task: type: Retrieval dataset: type: arguana name: MTEB ArguAna config: default split: test revision: None metrics: - type: map_at_1 value: 40.754000000000005 - type: map_at_10 value: 55.761 - type: map_at_100 value: 56.330999999999996 - type: map_at_1000 value: 56.333999999999996 - type: map_at_3 value: 51.92 - type: map_at_5 value: 54.010999999999996 - type: mrr_at_1 value: 41.181 - type: mrr_at_10 value: 55.967999999999996 - type: mrr_at_100 value: 56.538 - type: mrr_at_1000 value: 56.542 - type: mrr_at_3 value: 51.980000000000004 - type: mrr_at_5 value: 54.208999999999996 - type: ndcg_at_1 value: 40.754000000000005 - type: ndcg_at_10 value: 63.605000000000004 - type: ndcg_at_100 value: 66.05199999999999 - type: ndcg_at_1000 value: 66.12 - type: ndcg_at_3 value: 55.708 - type: ndcg_at_5 value: 59.452000000000005 - type: precision_at_1 value: 40.754000000000005 - type: precision_at_10 value: 8.841000000000001 - type: precision_at_100 value: 0.991 - type: precision_at_1000 value: 0.1 - type: precision_at_3 value: 22.238 - type: precision_at_5 value: 15.149000000000001 - type: recall_at_1 value: 40.754000000000005 - type: recall_at_10 value: 88.407 - type: recall_at_100 value: 99.14699999999999 - type: recall_at_1000 value: 99.644 - type: recall_at_3 value: 66.714 - type: recall_at_5 value: 75.747 - task: type: Clustering dataset: type: mteb/arxiv-clustering-p2p name: MTEB ArxivClusteringP2P config: default split: test revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d metrics: - type: v_measure value: 48.74884539679369 - task: type: Clustering dataset: type: mteb/arxiv-clustering-s2s name: MTEB ArxivClusteringS2S config: default split: test revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53 metrics: - type: v_measure value: 42.8075893810716 - task: type: Reranking dataset: type: mteb/askubuntudupquestions-reranking name: MTEB AskUbuntuDupQuestions config: default split: test revision: 2000358ca161889fa9c082cb41daa8dcfb161a54 metrics: - type: map value: 62.128470519187736 - type: mrr value: 74.28065778481289 - task: type: STS dataset: type: mteb/biosses-sts name: MTEB BIOSSES config: default split: test revision: d3fb88f8f02e40887cd149695127462bbcf29b4a metrics: - type: cos_sim_pearson value: 89.24629081484655 - type: cos_sim_spearman value: 86.93752309911496 - type: euclidean_pearson value: 87.58589628573816 - type: euclidean_spearman value: 88.05622328825284 - type: manhattan_pearson value: 87.5594959805773 - type: manhattan_spearman value: 88.19658793233961 - task: type: Classification dataset: type: mteb/banking77 name: MTEB Banking77Classification config: default split: test revision: 0fd18e25b25c072e09e0d92ab615fda904d66300 metrics: - type: accuracy value: 86.9512987012987 - type: f1 value: 86.92515357973708 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-p2p name: MTEB BiorxivClusteringP2P config: default split: test revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40 metrics: - type: v_measure value: 39.10263762928872 - task: type: Clustering dataset: type: mteb/biorxiv-clustering-s2s name: MTEB BiorxivClusteringS2S config: default split: test revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908 metrics: - type: v_measure value: 36.69711517426737 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackAndroidRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 32.327 - type: map_at_10 value: 44.099 - type: map_at_100 value: 45.525 - type: map_at_1000 value: 45.641999999999996 - type: map_at_3 value: 40.47 - type: map_at_5 value: 42.36 - type: mrr_at_1 value: 39.199 - type: mrr_at_10 value: 49.651 - type: mrr_at_100 value: 50.29 - type: mrr_at_1000 value: 50.329 - type: mrr_at_3 value: 46.924 - type: mrr_at_5 value: 48.548 - type: ndcg_at_1 value: 39.199 - type: ndcg_at_10 value: 50.773 - type: ndcg_at_100 value: 55.67999999999999 - type: ndcg_at_1000 value: 57.495 - type: ndcg_at_3 value: 45.513999999999996 - type: ndcg_at_5 value: 47.703 - type: precision_at_1 value: 39.199 - type: precision_at_10 value: 9.914000000000001 - type: precision_at_100 value: 1.5310000000000001 - type: precision_at_1000 value: 0.198 - type: precision_at_3 value: 21.984 - type: precision_at_5 value: 15.737000000000002 - type: recall_at_1 value: 32.327 - type: recall_at_10 value: 63.743 - type: recall_at_100 value: 84.538 - type: recall_at_1000 value: 96.089 - type: recall_at_3 value: 48.065000000000005 - type: recall_at_5 value: 54.519 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackEnglishRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 32.671 - type: map_at_10 value: 42.954 - type: map_at_100 value: 44.151 - type: map_at_1000 value: 44.287 - type: map_at_3 value: 39.912 - type: map_at_5 value: 41.798 - type: mrr_at_1 value: 41.465 - type: mrr_at_10 value: 49.351 - type: mrr_at_100 value: 49.980000000000004 - type: mrr_at_1000 value: 50.016000000000005 - type: mrr_at_3 value: 47.144000000000005 - type: mrr_at_5 value: 48.592999999999996 - type: ndcg_at_1 value: 41.465 - type: ndcg_at_10 value: 48.565999999999995 - type: ndcg_at_100 value: 52.76499999999999 - type: ndcg_at_1000 value: 54.749 - type: ndcg_at_3 value: 44.57 - type: ndcg_at_5 value: 46.759 - type: precision_at_1 value: 41.465 - type: precision_at_10 value: 9.107999999999999 - type: precision_at_100 value: 1.433 - type: precision_at_1000 value: 0.191 - type: precision_at_3 value: 21.423000000000002 - type: precision_at_5 value: 15.414 - type: recall_at_1 value: 32.671 - type: recall_at_10 value: 57.738 - type: recall_at_100 value: 75.86500000000001 - type: recall_at_1000 value: 88.36 - type: recall_at_3 value: 45.626 - type: recall_at_5 value: 51.812000000000005 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGamingRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 41.185 - type: map_at_10 value: 53.929 - type: map_at_100 value: 54.92 - type: map_at_1000 value: 54.967999999999996 - type: map_at_3 value: 50.70400000000001 - type: map_at_5 value: 52.673 - type: mrr_at_1 value: 47.398 - type: mrr_at_10 value: 57.303000000000004 - type: mrr_at_100 value: 57.959 - type: mrr_at_1000 value: 57.985 - type: mrr_at_3 value: 54.932 - type: mrr_at_5 value: 56.464999999999996 - type: ndcg_at_1 value: 47.398 - type: ndcg_at_10 value: 59.653 - type: ndcg_at_100 value: 63.627 - type: ndcg_at_1000 value: 64.596 - type: ndcg_at_3 value: 54.455 - type: ndcg_at_5 value: 57.245000000000005 - type: precision_at_1 value: 47.398 - type: precision_at_10 value: 9.524000000000001 - type: precision_at_100 value: 1.243 - type: precision_at_1000 value: 0.13699999999999998 - type: precision_at_3 value: 24.389 - type: precision_at_5 value: 16.752 - type: recall_at_1 value: 41.185 - type: recall_at_10 value: 73.193 - type: recall_at_100 value: 90.357 - type: recall_at_1000 value: 97.253 - type: recall_at_3 value: 59.199999999999996 - type: recall_at_5 value: 66.118 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackGisRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.27 - type: map_at_10 value: 36.223 - type: map_at_100 value: 37.218 - type: map_at_1000 value: 37.293 - type: map_at_3 value: 33.503 - type: map_at_5 value: 35.097 - type: mrr_at_1 value: 29.492 - type: mrr_at_10 value: 38.352000000000004 - type: mrr_at_100 value: 39.188 - type: mrr_at_1000 value: 39.247 - type: mrr_at_3 value: 35.876000000000005 - type: mrr_at_5 value: 37.401 - type: ndcg_at_1 value: 29.492 - type: ndcg_at_10 value: 41.239 - type: ndcg_at_100 value: 46.066 - type: ndcg_at_1000 value: 47.992000000000004 - type: ndcg_at_3 value: 36.11 - type: ndcg_at_5 value: 38.772 - type: precision_at_1 value: 29.492 - type: precision_at_10 value: 6.260000000000001 - type: precision_at_100 value: 0.914 - type: precision_at_1000 value: 0.11100000000000002 - type: precision_at_3 value: 15.104000000000001 - type: precision_at_5 value: 10.644 - type: recall_at_1 value: 27.27 - type: recall_at_10 value: 54.589 - type: recall_at_100 value: 76.70700000000001 - type: recall_at_1000 value: 91.158 - type: recall_at_3 value: 40.974 - type: recall_at_5 value: 47.327000000000005 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackMathematicaRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 17.848 - type: map_at_10 value: 26.207 - type: map_at_100 value: 27.478 - type: map_at_1000 value: 27.602 - type: map_at_3 value: 23.405 - type: map_at_5 value: 24.98 - type: mrr_at_1 value: 21.891 - type: mrr_at_10 value: 31.041999999999998 - type: mrr_at_100 value: 32.092 - type: mrr_at_1000 value: 32.151999999999994 - type: mrr_at_3 value: 28.358 - type: mrr_at_5 value: 29.969 - type: ndcg_at_1 value: 21.891 - type: ndcg_at_10 value: 31.585 - type: ndcg_at_100 value: 37.531 - type: ndcg_at_1000 value: 40.256 - type: ndcg_at_3 value: 26.508 - type: ndcg_at_5 value: 28.894 - type: precision_at_1 value: 21.891 - type: precision_at_10 value: 5.795999999999999 - type: precision_at_100 value: 0.9990000000000001 - type: precision_at_1000 value: 0.13799999999999998 - type: precision_at_3 value: 12.769 - type: precision_at_5 value: 9.279 - type: recall_at_1 value: 17.848 - type: recall_at_10 value: 43.452 - type: recall_at_100 value: 69.216 - type: recall_at_1000 value: 88.102 - type: recall_at_3 value: 29.18 - type: recall_at_5 value: 35.347 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackPhysicsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 30.94 - type: map_at_10 value: 41.248000000000005 - type: map_at_100 value: 42.495 - type: map_at_1000 value: 42.602000000000004 - type: map_at_3 value: 37.939 - type: map_at_5 value: 39.924 - type: mrr_at_1 value: 37.824999999999996 - type: mrr_at_10 value: 47.041 - type: mrr_at_100 value: 47.83 - type: mrr_at_1000 value: 47.878 - type: mrr_at_3 value: 44.466 - type: mrr_at_5 value: 46.111999999999995 - type: ndcg_at_1 value: 37.824999999999996 - type: ndcg_at_10 value: 47.223 - type: ndcg_at_100 value: 52.394 - type: ndcg_at_1000 value: 54.432 - type: ndcg_at_3 value: 42.032000000000004 - type: ndcg_at_5 value: 44.772 - type: precision_at_1 value: 37.824999999999996 - type: precision_at_10 value: 8.393 - type: precision_at_100 value: 1.2890000000000001 - type: precision_at_1000 value: 0.164 - type: precision_at_3 value: 19.698 - type: precision_at_5 value: 14.013 - type: recall_at_1 value: 30.94 - type: recall_at_10 value: 59.316 - type: recall_at_100 value: 80.783 - type: recall_at_1000 value: 94.15400000000001 - type: recall_at_3 value: 44.712 - type: recall_at_5 value: 51.932 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackProgrammersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.104 - type: map_at_10 value: 36.675999999999995 - type: map_at_100 value: 38.076 - type: map_at_1000 value: 38.189 - type: map_at_3 value: 33.733999999999995 - type: map_at_5 value: 35.287 - type: mrr_at_1 value: 33.904 - type: mrr_at_10 value: 42.55 - type: mrr_at_100 value: 43.434 - type: mrr_at_1000 value: 43.494 - type: mrr_at_3 value: 40.126 - type: mrr_at_5 value: 41.473 - type: ndcg_at_1 value: 33.904 - type: ndcg_at_10 value: 42.414 - type: ndcg_at_100 value: 48.203 - type: ndcg_at_1000 value: 50.437 - type: ndcg_at_3 value: 37.633 - type: ndcg_at_5 value: 39.67 - type: precision_at_1 value: 33.904 - type: precision_at_10 value: 7.82 - type: precision_at_100 value: 1.2409999999999999 - type: precision_at_1000 value: 0.159 - type: precision_at_3 value: 17.884 - type: precision_at_5 value: 12.648000000000001 - type: recall_at_1 value: 27.104 - type: recall_at_10 value: 53.563 - type: recall_at_100 value: 78.557 - type: recall_at_1000 value: 93.533 - type: recall_at_3 value: 39.92 - type: recall_at_5 value: 45.457 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.707749999999997 - type: map_at_10 value: 36.961 - type: map_at_100 value: 38.158833333333334 - type: map_at_1000 value: 38.270333333333326 - type: map_at_3 value: 34.07183333333334 - type: map_at_5 value: 35.69533333333334 - type: mrr_at_1 value: 32.81875 - type: mrr_at_10 value: 41.293 - type: mrr_at_100 value: 42.116499999999995 - type: mrr_at_1000 value: 42.170249999999996 - type: mrr_at_3 value: 38.83983333333333 - type: mrr_at_5 value: 40.29775 - type: ndcg_at_1 value: 32.81875 - type: ndcg_at_10 value: 42.355 - type: ndcg_at_100 value: 47.41374999999999 - type: ndcg_at_1000 value: 49.5805 - type: ndcg_at_3 value: 37.52825 - type: ndcg_at_5 value: 39.83266666666667 - type: precision_at_1 value: 32.81875 - type: precision_at_10 value: 7.382416666666666 - type: precision_at_100 value: 1.1640833333333334 - type: precision_at_1000 value: 0.15383333333333335 - type: precision_at_3 value: 17.134166666666665 - type: precision_at_5 value: 12.174833333333336 - type: recall_at_1 value: 27.707749999999997 - type: recall_at_10 value: 53.945 - type: recall_at_100 value: 76.191 - type: recall_at_1000 value: 91.101 - type: recall_at_3 value: 40.39083333333334 - type: recall_at_5 value: 46.40083333333333 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackStatsRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.482 - type: map_at_10 value: 33.201 - type: map_at_100 value: 34.107 - type: map_at_1000 value: 34.197 - type: map_at_3 value: 31.174000000000003 - type: map_at_5 value: 32.279 - type: mrr_at_1 value: 29.908 - type: mrr_at_10 value: 36.235 - type: mrr_at_100 value: 37.04 - type: mrr_at_1000 value: 37.105 - type: mrr_at_3 value: 34.355999999999995 - type: mrr_at_5 value: 35.382999999999996 - type: ndcg_at_1 value: 29.908 - type: ndcg_at_10 value: 37.325 - type: ndcg_at_100 value: 41.795 - type: ndcg_at_1000 value: 44.105 - type: ndcg_at_3 value: 33.555 - type: ndcg_at_5 value: 35.266999999999996 - type: precision_at_1 value: 29.908 - type: precision_at_10 value: 5.721 - type: precision_at_100 value: 0.8630000000000001 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 14.008000000000001 - type: precision_at_5 value: 9.754999999999999 - type: recall_at_1 value: 26.482 - type: recall_at_10 value: 47.072 - type: recall_at_100 value: 67.27 - type: recall_at_1000 value: 84.371 - type: recall_at_3 value: 36.65 - type: recall_at_5 value: 40.774 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackTexRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 18.815 - type: map_at_10 value: 26.369999999999997 - type: map_at_100 value: 27.458 - type: map_at_1000 value: 27.588 - type: map_at_3 value: 23.990000000000002 - type: map_at_5 value: 25.345000000000002 - type: mrr_at_1 value: 22.953000000000003 - type: mrr_at_10 value: 30.342999999999996 - type: mrr_at_100 value: 31.241000000000003 - type: mrr_at_1000 value: 31.319000000000003 - type: mrr_at_3 value: 28.16 - type: mrr_at_5 value: 29.406 - type: ndcg_at_1 value: 22.953000000000003 - type: ndcg_at_10 value: 31.151 - type: ndcg_at_100 value: 36.309000000000005 - type: ndcg_at_1000 value: 39.227000000000004 - type: ndcg_at_3 value: 26.921 - type: ndcg_at_5 value: 28.938000000000002 - type: precision_at_1 value: 22.953000000000003 - type: precision_at_10 value: 5.602 - type: precision_at_100 value: 0.9530000000000001 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 12.606 - type: precision_at_5 value: 9.119 - type: recall_at_1 value: 18.815 - type: recall_at_10 value: 41.574 - type: recall_at_100 value: 64.84400000000001 - type: recall_at_1000 value: 85.406 - type: recall_at_3 value: 29.694 - type: recall_at_5 value: 34.935 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackUnixRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 27.840999999999998 - type: map_at_10 value: 36.797999999999995 - type: map_at_100 value: 37.993 - type: map_at_1000 value: 38.086999999999996 - type: map_at_3 value: 34.050999999999995 - type: map_at_5 value: 35.379 - type: mrr_at_1 value: 32.649 - type: mrr_at_10 value: 41.025 - type: mrr_at_100 value: 41.878 - type: mrr_at_1000 value: 41.929 - type: mrr_at_3 value: 38.573 - type: mrr_at_5 value: 39.715 - type: ndcg_at_1 value: 32.649 - type: ndcg_at_10 value: 42.142 - type: ndcg_at_100 value: 47.558 - type: ndcg_at_1000 value: 49.643 - type: ndcg_at_3 value: 37.12 - type: ndcg_at_5 value: 38.983000000000004 - type: precision_at_1 value: 32.649 - type: precision_at_10 value: 7.08 - type: precision_at_100 value: 1.1039999999999999 - type: precision_at_1000 value: 0.13899999999999998 - type: precision_at_3 value: 16.698 - type: precision_at_5 value: 11.511000000000001 - type: recall_at_1 value: 27.840999999999998 - type: recall_at_10 value: 54.245 - type: recall_at_100 value: 77.947 - type: recall_at_1000 value: 92.36999999999999 - type: recall_at_3 value: 40.146 - type: recall_at_5 value: 44.951 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWebmastersRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 26.529000000000003 - type: map_at_10 value: 35.010000000000005 - type: map_at_100 value: 36.647 - type: map_at_1000 value: 36.857 - type: map_at_3 value: 31.968000000000004 - type: map_at_5 value: 33.554 - type: mrr_at_1 value: 31.818 - type: mrr_at_10 value: 39.550999999999995 - type: mrr_at_100 value: 40.54 - type: mrr_at_1000 value: 40.596 - type: mrr_at_3 value: 36.726 - type: mrr_at_5 value: 38.416 - type: ndcg_at_1 value: 31.818 - type: ndcg_at_10 value: 40.675 - type: ndcg_at_100 value: 46.548 - type: ndcg_at_1000 value: 49.126 - type: ndcg_at_3 value: 35.829 - type: ndcg_at_5 value: 38.0 - type: precision_at_1 value: 31.818 - type: precision_at_10 value: 7.826 - type: precision_at_100 value: 1.538 - type: precision_at_1000 value: 0.24 - type: precision_at_3 value: 16.601 - type: precision_at_5 value: 12.095 - type: recall_at_1 value: 26.529000000000003 - type: recall_at_10 value: 51.03 - type: recall_at_100 value: 77.556 - type: recall_at_1000 value: 93.804 - type: recall_at_3 value: 36.986000000000004 - type: recall_at_5 value: 43.096000000000004 - task: type: Retrieval dataset: type: BeIR/cqadupstack name: MTEB CQADupstackWordpressRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 23.480999999999998 - type: map_at_10 value: 30.817 - type: map_at_100 value: 31.838 - type: map_at_1000 value: 31.932 - type: map_at_3 value: 28.011999999999997 - type: map_at_5 value: 29.668 - type: mrr_at_1 value: 25.323 - type: mrr_at_10 value: 33.072 - type: mrr_at_100 value: 33.926 - type: mrr_at_1000 value: 33.993 - type: mrr_at_3 value: 30.436999999999998 - type: mrr_at_5 value: 32.092 - type: ndcg_at_1 value: 25.323 - type: ndcg_at_10 value: 35.514 - type: ndcg_at_100 value: 40.489000000000004 - type: ndcg_at_1000 value: 42.908 - type: ndcg_at_3 value: 30.092000000000002 - type: ndcg_at_5 value: 32.989000000000004 - type: precision_at_1 value: 25.323 - type: precision_at_10 value: 5.545 - type: precision_at_100 value: 0.861 - type: precision_at_1000 value: 0.117 - type: precision_at_3 value: 12.446 - type: precision_at_5 value: 9.131 - type: recall_at_1 value: 23.480999999999998 - type: recall_at_10 value: 47.825 - type: recall_at_100 value: 70.652 - type: recall_at_1000 value: 88.612 - type: recall_at_3 value: 33.537 - type: recall_at_5 value: 40.542 - task: type: Retrieval dataset: type: climate-fever name: MTEB ClimateFEVER config: default split: test revision: None metrics: - type: map_at_1 value: 13.333999999999998 - type: map_at_10 value: 22.524 - type: map_at_100 value: 24.506 - type: map_at_1000 value: 24.715 - type: map_at_3 value: 19.022 - type: map_at_5 value: 20.693 - type: mrr_at_1 value: 29.186 - type: mrr_at_10 value: 41.22 - type: mrr_at_100 value: 42.16 - type: mrr_at_1000 value: 42.192 - type: mrr_at_3 value: 38.013000000000005 - type: mrr_at_5 value: 39.704 - type: ndcg_at_1 value: 29.186 - type: ndcg_at_10 value: 31.167 - type: ndcg_at_100 value: 38.879000000000005 - type: ndcg_at_1000 value: 42.376000000000005 - type: ndcg_at_3 value: 25.817 - type: ndcg_at_5 value: 27.377000000000002 - type: precision_at_1 value: 29.186 - type: precision_at_10 value: 9.693999999999999 - type: precision_at_100 value: 1.8030000000000002 - type: precision_at_1000 value: 0.246 - type: precision_at_3 value: 19.11 - type: precision_at_5 value: 14.344999999999999 - type: recall_at_1 value: 13.333999999999998 - type: recall_at_10 value: 37.092000000000006 - type: recall_at_100 value: 63.651 - type: recall_at_1000 value: 83.05 - type: recall_at_3 value: 23.74 - type: recall_at_5 value: 28.655 - task: type: Retrieval dataset: type: dbpedia-entity name: MTEB DBPedia config: default split: test revision: None metrics: - type: map_at_1 value: 9.151 - type: map_at_10 value: 19.653000000000002 - type: map_at_100 value: 28.053 - type: map_at_1000 value: 29.709000000000003 - type: map_at_3 value: 14.191 - type: map_at_5 value: 16.456 - type: mrr_at_1 value: 66.25 - type: mrr_at_10 value: 74.4 - type: mrr_at_100 value: 74.715 - type: mrr_at_1000 value: 74.726 - type: mrr_at_3 value: 72.417 - type: mrr_at_5 value: 73.667 - type: ndcg_at_1 value: 54.25 - type: ndcg_at_10 value: 40.77 - type: ndcg_at_100 value: 46.359 - type: ndcg_at_1000 value: 54.193000000000005 - type: ndcg_at_3 value: 44.832 - type: ndcg_at_5 value: 42.63 - type: precision_at_1 value: 66.25 - type: precision_at_10 value: 32.175 - type: precision_at_100 value: 10.668 - type: precision_at_1000 value: 2.067 - type: precision_at_3 value: 47.667 - type: precision_at_5 value: 41.3 - type: recall_at_1 value: 9.151 - type: recall_at_10 value: 25.003999999999998 - type: recall_at_100 value: 52.976 - type: recall_at_1000 value: 78.315 - type: recall_at_3 value: 15.487 - type: recall_at_5 value: 18.999 - task: type: Classification dataset: type: mteb/emotion name: MTEB EmotionClassification config: default split: test revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37 metrics: - type: accuracy value: 51.89999999999999 - type: f1 value: 46.47777925067403 - task: type: Retrieval dataset: type: fever name: MTEB FEVER config: default split: test revision: None metrics: - type: map_at_1 value: 73.706 - type: map_at_10 value: 82.423 - type: map_at_100 value: 82.67999999999999 - type: map_at_1000 value: 82.694 - type: map_at_3 value: 81.328 - type: map_at_5 value: 82.001 - type: mrr_at_1 value: 79.613 - type: mrr_at_10 value: 87.07000000000001 - type: mrr_at_100 value: 87.169 - type: mrr_at_1000 value: 87.17 - type: mrr_at_3 value: 86.404 - type: mrr_at_5 value: 86.856 - type: ndcg_at_1 value: 79.613 - type: ndcg_at_10 value: 86.289 - type: ndcg_at_100 value: 87.201 - type: ndcg_at_1000 value: 87.428 - type: ndcg_at_3 value: 84.625 - type: ndcg_at_5 value: 85.53699999999999 - type: precision_at_1 value: 79.613 - type: precision_at_10 value: 10.399 - type: precision_at_100 value: 1.1079999999999999 - type: precision_at_1000 value: 0.11499999999999999 - type: precision_at_3 value: 32.473 - type: precision_at_5 value: 20.132 - type: recall_at_1 value: 73.706 - type: recall_at_10 value: 93.559 - type: recall_at_100 value: 97.188 - type: recall_at_1000 value: 98.555 - type: recall_at_3 value: 88.98700000000001 - type: recall_at_5 value: 91.373 - task: type: Retrieval dataset: type: fiqa name: MTEB FiQA2018 config: default split: test revision: None metrics: - type: map_at_1 value: 19.841 - type: map_at_10 value: 32.643 - type: map_at_100 value: 34.575 - type: map_at_1000 value: 34.736 - type: map_at_3 value: 28.317999999999998 - type: map_at_5 value: 30.964000000000002 - type: mrr_at_1 value: 39.660000000000004 - type: mrr_at_10 value: 48.620000000000005 - type: mrr_at_100 value: 49.384 - type: mrr_at_1000 value: 49.415 - type: mrr_at_3 value: 45.988 - type: mrr_at_5 value: 47.361 - type: ndcg_at_1 value: 39.660000000000004 - type: ndcg_at_10 value: 40.646 - type: ndcg_at_100 value: 47.657 - type: ndcg_at_1000 value: 50.428 - type: ndcg_at_3 value: 36.689 - type: ndcg_at_5 value: 38.211 - type: precision_at_1 value: 39.660000000000004 - type: precision_at_10 value: 11.235000000000001 - type: precision_at_100 value: 1.8530000000000002 - type: precision_at_1000 value: 0.23600000000000002 - type: precision_at_3 value: 24.587999999999997 - type: precision_at_5 value: 18.395 - type: recall_at_1 value: 19.841 - type: recall_at_10 value: 48.135 - type: recall_at_100 value: 74.224 - type: recall_at_1000 value: 90.826 - type: recall_at_3 value: 33.536 - type: recall_at_5 value: 40.311 - task: type: Retrieval dataset: type: hotpotqa name: MTEB HotpotQA config: default split: test revision: None metrics: - type: map_at_1 value: 40.358 - type: map_at_10 value: 64.497 - type: map_at_100 value: 65.362 - type: map_at_1000 value: 65.41900000000001 - type: map_at_3 value: 61.06700000000001 - type: map_at_5 value: 63.317 - type: mrr_at_1 value: 80.716 - type: mrr_at_10 value: 86.10799999999999 - type: mrr_at_100 value: 86.265 - type: mrr_at_1000 value: 86.27 - type: mrr_at_3 value: 85.271 - type: mrr_at_5 value: 85.82499999999999 - type: ndcg_at_1 value: 80.716 - type: ndcg_at_10 value: 72.597 - type: ndcg_at_100 value: 75.549 - type: ndcg_at_1000 value: 76.61 - type: ndcg_at_3 value: 67.874 - type: ndcg_at_5 value: 70.655 - type: precision_at_1 value: 80.716 - type: precision_at_10 value: 15.148 - type: precision_at_100 value: 1.745 - type: precision_at_1000 value: 0.188 - type: precision_at_3 value: 43.597 - type: precision_at_5 value: 28.351 - type: recall_at_1 value: 40.358 - type: recall_at_10 value: 75.739 - type: recall_at_100 value: 87.259 - type: recall_at_1000 value: 94.234 - type: recall_at_3 value: 65.39500000000001 - type: recall_at_5 value: 70.878 - task: type: Classification dataset: type: mteb/imdb name: MTEB ImdbClassification config: default split: test revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7 metrics: - type: accuracy value: 90.80799999999998 - type: ap value: 86.81350378180757 - type: f1 value: 90.79901248314215 - task: type: Retrieval dataset: type: msmarco name: MTEB MSMARCO config: default split: dev revision: None metrics: - type: map_at_1 value: 22.096 - type: map_at_10 value: 34.384 - type: map_at_100 value: 35.541 - type: map_at_1000 value: 35.589999999999996 - type: map_at_3 value: 30.496000000000002 - type: map_at_5 value: 32.718 - type: mrr_at_1 value: 22.750999999999998 - type: mrr_at_10 value: 35.024 - type: mrr_at_100 value: 36.125 - type: mrr_at_1000 value: 36.168 - type: mrr_at_3 value: 31.225 - type: mrr_at_5 value: 33.416000000000004 - type: ndcg_at_1 value: 22.750999999999998 - type: ndcg_at_10 value: 41.351 - type: ndcg_at_100 value: 46.92 - type: ndcg_at_1000 value: 48.111 - type: ndcg_at_3 value: 33.439 - type: ndcg_at_5 value: 37.407000000000004 - type: precision_at_1 value: 22.750999999999998 - type: precision_at_10 value: 6.564 - type: precision_at_100 value: 0.935 - type: precision_at_1000 value: 0.104 - type: precision_at_3 value: 14.288 - type: precision_at_5 value: 10.581999999999999 - type: recall_at_1 value: 22.096 - type: recall_at_10 value: 62.771 - type: recall_at_100 value: 88.529 - type: recall_at_1000 value: 97.55 - type: recall_at_3 value: 41.245 - type: recall_at_5 value: 50.788 - task: type: Classification dataset: type: mteb/mtop_domain name: MTEB MTOPDomainClassification (en) config: en split: test revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf metrics: - type: accuracy value: 94.16780665754673 - type: f1 value: 93.96331194859894 - task: type: Classification dataset: type: mteb/mtop_intent name: MTEB MTOPIntentClassification (en) config: en split: test revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba metrics: - type: accuracy value: 76.90606475148198 - type: f1 value: 58.58344986604187 - task: type: Classification dataset: type: mteb/amazon_massive_intent name: MTEB MassiveIntentClassification (en) config: en split: test revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7 metrics: - type: accuracy value: 76.14660390047075 - type: f1 value: 74.31533923533614 - task: type: Classification dataset: type: mteb/amazon_massive_scenario name: MTEB MassiveScenarioClassification (en) config: en split: test revision: 7d571f92784cd94a019292a1f45445077d0ef634 metrics: - type: accuracy value: 80.16139878950908 - type: f1 value: 80.18532656824924 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-p2p name: MTEB MedrxivClusteringP2P config: default split: test revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73 metrics: - type: v_measure value: 32.949880906135085 - task: type: Clustering dataset: type: mteb/medrxiv-clustering-s2s name: MTEB MedrxivClusteringS2S config: default split: test revision: 35191c8c0dca72d8ff3efcd72aa802307d469663 metrics: - type: v_measure value: 31.56300351524862 - task: type: Reranking dataset: type: mteb/mind_small name: MTEB MindSmallReranking config: default split: test revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69 metrics: - type: map value: 31.196521894371315 - type: mrr value: 32.22644231694389 - task: type: Retrieval dataset: type: nfcorpus name: MTEB NFCorpus config: default split: test revision: None metrics: - type: map_at_1 value: 6.783 - type: map_at_10 value: 14.549000000000001 - type: map_at_100 value: 18.433 - type: map_at_1000 value: 19.949 - type: map_at_3 value: 10.936 - type: map_at_5 value: 12.514 - type: mrr_at_1 value: 47.368 - type: mrr_at_10 value: 56.42 - type: mrr_at_100 value: 56.908 - type: mrr_at_1000 value: 56.95 - type: mrr_at_3 value: 54.283 - type: mrr_at_5 value: 55.568 - type: ndcg_at_1 value: 45.666000000000004 - type: ndcg_at_10 value: 37.389 - type: ndcg_at_100 value: 34.253 - type: ndcg_at_1000 value: 43.059999999999995 - type: ndcg_at_3 value: 42.725 - type: ndcg_at_5 value: 40.193 - type: precision_at_1 value: 47.368 - type: precision_at_10 value: 27.988000000000003 - type: precision_at_100 value: 8.672 - type: precision_at_1000 value: 2.164 - type: precision_at_3 value: 40.248 - type: precision_at_5 value: 34.737 - type: recall_at_1 value: 6.783 - type: recall_at_10 value: 17.838 - type: recall_at_100 value: 33.672000000000004 - type: recall_at_1000 value: 66.166 - type: recall_at_3 value: 11.849 - type: recall_at_5 value: 14.205000000000002 - task: type: Retrieval dataset: type: nq name: MTEB NQ config: default split: test revision: None metrics: - type: map_at_1 value: 31.698999999999998 - type: map_at_10 value: 46.556 - type: map_at_100 value: 47.652 - type: map_at_1000 value: 47.68 - type: map_at_3 value: 42.492000000000004 - type: map_at_5 value: 44.763999999999996 - type: mrr_at_1 value: 35.747 - type: mrr_at_10 value: 49.242999999999995 - type: mrr_at_100 value: 50.052 - type: mrr_at_1000 value: 50.068 - type: mrr_at_3 value: 45.867000000000004 - type: mrr_at_5 value: 47.778999999999996 - type: ndcg_at_1 value: 35.717999999999996 - type: ndcg_at_10 value: 54.14600000000001 - type: ndcg_at_100 value: 58.672999999999995 - type: ndcg_at_1000 value: 59.279 - type: ndcg_at_3 value: 46.407 - type: ndcg_at_5 value: 50.181 - type: precision_at_1 value: 35.717999999999996 - type: precision_at_10 value: 8.844000000000001 - type: precision_at_100 value: 1.139 - type: precision_at_1000 value: 0.12 - type: precision_at_3 value: 20.993000000000002 - type: precision_at_5 value: 14.791000000000002 - type: recall_at_1 value: 31.698999999999998 - type: recall_at_10 value: 74.693 - type: recall_at_100 value: 94.15299999999999 - type: recall_at_1000 value: 98.585 - type: recall_at_3 value: 54.388999999999996 - type: recall_at_5 value: 63.08200000000001 - task: type: Retrieval dataset: type: quora name: MTEB QuoraRetrieval config: default split: test revision: None metrics: - type: map_at_1 value: 71.283 - type: map_at_10 value: 85.24000000000001 - type: map_at_100 value: 85.882 - type: map_at_1000 value: 85.897 - type: map_at_3 value: 82.326 - type: map_at_5 value: 84.177 - type: mrr_at_1 value: 82.21000000000001 - type: mrr_at_10 value: 88.228 - type: mrr_at_100 value: 88.32 - type: mrr_at_1000 value: 88.32 - type: mrr_at_3 value: 87.323 - type: mrr_at_5 value: 87.94800000000001 - type: ndcg_at_1 value: 82.17999999999999 - type: ndcg_at_10 value: 88.9 - type: ndcg_at_100 value: 90.079 - type: ndcg_at_1000 value: 90.158 - type: ndcg_at_3 value: 86.18299999999999 - type: ndcg_at_5 value: 87.71799999999999 - type: precision_at_1 value: 82.17999999999999 - type: precision_at_10 value: 13.464 - type: precision_at_100 value: 1.533 - type: precision_at_1000 value: 0.157 - type: precision_at_3 value: 37.693 - type: precision_at_5 value: 24.792 - type: recall_at_1 value: 71.283 - type: recall_at_10 value: 95.742 - type: recall_at_100 value: 99.67200000000001 - type: recall_at_1000 value: 99.981 - type: recall_at_3 value: 87.888 - type: recall_at_5 value: 92.24 - task: type: Clustering dataset: type: mteb/reddit-clustering name: MTEB RedditClustering config: default split: test revision: 24640382cdbf8abc73003fb0fa6d111a705499eb metrics: - type: v_measure value: 56.24267063669042 - task: type: Clustering dataset: type: mteb/reddit-clustering-p2p name: MTEB RedditClusteringP2P config: default split: test revision: 282350215ef01743dc01b456c7f5241fa8937f16 metrics: - type: v_measure value: 62.88056988932578 - task: type: Retrieval dataset: type: scidocs name: MTEB SCIDOCS config: default split: test revision: None metrics: - type: map_at_1 value: 4.903 - type: map_at_10 value: 13.202 - type: map_at_100 value: 15.5 - type: map_at_1000 value: 15.870999999999999 - type: map_at_3 value: 9.407 - type: map_at_5 value: 11.238 - type: mrr_at_1 value: 24.2 - type: mrr_at_10 value: 35.867 - type: mrr_at_100 value: 37.001 - type: mrr_at_1000 value: 37.043 - type: mrr_at_3 value: 32.5 - type: mrr_at_5 value: 34.35 - type: ndcg_at_1 value: 24.2 - type: ndcg_at_10 value: 21.731 - type: ndcg_at_100 value: 30.7 - type: ndcg_at_1000 value: 36.618 - type: ndcg_at_3 value: 20.72 - type: ndcg_at_5 value: 17.954 - type: precision_at_1 value: 24.2 - type: precision_at_10 value: 11.33 - type: precision_at_100 value: 2.4410000000000003 - type: precision_at_1000 value: 0.386 - type: precision_at_3 value: 19.667 - type: precision_at_5 value: 15.86 - type: recall_at_1 value: 4.903 - type: recall_at_10 value: 22.962 - type: recall_at_100 value: 49.563 - type: recall_at_1000 value: 78.238 - type: recall_at_3 value: 11.953 - type: recall_at_5 value: 16.067999999999998 - task: type: STS dataset: type: mteb/sickr-sts name: MTEB SICK-R config: default split: test revision: a6ea5a8cab320b040a23452cc28066d9beae2cee metrics: - type: cos_sim_pearson value: 84.12694254604078 - type: cos_sim_spearman value: 80.30141815181918 - type: euclidean_pearson value: 81.34015449877128 - type: euclidean_spearman value: 80.13984197010849 - type: manhattan_pearson value: 81.31767068124086 - type: manhattan_spearman value: 80.11720513114103 - task: type: STS dataset: type: mteb/sts12-sts name: MTEB STS12 config: default split: test revision: a0d554a64d88156834ff5ae9920b964011b16384 metrics: - type: cos_sim_pearson value: 86.13112984010417 - type: cos_sim_spearman value: 78.03063573402875 - type: euclidean_pearson value: 83.51928418844804 - type: euclidean_spearman value: 78.4045235411144 - type: manhattan_pearson value: 83.49981637388689 - type: manhattan_spearman value: 78.4042575139372 - task: type: STS dataset: type: mteb/sts13-sts name: MTEB STS13 config: default split: test revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca metrics: - type: cos_sim_pearson value: 82.50327987379504 - type: cos_sim_spearman value: 84.18556767756205 - type: euclidean_pearson value: 82.69684424327679 - type: euclidean_spearman value: 83.5368106038335 - type: manhattan_pearson value: 82.57967581007374 - type: manhattan_spearman value: 83.43009053133697 - task: type: STS dataset: type: mteb/sts14-sts name: MTEB STS14 config: default split: test revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375 metrics: - type: cos_sim_pearson value: 82.50756863007814 - type: cos_sim_spearman value: 82.27204331279108 - type: euclidean_pearson value: 81.39535251429741 - type: euclidean_spearman value: 81.84386626336239 - type: manhattan_pearson value: 81.34281737280695 - type: manhattan_spearman value: 81.81149375673166 - task: type: STS dataset: type: mteb/sts15-sts name: MTEB STS15 config: default split: test revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3 metrics: - type: cos_sim_pearson value: 86.8727714856726 - type: cos_sim_spearman value: 87.95738287792312 - type: euclidean_pearson value: 86.62920602795887 - type: euclidean_spearman value: 87.05207355381243 - type: manhattan_pearson value: 86.53587918472225 - type: manhattan_spearman value: 86.95382961029586 - task: type: STS dataset: type: mteb/sts16-sts name: MTEB STS16 config: default split: test revision: 4d8694f8f0e0100860b497b999b3dbed754a0513 metrics: - type: cos_sim_pearson value: 83.52240359769479 - type: cos_sim_spearman value: 85.47685776238286 - type: euclidean_pearson value: 84.25815333483058 - type: euclidean_spearman value: 85.27415639683198 - type: manhattan_pearson value: 84.29127757025637 - type: manhattan_spearman value: 85.30226224917351 - task: type: STS dataset: type: mteb/sts17-crosslingual-sts name: MTEB STS17 (en-en) config: en-en split: test revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d metrics: - type: cos_sim_pearson value: 86.42501708915708 - type: cos_sim_spearman value: 86.42276182795041 - type: euclidean_pearson value: 86.5408207354761 - type: euclidean_spearman value: 85.46096321750838 - type: manhattan_pearson value: 86.54177303026881 - type: manhattan_spearman value: 85.50313151916117 - task: type: STS dataset: type: mteb/sts22-crosslingual-sts name: MTEB STS22 (en) config: en split: test revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80 metrics: - type: cos_sim_pearson value: 64.86521089250766 - type: cos_sim_spearman value: 65.94868540323003 - type: euclidean_pearson value: 67.16569626533084 - type: euclidean_spearman value: 66.37667004134917 - type: manhattan_pearson value: 67.1482365102333 - type: manhattan_spearman value: 66.53240122580029 - task: type: STS dataset: type: mteb/stsbenchmark-sts name: MTEB STSBenchmark config: default split: test revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831 metrics: - type: cos_sim_pearson value: 84.64746265365318 - type: cos_sim_spearman value: 86.41888825906786 - type: euclidean_pearson value: 85.27453642725811 - type: euclidean_spearman value: 85.94095796602544 - type: manhattan_pearson value: 85.28643660505334 - type: manhattan_spearman value: 85.95028003260744 - task: type: Reranking dataset: type: mteb/scidocs-reranking name: MTEB SciDocsRR config: default split: test revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab metrics: - type: map value: 87.48903153618527 - type: mrr value: 96.41081503826601 - task: type: Retrieval dataset: type: scifact name: MTEB SciFact config: default split: test revision: None metrics: - type: map_at_1 value: 58.594 - type: map_at_10 value: 69.296 - type: map_at_100 value: 69.782 - type: map_at_1000 value: 69.795 - type: map_at_3 value: 66.23 - type: map_at_5 value: 68.293 - type: mrr_at_1 value: 61.667 - type: mrr_at_10 value: 70.339 - type: mrr_at_100 value: 70.708 - type: mrr_at_1000 value: 70.722 - type: mrr_at_3 value: 68.0 - type: mrr_at_5 value: 69.56700000000001 - type: ndcg_at_1 value: 61.667 - type: ndcg_at_10 value: 74.039 - type: ndcg_at_100 value: 76.103 - type: ndcg_at_1000 value: 76.47800000000001 - type: ndcg_at_3 value: 68.967 - type: ndcg_at_5 value: 71.96900000000001 - type: precision_at_1 value: 61.667 - type: precision_at_10 value: 9.866999999999999 - type: precision_at_100 value: 1.097 - type: precision_at_1000 value: 0.11299999999999999 - type: precision_at_3 value: 27.111 - type: precision_at_5 value: 18.2 - type: recall_at_1 value: 58.594 - type: recall_at_10 value: 87.422 - type: recall_at_100 value: 96.667 - type: recall_at_1000 value: 99.667 - type: recall_at_3 value: 74.217 - type: recall_at_5 value: 81.539 - task: type: PairClassification dataset: type: mteb/sprintduplicatequestions-pairclassification name: MTEB SprintDuplicateQuestions config: default split: test revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46 metrics: - type: cos_sim_accuracy value: 99.85049504950496 - type: cos_sim_ap value: 96.33111544137081 - type: cos_sim_f1 value: 92.35443037974684 - type: cos_sim_precision value: 93.53846153846153 - type: cos_sim_recall value: 91.2 - type: dot_accuracy value: 99.82376237623762 - type: dot_ap value: 95.38082527310888 - type: dot_f1 value: 90.90909090909092 - type: dot_precision value: 92.90187891440502 - type: dot_recall value: 89.0 - type: euclidean_accuracy value: 99.84851485148515 - type: euclidean_ap value: 96.32316003996347 - type: euclidean_f1 value: 92.2071392659628 - type: euclidean_precision value: 92.71991911021233 - type: euclidean_recall value: 91.7 - type: manhattan_accuracy value: 99.84851485148515 - type: manhattan_ap value: 96.3655668249217 - type: manhattan_f1 value: 92.18356026222895 - type: manhattan_precision value: 92.98067141403867 - type: manhattan_recall value: 91.4 - type: max_accuracy value: 99.85049504950496 - type: max_ap value: 96.3655668249217 - type: max_f1 value: 92.35443037974684 - task: type: Clustering dataset: type: mteb/stackexchange-clustering name: MTEB StackExchangeClustering config: default split: test revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259 metrics: - type: v_measure value: 65.94861371629051 - task: type: Clustering dataset: type: mteb/stackexchange-clustering-p2p name: MTEB StackExchangeClusteringP2P config: default split: test revision: 815ca46b2622cec33ccafc3735d572c266efdb44 metrics: - type: v_measure value: 35.009430451385 - task: type: Reranking dataset: type: mteb/stackoverflowdupquestions-reranking name: MTEB StackOverflowDupQuestions config: default split: test revision: e185fbe320c72810689fc5848eb6114e1ef5ec69 metrics: - type: map value: 54.61164066427969 - type: mrr value: 55.49710603938544 - task: type: Summarization dataset: type: mteb/summeval name: MTEB SummEval config: default split: test revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c metrics: - type: cos_sim_pearson value: 30.622620124907662 - type: cos_sim_spearman value: 31.0678351356163 - type: dot_pearson value: 30.863727693306814 - type: dot_spearman value: 31.230306567021255 - task: type: Retrieval dataset: type: trec-covid name: MTEB TRECCOVID config: default split: test revision: None metrics: - type: map_at_1 value: 0.22 - type: map_at_10 value: 2.011 - type: map_at_100 value: 10.974 - type: map_at_1000 value: 25.819 - type: map_at_3 value: 0.6649999999999999 - type: map_at_5 value: 1.076 - type: mrr_at_1 value: 86.0 - type: mrr_at_10 value: 91.8 - type: mrr_at_100 value: 91.8 - type: mrr_at_1000 value: 91.8 - type: mrr_at_3 value: 91.0 - type: mrr_at_5 value: 91.8 - type: ndcg_at_1 value: 82.0 - type: ndcg_at_10 value: 78.07300000000001 - type: ndcg_at_100 value: 58.231 - type: ndcg_at_1000 value: 51.153000000000006 - type: ndcg_at_3 value: 81.123 - type: ndcg_at_5 value: 81.059 - type: precision_at_1 value: 86.0 - type: precision_at_10 value: 83.0 - type: precision_at_100 value: 59.38 - type: precision_at_1000 value: 22.55 - type: precision_at_3 value: 87.333 - type: precision_at_5 value: 86.8 - type: recall_at_1 value: 0.22 - type: recall_at_10 value: 2.2079999999999997 - type: recall_at_100 value: 14.069 - type: recall_at_1000 value: 47.678 - type: recall_at_3 value: 0.7040000000000001 - type: recall_at_5 value: 1.161 - task: type: Retrieval dataset: type: webis-touche2020 name: MTEB Touche2020 config: default split: test revision: None metrics: - type: map_at_1 value: 2.809 - type: map_at_10 value: 10.394 - type: map_at_100 value: 16.598 - type: map_at_1000 value: 18.142 - type: map_at_3 value: 5.572 - type: map_at_5 value: 7.1370000000000005 - type: mrr_at_1 value: 32.653 - type: mrr_at_10 value: 46.564 - type: mrr_at_100 value: 47.469 - type: mrr_at_1000 value: 47.469 - type: mrr_at_3 value: 42.177 - type: mrr_at_5 value: 44.524 - type: ndcg_at_1 value: 30.612000000000002 - type: ndcg_at_10 value: 25.701 - type: ndcg_at_100 value: 37.532 - type: ndcg_at_1000 value: 48.757 - type: ndcg_at_3 value: 28.199999999999996 - type: ndcg_at_5 value: 25.987 - type: precision_at_1 value: 32.653 - type: precision_at_10 value: 23.469 - type: precision_at_100 value: 7.9799999999999995 - type: precision_at_1000 value: 1.5350000000000001 - type: precision_at_3 value: 29.932 - type: precision_at_5 value: 26.122 - type: recall_at_1 value: 2.809 - type: recall_at_10 value: 16.887 - type: recall_at_100 value: 48.67 - type: recall_at_1000 value: 82.89699999999999 - type: recall_at_3 value: 6.521000000000001 - type: recall_at_5 value: 9.609 - task: type: Classification dataset: type: mteb/toxic_conversations_50k name: MTEB ToxicConversationsClassification config: default split: test revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c metrics: - type: accuracy value: 71.57860000000001 - type: ap value: 13.82629211536393 - type: f1 value: 54.59860966183956 - task: type: Classification dataset: type: mteb/tweet_sentiment_extraction name: MTEB TweetSentimentExtractionClassification config: default split: test revision: d604517c81ca91fe16a244d1248fc021f9ecee7a metrics: - type: accuracy value: 59.38030560271647 - type: f1 value: 59.69685552567865 - task: type: Clustering dataset: type: mteb/twentynewsgroups-clustering name: MTEB TwentyNewsgroupsClustering config: default split: test revision: 6125ec4e24fa026cec8a478383ee943acfbd5449 metrics: - type: v_measure value: 51.4736717043405 - task: type: PairClassification dataset: type: mteb/twittersemeval2015-pairclassification name: MTEB TwitterSemEval2015 config: default split: test revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1 metrics: - type: cos_sim_accuracy value: 86.92853311080646 - type: cos_sim_ap value: 77.67872502591382 - type: cos_sim_f1 value: 70.33941236068895 - type: cos_sim_precision value: 67.63273258645884 - type: cos_sim_recall value: 73.27176781002639 - type: dot_accuracy value: 85.79603027954938 - type: dot_ap value: 73.73786190233379 - type: dot_f1 value: 67.3437901774235 - type: dot_precision value: 65.67201604814443 - type: dot_recall value: 69.10290237467018 - type: euclidean_accuracy value: 86.94045419324074 - type: euclidean_ap value: 77.6687791535167 - type: euclidean_f1 value: 70.47209214023542 - type: euclidean_precision value: 67.7207492094381 - type: euclidean_recall value: 73.45646437994723 - type: manhattan_accuracy value: 86.87488823985218 - type: manhattan_ap value: 77.63373392430728 - type: manhattan_f1 value: 70.40920716112532 - type: manhattan_precision value: 68.31265508684864 - type: manhattan_recall value: 72.63852242744063 - type: max_accuracy value: 86.94045419324074 - type: max_ap value: 77.67872502591382 - type: max_f1 value: 70.47209214023542 - task: type: PairClassification dataset: type: mteb/twitterurlcorpus-pairclassification name: MTEB TwitterURLCorpus config: default split: test revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf metrics: - type: cos_sim_accuracy value: 88.67155664221679 - type: cos_sim_ap value: 85.64591703003417 - type: cos_sim_f1 value: 77.59531005352656 - type: cos_sim_precision value: 73.60967184801382 - type: cos_sim_recall value: 82.03726516784724 - type: dot_accuracy value: 88.41541506578181 - type: dot_ap value: 84.6482788957769 - type: dot_f1 value: 77.04748541466657 - type: dot_precision value: 74.02440754931176 - type: dot_recall value: 80.3279950723745 - type: euclidean_accuracy value: 88.63080684596576 - type: euclidean_ap value: 85.44570045321562 - type: euclidean_f1 value: 77.28769403336106 - type: euclidean_precision value: 72.90600040958427 - type: euclidean_recall value: 82.22975053895904 - type: manhattan_accuracy value: 88.59393798269105 - type: manhattan_ap value: 85.40271361038187 - type: manhattan_f1 value: 77.17606419344392 - type: manhattan_precision value: 72.4447747078295 - type: manhattan_recall value: 82.5685247921158 - type: max_accuracy value: 88.67155664221679 - type: max_ap value: 85.64591703003417 - type: max_f1 value: 77.59531005352656 license: mit language: - en --- <h1 align="center">FlagEmbedding</h1> <h4 align="center"> <p> <a href=#model-list>Model List</a> | <a href=#frequently-asked-questions>FAQ</a> | <a href=#usage>Usage</a> | <a href="#evaluation">Evaluation</a> | <a href="#train">Train</a> | <a href="#contact">Contact</a> | <a href="#citation">Citation</a> | <a href="#license">License</a> <p> </h4> More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding). [English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md) FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search. And it also can be used in vector databases for LLMs. ************* 🌟**Updates**🌟 ************* - 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire: - 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released - 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released - 09/12/2023: New models: - **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models. - **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction. <details> <summary>More</summary> <!-- ### More --> - 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning. - 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard). - 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗** - 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada: - 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset. </details> ## Model List `bge` is short for `BAAI general embedding`. | Model | Language | | Description | query instruction for retrieval [1] | |:-------------------------------|:--------:| :--------:| :--------:|:--------:| | [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | | | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` | [1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages. [2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models. For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results. All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI. If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models . ## Frequently asked questions <details> <summary>1. How to fine-tune bge embedding model?</summary> <!-- ### How to fine-tune bge embedding model? --> Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model. Some suggestions: - Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance. - If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity. - If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker. </details> <details> <summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary> <!-- ### The similarity score between two dissimilar sentences is higher than 0.5 --> **Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.** Since we finetune the models by contrastive learning with a temperature of 0.01, the similarity distribution of the current BGE model is about in the interval \[0.6, 1\]. So a similarity score greater than 0.5 does not indicate that the two sentences are similar. For downstream tasks, such as passage retrieval or semantic similarity, **what matters is the relative order of the scores, not the absolute value.** If you need to filter similar sentences based on a similarity threshold, please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9). </details> <details> <summary>3. When does the query instruction need to be used</summary> <!-- ### When does the query instruction need to be used --> For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction. No instruction only has a slight degradation in retrieval performance compared with using instruction. So you can generate embedding without instruction in all cases for convenience. For a retrieval task that uses short queries to find long related documents, it is recommended to add instructions for these short queries. **The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.** In all cases, the documents/passages do not need to add the instruction. </details> ## Usage ### Usage for Embedding Model Here are some examples for using `bge` models with [FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers). #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding. ```python from FlagEmbedding import FlagModel sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = FlagModel('BAAI/bge-large-zh-v1.5', query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:", use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation embeddings_1 = model.encode(sentences_1) embeddings_2 = model.encode(sentences_2) similarity = embeddings_1 @ embeddings_2.T print(similarity) # for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query # corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] q_embeddings = model.encode_queries(queries) p_embeddings = model.encode(passages) scores = q_embeddings @ p_embeddings.T ``` For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list). By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs. You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable. #### Using Sentence-Transformers You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net): ``` pip install -U sentence-transformers ``` ```python from sentence_transformers import SentenceTransformer sentences_1 = ["样例数据-1", "样例数据-2"] sentences_2 = ["样例数据-3", "样例数据-4"] model = SentenceTransformer('BAAI/bge-large-zh-v1.5') embeddings_1 = model.encode(sentences_1, normalize_embeddings=True) embeddings_2 = model.encode(sentences_2, normalize_embeddings=True) similarity = embeddings_1 @ embeddings_2.T print(similarity) ``` For s2p(short query to long passage) retrieval task, each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)). But the instruction is not needed for passages. ```python from sentence_transformers import SentenceTransformer queries = ['query_1', 'query_2'] passages = ["样例文档-1", "样例文档-2"] instruction = "为这个句子生成表示以用于检索相关文章:" model = SentenceTransformer('BAAI/bge-large-zh-v1.5') q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True) p_embeddings = model.encode(passages, normalize_embeddings=True) scores = q_embeddings @ p_embeddings.T ``` #### Using Langchain You can use `bge` in langchain like this: ```python from langchain.embeddings import HuggingFaceBgeEmbeddings model_name = "BAAI/bge-large-en-v1.5" model_kwargs = {'device': 'cuda'} encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity model = HuggingFaceBgeEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs, query_instruction="为这个句子生成表示以用于检索相关文章:" ) model.query_instruction = "为这个句子生成表示以用于检索相关文章:" ``` #### Using HuggingFace Transformers With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding. ```python from transformers import AutoTokenizer, AutoModel import torch # Sentences we want sentence embeddings for sentences = ["样例数据-1", "样例数据-2"] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5') model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5') model.eval() # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages) # encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, cls pooling. sentence_embeddings = model_output[0][:, 0] # normalize embeddings sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:", sentence_embeddings) ``` ### Usage for Reranker Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can get a relevance score by inputting query and passage to the reranker. The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range. #### Using FlagEmbedding ``` pip install -U FlagEmbedding ``` Get relevance scores (higher scores indicate more relevance): ```python from FlagEmbedding import FlagReranker reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation score = reranker.compute_score(['query', 'passage']) print(score) scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]) print(scores) ``` #### Using Huggingface transformers ```python import torch from transformers import AutoModelForSequenceClassification, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large') model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large') model.eval() pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']] with torch.no_grad(): inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512) scores = model(**inputs, return_dict=True).logits.view(-1, ).float() print(scores) ``` ## Evaluation `baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!** For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md). - **MTEB**: | Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) | |:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:| | [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 | | [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 | | [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 | | [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 | | [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 | | [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 | | [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 | | [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 | | [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 | | [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 | | [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 | | [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 | | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 | | [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 | | [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 | | [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 | | [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 | - **C-MTEB**: We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks. Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction. | Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 | | [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 | | [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 | | [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 | | [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 | | [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 | | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 | | [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 | | [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 | | [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 | | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 | | [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 | | [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 | | [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 | | [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 | | [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 | - **Reranking**: See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script. | Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg | |:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:| | text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 | | multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 | | multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 | | multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 | | m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 | | m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 | | bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 | | bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 | | [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 | | [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 | \* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks ## Train ### BAAI Embedding We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning. **You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).** We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain). Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned. More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md). ### BGE Reranker Cross-encoder will perform full-attention over the input pair, which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model. Therefore, it can be used to re-rank the top-k documents returned by embedding model. We train the cross-encoder on a multilingual pair data, The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker). More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker) ## Contact If you have any question or suggestion related to this project, feel free to open an issue or pull request. You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn). ## Citation If you find this repository useful, please consider giving a star :star: and citation ``` @misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## License FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
89,732
[ [ -0.036651611328125, -0.06793212890625, 0.029296875, 0.01187896728515625, -0.02716064453125, -0.02056884765625, -0.023773193359375, -0.0189666748046875, 0.0300445556640625, 0.02838134765625, -0.0259857177734375, -0.06549072265625, -0.036102294921875, -0.0046234130859375, -0.0063934326171875, 0.04150390625, -0.00373077392578125, 0.010711669921875, 0.003925323486328125, -0.0186920166015625, -0.0283966064453125, -0.0190277099609375, -0.049224853515625, -0.0192108154296875, 0.0257568359375, 0.0170745849609375, 0.04217529296875, 0.056365966796875, 0.0222930908203125, 0.020233154296875, -0.0175933837890625, 0.01165008544921875, -0.0357666015625, -0.005146026611328125, -0.01552581787109375, -0.024871826171875, -0.031463623046875, 0.01319122314453125, 0.050994873046875, 0.0335693359375, -0.007747650146484375, 0.0077972412109375, 0.0005064010620117188, 0.053192138671875, -0.034515380859375, 0.020751953125, -0.042724609375, 0.0027103424072265625, -0.01806640625, 0.01128387451171875, -0.03863525390625, -0.028717041015625, 0.011962890625, -0.045867919921875, 0.0061798095703125, 0.0216064453125, 0.097412109375, 0.01541900634765625, -0.033966064453125, -0.01239013671875, -0.0090789794921875, 0.0740966796875, -0.07525634765625, 0.051177978515625, 0.037933349609375, 0.0190277099609375, -0.00580596923828125, -0.0609130859375, -0.0269622802734375, -0.0124053955078125, -0.0151519775390625, 0.031524658203125, 0.0018949508666992188, 0.0015048980712890625, 0.0236358642578125, 0.044586181640625, -0.041412353515625, 0.007038116455078125, -0.005123138427734375, -0.01187896728515625, 0.056884765625, -0.01230621337890625, 0.034149169921875, -0.0416259765625, -0.0223236083984375, -0.0274505615234375, -0.059661865234375, 0.0034198760986328125, 0.0273590087890625, 0.0102386474609375, -0.029296875, 0.0423583984375, -0.017120361328125, 0.045501708984375, 0.0037994384765625, 0.003856658935546875, 0.03924560546875, -0.0278472900390625, -0.01548004150390625, -0.0109405517578125, 0.069580078125, 0.029449462890625, -0.004314422607421875, 0.0038661956787109375, -0.0241241455078125, -0.00708770751953125, -0.006938934326171875, -0.0670166015625, -0.018157958984375, 0.0148162841796875, -0.05712890625, -0.0135498046875, 0.0177459716796875, -0.05810546875, 0.007747650146484375, 0.00009906291961669922, 0.043670654296875, -0.055694580078125, -0.005519866943359375, 0.0233612060546875, -0.015777587890625, 0.029998779296875, -0.00023555755615234375, -0.046844482421875, -0.018524169921875, 0.039764404296875, 0.0640869140625, 0.0124969482421875, -0.005710601806640625, -0.0279541015625, 0.0028362274169921875, -0.01065826416015625, 0.0244598388671875, -0.03887939453125, -0.01331329345703125, 0.0157470703125, 0.028961181640625, -0.00771331787109375, -0.021697998046875, 0.06597900390625, -0.040130615234375, 0.02691650390625, -0.0283050537109375, -0.061553955078125, -0.037506103515625, 0.006908416748046875, -0.060089111328125, 0.08270263671875, -0.00732421875, -0.06353759765625, 0.006229400634765625, -0.048248291015625, -0.01617431640625, -0.0191192626953125, -0.00249481201171875, -0.044769287109375, -0.0087890625, 0.0284423828125, 0.043701171875, -0.017120361328125, 0.0026302337646484375, -0.0259857177734375, -0.042755126953125, -0.0005478858947753906, -0.017303466796875, 0.0819091796875, 0.0191650390625, -0.025115966796875, -0.0164642333984375, -0.032806396484375, 0.0089874267578125, 0.022705078125, -0.02337646484375, -0.025787353515625, 0.0165557861328125, 0.0176849365234375, 0.0038700103759765625, 0.03961181640625, -0.052764892578125, 0.01369476318359375, -0.043792724609375, 0.044464111328125, 0.04180908203125, 0.012939453125, 0.0179290771484375, -0.035491943359375, 0.021514892578125, -0.0017404556274414062, -0.00286102294921875, -0.0166015625, -0.039703369140625, -0.046966552734375, -0.0226287841796875, 0.055511474609375, 0.04937744140625, -0.06524658203125, 0.049774169921875, -0.034210205078125, -0.046356201171875, -0.07049560546875, 0.0100555419921875, 0.039947509765625, 0.00016415119171142578, 0.05377197265625, -0.01035308837890625, -0.035858154296875, -0.0699462890625, -0.00463104248046875, 0.005863189697265625, -0.00681304931640625, 0.040191650390625, 0.0460205078125, -0.023895263671875, 0.0305023193359375, -0.05487060546875, -0.026153564453125, -0.0172119140625, -0.0054779052734375, 0.025390625, 0.036773681640625, 0.0478515625, -0.07537841796875, -0.043701171875, -0.0006208419799804688, -0.058319091796875, 0.005710601806640625, 0.002719879150390625, -0.0224151611328125, 0.01305389404296875, 0.045654296875, -0.0307769775390625, 0.017791748046875, 0.03564453125, -0.019317626953125, 0.021087646484375, -0.0015649795532226562, 0.01097869873046875, -0.09912109375, 0.0016527175903320312, 0.022613525390625, -0.0085906982421875, -0.020477294921875, 0.03887939453125, 0.012725830078125, 0.0154266357421875, -0.025909423828125, 0.0439453125, -0.0394287109375, 0.01873779296875, 0.00966644287109375, 0.0460205078125, -0.00670623779296875, 0.038330078125, -0.0035228729248046875, 0.0537109375, 0.02783203125, -0.0299072265625, 0.00926971435546875, 0.03955078125, -0.033355712890625, 0.006103515625, -0.04937744140625, -0.005687713623046875, -0.005523681640625, 0.01256561279296875, -0.06207275390625, -0.005458831787109375, 0.0198211669921875, -0.042999267578125, 0.039581298828125, -0.0224456787109375, -0.037200927734375, -0.027587890625, -0.06829833984375, 0.0110015869140625, 0.04376220703125, -0.048553466796875, 0.0164794921875, 0.0220947265625, 0.0069427490234375, -0.0579833984375, -0.061309814453125, -0.01165771484375, -0.00017464160919189453, -0.039520263671875, 0.0408935546875, -0.0021495819091796875, 0.0191650390625, 0.01419830322265625, -0.005374908447265625, 0.01128387451171875, 0.0086669921875, -0.0001990795135498047, 0.0184173583984375, -0.035797119140625, 0.0035552978515625, 0.02056884765625, 0.00980377197265625, -0.0148773193359375, -0.01212310791015625, 0.033111572265625, -0.01291656494140625, -0.0267791748046875, -0.017791748046875, 0.0255584716796875, 0.0192413330078125, -0.030426025390625, 0.0445556640625, 0.07452392578125, -0.0281829833984375, -0.006275177001953125, -0.0496826171875, -0.009246826171875, -0.036224365234375, 0.034149169921875, -0.024322509765625, -0.07379150390625, 0.02972412109375, -0.0015230178833007812, 0.0162506103515625, 0.050872802734375, 0.0252532958984375, -0.0106201171875, 0.0809326171875, 0.0281524658203125, -0.0202789306640625, 0.04986572265625, -0.049774169921875, 0.0133209228515625, -0.0882568359375, -0.003337860107421875, -0.0297088623046875, -0.0297088623046875, -0.099853515625, -0.03802490234375, 0.0046234130859375, 0.0210113525390625, -0.02862548828125, 0.0323486328125, -0.042999267578125, 0.01145172119140625, 0.03643798828125, 0.0222930908203125, -0.0013837814331054688, 0.00936126708984375, -0.032623291015625, -0.020355224609375, -0.04583740234375, -0.038238525390625, 0.07525634765625, 0.036407470703125, 0.046051025390625, 0.027374267578125, 0.061981201171875, 0.01422119140625, 0.007320404052734375, -0.058197021484375, 0.042999267578125, -0.03936767578125, -0.042999267578125, -0.0269927978515625, -0.036712646484375, -0.083984375, 0.02984619140625, -0.0205841064453125, -0.058319091796875, 0.00807952880859375, -0.01485443115234375, -0.0023059844970703125, 0.03515625, -0.050872802734375, 0.07720947265625, -0.0081024169921875, -0.023193359375, -0.005886077880859375, -0.03155517578125, 0.024505615234375, 0.01496124267578125, 0.00620269775390625, 0.0055694580078125, -0.01953125, 0.0572509765625, -0.01416015625, 0.04803466796875, -0.0122222900390625, 0.01123809814453125, 0.03240966796875, -0.01385498046875, 0.04168701171875, 0.00606536865234375, -0.0135498046875, 0.0226898193359375, 0.006725311279296875, -0.036376953125, -0.037384033203125, 0.06634521484375, -0.05072021484375, -0.05328369140625, -0.028289794921875, -0.0188751220703125, 0.01348114013671875, 0.032989501953125, 0.0265960693359375, 0.01654052734375, -0.00777435302734375, 0.048736572265625, 0.06982421875, -0.041107177734375, 0.0289154052734375, 0.0261077880859375, -0.0206146240234375, -0.044647216796875, 0.0845947265625, 0.0197906494140625, -0.00395965576171875, 0.050750732421875, 0.0010137557983398438, -0.02105712890625, -0.0400390625, -0.034393310546875, 0.04791259765625, -0.04473876953125, -0.01264190673828125, -0.048309326171875, -0.0322265625, -0.0325927734375, 0.001651763916015625, -0.02044677734375, -0.021331787109375, -0.013427734375, -0.0211944580078125, 0.0177764892578125, 0.0357666015625, 0.0091552734375, 0.006694793701171875, -0.0535888671875, 0.015899658203125, -0.0073394775390625, 0.033172607421875, 0.005420684814453125, -0.04071044921875, -0.046844482421875, 0.01316070556640625, -0.0369873046875, -0.081787109375, 0.0262451171875, 0.005687713623046875, 0.06317138671875, 0.02484130859375, -0.0008497238159179688, 0.0309600830078125, -0.039581298828125, 0.08056640625, -0.0081634521484375, -0.05926513671875, 0.0384521484375, -0.02117919921875, 0.012420654296875, 0.042205810546875, 0.049285888671875, -0.03497314453125, -0.0206451416015625, -0.037078857421875, -0.07275390625, 0.036712646484375, 0.01374053955078125, 0.00321197509765625, -0.022369384765625, 0.0247955322265625, -0.01374053955078125, -0.00016796588897705078, -0.060302734375, -0.05621337890625, -0.025177001953125, -0.02655029296875, -0.00732421875, -0.020904541015625, 0.01558685302734375, -0.0218963623046875, 0.075439453125, 0.0003390312194824219, 0.041351318359375, 0.0270233154296875, -0.024658203125, 0.018035888671875, 0.019073486328125, 0.0224609375, 0.01415252685546875, -0.0291595458984375, -0.0109405517578125, 0.0237579345703125, -0.0416259765625, -0.004810333251953125, 0.0233612060546875, -0.035308837890625, 0.014617919921875, 0.023040771484375, 0.053375244140625, 0.033843994140625, -0.03338623046875, 0.042633056640625, 0.00861358642578125, -0.01419830322265625, -0.02252197265625, -0.00539398193359375, 0.023040771484375, 0.0189666748046875, 0.00879669189453125, -0.034393310546875, 0.0199737548828125, -0.0399169921875, 0.0255584716796875, 0.033843994140625, -0.028656005859375, -0.00505828857421875, 0.052764892578125, 0.002620697021484375, -0.0016107559204101562, 0.036102294921875, -0.037841796875, -0.055450439453125, 0.03204345703125, 0.028289794921875, 0.06329345703125, -0.010986328125, 0.016876220703125, 0.06500244140625, 0.04010009765625, -0.024078369140625, 0.0269012451171875, 0.00585174560546875, -0.04400634765625, -0.033447265625, -0.0408935546875, -0.004390716552734375, 0.02008056640625, -0.043548583984375, 0.02642822265625, -0.031402587890625, -0.01114654541015625, 0.0023708343505859375, 0.033111572265625, -0.05609130859375, 0.00955963134765625, 0.003421783447265625, 0.08477783203125, -0.04400634765625, 0.06292724609375, 0.07476806640625, -0.07232666015625, -0.058197021484375, 0.0059814453125, -0.0098419189453125, -0.04583740234375, 0.0283050537109375, 0.0198822021484375, 0.0133209228515625, 0.00469207763671875, -0.03594970703125, -0.06890869140625, 0.11822509765625, 0.0028858184814453125, -0.0399169921875, -0.00466156005859375, -0.021240234375, 0.034454345703125, -0.02850341796875, 0.033599853515625, 0.0310516357421875, 0.0460205078125, -0.0139617919921875, -0.048553466796875, 0.040863037109375, -0.0238037109375, 0.0176849365234375, 0.00370025634765625, -0.0733642578125, 0.0623779296875, 0.003437042236328125, -0.0249786376953125, 0.0146484375, 0.05450439453125, 0.0173187255859375, 0.031585693359375, 0.0182037353515625, 0.07000732421875, 0.049652099609375, -0.01678466796875, 0.08709716796875, -0.019439697265625, 0.04736328125, 0.06549072265625, 0.01275634765625, 0.0843505859375, 0.006694793701171875, -0.017730712890625, 0.05059814453125, 0.0596923828125, -0.024078369140625, 0.035064697265625, 0.0014438629150390625, 0.00460052490234375, -0.024078369140625, 0.00402069091796875, -0.04034423828125, 0.0217132568359375, 0.0245513916015625, -0.038970947265625, 0.0033626556396484375, -0.0220489501953125, 0.00897979736328125, 0.00798797607421875, -0.0017061233520507812, 0.04339599609375, 0.0235443115234375, -0.035064697265625, 0.050079345703125, 0.0177459716796875, 0.07611083984375, -0.0305023193359375, -0.011505126953125, -0.0212860107421875, -0.008575439453125, -0.01708984375, -0.05889892578125, -0.0060882568359375, -0.019378662109375, -0.0155487060546875, 0.00634765625, 0.0406494140625, -0.046966552734375, -0.0306549072265625, 0.042572021484375, 0.038818359375, 0.01824951171875, 0.01348114013671875, -0.0826416015625, 0.0023517608642578125, 0.0288238525390625, -0.0401611328125, 0.0231170654296875, 0.035552978515625, -0.004669189453125, 0.04443359375, 0.04400634765625, 0.004825592041015625, -0.0014820098876953125, 0.003021240234375, 0.03875732421875, -0.07037353515625, -0.022979736328125, -0.047607421875, 0.0271148681640625, -0.0246124267578125, 0.0016498565673828125, 0.060791015625, 0.053009033203125, 0.08074951171875, -0.003963470458984375, 0.061309814453125, -0.00862884521484375, 0.0306854248046875, -0.045318603515625, 0.067138671875, -0.0772705078125, 0.019439697265625, -0.0266265869140625, -0.070556640625, -0.0118408203125, 0.052703857421875, -0.0252838134765625, 0.0174102783203125, 0.051177978515625, 0.0736083984375, -0.019287109375, -0.01438140869140625, 0.023193359375, 0.0328369140625, 0.0118865966796875, 0.0596923828125, 0.0259857177734375, -0.0736083984375, 0.048248291015625, -0.017852783203125, 0.009613037109375, -0.039154052734375, -0.04791259765625, -0.06939697265625, -0.05511474609375, -0.031707763671875, -0.022979736328125, -0.0034198760986328125, 0.06927490234375, 0.0257568359375, -0.056304931640625, -0.005207061767578125, 0.02081298828125, 0.036468505859375, -0.02008056640625, -0.0207061767578125, 0.0496826171875, -0.005706787109375, -0.07037353515625, 0.0247039794921875, -0.006290435791015625, -0.0051422119140625, -0.00399017333984375, -0.0184326171875, -0.06640625, 0.00894927978515625, 0.044952392578125, 0.0191497802734375, -0.068603515625, -0.031585693359375, 0.00634002685546875, -0.0196075439453125, -0.0115814208984375, 0.012725830078125, -0.0309906005859375, 0.0269927978515625, 0.046600341796875, 0.05792236328125, 0.0496826171875, -0.003513336181640625, 0.01529693603515625, -0.046051025390625, -0.00640869140625, -0.0031604766845703125, 0.053375244140625, 0.0273895263671875, -0.0226287841796875, 0.06781005859375, 0.016082763671875, -0.030487060546875, -0.056610107421875, 0.0024394989013671875, -0.07928466796875, -0.02471923828125, 0.083984375, -0.031982421875, -0.0188140869140625, 0.02056884765625, -0.01439666748046875, 0.041259765625, -0.036956787109375, 0.035369873046875, 0.05999755859375, 0.03271484375, -0.0117034912109375, -0.0684814453125, 0.0235443115234375, 0.046722412109375, -0.0206756591796875, -0.0255126953125, 0.025390625, 0.036346435546875, 0.0179595947265625, 0.00876617431640625, -0.0185546875, 0.024017333984375, -0.005649566650390625, -0.0006680488586425781, -0.0104217529296875, 0.018524169921875, -0.01372528076171875, 0.001949310302734375, -0.01255035400390625, -0.023223876953125 ] ]
timm/resnet101.a1h_in1k
2023-04-05T18:20:31.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "arxiv:2110.00476", "arxiv:1512.03385", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/resnet101.a1h_in1k
0
104,983
timm
2023-04-05T18:19:44
--- tags: - image-classification - timm library_tag: timm license: apache-2.0 --- # Model card for resnet101.a1h_in1k A ResNet-B image classification model. This model features: * ReLU activations * single layer 7x7 convolution with pooling * 1x1 convolution shortcut downsample Trained on ImageNet-1k in `timm` using recipe template described below. Recipe details: * Based on [ResNet Strikes Back](https://arxiv.org/abs/2110.00476) `A1` recipe * LAMB optimizer * Stronger dropout, stochastic depth, and RandAugment than paper `A1` recipe * Cosine LR schedule with warmup ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 44.5 - GMACs: 7.8 - Activations (M): 16.2 - Image size: train = 224 x 224, test = 288 x 288 - **Papers:** - ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476 - Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385 - **Original:** https://github.com/huggingface/pytorch-image-models ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('resnet101.a1h_in1k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnet101.a1h_in1k', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 64, 112, 112]) # torch.Size([1, 256, 56, 56]) # torch.Size([1, 512, 28, 28]) # torch.Size([1, 1024, 14, 14]) # torch.Size([1, 2048, 7, 7]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnet101.a1h_in1k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 2048, 7, 7) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). |model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec| |------------------------------------------|--------|-----|-----|-----------|-----|-----|-------| |[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 | |[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 | |[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 | |[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 | |[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 | |[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 | |[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 | |[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 | |[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 | |[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 | |[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 | |[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 | |[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 | |[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 | |[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 | |[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 | |[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 | |[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 | |[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 | |[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 | |[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 | |[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 | |[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 | |[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 | |[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 | |[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 | |[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 | |[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 | |[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 | |[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 | |[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 | |[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 | |[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 | |[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 | |[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 | |[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 | |[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 | |[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 | |[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 | |[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 | |[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 | |[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 | |[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 | |[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 | |[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 | |[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 | |[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 | |[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 | |[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 | |[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 | |[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 | |[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 | |[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 | |[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 | |[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 | |[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 | |[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 | |[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 | |[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 | |[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 | |[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 | |[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 | |[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 | |[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 | |[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 | |[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 | |[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 | |[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 | |[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 | |[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 | |[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 | |[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 | |[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 | |[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 | |[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 | |[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 | |[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 | |[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 | |[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 | |[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 | |[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 | |[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 | |[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 | |[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 | |[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 | |[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 | |[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 | |[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 | |[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 | |[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 | |[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 | |[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 | |[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 | |[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 | |[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 | |[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 | |[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 | |[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 | |[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 | |[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 | |[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 | |[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 | |[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 | |[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 | |[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 | |[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 | |[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 | |[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 | |[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 | |[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 | |[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 | |[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 | |[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 | |[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 | |[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 | |[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 | |[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 | |[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 | |[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 | |[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 | |[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 | |[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 | |[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 | |[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 | |[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 | |[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 | |[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 | |[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 | |[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 | |[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 | |[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 | |[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 | |[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 | |[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 | |[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 | |[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 | |[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 | |[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 | |[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 | |[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 | |[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 | |[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 | |[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 | |[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 | |[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 | |[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 | |[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 | |[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 | |[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 | |[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 | |[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 | |[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 | |[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 | |[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 | |[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 | |[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 | |[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 | |[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 | |[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 | |[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 | |[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 | |[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 | |[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 | |[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 | |[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 | |[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 | |[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 | |[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 | |[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 | |[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 | |[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 | |[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 | |[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 | |[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 | |[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 | |[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 | |[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 | |[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 | |[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 | |[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 | |[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 | |[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 | |[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 | |[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 | |[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 | |[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 | |[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 | |[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 | |[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 | |[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 | |[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 | |[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 | |[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 | |[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 | |[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 | |[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 | |[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 | |[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 | |[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 | |[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 | |[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 | |[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 | |[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 | |[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 | |[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 | |[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 | |[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 | |[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 | |[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 | |[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 | |[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 | |[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 | |[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 | |[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 | |[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 | |[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 | |[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 | |[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 | |[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 | |[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 | |[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 | |[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 | |[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 | |[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 | |[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 | |[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 | |[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 | |[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 | |[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 | |[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 | |[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 | |[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 | |[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 | |[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 | |[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 | |[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 | |[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 | |[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 | |[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 | |[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 | |[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 | |[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 | |[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 | |[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 | |[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 | |[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 | |[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 | |[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 | |[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 | |[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 | |[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 | |[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 | |[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 | |[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 | |[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 | ## Citation ```bibtex @inproceedings{wightman2021resnet, title={ResNet strikes back: An improved training procedure in timm}, author={Wightman, Ross and Touvron, Hugo and Jegou, Herve}, booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ``` ```bibtex @article{He2015, author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun}, title = {Deep Residual Learning for Image Recognition}, journal = {arXiv preprint arXiv:1512.03385}, year = {2015} } ```
38,523
[ [ -0.064208984375, -0.0171051025390625, 0.001995086669921875, 0.0279083251953125, -0.030731201171875, -0.00959014892578125, -0.0097503662109375, -0.0288543701171875, 0.0860595703125, 0.022216796875, -0.04736328125, -0.040313720703125, -0.04718017578125, 0.00032830238342285156, 0.022552490234375, 0.06524658203125, 0.0007534027099609375, -0.006744384765625, 0.0163116455078125, -0.020263671875, -0.00481414794921875, -0.024627685546875, -0.0804443359375, -0.01401519775390625, 0.032928466796875, 0.01233673095703125, 0.050323486328125, 0.046966552734375, 0.0302276611328125, 0.044708251953125, -0.020172119140625, 0.0218505859375, -0.005474090576171875, -0.008636474609375, 0.045684814453125, -0.0291900634765625, -0.06829833984375, -0.0016508102416992188, 0.05474853515625, 0.046661376953125, 0.00494384765625, 0.02728271484375, 0.02691650390625, 0.046478271484375, 0.00118255615234375, -0.0038509368896484375, 0.00048279762268066406, 0.00994873046875, -0.02252197265625, 0.006778717041015625, -0.005908966064453125, -0.053192138671875, 0.0119476318359375, -0.045257568359375, -0.0044403076171875, 0.0010662078857421875, 0.10015869140625, -0.0083160400390625, -0.01617431640625, 0.007404327392578125, 0.01076507568359375, 0.056396484375, -0.062042236328125, 0.0255126953125, 0.041229248046875, 0.0023403167724609375, -0.0146636962890625, -0.051300048828125, -0.03839111328125, 0.00875091552734375, -0.03240966796875, 0.022308349609375, -0.0220489501953125, -0.0174560546875, 0.029510498046875, 0.0254058837890625, -0.03338623046875, -0.00872039794921875, -0.027435302734375, -0.007373809814453125, 0.0513916015625, 0.005901336669921875, 0.0517578125, -0.0256500244140625, -0.03692626953125, -0.01105499267578125, -0.01263427734375, 0.03582763671875, 0.0184173583984375, 0.0118865966796875, -0.0806884765625, 0.03033447265625, 0.00995635986328125, 0.0187530517578125, 0.0279541015625, -0.0114593505859375, 0.0621337890625, -0.00528717041015625, -0.039215087890625, -0.035888671875, 0.08087158203125, 0.047210693359375, 0.0189208984375, -0.006130218505859375, -0.004383087158203125, -0.0142974853515625, -0.027587890625, -0.071044921875, -0.0037364959716796875, 0.019561767578125, -0.04229736328125, -0.0200653076171875, 0.0245513916015625, -0.06585693359375, -0.004940032958984375, -0.005832672119140625, 0.0064544677734375, -0.054840087890625, -0.0328369140625, 0.0008454322814941406, -0.017181396484375, 0.038116455078125, 0.0172882080078125, -0.0244140625, 0.032562255859375, 0.006473541259765625, 0.06707763671875, 0.020355224609375, -0.006381988525390625, -0.0182647705078125, 0.000370025634765625, -0.02606201171875, 0.0284423828125, 0.0111846923828125, -0.01241302490234375, -0.0267486572265625, 0.033477783203125, -0.0203094482421875, -0.0185699462890625, 0.044830322265625, 0.0188140869140625, 0.014434814453125, -0.021942138671875, -0.018096923828125, -0.0193939208984375, 0.0281524658203125, -0.044219970703125, 0.0765380859375, 0.0287933349609375, -0.08294677734375, 0.01067352294921875, -0.037933349609375, -0.0012826919555664062, -0.0222320556640625, 0.005657196044921875, -0.0682373046875, 0.0022449493408203125, 0.01519775390625, 0.051788330078125, -0.0184326171875, -0.0119171142578125, -0.0268096923828125, 0.0031871795654296875, 0.029998779296875, 0.01320648193359375, 0.0697021484375, 0.0230712890625, -0.03472900390625, -0.01507568359375, -0.054168701171875, 0.033538818359375, 0.0333251953125, -0.0014238357543945312, -0.00371551513671875, -0.0595703125, 0.0022029876708984375, 0.04345703125, 0.016937255859375, -0.051971435546875, 0.018585205078125, -0.01445770263671875, 0.02557373046875, 0.047882080078125, 0.0027618408203125, 0.01401519775390625, -0.05291748046875, 0.04461669921875, -0.0005068778991699219, 0.020477294921875, -0.0020465850830078125, -0.030853271484375, -0.056121826171875, -0.05401611328125, 0.018829345703125, 0.031524658203125, -0.0304718017578125, 0.0643310546875, 0.00914764404296875, -0.045654296875, -0.046783447265625, 0.003047943115234375, 0.04449462890625, 0.0207672119140625, 0.00823974609375, -0.028564453125, -0.05426025390625, -0.072509765625, -0.024139404296875, 0.00893402099609375, -0.0011777877807617188, 0.049530029296875, 0.032684326171875, -0.01372528076171875, 0.038848876953125, -0.02783203125, -0.0164031982421875, -0.01155853271484375, -0.00768280029296875, 0.033355712890625, 0.060394287109375, 0.076904296875, -0.05438232421875, -0.06878662109375, 0.01064300537109375, -0.08221435546875, -0.006061553955078125, -0.0017156600952148438, -0.01971435546875, 0.0323486328125, 0.0171356201171875, -0.0628662109375, 0.05841064453125, 0.0283966064453125, -0.06036376953125, 0.033660888671875, -0.0253143310546875, 0.042144775390625, -0.0828857421875, 0.02099609375, 0.0208282470703125, -0.018157958984375, -0.043304443359375, 0.005336761474609375, -0.005901336669921875, 0.007732391357421875, -0.0423583984375, 0.059173583984375, -0.05279541015625, -0.003582000732421875, 0.00989532470703125, 0.002300262451171875, -0.00009173154830932617, 0.033416748046875, -0.003589630126953125, 0.04443359375, 0.0662841796875, -0.01392364501953125, 0.024383544921875, 0.031219482421875, 0.004306793212890625, 0.055908203125, -0.047393798828125, 0.0095977783203125, 0.00229644775390625, 0.033477783203125, -0.07550048828125, -0.0294342041015625, 0.041412353515625, -0.060791015625, 0.049407958984375, -0.020416259765625, -0.0207061767578125, -0.062225341796875, -0.06536865234375, 0.019683837890625, 0.049102783203125, -0.04547119140625, 0.0287628173828125, 0.0163116455078125, -0.0013761520385742188, -0.03631591796875, -0.052490234375, 0.006072998046875, -0.032440185546875, -0.06170654296875, 0.033782958984375, 0.0242919921875, -0.01320648193359375, 0.00702667236328125, -0.0104522705078125, -0.01116943359375, -0.01482391357421875, 0.045166015625, 0.023834228515625, -0.022705078125, -0.0311431884765625, -0.0288848876953125, -0.02215576171875, -0.005474090576171875, -0.00913238525390625, 0.038787841796875, -0.03387451171875, 0.00539398193359375, -0.108642578125, 0.00811004638671875, 0.06634521484375, -0.0018224716186523438, 0.0743408203125, 0.057586669921875, -0.0360107421875, 0.01387786865234375, -0.035003662109375, -0.0172271728515625, -0.038787841796875, -0.0172119140625, -0.05230712890625, -0.04315185546875, 0.07025146484375, 0.0042877197265625, -0.010101318359375, 0.05731201171875, 0.01033782958984375, -0.0184173583984375, 0.06103515625, 0.035736083984375, -0.002716064453125, 0.04193115234375, -0.063232421875, 0.00630950927734375, -0.0625, -0.054931640625, -0.019561767578125, -0.04443359375, -0.04443359375, -0.0235595703125, 0.01904296875, 0.0287017822265625, -0.02081298828125, 0.0457763671875, -0.043060302734375, 0.0029811859130859375, 0.026519775390625, 0.040283203125, -0.0173797607421875, -0.007190704345703125, -0.0101318359375, -0.024658203125, -0.03900146484375, -0.0265655517578125, 0.0570068359375, 0.04644775390625, 0.031585693359375, 0.007659912109375, 0.045074462890625, 0.003749847412109375, 0.01446533203125, -0.022735595703125, 0.05120849609375, 0.0028533935546875, -0.0313720703125, -0.0250091552734375, -0.0308685302734375, -0.0806884765625, 0.0122833251953125, -0.033721923828125, -0.06317138671875, -0.01204681396484375, -0.0031757354736328125, -0.027587890625, 0.056884765625, -0.046234130859375, 0.048309326171875, -0.004756927490234375, -0.040283203125, -0.0016851425170898438, -0.060455322265625, 0.005401611328125, 0.02783203125, 0.0028972625732421875, 0.0008635520935058594, -0.00304412841796875, 0.058685302734375, -0.06231689453125, 0.0462646484375, -0.0262603759765625, 0.01172637939453125, 0.029296875, -0.0019588470458984375, 0.029876708984375, -0.0025539398193359375, -0.01519775390625, -0.004764556884765625, 0.0081024169921875, -0.061187744140625, -0.0261688232421875, 0.049102783203125, -0.0556640625, -0.0293731689453125, -0.04852294921875, -0.0186614990234375, 0.00807952880859375, 0.002620697021484375, 0.036376953125, 0.0494384765625, 0.0008349418640136719, 0.0178680419921875, 0.0390625, -0.0343017578125, 0.038726806640625, -0.00862884521484375, -0.0001722574234008789, -0.044097900390625, 0.0528564453125, 0.0052947998046875, -0.0002601146697998047, -0.0012187957763671875, 0.003200531005859375, -0.0304718017578125, -0.0171966552734375, -0.023773193359375, 0.054840087890625, -0.01190185546875, -0.02435302734375, -0.04779052734375, -0.02716064453125, -0.04315185546875, -0.0302886962890625, -0.032928466796875, -0.0276947021484375, -0.0221710205078125, 0.0032863616943359375, 0.0533447265625, 0.06402587890625, -0.0267486572265625, 0.0316162109375, -0.040283203125, 0.0213470458984375, 0.005199432373046875, 0.04193115234375, -0.0230712890625, -0.053497314453125, 0.003368377685546875, -0.0034503936767578125, -0.0076904296875, -0.0635986328125, 0.0499267578125, 0.0004801750183105469, 0.028167724609375, 0.029205322265625, -0.0165863037109375, 0.056060791015625, -0.0016717910766601562, 0.035400390625, 0.04632568359375, -0.053009033203125, 0.0280914306640625, -0.031463623046875, 0.00148773193359375, 0.0209197998046875, 0.01485443115234375, -0.030609130859375, -0.0242156982421875, -0.06787109375, -0.03125, 0.056884765625, 0.008087158203125, -0.002796173095703125, -0.002307891845703125, 0.056304931640625, -0.005695343017578125, 0.004192352294921875, -0.040618896484375, -0.06756591796875, -0.007770538330078125, -0.0108184814453125, 0.006092071533203125, -0.004665374755859375, 0.00395965576171875, -0.04998779296875, 0.04998779296875, 0.005218505859375, 0.03704833984375, 0.01300811767578125, 0.004833221435546875, 0.0025730133056640625, -0.023956298828125, 0.045318603515625, 0.0261688232421875, -0.013946533203125, -0.00870513916015625, 0.0277557373046875, -0.0384521484375, 0.007587432861328125, 0.01450347900390625, -0.0018396377563476562, 0.005504608154296875, 0.00815582275390625, 0.039215087890625, 0.0247955322265625, -0.003936767578125, 0.03790283203125, -0.0185089111328125, -0.040130615234375, -0.016326904296875, -0.01435089111328125, 0.018402099609375, 0.0323486328125, 0.0233306884765625, 0.005504608154296875, -0.029541015625, -0.0272674560546875, 0.039520263671875, 0.052886962890625, -0.0286865234375, -0.031524658203125, 0.044342041015625, -0.002864837646484375, -0.018310546875, 0.0303192138671875, -0.00882720947265625, -0.05126953125, 0.07794189453125, 0.026519775390625, 0.048553466796875, -0.037109375, 0.00627899169921875, 0.06488037109375, -0.0008826255798339844, 0.0128021240234375, 0.0250091552734375, 0.034912109375, -0.0261383056640625, -0.005039215087890625, -0.04083251953125, 0.01384735107421875, 0.03814697265625, -0.03173828125, 0.0219573974609375, -0.055145263671875, -0.0271148681640625, 0.0069427490234375, 0.03692626953125, -0.04730224609375, 0.0254974365234375, -0.0029201507568359375, 0.08001708984375, -0.06170654296875, 0.06317138671875, 0.0689697265625, -0.04229736328125, -0.0662841796875, -0.0013275146484375, 0.010040283203125, -0.065673828125, 0.036651611328125, 0.00788116455078125, 0.0024204254150390625, -0.0007801055908203125, -0.0374755859375, -0.05108642578125, 0.103271484375, 0.0309295654296875, -0.002353668212890625, 0.020904541015625, -0.0322265625, 0.027862548828125, -0.01458740234375, 0.04388427734375, 0.0276031494140625, 0.038238525390625, 0.01312255859375, -0.0660400390625, 0.027435302734375, -0.031707763671875, -0.0080413818359375, 0.0240936279296875, -0.0975341796875, 0.0675048828125, -0.0178680419921875, -0.003993988037109375, 0.0172576904296875, 0.048431396484375, 0.023834228515625, -0.00099945068359375, 0.0200958251953125, 0.07061767578125, 0.037628173828125, -0.01763916015625, 0.0791015625, -0.016845703125, 0.03936767578125, 0.017913818359375, 0.040924072265625, 0.0268402099609375, 0.028045654296875, -0.04144287109375, 0.0195465087890625, 0.06298828125, -0.0027828216552734375, 0.01010894775390625, 0.02069091796875, -0.0287933349609375, -0.013946533203125, -0.016204833984375, -0.05145263671875, 0.017120361328125, 0.0070648193359375, -0.0125579833984375, -0.011566162109375, -0.0033435821533203125, 0.0177001953125, 0.021820068359375, -0.0189208984375, 0.03973388671875, 0.005863189697265625, -0.0288543701171875, 0.035858154296875, -0.0011377334594726562, 0.07794189453125, -0.0249786376953125, 0.012359619140625, -0.02789306640625, 0.021148681640625, -0.018585205078125, -0.080078125, 0.02581787109375, -0.006832122802734375, 0.004703521728515625, -0.016845703125, 0.04998779296875, -0.02508544921875, -0.025299072265625, 0.0281524658203125, 0.0301055908203125, 0.039581298828125, 0.021484375, -0.08441162109375, 0.021026611328125, 0.006885528564453125, -0.04583740234375, 0.032867431640625, 0.03765869140625, 0.02630615234375, 0.05670166015625, 0.0254974365234375, 0.023193359375, 0.0138702392578125, -0.0254364013671875, 0.056365966796875, -0.0487060546875, -0.033416748046875, -0.062164306640625, 0.03912353515625, -0.0307769775390625, -0.039581298828125, 0.0540771484375, 0.042144775390625, 0.0284881591796875, 0.00177764892578125, 0.04852294921875, -0.03955078125, 0.037750244140625, -0.0198516845703125, 0.056610107421875, -0.04998779296875, -0.0173797607421875, -0.014556884765625, -0.045257568359375, -0.0305633544921875, 0.06365966796875, -0.01007080078125, 0.0182647705078125, 0.0230255126953125, 0.051422119140625, 0.0041046142578125, -0.00971221923828125, 0.0012054443359375, 0.0121917724609375, -0.0096282958984375, 0.0643310546875, 0.038238525390625, -0.057830810546875, 0.0037059783935546875, -0.0364990234375, -0.02099609375, -0.0251617431640625, -0.056793212890625, -0.08660888671875, -0.05078125, -0.039886474609375, -0.0513916015625, -0.0177154541015625, 0.08966064453125, 0.06146240234375, -0.04595947265625, -0.01090240478515625, 0.01061248779296875, 0.006504058837890625, -0.0117340087890625, -0.0159912109375, 0.04083251953125, 0.005603790283203125, -0.07275390625, -0.031005859375, 0.0107574462890625, 0.04608154296875, 0.02972412109375, -0.03619384765625, -0.0162200927734375, -0.00679779052734375, 0.0247955322265625, 0.06298828125, -0.05975341796875, -0.021392822265625, 0.0007357597351074219, -0.035919189453125, 0.00982666015625, 0.0218505859375, -0.03460693359375, -0.00836944580078125, 0.03619384765625, 0.031219482421875, 0.053497314453125, 0.0058135986328125, 0.011566162109375, -0.034210205078125, 0.042266845703125, -0.0025501251220703125, 0.025604248046875, 0.016845703125, -0.0224609375, 0.0577392578125, 0.0404052734375, -0.0296173095703125, -0.076904296875, -0.01340484619140625, -0.097412109375, -0.00423431396484375, 0.04876708984375, -0.00624847412109375, -0.03387451171875, 0.032623291015625, -0.032196044921875, 0.038360595703125, -0.0165557861328125, 0.0211944580078125, 0.017608642578125, -0.0260009765625, -0.027740478515625, -0.040496826171875, 0.045684814453125, 0.027130126953125, -0.0513916015625, -0.0303192138671875, 0.00011897087097167969, 0.0232391357421875, 0.01473236083984375, 0.056549072265625, -0.02862548828125, 0.0101470947265625, -0.008544921875, 0.01873779296875, -0.0036334991455078125, 0.0113372802734375, -0.024200439453125, -0.00963592529296875, -0.01739501953125, -0.04827880859375 ] ]
bigscience/bloom-7b1
2023-02-10T16:36:00.000Z
[ "transformers", "pytorch", "jax", "bloom", "text-generation", "ak", "ar", "as", "bm", "bn", "ca", "code", "en", "es", "eu", "fon", "fr", "gu", "hi", "id", "ig", "ki", "kn", "lg", "ln", "ml", "mr", "ne", "nso", "ny", "or", "pa", "pt", "rn", "rw", "sn", "st", "sw", "ta", "te", "tn", "ts", "tum", "tw", "ur", "vi", "wo", "xh", "yo", "zh", "zhs", "zht", "zu", "arxiv:1909.08053", "arxiv:2110.02861", "arxiv:2108.12409", "license:bigscience-bloom-rail-1.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
bigscience
null
null
bigscience/bloom-7b1
138
104,541
transformers
2022-05-19T11:53:18
--- license: bigscience-bloom-rail-1.0 language: - ak - ar - as - bm - bn - ca - code - en - es - eu - fon - fr - gu - hi - id - ig - ki - kn - lg - ln - ml - mr - ne - nso - ny - or - pa - pt - rn - rw - sn - st - sw - ta - te - tn - ts - tum - tw - ur - vi - wo - xh - yo - zh - zhs - zht - zu pipeline_tag: text-generation --- <h1 style='text-align: center '>BLOOM LM</h1> <h2 style='text-align: center '><em>BigScience Large Open-science Open-access Multilingual Language Model</em> </h2> <h3 style='text-align: center '>Model Card</h3> <img src="https://s3.amazonaws.com/moonup/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/> Version 1.0 / 26.May.2022 ## Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Data](#training-data) 4. [Risks and Limitations](#risks-and-limitations) 5. [Evaluation](#evaluation) 6. [Recommendations](#recommendations) 7. [Glossary and Calculations](#glossary-and-calculations) 8. [More Information](#more-information) 9. [Model Card Authors](#model-card-authors) ## Model Details ### Basics *This section provides information for anyone who wants to know about the model.* <details> <summary>Click to expand</summary> <br/> **Developed by:** BigScience ([website](https://bigscience.huggingface.co)) * All collaborators are either volunteers or have an agreement with their employer. *(Further breakdown of participants forthcoming.)* **Model Type:** Transformer-based Language Model **Version:** 1.0.0 **Languages:** Multiple; see [training data](#training-data) **License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license)) **Release Date Estimate:** Monday, 11.July.2022 **Send Questions to:** bigscience-contact@googlegroups.com **Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022 **Funded by:** * The French government. * Hugging Face ([website](https://huggingface.co)). * Organizations of contributors. *(Further breakdown of organizations forthcoming.)* </details> ### Technical Specifications *This section provides information for people who work on model development.* <details> <summary>Click to expand</summary><br/> Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training. **Model Architecture:** Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)): * Decoder-only architecture * Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf)) * ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions * 7,069,016,064 parameters: * 1,027,604,480 embedding parameters * 30 layers, 32 attention heads * Hidden layers are 4096-dimensional * Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization)) **Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)). **Compute infrastructure:** Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)). * Hardware: 384 A100 80GB GPUs (48 nodes): * Additional 32 A100 80GB GPUs (4 nodes) in reserve * 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links * CPU: AMD * CPU memory: 512GB per node * GPU memory: 640GB per node * Inter-node connect: Omni-Path Architecture (OPA) * NCCL-communications network: a fully dedicated subnet * Disc IO network: shared network with other types of nodes * Software: * Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed)) * DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed)) * PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch)) * apex ([Github link](https://github.com/NVIDIA/apex)) #### **Training** Training logs: [Tensorboard link](https://huggingface.co/tensorboard/bigscience/tr11c-2B5-logs) - Number of epochs: 1 (*current target*) - Dates: - Started 11th March, 2022 11:42am PST - Ended 5th July, 2022 - Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments) - Server training location: Île-de-France, France #### **Tokenization** The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)) is a learned subword tokenizer trained using: - A byte-level Byte Pair Encoding (BPE) algorithm - A simple pre-tokenization rule, no normalization - A vocabulary size of 250,680 It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language. </details> ### Environmental Impact <details> <summary>Click to expand</summary><br/> The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing. **Estimated carbon emissions:** *(Forthcoming upon completion of training.)* **Estimated electricity usage:** *(Forthcoming upon completion of training.)* </details> <p>&nbsp;</p> ## Uses *This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model. It provides information for anyone considering using the model or who is affected by the model.* <details> <summary>Click to expand</summary><br/> ### Intended Use This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive. #### **Direct Use** - Text generation - Exploring characteristics of language generated by a language model - Examples: Cloze tests, counterfactuals, generations with reframings #### **Downstream Use** - Tasks that leverage language models include: Information Extraction, Question Answering, Summarization ### Misuse and Out-of-scope Use *This section addresses what users ought not do with the model.* See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases. #### **Out-of-scope Uses** Using the model in [high-stakes](#high-stakes) settings is out of scope for this model.  The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but is not correct. ##### Out-of-scope Uses Include: - Usage in biomedical domains, political and legal domains, or finance domains - Usage for evaluating or scoring individuals, such as for employment, education, or credit - Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct #### **Misuse** Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes: - Spam generation - Disinformation and influence operations - Disparagement and defamation - Harassment and abuse - [Deception](#deception) - Unconsented impersonation and imitation - Unconsented surveillance - Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license) ### Intended Users #### **Direct Users** - General Public - Researchers - Students - Educators - Engineers/developers - Non-commercial entities - Community advocates, including human and civil rights groups #### Indirect Users - Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use) - Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license) #### Others Affected (Parties Prenantes) - People and groups referred to by the LLM - People and groups exposed to outputs of, or decisions based on, the LLM - People and groups whose original work is included in the LLM </details> <p>&nbsp;</p> ## Training Data *This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.* <details> <summary>Click to expand</summary><br/> Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus). Training data includes: - 45 natural languages - 12 programming languages - In 1.5TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.) #### **Languages** The pie chart shows the distribution of languages in training data. ![pie chart showing the distribution of languages in training data](https://github.com/bigscience-workshop/model_card/blob/main/assets/data/pie_chart.svg?raw=true) The following table shows the further distribution of Niger-Congo and Indic languages in the training data. <details> <summary>Click to expand</summary><br/> | Niger Congo | Percentage | | Indic | Percentage | |----------------|------------ |------ |-----------|------------| | Chi Tumbuka | 0.00002 | | Assamese | 0.01 | | Kikuyu | 0.00004 | | Odia | 0.04 | | Bambara | 0.00004 | | Gujarati | 0.04 | | Akan | 0.00007 | | Marathi | 0.05 | | Xitsonga | 0.00007 | | Punjabi | 0.05 | | Sesotho | 0.00007 | | Kannada | 0.06 | | Chi Chewa | 0.0001 | | Nepali | 0.07 | | Setswana | 0.0002 | | Telugu | 0.09 | | Northern Sotho | 0.0002 | | Malayalam | 0.10 | | Fon | 0.0002 | | Urdu | 0.10 | | Kirundi | 0.0003 | | Tamil | 0.20 | | Wolof | 0.0004 | | Bengali | 0.50 | | Kuganda | 0.0004 | | Hindi | 0.70 | | Chi Shona | 0.001 | | Isi Zulu | 0.001 | | Igbo | 0.001 | | Xhosa | 0.001 | | Kinyarwanda | 0.003 | | Yoruba | 0.006 | | Swahili | 0.02 | </details> The following table shows the distribution of programming languages. <details> <summary>Click to expand</summary><br/> | Extension | Language | Number of files | |----------------|------------|-----------------| | java | Java | 5,407,724 | | php | PHP | 4,942,186 | | cpp | C++ | 2,503,930 | | py | Python | 2,435,072 | | js | JavaScript | 1,905,518 | | cs | C# | 1,577,347 | | rb | Ruby | 6,78,413 | | cc | C++ | 443,054 | | hpp | C++ | 391,048 | | lua | Lua | 352,317 | | go | GO | 227,763 | | ts | TypeScript | 195,254 | | C | C | 134,537 | | scala | Scala | 92,052 | | hh | C++ | 67,161 | | H | C++ | 55,899 | | tsx | TypeScript | 33,107 | | rs | Rust | 29,693 | | phpt | PHP | 9,702 | | c++ | C++ | 1,342 | | h++ | C++ | 791 | | php3 | PHP | 540 | | phps | PHP | 270 | | php5 | PHP | 166 | | php4 | PHP | 29 | </details> </details> <p>&nbsp;</p> ## Risks and Limitations *This section identifies foreseeable harms and misunderstandings.* <details> <summary>Click to expand</summary><br/> Model may: - Overrepresent some viewpoints and underrepresent others - Contain stereotypes - Contain [personal information](#personal-data-and-information) - Generate: - Hateful, abusive, or violent language - Discriminatory or prejudicial language - Content that may not be appropriate for all settings, including sexual content - Make errors, including producing incorrect information as if it were factual - Generate irrelevant or repetitive outputs </details> <p>&nbsp;</p> ## Evaluation *This section describes the evaluation protocols and provides the results.* <details> <summary>Click to expand</summary><br/> ### Metrics *This section describes the different ways performance is calculated and why.* Includes: | Metric | Why chosen | |--------------------|--------------------------------------------------------------------| | [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training | | Cross Entropy [Loss](#loss) | Standard objective for language models. | And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_ ### Factors *This section lists some different aspects of BLOOM models. Its focus is on those aspects that are likely to give rise to high variance in model behavior.* - Language, such as English or Yoruba - Domain, such as newswire or stories - Demographic characteristics, such as gender or nationality ### Results *Results are based on the [Factors](#factors) and [Metrics](#metrics).* **Train-time Evaluation:** As of 25.May.2022, 15:00 PST: - Training Loss: 2.3 - Validation Loss: 2.9 - Perplexity: 16 </details> <p>&nbsp;</p> ## Recommendations *This section provides information on warnings and potential mitigations.* <details> <summary>Click to expand</summary><br/> - Indirect users should be made aware when the content they're working with is created by the LLM. - Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary. - Models pretrained with the LLM should include an updated Model Card. - Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments. </details> <p>&nbsp;</p> ## Glossary and Calculations *This section defines common terms and how metrics are calculated.* <details> <summary>Click to expand</summary><br/> - <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss. - <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy. - <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/). - <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf). - <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf). - <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm). - <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf)) - <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated. </details> <p>&nbsp;</p> ## More Information <details> <summary>Click to expand</summary><br/> ### Dataset Creation Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling ### Technical Specifications Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md ### Initial Results Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book </details> <p>&nbsp;</p> ## Model Card Authors *Ordered roughly chronologically and by amount of time spent.* Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muñoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Ilić, Gérard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff
20,561
[ [ -0.0291900634765625, -0.0543212890625, 0.033203125, 0.0086822509765625, -0.006114959716796875, -0.01422882080078125, -0.03656005859375, -0.042205810546875, 0.00682830810546875, 0.0172882080078125, -0.028228759765625, -0.037750244140625, -0.051544189453125, 0.004779815673828125, -0.02569580078125, 0.0770263671875, 0.0037441253662109375, 0.01287078857421875, 0.00040149688720703125, 0.00882720947265625, -0.010833740234375, -0.049560546875, -0.041961669921875, -0.02874755859375, 0.044189453125, 0.0243988037109375, 0.052642822265625, 0.050323486328125, 0.04815673828125, 0.0201873779296875, -0.0330810546875, -0.0061492919921875, -0.04144287109375, -0.028656005859375, -0.02020263671875, -0.0146636962890625, -0.04913330078125, -0.0037384033203125, 0.06719970703125, 0.048675537109375, 0.0019702911376953125, 0.024749755859375, -0.001117706298828125, 0.03997802734375, -0.042694091796875, 0.025054931640625, -0.039764404296875, 0.00550079345703125, -0.0172882080078125, 0.0222625732421875, -0.0244903564453125, 0.00008565187454223633, 0.004306793212890625, -0.03607177734375, 0.01253509521484375, 0.0004982948303222656, 0.071533203125, -0.002964019775390625, -0.00818634033203125, -0.00634002685546875, -0.058807373046875, 0.064208984375, -0.065185546875, 0.043609619140625, 0.026397705078125, 0.0179443359375, 0.003265380859375, -0.07037353515625, -0.058441162109375, -0.0145263671875, 0.005107879638671875, 0.0282745361328125, -0.01497650146484375, 0.0047607421875, 0.0277557373046875, 0.044921875, -0.044769287109375, 0.0186309814453125, -0.0384521484375, -0.006000518798828125, 0.04949951171875, 0.00249481201171875, 0.012420654296875, -0.01788330078125, -0.01459503173828125, -0.02764892578125, -0.046783447265625, -0.0159912109375, 0.0167083740234375, 0.0345458984375, -0.0277557373046875, 0.050872802734375, 0.01319122314453125, 0.0302886962890625, -0.0209808349609375, 0.0009899139404296875, 0.03631591796875, -0.0411376953125, -0.028411865234375, -0.0153656005859375, 0.0750732421875, 0.01486968994140625, -0.0020503997802734375, -0.00775909423828125, -0.00634002685546875, -0.0157928466796875, 0.0021877288818359375, -0.071533203125, -0.00943756103515625, 0.03143310546875, -0.02337646484375, -0.00859832763671875, 0.0011053085327148438, -0.0782470703125, -0.00585174560546875, -0.01103973388671875, 0.034393310546875, -0.04974365234375, -0.034393310546875, 0.0209808349609375, -0.0062713623046875, 0.01142120361328125, 0.0073394775390625, -0.06390380859375, 0.01995849609375, 0.0439453125, 0.078125, -0.00679779052734375, -0.046173095703125, -0.0013246536254882812, 0.004825592041015625, -0.002788543701171875, 0.0164031982421875, -0.0165863037109375, -0.0418701171875, -0.00547027587890625, 0.0125885009765625, -0.004718780517578125, -0.0159149169921875, 0.04345703125, -0.0305938720703125, 0.02642822265625, -0.022796630859375, -0.04437255859375, -0.002964019775390625, 0.00238037109375, -0.046630859375, 0.0799560546875, 0.0016565322875976562, -0.065185546875, 0.0122222900390625, -0.07586669921875, -0.013580322265625, 0.0038089752197265625, 0.0103912353515625, -0.04461669921875, -0.00980377197265625, 0.00820159912109375, 0.0300140380859375, -0.021575927734375, 0.0217132568359375, -0.00916290283203125, -0.01055908203125, 0.007476806640625, -0.0272064208984375, 0.060943603515625, 0.028717041015625, -0.042999267578125, 0.0042724609375, -0.049224853515625, -0.0089874267578125, 0.0276641845703125, -0.029754638671875, 0.01323699951171875, -0.00942230224609375, 0.0303192138671875, 0.01617431640625, 0.0213470458984375, -0.0531005859375, 0.022705078125, -0.046844482421875, 0.05401611328125, 0.045318603515625, -0.01050567626953125, 0.024169921875, -0.01512908935546875, 0.025909423828125, 0.0057220458984375, 0.024383544921875, -0.018768310546875, -0.047576904296875, -0.059417724609375, -0.035308837890625, 0.0312042236328125, 0.036102294921875, -0.03466796875, 0.047454833984375, -0.03619384765625, -0.058258056640625, -0.03179931640625, -0.0052337646484375, 0.042816162109375, 0.0251312255859375, 0.0543212890625, -0.0055999755859375, -0.04144287109375, -0.057281494140625, 0.0052642822265625, 0.005970001220703125, 0.01538848876953125, 0.0281982421875, 0.07843017578125, -0.036773681640625, 0.0623779296875, -0.0404052734375, -0.004764556884765625, -0.018157958984375, -0.0007719993591308594, 0.02386474609375, 0.040283203125, 0.03326416015625, -0.0562744140625, -0.019683837890625, 0.0026874542236328125, -0.051055908203125, 0.025634765625, 0.0188446044921875, 0.00016129016876220703, 0.020111083984375, 0.037750244140625, -0.06414794921875, 0.021820068359375, 0.05291748046875, -0.00998687744140625, 0.054229736328125, -0.0189056396484375, -0.00751495361328125, -0.10333251953125, 0.037445068359375, 0.0018014907836914062, 0.0021190643310546875, -0.03778076171875, 0.018280029296875, -0.005397796630859375, -0.031341552734375, -0.0477294921875, 0.05859375, -0.030487060546875, 0.005397796630859375, -0.0117340087890625, -0.0017499923706054688, -0.009246826171875, 0.0250244140625, 0.0085906982421875, 0.068359375, 0.052490234375, -0.046142578125, 0.01097869873046875, 0.01105499267578125, -0.012908935546875, 0.004932403564453125, -0.0657958984375, 0.00799560546875, -0.00843048095703125, 0.021575927734375, -0.055328369140625, -0.023345947265625, 0.0157623291015625, -0.042083740234375, 0.03240966796875, -0.000629425048828125, -0.060791015625, -0.0518798828125, -0.016021728515625, 0.02935791015625, 0.045257568359375, -0.0341796875, 0.0283660888671875, 0.023468017578125, 0.0122833251953125, -0.03759765625, -0.06768798828125, 0.00814056396484375, -0.01087188720703125, -0.045257568359375, 0.03973388671875, -0.01067352294921875, -0.005428314208984375, 0.00501251220703125, 0.0183258056640625, 0.004512786865234375, 0.0021076202392578125, 0.02423095703125, 0.00943756103515625, -0.012908935546875, 0.0260162353515625, -0.0211181640625, -0.0015535354614257812, -0.005970001220703125, -0.04388427734375, 0.052398681640625, -0.0205078125, -0.0249176025390625, -0.036529541015625, 0.02142333984375, 0.050933837890625, -0.0200958251953125, 0.0811767578125, 0.06085205078125, -0.04498291015625, 0.00426483154296875, -0.035888671875, -0.026519775390625, -0.035552978515625, 0.050811767578125, -0.0027027130126953125, -0.07110595703125, 0.03094482421875, 0.007244110107421875, 0.0134429931640625, 0.05596923828125, 0.0533447265625, 0.0095977783203125, 0.06097412109375, 0.04937744140625, -0.016815185546875, 0.041351318359375, -0.04833984375, 0.02392578125, -0.06744384765625, -0.00958251953125, -0.036956787109375, -0.00728607177734375, -0.045318603515625, -0.049224853515625, 0.022705078125, 0.0144195556640625, -0.038970947265625, 0.033203125, -0.0408935546875, 0.0204315185546875, 0.0440673828125, 0.0029239654541015625, 0.004669189453125, 0.00562286376953125, -0.0132904052734375, -0.005016326904296875, -0.05743408203125, -0.046142578125, 0.0970458984375, 0.0506591796875, 0.03192138671875, 0.004512786865234375, 0.04827880859375, -0.00010472536087036133, 0.01523590087890625, -0.050323486328125, 0.04083251953125, -0.007122039794921875, -0.06072998046875, -0.0286865234375, -0.043182373046875, -0.08636474609375, 0.013916015625, -0.0220794677734375, -0.0740966796875, 0.0012989044189453125, 0.0181732177734375, -0.019195556640625, 0.052032470703125, -0.05859375, 0.0723876953125, -0.021026611328125, -0.034881591796875, -0.019195556640625, -0.042236328125, 0.0204315185546875, 0.0012159347534179688, 0.0295257568359375, 0.01543426513671875, 0.0139007568359375, 0.0589599609375, -0.038726806640625, 0.0701904296875, -0.01074981689453125, 0.005817413330078125, 0.0214385986328125, -0.0224151611328125, 0.0340576171875, 0.0018310546875, -0.01342010498046875, 0.04437255859375, -0.0036411285400390625, -0.0263824462890625, -0.0091552734375, 0.059112548828125, -0.08013916015625, -0.0292816162109375, -0.043487548828125, -0.040069580078125, -0.00333404541015625, 0.033599853515625, 0.040069580078125, 0.0157623291015625, -0.0193328857421875, 0.01788330078125, 0.051361083984375, -0.043731689453125, 0.0299530029296875, 0.025054931640625, -0.0298614501953125, -0.04620361328125, 0.0819091796875, 0.01055145263671875, 0.022796630859375, 0.0261688232421875, 0.0282745361328125, -0.02142333984375, -0.044189453125, -0.024932861328125, 0.038421630859375, -0.04034423828125, -0.00860595703125, -0.0634765625, -0.035552978515625, -0.055511474609375, 0.0097808837890625, -0.042388916015625, -0.018280029296875, -0.040252685546875, -0.01399993896484375, 0.031494140625, 0.0421142578125, -0.0137481689453125, 0.031494140625, -0.048492431640625, 0.0035533905029296875, 0.01531219482421875, 0.0194549560546875, 0.0044708251953125, -0.04541015625, -0.034881591796875, 0.023223876953125, -0.037994384765625, -0.046478271484375, 0.025360107421875, 0.01319122314453125, 0.0305023193359375, 0.0081024169921875, -0.031280517578125, 0.0322265625, -0.039337158203125, 0.08270263671875, 0.0275726318359375, -0.0689697265625, 0.039794921875, -0.0323486328125, 0.0299530029296875, 0.030487060546875, 0.03973388671875, -0.04083251953125, -0.01027679443359375, -0.060546875, -0.08392333984375, 0.04736328125, 0.017974853515625, 0.016326904296875, -0.00868988037109375, 0.030731201171875, -0.0207366943359375, 0.01497650146484375, -0.07159423828125, -0.0232391357421875, -0.025360107421875, -0.01025390625, -0.02655029296875, -0.0192718505859375, -0.0113525390625, -0.0307769775390625, 0.06439208984375, 0.0007357597351074219, 0.04400634765625, 0.01995849609375, -0.00943756103515625, -0.0026760101318359375, 0.01337432861328125, 0.05169677734375, 0.045928955078125, -0.0180816650390625, -0.0024013519287109375, 0.0234527587890625, -0.05902099609375, -0.01216888427734375, 0.0215911865234375, -0.0120849609375, -0.01081085205078125, 0.0268402099609375, 0.06304931640625, 0.0165863037109375, -0.0460205078125, 0.03973388671875, 0.01242828369140625, -0.0297393798828125, -0.0235748291015625, -0.006195068359375, 0.027587890625, 0.0112762451171875, 0.01629638671875, -0.012237548828125, -0.0021648406982421875, -0.041290283203125, 0.00556182861328125, 0.034027099609375, -0.0217437744140625, -0.034454345703125, 0.057952880859375, 0.002315521240234375, -0.0205841064453125, 0.037933349609375, -0.025604248046875, -0.039276123046875, 0.0418701171875, 0.0452880859375, 0.0567626953125, -0.00421905517578125, 0.00412750244140625, 0.0570068359375, 0.031982421875, -0.00818634033203125, 0.019073486328125, 0.019195556640625, -0.04498291015625, -0.031005859375, -0.05950927734375, -0.02655029296875, 0.0256805419921875, -0.027587890625, 0.0248260498046875, -0.04449462890625, -0.0203399658203125, 0.007152557373046875, 0.005313873291015625, -0.0604248046875, 0.014678955078125, 0.0240325927734375, 0.083251953125, -0.059112548828125, 0.06524658203125, 0.052032470703125, -0.048431396484375, -0.0648193359375, -0.006168365478515625, -0.0013589859008789062, -0.05377197265625, 0.054412841796875, 0.00814056396484375, 0.00238800048828125, 0.005321502685546875, -0.0518798828125, -0.08099365234375, 0.09234619140625, 0.0253143310546875, -0.04888916015625, 0.001285552978515625, 0.0146026611328125, 0.05352783203125, -0.0015153884887695312, 0.0291900634765625, 0.023101806640625, 0.044403076171875, 0.01558685302734375, -0.07659912109375, -0.00008654594421386719, -0.0171661376953125, -0.002559661865234375, 0.00553131103515625, -0.0660400390625, 0.08477783203125, -0.0114288330078125, -0.01580810546875, 0.0012865066528320312, 0.052764892578125, 0.01041412353515625, 0.00411224365234375, 0.00948333740234375, 0.06756591796875, 0.0599365234375, -0.01226043701171875, 0.08319091796875, -0.038055419921875, 0.04852294921875, 0.0714111328125, -0.00649261474609375, 0.06103515625, 0.0276641845703125, -0.0345458984375, 0.0198974609375, 0.03448486328125, -0.01444244384765625, 0.0186920166015625, 0.021209716796875, -0.00902557373046875, 0.004344940185546875, -0.002956390380859375, -0.052947998046875, 0.0175018310546875, 0.0294036865234375, -0.042694091796875, -0.005146026611328125, 0.00833892822265625, 0.016937255859375, -0.019378662109375, -0.0225677490234375, 0.03533935546875, 0.01001739501953125, -0.0380859375, 0.038909912109375, 0.01605224609375, 0.05841064453125, -0.05474853515625, 0.007244110107421875, 0.0008845329284667969, 0.024383544921875, -0.0192108154296875, -0.06085205078125, 0.0113983154296875, -0.00037479400634765625, -0.0191802978515625, -0.01140594482421875, 0.0276947021484375, -0.0195159912109375, -0.05120849609375, 0.0290985107421875, 0.02166748046875, 0.016448974609375, -0.00653076171875, -0.0653076171875, 0.013671875, -0.0193634033203125, -0.0255126953125, 0.023040771484375, 0.018585205078125, 0.018157958984375, 0.040130615234375, 0.051544189453125, 0.017059326171875, 0.007015228271484375, 0.0011749267578125, 0.0704345703125, -0.05279541015625, -0.0151824951171875, -0.0653076171875, 0.03521728515625, -0.00504302978515625, -0.043487548828125, 0.0693359375, 0.062255859375, 0.06695556640625, 0.0023593902587890625, 0.062408447265625, -0.01029205322265625, 0.0263519287109375, -0.032684326171875, 0.0443115234375, -0.049224853515625, -0.004547119140625, -0.038238525390625, -0.0704345703125, -0.0271148681640625, 0.037994384765625, -0.033660888671875, 0.021820068359375, 0.04876708984375, 0.061553955078125, -0.012054443359375, 0.00553131103515625, 0.01544952392578125, 0.034942626953125, 0.035400390625, 0.0399169921875, 0.039642333984375, -0.0418701171875, 0.030853271484375, -0.020172119140625, -0.0155181884765625, -0.025360107421875, -0.0654296875, -0.056549072265625, -0.046600341796875, -0.0262908935546875, -0.037689208984375, 0.0072174072265625, 0.0731201171875, 0.059478759765625, -0.0626220703125, -0.02239990234375, -0.0252838134765625, -0.01248931884765625, -0.00856781005859375, -0.016448974609375, 0.042999267578125, -0.0146484375, -0.054412841796875, 0.018798828125, 0.01346588134765625, 0.0107879638671875, -0.035552978515625, -0.009063720703125, -0.034942626953125, -0.005451202392578125, 0.047515869140625, 0.04193115234375, -0.04302978515625, -0.01556396484375, 0.01123046875, -0.01531982421875, -0.0009474754333496094, 0.03472900390625, -0.0220947265625, 0.0273895263671875, 0.024688720703125, 0.041961669921875, 0.054443359375, -0.019256591796875, 0.015472412109375, -0.03564453125, 0.0118560791015625, 0.0274200439453125, 0.038543701171875, 0.02783203125, -0.0247802734375, 0.0295867919921875, 0.03131103515625, -0.0557861328125, -0.059814453125, 0.0105133056640625, -0.07293701171875, -0.0230255126953125, 0.11309814453125, -0.006649017333984375, -0.02783203125, 0.01910400390625, -0.01265716552734375, 0.0190887451171875, -0.017852783203125, 0.044647216796875, 0.055328369140625, 0.00890350341796875, -0.00400543212890625, -0.05029296875, 0.026397705078125, 0.02716064453125, -0.06201171875, 0.005847930908203125, 0.0294952392578125, 0.0263671875, 0.0305633544921875, 0.03594970703125, -0.0173492431640625, 0.006587982177734375, -0.0029850006103515625, 0.032806396484375, -0.008544921875, -0.019256591796875, -0.030181884765625, -0.002475738525390625, 0.005519866943359375, -0.00428009033203125 ] ]
sentence-transformers/paraphrase-mpnet-base-v2
2022-06-15T19:23:23.000Z
[ "sentence-transformers", "pytorch", "tf", "mpnet", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/paraphrase-mpnet-base-v2
28
104,419
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/paraphrase-mpnet-base-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/paraphrase-mpnet-base-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-mpnet-base-v2') model = AutoModel.from_pretrained('sentence-transformers/paraphrase-mpnet-base-v2') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-mpnet-base-v2) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,699
[ [ -0.017425537109375, -0.042449951171875, 0.032928466796875, 0.0282135009765625, -0.031585693359375, -0.0372314453125, -0.005584716796875, 0.0142059326171875, 0.00824737548828125, 0.0406494140625, -0.031005859375, -0.019561767578125, -0.057037353515625, 0.0053253173828125, -0.03741455078125, 0.061004638671875, -0.01218414306640625, -0.004428863525390625, -0.02740478515625, -0.01396942138671875, -0.00251007080078125, -0.03314208984375, -0.0305023193359375, -0.0241851806640625, 0.0236968994140625, 0.0166778564453125, 0.039093017578125, 0.04278564453125, 0.0318603515625, 0.032501220703125, -0.0072174072265625, 0.0038089752197265625, -0.01354217529296875, -0.00836181640625, -0.00798797607421875, -0.0235748291015625, 0.00003826618194580078, 0.018585205078125, 0.048370361328125, 0.0297698974609375, -0.0116119384765625, 0.0128326416015625, 0.01091766357421875, 0.0166015625, -0.0390625, 0.03643798828125, -0.054351806640625, 0.01401519775390625, 0.004413604736328125, -0.0031585693359375, -0.034027099609375, -0.00225830078125, 0.0195159912109375, -0.0233917236328125, 0.0088958740234375, 0.005443572998046875, 0.07977294921875, 0.0249481201171875, -0.023895263671875, -0.01435089111328125, -0.01428985595703125, 0.06085205078125, -0.07159423828125, 0.004177093505859375, 0.0305938720703125, 0.00601959228515625, 0.0084381103515625, -0.09490966796875, -0.060455322265625, -0.0138397216796875, -0.03814697265625, 0.00879669189453125, -0.02691650390625, -0.000766754150390625, 0.01067352294921875, 0.0120391845703125, -0.04241943359375, -0.0224456787109375, -0.03436279296875, -0.01441192626953125, 0.0246124267578125, 0.0035915374755859375, 0.0278778076171875, -0.053070068359375, -0.0361328125, -0.02471923828125, -0.01165008544921875, -0.007442474365234375, 0.00530242919921875, 0.01081085205078125, -0.0218048095703125, 0.06787109375, -0.01482391357421875, 0.0386962890625, -0.005069732666015625, 0.0082550048828125, 0.04132080078125, -0.031219482421875, -0.018463134765625, -0.01165771484375, 0.08709716796875, 0.03778076171875, 0.023101806640625, -0.0089263916015625, -0.01096343994140625, -0.01177215576171875, 0.00881195068359375, -0.065185546875, -0.03436279296875, 0.006160736083984375, -0.038970947265625, -0.028076171875, 0.0155029296875, -0.05987548828125, -0.01209259033203125, -0.0019073486328125, 0.05377197265625, -0.045501708984375, 0.00762176513671875, 0.005950927734375, -0.03326416015625, 0.0240936279296875, -0.033447265625, -0.049102783203125, 0.0147705078125, 0.035919189453125, 0.09124755859375, 0.005527496337890625, -0.0528564453125, -0.0211334228515625, -0.0021953582763671875, 0.0169219970703125, 0.047760009765625, -0.032257080078125, -0.0009593963623046875, -0.0092010498046875, 0.01497650146484375, -0.054595947265625, -0.032958984375, 0.05291748046875, -0.0112457275390625, 0.04473876953125, 0.003032684326171875, -0.05133056640625, -0.00959014892578125, 0.004650115966796875, -0.0270233154296875, 0.07086181640625, 0.00975799560546875, -0.0748291015625, -0.007671356201171875, -0.05157470703125, -0.0204925537109375, -0.01421356201171875, 0.005710601806640625, -0.051361083984375, 0.0087432861328125, 0.0361328125, 0.05780029296875, -0.00412750244140625, 0.0245361328125, -0.01953125, -0.0318603515625, 0.034393310546875, -0.033721923828125, 0.0831298828125, 0.01702880859375, -0.03668212890625, 0.00426483154296875, -0.03314208984375, -0.0072174072265625, 0.018157958984375, -0.01143646240234375, -0.002048492431640625, 0.006229400634765625, 0.015045166015625, 0.0380859375, 0.023406982421875, -0.048370361328125, -0.0032939910888671875, -0.0352783203125, 0.08270263671875, 0.04119873046875, 0.01488494873046875, 0.043792724609375, -0.033447265625, 0.0146026611328125, 0.0245819091796875, 0.00799560546875, -0.011199951171875, -0.03582763671875, -0.0697021484375, -0.01459503173828125, 0.0194244384765625, 0.05084228515625, -0.068603515625, 0.0615234375, -0.035797119140625, -0.037078857421875, -0.049591064453125, -0.0008668899536132812, 0.0142059326171875, 0.041656494140625, 0.05169677734375, 0.0008025169372558594, -0.044769287109375, -0.08111572265625, -0.007038116455078125, -0.003971099853515625, -0.0038814544677734375, 0.0225982666015625, 0.054962158203125, -0.0194244384765625, 0.0631103515625, -0.03619384765625, -0.019073486328125, -0.031707763671875, 0.0191497802734375, 0.01580810546875, 0.04779052734375, 0.036285400390625, -0.06329345703125, -0.0361328125, -0.03704833984375, -0.0538330078125, -0.00986480712890625, -0.0249481201171875, -0.0169219970703125, -0.00218963623046875, 0.042327880859375, -0.07110595703125, 0.015899658203125, 0.040191650390625, -0.036224365234375, 0.025177001953125, -0.0171051025390625, -0.01033782958984375, -0.10919189453125, 0.01183319091796875, -0.0016117095947265625, -0.00933837890625, -0.0247650146484375, 0.0143585205078125, 0.01280975341796875, -0.017669677734375, -0.035980224609375, 0.0254058837890625, -0.028594970703125, 0.0120086669921875, -0.007427215576171875, 0.0240478515625, -0.003948211669921875, 0.05755615234375, -0.00959014892578125, 0.06488037109375, 0.043365478515625, -0.037506103515625, 0.0277557373046875, 0.037506103515625, -0.0352783203125, 0.01678466796875, -0.0699462890625, -0.0016336441040039062, -0.0010786056518554688, 0.0268707275390625, -0.08465576171875, -0.00849151611328125, 0.03289794921875, -0.041839599609375, -0.006626129150390625, 0.00949859619140625, -0.050750732421875, -0.043670654296875, -0.0271148681640625, 0.00019311904907226562, 0.050445556640625, -0.035430908203125, 0.0377197265625, 0.01519012451171875, -0.01207733154296875, -0.038543701171875, -0.0831298828125, 0.0218048095703125, -0.0206756591796875, -0.041778564453125, 0.03875732421875, -0.002292633056640625, 0.012542724609375, 0.01029205322265625, 0.01471710205078125, -0.0151824951171875, -0.0035953521728515625, -0.0004038810729980469, 0.0021820068359375, -0.0095367431640625, -0.0006155967712402344, 0.01279449462890625, -0.00786590576171875, 0.01397705078125, -0.0240020751953125, 0.055877685546875, -0.00997161865234375, -0.00688934326171875, -0.04217529296875, 0.018280029296875, 0.0506591796875, -0.0233154296875, 0.09149169921875, 0.07012939453125, -0.0177001953125, 0.005962371826171875, -0.0311431884765625, -0.01474761962890625, -0.035064697265625, 0.041107177734375, -0.0206451416015625, -0.058319091796875, 0.0260009765625, 0.019317626953125, -0.0045166015625, 0.049835205078125, 0.041412353515625, -0.0126800537109375, 0.060302734375, 0.0258331298828125, -0.0037021636962890625, 0.036224365234375, -0.0322265625, 0.01520538330078125, -0.071044921875, -0.01209259033203125, -0.0229339599609375, -0.0288543701171875, -0.0477294921875, -0.0396728515625, 0.01395416259765625, 0.00039887428283691406, -0.018798828125, 0.054168701171875, -0.036895751953125, 0.02178955078125, 0.05804443359375, 0.0244140625, -0.015655517578125, 0.006771087646484375, -0.041046142578125, -0.001552581787109375, -0.05255126953125, -0.03839111328125, 0.065673828125, 0.0197906494140625, 0.0160675048828125, -0.005825042724609375, 0.053680419921875, -0.0035991668701171875, -0.00780487060546875, -0.04132080078125, 0.0552978515625, -0.0236358642578125, -0.0248870849609375, -0.01593017578125, -0.035003662109375, -0.053802490234375, 0.033416748046875, -0.00679779052734375, -0.059661865234375, 0.0114898681640625, -0.0149688720703125, -0.0162506103515625, 0.0266571044921875, -0.06317138671875, 0.08416748046875, 0.0150146484375, -0.0021381378173828125, -0.0006341934204101562, -0.06573486328125, 0.01511383056640625, 0.00876617431640625, -0.006534576416015625, -0.006565093994140625, -0.016876220703125, 0.06280517578125, -0.031036376953125, 0.061553955078125, -0.0166015625, 0.03265380859375, 0.02764892578125, -0.0204315185546875, 0.027313232421875, -0.010101318359375, -0.01326751708984375, -0.00559234619140625, -0.00534820556640625, -0.036895751953125, -0.046783447265625, 0.050933837890625, -0.06689453125, -0.0277862548828125, -0.027862548828125, -0.0484619140625, -0.00047516822814941406, 0.01184844970703125, 0.03265380859375, 0.0257568359375, 0.00762939453125, 0.05426025390625, 0.033782958984375, -0.02099609375, 0.058837890625, -0.00183868408203125, 0.0092010498046875, -0.043212890625, 0.05572509765625, 0.006542205810546875, 0.01105499267578125, 0.044158935546875, 0.0261383056640625, -0.02716064453125, -0.028533935546875, -0.0167083740234375, 0.027313232421875, -0.046142578125, -0.01021575927734375, -0.08648681640625, -0.039642333984375, -0.049102783203125, 0.005855560302734375, -0.0060882568359375, -0.0386962890625, -0.035491943359375, -0.01528167724609375, 0.02294921875, 0.021240234375, -0.0004954338073730469, 0.041961669921875, -0.047393798828125, 0.0178375244140625, 0.0229034423828125, -0.0004723072052001953, -0.003551483154296875, -0.06280517578125, -0.01318359375, 0.0056610107421875, -0.0277557373046875, -0.06585693359375, 0.044525146484375, 0.0233154296875, 0.04656982421875, 0.004558563232421875, 0.0043182373046875, 0.041656494140625, -0.044677734375, 0.07122802734375, 0.002796173095703125, -0.07781982421875, 0.0260772705078125, -0.01432037353515625, 0.036041259765625, 0.0439453125, 0.0251007080078125, -0.037933349609375, -0.023712158203125, -0.05621337890625, -0.07489013671875, 0.051605224609375, 0.0489501953125, 0.04345703125, -0.022491455078125, 0.03045654296875, -0.02880859375, 0.0154266357421875, -0.0875244140625, -0.031494140625, -0.03167724609375, -0.04095458984375, -0.029083251953125, -0.0286712646484375, 0.0146484375, -0.036376953125, 0.053253173828125, 0.00774383544921875, 0.06317138671875, 0.021026611328125, -0.03314208984375, 0.0290069580078125, 0.0145263671875, 0.043975830078125, 0.03106689453125, -0.00531768798828125, 0.0287933349609375, 0.022216796875, -0.024993896484375, 0.002532958984375, 0.038543701171875, -0.0137176513671875, 0.0148773193359375, 0.0271148681640625, 0.071533203125, 0.033966064453125, -0.031219482421875, 0.0587158203125, 0.0026111602783203125, -0.0225067138671875, -0.017669677734375, -0.0080413818359375, 0.03204345703125, 0.0291900634765625, 0.0175018310546875, 0.0090484619140625, 0.0029296875, -0.02545166015625, 0.036468505859375, 0.00914764404296875, -0.0251007080078125, -0.0002111196517944336, 0.051361083984375, -0.0029296875, -0.021087646484375, 0.06634521484375, -0.0167999267578125, -0.054534912109375, 0.0377197265625, 0.044403076171875, 0.07904052734375, 0.004062652587890625, 0.02862548828125, 0.0285797119140625, 0.0292816162109375, -0.00994110107421875, -0.006473541259765625, -0.006458282470703125, -0.053314208984375, -0.01554107666015625, -0.05499267578125, 0.0014743804931640625, 0.0003094673156738281, -0.0382080078125, 0.0114288330078125, -0.0138397216796875, 0.0033168792724609375, -0.0048370361328125, -0.015869140625, -0.0343017578125, 0.0002263784408569336, 0.0019464492797851562, 0.060791015625, -0.06524658203125, 0.06573486328125, 0.046966552734375, -0.048919677734375, -0.052276611328125, 0.00957489013671875, -0.028228759765625, -0.06005859375, 0.0325927734375, 0.0291595458984375, 0.0201263427734375, 0.0189208984375, -0.038909912109375, -0.05914306640625, 0.1058349609375, 0.0287322998046875, -0.01739501953125, -0.0216217041015625, 0.0140380859375, 0.033111572265625, -0.03076171875, 0.0218353271484375, 0.03643798828125, 0.0250091552734375, 0.0015125274658203125, -0.052520751953125, 0.017852783203125, -0.02105712890625, 0.01514434814453125, -0.007110595703125, -0.040008544921875, 0.08380126953125, 0.01221466064453125, 0.00018286705017089844, 0.0266265869140625, 0.0584716796875, 0.0179595947265625, -0.00514984130859375, 0.0248870849609375, 0.05780029296875, 0.0301971435546875, -0.0034580230712890625, 0.080810546875, -0.0253753662109375, 0.057586669921875, 0.08050537109375, 0.0134429931640625, 0.08612060546875, 0.04498291015625, -0.0107269287109375, 0.049285888671875, 0.032379150390625, -0.020050048828125, 0.055755615234375, 0.00982666015625, -0.01013946533203125, -0.003498077392578125, 0.010467529296875, -0.01045989990234375, 0.03759765625, 0.00579071044921875, -0.068603515625, -0.017852783203125, 0.00803375244140625, 0.003787994384765625, -0.0029964447021484375, -0.002811431884765625, 0.0443115234375, 0.0178680419921875, -0.0460205078125, 0.0256805419921875, 0.0208740234375, 0.0675048828125, -0.032379150390625, 0.01332855224609375, -0.0078887939453125, 0.0261383056640625, 0.00968170166015625, -0.04864501953125, 0.035980224609375, -0.0136566162109375, -0.004405975341796875, -0.02294921875, 0.04925537109375, -0.043060302734375, -0.04949951171875, 0.0171661376953125, 0.034759521484375, 0.009735107421875, -0.01334381103515625, -0.09588623046875, -0.0169219970703125, 0.0053558349609375, -0.031707763671875, 0.0196685791015625, 0.0248565673828125, 0.03741455078125, 0.044586181640625, 0.0246734619140625, -0.0172576904296875, 0.0170135498046875, -0.005702972412109375, 0.05218505859375, -0.041351318359375, -0.037933349609375, -0.08599853515625, 0.043853759765625, -0.0218353271484375, -0.029815673828125, 0.0650634765625, 0.037261962890625, 0.053497314453125, -0.0178070068359375, 0.03875732421875, -0.01715087890625, 0.01325225830078125, -0.037994384765625, 0.069580078125, -0.0279388427734375, -0.01380157470703125, -0.0200958251953125, -0.076171875, -0.0183258056640625, 0.0789794921875, -0.0262451171875, 0.00684356689453125, 0.0733642578125, 0.06732177734375, -0.0232086181640625, -0.01202392578125, 0.0094451904296875, 0.026031494140625, 0.005733489990234375, 0.039337158203125, 0.030975341796875, -0.06170654296875, 0.066162109375, -0.0517578125, 0.004207611083984375, -0.00026607513427734375, -0.0625, -0.0716552734375, -0.06591796875, -0.02838134765625, -0.0271148681640625, 0.0008935928344726562, 0.069091796875, 0.04437255859375, -0.05572509765625, -0.00827789306640625, -0.034515380859375, -0.01268768310546875, -0.00833892822265625, -0.023773193359375, 0.035614013671875, -0.03448486328125, -0.0596923828125, 0.01323699951171875, -0.006038665771484375, 0.00634002685546875, -0.01268768310546875, 0.01151275634765625, -0.061798095703125, 0.01053619384765625, 0.0379638671875, -0.014312744140625, -0.054901123046875, -0.0211334228515625, -0.00603485107421875, -0.0306854248046875, -0.00417327880859375, 0.034820556640625, -0.0438232421875, 0.006988525390625, 0.03802490234375, 0.04949951171875, 0.0518798828125, -0.013397216796875, 0.036651611328125, -0.06103515625, 0.0205841064453125, 0.00601959228515625, 0.054962158203125, 0.030670166015625, -0.0118865966796875, 0.03411865234375, 0.032440185546875, -0.03814697265625, -0.04840087890625, -0.01141357421875, -0.07464599609375, -0.0258331298828125, 0.09161376953125, -0.03204345703125, -0.02545166015625, 0.01541900634765625, -0.016876220703125, 0.037322998046875, -0.01192474365234375, 0.04046630859375, 0.0645751953125, 0.00765228271484375, -0.033660888671875, -0.0266265869140625, 0.0100250244140625, 0.04345703125, -0.040985107421875, -0.01493072509765625, 0.00988006591796875, 0.02972412109375, 0.01139068603515625, 0.0226593017578125, -0.00226593017578125, 0.006618499755859375, 0.01329803466796875, -0.005100250244140625, -0.0218048095703125, 0.00037741661071777344, -0.03216552734375, 0.02032470703125, -0.03271484375, -0.02911376953125 ] ]
cl-tohoku/bert-large-japanese
2021-09-23T13:45:41.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "ja", "dataset:wikipedia", "license:cc-by-sa-4.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
cl-tohoku
null
null
cl-tohoku/bert-large-japanese
6
104,154
transformers
2022-03-02T23:29:05
--- language: ja license: cc-by-sa-4.0 datasets: - wikipedia widget: - text: 東北大学で[MASK]の研究をしています。 --- # BERT large Japanese (unidic-lite with whole word masking, jawiki-20200831) This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language. This version of the model processes input texts with word-level tokenization based on the Unidic 2.1.2 dictionary (available in [unidic-lite](https://pypi.org/project/unidic-lite/) package), followed by the WordPiece subword tokenization. Additionally, the model is trained with the whole word masking enabled for the masked language modeling (MLM) objective. The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/tree/v2.0). ## Model architecture The model architecture is the same as the original BERT large model; 24 layers, 1024 dimensions of hidden states, and 16 attention heads. ## Training Data The models are trained on the Japanese version of Wikipedia. The training corpus is generated from the Wikipedia Cirrussearch dump file as of August 31, 2020. The generated corpus files are 4.0GB in total, containing approximately 30M sentences. We used the [MeCab](https://taku910.github.io/mecab/) morphological parser with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd) dictionary to split texts into sentences. ## Tokenization The texts are first tokenized by MeCab with the Unidic 2.1.2 dictionary and then split into subwords by the WordPiece algorithm. The vocabulary size is 32768. We used [`fugashi`](https://github.com/polm/fugashi) and [`unidic-lite`](https://github.com/polm/unidic-lite) packages for the tokenization. ## Training The models are trained with the same configuration as the original BERT; 512 tokens per instance, 256 instances per batch, and 1M training steps. For training of the MLM (masked language modeling) objective, we introduced whole word masking in which all of the subword tokens corresponding to a single word (tokenized by MeCab) are masked at once. For training of each model, we used a v3-8 instance of Cloud TPUs provided by [TensorFlow Research Cloud program](https://www.tensorflow.org/tfrc/). The training took about 5 days to finish. ## Licenses The pretrained models are distributed under the terms of the [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/). ## Acknowledgments This model is trained with Cloud TPUs provided by [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc/) program.
2,590
[ [ -0.031524658203125, -0.0635986328125, 0.0179290771484375, 0.01306915283203125, -0.0521240234375, 0.003231048583984375, -0.0297698974609375, -0.0333251953125, 0.040679931640625, 0.0396728515625, -0.050201416015625, -0.037384033203125, -0.043792724609375, -0.0009140968322753906, -0.0211639404296875, 0.0810546875, -0.0009036064147949219, 0.0170135498046875, 0.021728515625, 0.01128387451171875, -0.0190582275390625, -0.04095458984375, -0.05718994140625, -0.0294647216796875, 0.035736083984375, 0.00885009765625, 0.03564453125, 0.0255584716796875, 0.01397705078125, 0.0177764892578125, -0.0015392303466796875, -0.003017425537109375, -0.039764404296875, -0.020599365234375, -0.0001291036605834961, -0.027008056640625, -0.01259613037109375, -0.0002321004867553711, 0.047210693359375, 0.055450439453125, 0.00201416015625, 0.006900787353515625, -0.004436492919921875, 0.0297698974609375, -0.0384521484375, 0.0032520294189453125, -0.056243896484375, 0.004505157470703125, -0.0253143310546875, 0.0167236328125, -0.0234375, 0.007904052734375, 0.01230621337890625, -0.053466796875, 0.0180511474609375, 0.00019633769989013672, 0.083740234375, -0.0006961822509765625, -0.008575439453125, -0.0182037353515625, -0.037994384765625, 0.055694580078125, -0.0738525390625, 0.0305328369140625, 0.037322998046875, 0.0002980232238769531, -0.01409912109375, -0.07086181640625, -0.0537109375, -0.00897216796875, 0.00862884521484375, 0.00896453857421875, -0.0035400390625, 0.01324462890625, 0.0233154296875, 0.0184783935546875, -0.048675537109375, 0.03350830078125, -0.033111572265625, -0.023590087890625, 0.0384521484375, -0.00815582275390625, 0.0191497802734375, -0.0234527587890625, -0.034759521484375, -0.01227569580078125, -0.037567138671875, 0.005794525146484375, 0.0267181396484375, 0.01120758056640625, -0.004253387451171875, 0.041473388671875, -0.00027680397033691406, 0.037261962890625, -0.0238494873046875, -0.0254364013671875, 0.02349853515625, -0.0215911865234375, -0.0261077880859375, 0.01076507568359375, 0.0758056640625, 0.0111236572265625, 0.0310211181640625, -0.01432037353515625, -0.0284271240234375, -0.00720977783203125, 0.01593017578125, -0.0704345703125, -0.014556884765625, 0.01043701171875, -0.040802001953125, -0.0177764892578125, 0.00560760498046875, -0.040557861328125, -0.0055084228515625, -0.00661468505859375, 0.06280517578125, -0.042236328125, -0.01502227783203125, 0.01837158203125, -0.005954742431640625, 0.01212310791015625, 0.00562286376953125, -0.064208984375, 0.00862884521484375, 0.040679931640625, 0.062225341796875, 0.0019130706787109375, -0.0164947509765625, -0.0018510818481445312, 0.00463104248046875, -0.031341552734375, 0.02801513671875, -0.020599365234375, -0.0252532958984375, 0.00974273681640625, 0.0142364501953125, -0.0147857666015625, -0.01605224609375, 0.035003662109375, -0.042999267578125, 0.0311737060546875, 0.0011205673217773438, -0.060943603515625, -0.0178985595703125, 0.01030731201171875, -0.03289794921875, 0.07550048828125, 0.0053863525390625, -0.0556640625, 0.0162506103515625, -0.0621337890625, -0.030181884765625, 0.024505615234375, 0.00762939453125, -0.036651611328125, 0.005771636962890625, 0.01531982421875, 0.0396728515625, -0.0012760162353515625, 0.02972412109375, -0.015838623046875, -0.03125, 0.0267181396484375, -0.0215301513671875, 0.0849609375, 0.0142974853515625, -0.043731689453125, 0.00437164306640625, -0.05426025390625, 0.0007290840148925781, 0.018707275390625, -0.0211639404296875, -0.0237274169921875, -0.025146484375, 0.0204010009765625, 0.0177154541015625, 0.03564453125, -0.06231689453125, 0.01125335693359375, -0.038299560546875, 0.042755126953125, 0.0584716796875, -0.010955810546875, 0.0109100341796875, -0.006214141845703125, 0.0272369384765625, -0.00042128562927246094, 0.0210418701171875, -0.0227508544921875, -0.04156494140625, -0.08026123046875, -0.032684326171875, 0.063720703125, 0.0283660888671875, -0.06402587890625, 0.06353759765625, -0.0290069580078125, -0.03192138671875, -0.062255859375, 0.007049560546875, 0.03082275390625, 0.02069091796875, 0.0165252685546875, -0.0369873046875, -0.044342041015625, -0.0721435546875, 0.0158233642578125, 0.001964569091796875, -0.0201568603515625, 0.0004279613494873047, 0.058624267578125, -0.0360107421875, 0.05108642578125, -0.02935791015625, -0.03985595703125, -0.01959228515625, 0.0189666748046875, 0.0238037109375, 0.03851318359375, 0.03167724609375, -0.041168212890625, -0.0343017578125, -0.0204925537109375, -0.03466796875, 0.00201416015625, -0.00585174560546875, -0.013275146484375, 0.0269622802734375, 0.05426025390625, -0.04248046875, 0.027801513671875, 0.04315185546875, -0.0222320556640625, 0.02886962890625, -0.0203704833984375, -0.0238800048828125, -0.107177734375, 0.0294647216796875, -0.01369476318359375, -0.019256591796875, -0.046234130859375, 0.0009398460388183594, -0.00530242919921875, -0.01535797119140625, -0.0297088623046875, 0.04949951171875, -0.040679931640625, -0.0021209716796875, -0.0231475830078125, 0.00728607177734375, -0.005252838134765625, 0.055938720703125, 0.006275177001953125, 0.054046630859375, 0.03521728515625, -0.044677734375, 0.0172576904296875, 0.00572967529296875, -0.065185546875, -0.0082244873046875, -0.05767822265625, 0.0054931640625, -0.002315521240234375, 0.01806640625, -0.07183837890625, -0.014129638671875, 0.0207366943359375, -0.038238525390625, 0.041748046875, 0.019073486328125, -0.0662841796875, -0.035308837890625, -0.034454345703125, 0.00858306884765625, 0.04791259765625, -0.039764404296875, 0.032012939453125, 0.0287628173828125, -0.00388336181640625, -0.0494384765625, -0.06036376953125, 0.0207672119140625, 0.00919342041015625, -0.03143310546875, 0.037933349609375, -0.00794219970703125, 0.00933074951171875, 0.007663726806640625, 0.0090789794921875, -0.0021877288818359375, 0.01316070556640625, 0.006107330322265625, 0.0263519287109375, -0.01125335693359375, 0.01155853271484375, -0.0029468536376953125, -0.0009784698486328125, -0.005573272705078125, -0.017425537109375, 0.07659912109375, 0.01366424560546875, -0.004749298095703125, -0.03546142578125, 0.014862060546875, 0.0272369384765625, -0.008636474609375, 0.0848388671875, 0.07366943359375, -0.03387451171875, -0.006992340087890625, -0.05157470703125, -0.0182037353515625, -0.032989501953125, 0.04400634765625, -0.025909423828125, -0.07720947265625, 0.0411376953125, 0.0172576904296875, 0.024810791015625, 0.042755126953125, 0.03955078125, -0.012969970703125, 0.072265625, 0.055328369140625, -0.027557373046875, 0.04864501953125, -0.03240966796875, 0.027191162109375, -0.070068359375, -0.0308685302734375, -0.0241546630859375, -0.0283050537109375, -0.046173095703125, -0.032379150390625, 0.0172271728515625, 0.0183868408203125, -0.02008056640625, 0.017913818359375, -0.029815673828125, 0.031768798828125, 0.05108642578125, 0.0117034912109375, 0.003223419189453125, 0.0226898193359375, -0.0220794677734375, -0.008636474609375, -0.05157470703125, -0.031341552734375, 0.09002685546875, 0.04547119140625, 0.040618896484375, -0.00405120849609375, 0.042205810546875, 0.0033168792724609375, 0.018157958984375, -0.05706787109375, 0.043731689453125, -0.037200927734375, -0.07635498046875, -0.026336669921875, -0.0265350341796875, -0.0819091796875, 0.0224151611328125, -0.0282135009765625, -0.056610107421875, -0.00738525390625, -0.026824951171875, -0.00885772705078125, 0.041839599609375, -0.045745849609375, 0.061309814453125, -0.02252197265625, 0.0016317367553710938, -0.014739990234375, -0.064453125, 0.03106689453125, -0.0266571044921875, 0.01245880126953125, -0.0007758140563964844, -0.00859832763671875, 0.08746337890625, -0.03076171875, 0.0712890625, -0.0125579833984375, -0.0032291412353515625, 0.004276275634765625, -0.01983642578125, 0.01555633544921875, -0.0100860595703125, 0.01261138916015625, 0.0390625, -0.01236724853515625, -0.03802490234375, -0.01502227783203125, 0.0472412109375, -0.08087158203125, -0.02679443359375, -0.03570556640625, -0.0276031494140625, -0.004314422607421875, 0.03875732421875, 0.0440673828125, 0.030303955078125, -0.01380157470703125, 0.03662109375, 0.06964111328125, -0.029541015625, 0.047027587890625, 0.04669189453125, -0.029052734375, -0.0302276611328125, 0.061737060546875, 0.01415252685546875, 0.0094146728515625, 0.03985595703125, 0.005859375, -0.0295257568359375, -0.040130615234375, -0.028656005859375, 0.042144775390625, -0.031463623046875, -0.005886077880859375, -0.062042236328125, -0.041534423828125, -0.047149658203125, 0.001979827880859375, -0.026824951171875, -0.025665283203125, -0.0310211181640625, 0.00380706787109375, 0.006710052490234375, 0.04736328125, 0.007091522216796875, 0.042327880859375, -0.044219970703125, 0.0188751220703125, 0.0205230712890625, 0.02362060546875, -0.002025604248046875, -0.062103271484375, -0.02679443359375, 0.0178070068359375, -0.01247406005859375, -0.04296875, 0.025665283203125, 0.007747650146484375, 0.044464111328125, 0.047760009765625, -0.0067291259765625, 0.0450439453125, -0.030517578125, 0.07135009765625, 0.03179931640625, -0.07159423828125, 0.028839111328125, -0.0230255126953125, 0.034332275390625, 0.044189453125, 0.040252685546875, -0.044281005859375, -0.027191162109375, -0.0609130859375, -0.06329345703125, 0.058502197265625, 0.0163116455078125, 0.028594970703125, 0.00919342041015625, 0.030975341796875, -0.00482177734375, 0.0255126953125, -0.08367919921875, -0.0240325927734375, -0.020294189453125, -0.01727294921875, -0.034942626953125, -0.045013427734375, 0.00203704833984375, -0.023284912109375, 0.0797119140625, 0.0035800933837890625, 0.0282440185546875, 0.006435394287109375, -0.0200042724609375, 0.00518035888671875, 0.0085601806640625, 0.05511474609375, 0.044586181640625, -0.023345947265625, -0.0015125274658203125, 0.001331329345703125, -0.0498046875, -0.0182647705078125, 0.01280975341796875, -0.0250244140625, 0.035247802734375, 0.040740966796875, 0.09197998046875, 0.02593994140625, -0.05438232421875, 0.050201416015625, -0.0024242401123046875, -0.034515380859375, -0.0264892578125, -0.006076812744140625, 0.0117645263671875, -0.0036182403564453125, 0.0280609130859375, -0.033233642578125, -0.0122528076171875, -0.0438232421875, -0.005157470703125, 0.039642333984375, -0.01727294921875, -0.01751708984375, 0.041473388671875, 0.01226043701171875, -0.01212310791015625, 0.058685302734375, -0.014923095703125, -0.06390380859375, 0.0328369140625, 0.049041748046875, 0.0625, -0.0205078125, 0.0188140869140625, 0.04248046875, 0.0433349609375, 0.00015211105346679688, -0.0037212371826171875, -0.0014247894287109375, -0.05987548828125, -0.0298309326171875, -0.06805419921875, -0.0109100341796875, 0.05096435546875, -0.028839111328125, 0.01397705078125, -0.046661376953125, -0.004467010498046875, 0.0013713836669921875, 0.032989501953125, -0.03179931640625, 0.0246124267578125, 0.01535797119140625, 0.06524658203125, -0.059967041015625, 0.08441162109375, 0.052978515625, -0.0279388427734375, -0.0643310546875, -0.00975799560546875, -0.031707763671875, -0.08148193359375, 0.036956787109375, 0.0277252197265625, 0.018402099609375, 0.0008082389831542969, -0.04681396484375, -0.056640625, 0.0706787109375, 0.00827789306640625, -0.04864501953125, -0.007015228271484375, 0.01324462890625, 0.052642822265625, -0.028839111328125, 0.00759124755859375, 0.0248260498046875, 0.0213623046875, 0.0080413818359375, -0.07080078125, -0.01325225830078125, -0.018707275390625, 0.027801513671875, 0.01491546630859375, -0.04095458984375, 0.067626953125, 0.004482269287109375, -0.01488494873046875, 0.012359619140625, 0.03570556640625, 0.00722503662109375, -0.0016965866088867188, 0.04669189453125, 0.0626220703125, 0.04644775390625, 0.00601959228515625, 0.0594482421875, -0.0238189697265625, 0.027069091796875, 0.0650634765625, 0.01026153564453125, 0.061859130859375, 0.031341552734375, -0.00499725341796875, 0.0433349609375, 0.057281494140625, -0.01369476318359375, 0.047454833984375, 0.0125274658203125, -0.0193023681640625, -0.0144195556640625, -0.01971435546875, -0.0294342041015625, 0.03216552734375, 0.02630615234375, -0.037506103515625, -0.007328033447265625, 0.0139007568359375, 0.0231475830078125, -0.0249481201171875, -0.03289794921875, 0.060028076171875, 0.01383209228515625, -0.048187255859375, 0.04302978515625, 0.0261993408203125, 0.0767822265625, -0.0738525390625, 0.0291748046875, -0.01104736328125, 0.0180816650390625, 0.005352020263671875, -0.04730224609375, -0.001941680908203125, 0.005794525146484375, -0.00785064697265625, -0.0178070068359375, 0.042724609375, -0.0452880859375, -0.051025390625, 0.024627685546875, 0.00704193115234375, 0.0321044921875, 0.0021820068359375, -0.06842041015625, 0.00751495361328125, 0.0211639404296875, -0.0236968994140625, 0.035491943359375, 0.0183258056640625, -0.00405120849609375, 0.0396728515625, 0.06390380859375, 0.01104736328125, 0.01267242431640625, 0.0218658447265625, 0.06109619140625, -0.037261962890625, -0.03778076171875, -0.059326171875, 0.0203094482421875, -0.01227569580078125, -0.0298309326171875, 0.04840087890625, 0.04296875, 0.07025146484375, -0.011260986328125, 0.03472900390625, -0.006130218505859375, 0.03759765625, -0.046234130859375, 0.05499267578125, -0.058624267578125, 0.0013866424560546875, -0.0152587890625, -0.08270263671875, -0.0143890380859375, 0.05975341796875, 0.0055694580078125, 0.00713348388671875, 0.0467529296875, 0.05426025390625, 0.003437042236328125, -0.01235198974609375, 0.0255584716796875, 0.02569580078125, 0.0224456787109375, 0.04412841796875, 0.034698486328125, -0.04559326171875, 0.02886962890625, -0.033050537109375, -0.01123809814453125, -0.018768310546875, -0.0538330078125, -0.09100341796875, -0.0474853515625, -0.0148773193359375, -0.02484130859375, -0.00891876220703125, 0.06353759765625, 0.05029296875, -0.05322265625, -0.01739501953125, -0.0009784698486328125, -0.0166778564453125, 0.0202178955078125, -0.0175628662109375, 0.0338134765625, -0.023590087890625, -0.0537109375, 0.00693511962890625, 0.004940032958984375, 0.01204681396484375, -0.0311737060546875, 0.00811004638671875, -0.039215087890625, -0.001819610595703125, 0.04425048828125, 0.00618743896484375, -0.052398681640625, -0.00925445556640625, -0.0033397674560546875, -0.018157958984375, -0.0171051025390625, 0.04937744140625, -0.030609130859375, 0.043548583984375, 0.0194854736328125, 0.044464111328125, 0.0701904296875, -0.0221099853515625, 0.020538330078125, -0.06671142578125, 0.0301361083984375, 0.011260986328125, 0.031463623046875, 0.029205322265625, -0.0096893310546875, 0.034271240234375, 0.0237274169921875, -0.0251617431640625, -0.0672607421875, -0.0060272216796875, -0.07080078125, -0.05078125, 0.08111572265625, -0.0110321044921875, -0.02911376953125, -0.0019006729125976562, -0.00978851318359375, 0.0305328369140625, -0.008575439453125, 0.0538330078125, 0.072509765625, 0.03106689453125, -0.0121612548828125, -0.033721923828125, 0.02410888671875, 0.0377197265625, -0.04296875, -0.0242156982421875, 0.01326751708984375, 0.0418701171875, 0.024932861328125, 0.060455322265625, -0.002655029296875, -0.0011091232299804688, -0.01300811767578125, 0.0239105224609375, -0.005863189697265625, -0.01473236083984375, -0.0215301513671875, -0.006473541259765625, -0.016448974609375, -0.031280517578125 ] ]
openai/whisper-small
2023-09-08T13:08:05.000Z
[ "transformers", "pytorch", "tf", "jax", "safetensors", "whisper", "automatic-speech-recognition", "audio", "hf-asr-leaderboard", "en", "zh", "de", "es", "ru", "ko", "fr", "ja", "pt", "tr", "pl", "ca", "nl", "ar", "sv", "it", "id", "hi", "fi", "vi", "he", "uk", "el", "ms", "cs", "ro", "da", "hu", "ta", "no", "th", "ur", "hr", "bg", "lt", "la", "mi", "ml", "cy", "sk", "te", "fa", "lv", "bn", "sr", "az", "sl", "kn", "et", "mk", "br", "eu", "is", "hy", "ne", "mn", "bs", "kk", "sq", "sw", "gl", "mr", "pa", "si", "km", "sn", "yo", "so", "af", "oc", "ka", "be", "tg", "sd", "gu", "am", "yi", "lo", "uz", "fo", "ht", "ps", "tk", "nn", "mt", "sa", "lb", "my", "bo", "tl", "mg", "as", "tt", "haw", "ln", "ha", "ba", "jw", "su", "arxiv:2212.04356", "license:apache-2.0", "model-index", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
openai
null
null
openai/whisper-small
97
103,525
transformers
2022-09-26T06:51:27
--- language: - en - zh - de - es - ru - ko - fr - ja - pt - tr - pl - ca - nl - ar - sv - it - id - hi - fi - vi - he - uk - el - ms - cs - ro - da - hu - ta - no - th - ur - hr - bg - lt - la - mi - ml - cy - sk - te - fa - lv - bn - sr - az - sl - kn - et - mk - br - eu - is - hy - ne - mn - bs - kk - sq - sw - gl - mr - pa - si - km - sn - yo - so - af - oc - ka - be - tg - sd - gu - am - yi - lo - uz - fo - ht - ps - tk - nn - mt - sa - lb - my - bo - tl - mg - as - tt - haw - ln - ha - ba - jw - su tags: - audio - automatic-speech-recognition - hf-asr-leaderboard widget: - example_title: Librispeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: Librispeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac model-index: - name: whisper-small results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (clean) type: librispeech_asr config: clean split: test args: language: en metrics: - name: Test WER type: wer value: 3.432213777886737 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: LibriSpeech (other) type: librispeech_asr config: other split: test args: language: en metrics: - name: Test WER type: wer value: 7.628304527060248 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 11.0 type: mozilla-foundation/common_voice_11_0 config: hi split: test args: language: hi metrics: - name: Test WER type: wer value: 87.3 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 13.0 type: mozilla-foundation/common_voice_13_0 config: dv split: test args: language: dv metrics: - name: Wer type: wer value: 125.69809089960707 pipeline_tag: automatic-speech-recognition license: apache-2.0 --- # Whisper Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need for fine-tuning. Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356) by Alec Radford et al from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper). **Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were copied and pasted from the original model card. ## Model details Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model. It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision. The models were trained on either English-only data or multilingual data. The English-only models were trained on the task of speech recognition. The multilingual models were trained on both speech recognition and speech translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio. For speech translation, the model predicts transcriptions to a *different* language to the audio. Whisper checkpoints come in five configurations of varying model sizes. The smallest four are trained on either English-only or multilingual data. The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The checkpoints are summarised in the following table with links to the models on the Hub: | Size | Parameters | English-only | Multilingual | |----------|------------|------------------------------------------------------|-----------------------------------------------------| | tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) | | base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) | | small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) | | medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) | | large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) | | large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) | # Usage To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor). The `WhisperProcessor` is used to: 1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model) 2. Post-process the model outputs (converting them from tokens to text) The model is informed of which task to perform (transcription or translation) by passing the appropriate "context tokens". These context tokens are a sequence of tokens that are given to the decoder at the start of the decoding process, and take the following order: 1. The transcription always starts with the `<|startoftranscript|>` token 2. The second token is the language token (e.g. `<|en|>` for English) 3. The third token is the "task token". It can take one of two values: `<|transcribe|>` for speech recognition or `<|translate|>` for speech translation 4. In addition, a `<|notimestamps|>` token is added if the model should not include timestamp prediction Thus, a typical sequence of context tokens might look as follows: ``` <|startoftranscript|> <|en|> <|transcribe|> <|notimestamps|> ``` Which tells the model to decode in English, under the task of speech recognition, and not to predict timestamps. These tokens can either be forced or un-forced. If they are forced, the model is made to predict each token at each position. This allows one to control the output language and task for the Whisper model. If they are un-forced, the Whisper model will automatically predict the output langauge and task itself. The context tokens can be set accordingly: ```python model.config.forced_decoder_ids = WhisperProcessor.get_decoder_prompt_ids(language="english", task="transcribe") ``` Which forces the model to predict in English under the task of speech recognition. ## Transcription ### English to English In this example, the context tokens are 'unforced', meaning the model automatically predicts the output language (English) and task (transcribe). ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-small") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small") >>> model.config.forced_decoder_ids = None >>> # load dummy dataset and read audio files >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False) ['<|startoftranscript|><|en|><|transcribe|><|notimestamps|> Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.'] ``` The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`. ### French to French The following example demonstrates French to French transcription by setting the decoder ids appropriately. ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-small") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="transcribe") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids) ['<|startoftranscript|><|fr|><|transcribe|><|notimestamps|> Un vrai travail intéressant va enfin être mené sur ce sujet.<|endoftext|>'] >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' Un vrai travail intéressant va enfin être mené sur ce sujet.'] ``` ## Translation Setting the task to "translate" forces the Whisper model to perform speech translation. ### French to English ```python >>> from transformers import WhisperProcessor, WhisperForConditionalGeneration >>> from datasets import Audio, load_dataset >>> # load model and processor >>> processor = WhisperProcessor.from_pretrained("openai/whisper-small") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small") >>> forced_decoder_ids = processor.get_decoder_prompt_ids(language="french", task="translate") >>> # load streaming dataset and read first audio sample >>> ds = load_dataset("common_voice", "fr", split="test", streaming=True) >>> ds = ds.cast_column("audio", Audio(sampling_rate=16_000)) >>> input_speech = next(iter(ds))["audio"] >>> input_features = processor(input_speech["array"], sampling_rate=input_speech["sampling_rate"], return_tensors="pt").input_features >>> # generate token ids >>> predicted_ids = model.generate(input_features, forced_decoder_ids=forced_decoder_ids) >>> # decode token ids to text >>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True) [' A very interesting work, we will finally be given on this subject.'] ``` ## Evaluation This code snippet shows how to evaluate Whisper Small on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr): ```python >>> from datasets import load_dataset >>> from transformers import WhisperForConditionalGeneration, WhisperProcessor >>> import torch >>> from evaluate import load >>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test") >>> processor = WhisperProcessor.from_pretrained("openai/whisper-small") >>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-small").to("cuda") >>> def map_to_pred(batch): >>> audio = batch["audio"] >>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features >>> batch["reference"] = processor.tokenizer._normalize(batch['text']) >>> >>> with torch.no_grad(): >>> predicted_ids = model.generate(input_features.to("cuda"))[0] >>> transcription = processor.decode(predicted_ids) >>> batch["prediction"] = processor.tokenizer._normalize(transcription) >>> return batch >>> result = librispeech_test_clean.map(map_to_pred) >>> wer = load("wer") >>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"])) 3.432213777886737 ``` ## Long-Form Transcription The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers [`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline) method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`: ```python >>> import torch >>> from transformers import pipeline >>> from datasets import load_dataset >>> device = "cuda:0" if torch.cuda.is_available() else "cpu" >>> pipe = pipeline( >>> "automatic-speech-recognition", >>> model="openai/whisper-small", >>> chunk_length_s=30, >>> device=device, >>> ) >>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") >>> sample = ds[0]["audio"] >>> prediction = pipe(sample.copy(), batch_size=8)["text"] " Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel." >>> # we can also return timestamps for the predictions >>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"] [{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.', 'timestamp': (0.0, 5.44)}] ``` Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm. ## Fine-Tuning The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However, its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step guide to fine-tuning the Whisper model with as little as 5 hours of labelled data. ### Evaluated Use The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research. The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them. In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes. ## Training Data The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages. As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language. ## Performance and Limitations Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level. However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself. Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf). In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages. ## Broader Implications We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications. There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects. ### BibTeX entry and citation info ```bibtex @misc{radford2022whisper, doi = {10.48550/ARXIV.2212.04356}, url = {https://arxiv.org/abs/2212.04356}, author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya}, title = {Robust Speech Recognition via Large-Scale Weak Supervision}, publisher = {arXiv}, year = {2022}, copyright = {arXiv.org perpetual, non-exclusive license} } ```
20,116
[ [ -0.0142669677734375, -0.04278564453125, 0.007793426513671875, 0.033935546875, -0.00913238525390625, -0.01006317138671875, -0.02398681640625, -0.03985595703125, 0.01546478271484375, 0.0345458984375, -0.06182861328125, -0.043212890625, -0.05712890625, -0.0003750324249267578, -0.04132080078125, 0.0755615234375, 0.007190704345703125, -0.0006055831909179688, 0.01230621337890625, -0.00804901123046875, -0.032440185546875, -0.0343017578125, -0.05242919921875, -0.0161590576171875, 0.01010894775390625, 0.013275146484375, 0.02996826171875, 0.03765869140625, 0.00789642333984375, 0.0306243896484375, -0.0311737060546875, -0.00252532958984375, -0.031219482421875, -0.00534820556640625, 0.026397705078125, -0.039398193359375, -0.043853759765625, 0.00650787353515625, 0.054168701171875, 0.0391845703125, -0.0193328857421875, 0.0298004150390625, 0.018035888671875, 0.0247650146484375, -0.0185546875, 0.022186279296875, -0.04608154296875, -0.01033782958984375, -0.020660400390625, -0.00267791748046875, -0.02398681640625, -0.019378662109375, 0.039520263671875, -0.041748046875, 0.0213623046875, 0.0018644332885742188, 0.0758056640625, 0.0166168212890625, -0.0119781494140625, -0.0283203125, -0.053314208984375, 0.07708740234375, -0.06536865234375, 0.0355224609375, 0.037200927734375, 0.01352691650390625, -0.005283355712890625, -0.06573486328125, -0.05255126953125, -0.00103759765625, -0.0089874267578125, 0.021087646484375, -0.035369873046875, 0.00022351741790771484, 0.0175018310546875, 0.0203399658203125, -0.036285400390625, -0.00003349781036376953, -0.048370361328125, -0.0526123046875, 0.042388916015625, -0.00033736228942871094, 0.02587890625, -0.020721435546875, -0.01470184326171875, -0.0238800048828125, -0.02362060546875, 0.034576416015625, 0.0271453857421875, 0.03619384765625, -0.04901123046875, 0.0304107666015625, -0.0100250244140625, 0.048492431640625, 0.0123443603515625, -0.049591064453125, 0.0489501953125, -0.01535797119140625, -0.01328277587890625, 0.02728271484375, 0.07476806640625, 0.020477294921875, 0.01158905029296875, 0.003742218017578125, -0.017791748046875, 0.008453369140625, -0.007198333740234375, -0.053985595703125, 0.0049285888671875, 0.036590576171875, -0.042236328125, -0.024658203125, -0.01812744140625, -0.037689208984375, 0.018280029296875, -0.0180816650390625, 0.053192138671875, -0.042236328125, -0.026885986328125, 0.01143646240234375, -0.029052734375, 0.018035888671875, 0.0017795562744140625, -0.06170654296875, 0.0279083251953125, 0.0303802490234375, 0.06744384765625, 0.004978179931640625, -0.049224853515625, -0.042877197265625, 0.00601959228515625, 0.00441741943359375, 0.03472900390625, -0.0189208984375, -0.04315185546875, -0.006565093994140625, 0.01348114013671875, -0.029998779296875, -0.03802490234375, 0.05169677734375, -0.00862884521484375, 0.035064697265625, -0.00443267822265625, -0.03594970703125, -0.0197906494140625, -0.01275634765625, -0.033843994140625, 0.07080078125, 0.01160430908203125, -0.053253173828125, 0.00893402099609375, -0.0408935546875, -0.03887939453125, -0.0131378173828125, 0.017974853515625, -0.032623291015625, -0.001644134521484375, 0.033599853515625, 0.034881591796875, -0.008819580078125, 0.0071258544921875, 0.0078887939453125, -0.0309295654296875, 0.02655029296875, -0.034088134765625, 0.0753173828125, 0.01181793212890625, -0.02935791015625, 0.0142669677734375, -0.058074951171875, 0.005435943603515625, 0.0049285888671875, -0.019683837890625, 0.006793975830078125, -0.000056684017181396484, 0.0200042724609375, 0.007709503173828125, 0.01413726806640625, -0.0516357421875, -0.0055389404296875, -0.051025390625, 0.065673828125, 0.04248046875, -0.0030765533447265625, 0.0273284912109375, -0.0411376953125, 0.02313232421875, 0.0108184814453125, 0.0262298583984375, -0.018218994140625, -0.047393798828125, -0.06231689453125, -0.0287322998046875, 0.033599853515625, 0.0596923828125, -0.0328369140625, 0.042633056640625, -0.02197265625, -0.044097900390625, -0.09454345703125, -0.005832672119140625, 0.04180908203125, 0.0479736328125, 0.049713134765625, -0.003993988037109375, -0.049774169921875, -0.0594482421875, -0.01102447509765625, -0.0218963623046875, -0.012725830078125, 0.026123046875, 0.0273284912109375, -0.02972412109375, 0.05267333984375, -0.030059814453125, -0.04144287109375, -0.025634765625, 0.00434112548828125, 0.03448486328125, 0.047271728515625, 0.0251922607421875, -0.056304931640625, -0.0289306640625, -0.0149688720703125, -0.040985107421875, -0.012359619140625, -0.00839996337890625, -0.0008192062377929688, 0.01523590087890625, 0.033416748046875, -0.05255126953125, 0.03558349609375, 0.051025390625, -0.0139923095703125, 0.047149658203125, 0.005008697509765625, -0.002803802490234375, -0.08831787109375, 0.0016460418701171875, -0.017852783203125, -0.012664794921875, -0.05328369140625, -0.0181427001953125, -0.006378173828125, -0.00728607177734375, -0.043731689453125, 0.046295166015625, -0.0251312255859375, 0.004299163818359375, -0.00438690185546875, 0.0103912353515625, -0.0037670135498046875, 0.04876708984375, 0.0191650390625, 0.051788330078125, 0.061920166015625, -0.04296875, 0.0163726806640625, 0.044036865234375, -0.01995849609375, 0.0218048095703125, -0.07135009765625, 0.0092010498046875, 0.006290435791015625, 0.01212310791015625, -0.06646728515625, -0.007965087890625, 0.00621795654296875, -0.0706787109375, 0.03125, -0.0258636474609375, -0.02276611328125, -0.03961181640625, -0.007053375244140625, 0.005970001220703125, 0.06488037109375, -0.036895751953125, 0.053466796875, 0.031982421875, -0.017181396484375, -0.042083740234375, -0.053131103515625, -0.00659942626953125, -0.01010894775390625, -0.057037353515625, 0.037506103515625, -0.0011491775512695312, 0.004791259765625, -0.00687408447265625, -0.004657745361328125, 0.0089263916015625, -0.0154266357421875, 0.035064697265625, 0.029937744140625, -0.0062408447265625, -0.020050048828125, 0.0182342529296875, -0.0186309814453125, 0.00019276142120361328, -0.0208892822265625, 0.04974365234375, -0.018218994140625, -0.001285552978515625, -0.058074951171875, 0.0281219482421875, 0.0460205078125, -0.025970458984375, 0.04901123046875, 0.056243896484375, -0.020782470703125, -0.012939453125, -0.046173095703125, -0.01346588134765625, -0.039886474609375, 0.0158233642578125, -0.03668212890625, -0.059661865234375, 0.058929443359375, 0.0169677734375, 0.01175689697265625, 0.04925537109375, 0.039337158203125, -0.01067352294921875, 0.07952880859375, 0.037506103515625, -0.0202789306640625, 0.0191192626953125, -0.050628662109375, -0.006008148193359375, -0.07489013671875, -0.0308380126953125, -0.04144287109375, -0.01446533203125, -0.033477783203125, -0.02032470703125, 0.034271240234375, 0.01387786865234375, -0.00025773048400878906, 0.0391845703125, -0.05279541015625, 0.000988006591796875, 0.0504150390625, 0.0003261566162109375, 0.0044403076171875, -0.0017518997192382812, -0.0211029052734375, -0.0005731582641601562, -0.0372314453125, -0.029266357421875, 0.07318115234375, 0.034515380859375, 0.03521728515625, -0.00247955322265625, 0.05419921875, -0.0008873939514160156, 0.001216888427734375, -0.060089111328125, 0.037689208984375, -0.0100555419921875, -0.038238525390625, -0.0300140380859375, -0.01947021484375, -0.06341552734375, 0.0128021240234375, -0.012176513671875, -0.05523681640625, 0.00937652587890625, -0.0005025863647460938, -0.0220184326171875, 0.01412200927734375, -0.054046630859375, 0.04791259765625, 0.01226806640625, 0.01114654541015625, 0.0011005401611328125, -0.0552978515625, 0.0111846923828125, 0.006694793701171875, 0.0094451904296875, -0.00489044189453125, 0.01178741455078125, 0.076904296875, -0.03790283203125, 0.0714111328125, -0.02276611328125, 0.002368927001953125, 0.033599853515625, -0.00757598876953125, 0.0275421142578125, -0.0158233642578125, -0.007572174072265625, 0.03717041015625, 0.0280914306640625, -0.0213470458984375, -0.0198974609375, 0.039825439453125, -0.08038330078125, -0.0287322998046875, -0.0191650390625, -0.02374267578125, -0.007404327392578125, 0.019256591796875, 0.06787109375, 0.055419921875, -0.01114654541015625, -0.0021305084228515625, 0.031158447265625, -0.018218994140625, 0.04254150390625, 0.047821044921875, -0.015869140625, -0.037017822265625, 0.06817626953125, 0.0211944580078125, 0.016876220703125, 0.02081298828125, 0.026275634765625, -0.034088134765625, -0.0498046875, -0.041412353515625, 0.0237274169921875, -0.0391845703125, -0.011566162109375, -0.06854248046875, -0.042694091796875, -0.052520751953125, 0.0026950836181640625, -0.027862548828125, -0.0217132568359375, -0.036224365234375, 0.00829315185546875, 0.040802001953125, 0.031524658203125, 0.0005354881286621094, 0.042205810546875, -0.07452392578125, 0.031951904296875, 0.02362060546875, 0.0068511962890625, 0.0013294219970703125, -0.0762939453125, -0.00563812255859375, 0.0163726806640625, -0.01500701904296875, -0.05517578125, 0.04150390625, 0.026824951171875, 0.04205322265625, 0.0198211669921875, 0.00037860870361328125, 0.06048583984375, -0.05584716796875, 0.06439208984375, 0.011383056640625, -0.094970703125, 0.055206298828125, -0.0252685546875, 0.025421142578125, 0.0295562744140625, 0.026519775390625, -0.053863525390625, -0.03594970703125, -0.047027587890625, -0.048004150390625, 0.0628662109375, 0.02783203125, 0.0123443603515625, 0.007534027099609375, 0.0218658447265625, 0.00601959228515625, 0.01004791259765625, -0.03668212890625, -0.032562255859375, -0.03570556640625, -0.0198516845703125, -0.0130767822265625, -0.01076507568359375, -0.003345489501953125, -0.0396728515625, 0.056427001953125, -0.002437591552734375, 0.04290771484375, 0.032745361328125, -0.00453948974609375, -0.0019168853759765625, 0.006988525390625, 0.0439453125, 0.0212554931640625, -0.01436614990234375, -0.026336669921875, 0.0230255126953125, -0.05914306640625, -0.0002830028533935547, 0.0195770263671875, -0.02215576171875, 0.01276397705078125, 0.05938720703125, 0.09100341796875, 0.0160980224609375, -0.0364990234375, 0.054107666015625, -0.00920867919921875, -0.0304412841796875, -0.04168701171875, 0.0031757354736328125, 0.0218963623046875, 0.01511383056640625, 0.0260009765625, 0.00975799560546875, 0.005767822265625, -0.03558349609375, 0.005306243896484375, 0.0202789306640625, -0.033538818359375, -0.040069580078125, 0.06109619140625, 0.01169586181640625, -0.037841796875, 0.053497314453125, 0.00754547119140625, -0.057403564453125, 0.035400390625, 0.05230712890625, 0.0760498046875, -0.037200927734375, 0.002338409423828125, 0.033233642578125, 0.0183868408203125, -0.0056304931640625, 0.038818359375, -0.00908660888671875, -0.0576171875, -0.03369140625, -0.0751953125, -0.0181427001953125, 0.01259613037109375, -0.069580078125, 0.0233154296875, -0.0175628662109375, -0.0218505859375, 0.0225982666015625, -0.00039076805114746094, -0.05841064453125, 0.0096282958984375, 0.006031036376953125, 0.0784912109375, -0.055816650390625, 0.0775146484375, 0.0184783935546875, -0.0195465087890625, -0.08160400390625, 0.002208709716796875, 0.002239227294921875, -0.07830810546875, 0.03173828125, 0.0255279541015625, -0.0158233642578125, 0.01425933837890625, -0.041168212890625, -0.06378173828125, 0.07305908203125, 0.00948333740234375, -0.05303955078125, -0.00804901123046875, -0.00266265869140625, 0.0391845703125, -0.0234222412109375, 0.00983428955078125, 0.055206298828125, 0.031951904296875, 0.006008148193359375, -0.10382080078125, -0.006679534912109375, -0.0206146240234375, -0.0127716064453125, 0.0007319450378417969, -0.0526123046875, 0.0631103515625, -0.024200439453125, -0.01922607421875, 0.0200958251953125, 0.050323486328125, 0.015716552734375, 0.0160980224609375, 0.04669189453125, 0.03662109375, 0.05230712890625, -0.01268768310546875, 0.07501220703125, -0.020111083984375, 0.010345458984375, 0.06640625, -0.003520965576171875, 0.08489990234375, 0.0219268798828125, -0.0280303955078125, 0.043304443359375, 0.0288238525390625, -0.00020968914031982422, 0.04119873046875, -0.007080078125, -0.0215301513671875, 0.007389068603515625, -0.0030269622802734375, -0.03131103515625, 0.05853271484375, 0.0308837890625, -0.0202789306640625, 0.02435302734375, 0.0238494873046875, 0.0075531005859375, -0.0105743408203125, -0.0193939208984375, 0.0716552734375, 0.01055908203125, -0.044525146484375, 0.06591796875, 0.0021820068359375, 0.07257080078125, -0.06231689453125, 0.0164031982421875, 0.003917694091796875, 0.011993408203125, -0.01302337646484375, -0.04815673828125, 0.0254364013671875, -0.01078033447265625, -0.024169921875, -0.01424407958984375, 0.04156494140625, -0.0550537109375, -0.0391845703125, 0.042236328125, 0.026519775390625, 0.024169921875, -0.0089874267578125, -0.06634521484375, 0.029632568359375, 0.0166473388671875, -0.0182647705078125, 0.01284027099609375, 0.0134735107421875, 0.0175018310546875, 0.047515869140625, 0.06439208984375, 0.0305633544921875, 0.01099395751953125, 0.013336181640625, 0.060089111328125, -0.047760009765625, -0.051116943359375, -0.050811767578125, 0.03607177734375, 0.00434112548828125, -0.033416748046875, 0.05926513671875, 0.036590576171875, 0.05169677734375, -0.0008692741394042969, 0.05731201171875, 0.00473785400390625, 0.07080078125, -0.04168701171875, 0.06304931640625, -0.031768798828125, 0.0008077621459960938, -0.0246124267578125, -0.055389404296875, 0.00481414794921875, 0.043609619140625, -0.005039215087890625, -0.00893402099609375, 0.0284576416015625, 0.06695556640625, 0.005306243896484375, 0.01271820068359375, 0.01003265380859375, 0.030364990234375, 0.0162200927734375, 0.040618896484375, 0.043487548828125, -0.05731201171875, 0.048797607421875, -0.03741455078125, -0.0180206298828125, 0.003936767578125, -0.044525146484375, -0.07421875, -0.06475830078125, -0.0197906494140625, -0.04229736328125, -0.0182952880859375, 0.05889892578125, 0.0667724609375, -0.06353759765625, -0.0247802734375, 0.021148681640625, -0.0035037994384765625, -0.0308380126953125, -0.0186309814453125, 0.04248046875, -0.002239227294921875, -0.06689453125, 0.04718017578125, 0.002285003662109375, 0.02923583984375, -0.01364898681640625, -0.0165557861328125, 0.002712249755859375, 0.00804901123046875, 0.04107666015625, 0.0217742919921875, -0.06500244140625, -0.01047515869140625, 0.008209228515625, 0.004032135009765625, -0.0018138885498046875, 0.0318603515625, -0.053863525390625, 0.02642822265625, 0.02783203125, 0.0091552734375, 0.061187744140625, -0.021942138671875, 0.028411865234375, -0.056488037109375, 0.034423828125, 0.01505279541015625, 0.024566650390625, 0.0263671875, -0.022216796875, 0.01085662841796875, 0.0218963623046875, -0.041229248046875, -0.0770263671875, -0.0095367431640625, -0.08349609375, -0.01172637939453125, 0.07452392578125, 0.0018358230590820312, -0.0263214111328125, -0.00811004638671875, -0.0250701904296875, 0.032623291015625, -0.03546142578125, 0.0237579345703125, 0.043304443359375, 0.0037784576416015625, -0.002498626708984375, -0.0439453125, 0.0556640625, 0.0162506103515625, -0.01739501953125, -0.001834869384765625, 0.003047943115234375, 0.04559326171875, 0.0201263427734375, 0.0634765625, -0.0162811279296875, 0.013031005859375, 0.0091705322265625, 0.012176513671875, -0.00852203369140625, -0.01432037353515625, -0.0345458984375, -0.004302978515625, -0.0251617431640625, -0.031951904296875 ] ]
Salesforce/safety-flan-t5-base
2023-05-04T05:23:20.000Z
[ "transformers", "pytorch", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
Salesforce
null
null
Salesforce/safety-flan-t5-base
7
103,437
transformers
2023-04-27T05:35:36
--- {} --- # Model Details ## Model Description - **Model type:** Language model - **License:** CC BY-NC - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5) - **Resources for more information:** - [GitHub Repo](https://github.com/salesforce/AuditNLG) # Usage Find below some example scripts on how to use the model in `transformers`: ## Using the Pytorch model ### Running the model <details> <summary> Click to expand </summary> ```python from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, AutoConfig config = AutoConfig.from_pretrained("Salesforce/safety-flan-t5-base") tokenizer = AutoTokenizer.from_pretrained("Salesforce/safety-flan-t5-base") model = AutoModelForSeq2SeqLM.from_pretrained("Salesforce/safety-flan-t5-base", config=config) prefix = "Is the <Text> field safe or unsafe?" input_context = "Can you teach me this?" input_text = "You are so stupid" input_ids = tokenizer(prefix + " <Text> " + input_text + " <Context> " + input_context, return_tensors="pt").input_ids outputs = model.generate(input_ids) print(tokenizer.decode(outputs[0])) ``` </details>
1,140
[ [ -0.00954437255859375, -0.0279388427734375, 0.0074310302734375, -0.0031642913818359375, -0.00768280029296875, -0.0161590576171875, 0.01001739501953125, -0.022613525390625, -0.0190887451171875, 0.0207672119140625, -0.046783447265625, -0.03564453125, -0.041778564453125, 0.0029544830322265625, -0.04119873046875, 0.08209228515625, -0.0107879638671875, -0.00957489013671875, 0.01464080810546875, -0.0079345703125, -0.00852203369140625, -0.040008544921875, -0.03985595703125, -0.030975341796875, 0.01509857177734375, 0.0098114013671875, 0.043609619140625, 0.0302886962890625, 0.03363037109375, 0.028472900390625, -0.0157318115234375, -0.010986328125, -0.0169677734375, 0.00640869140625, -0.0232391357421875, -0.041534423828125, -0.042083740234375, -0.002880096435546875, 0.0299224853515625, 0.01224517822265625, -0.01522064208984375, 0.032012939453125, -0.01476287841796875, -0.0027256011962890625, -0.044189453125, 0.0269927978515625, -0.037506103515625, 0.006256103515625, 0.0180816650390625, 0.0023288726806640625, -0.0165557861328125, -0.01401519775390625, 0.0258636474609375, -0.046722412109375, 0.0479736328125, -0.014007568359375, 0.10455322265625, 0.00525665283203125, -0.047698974609375, -0.00818634033203125, -0.039337158203125, 0.05010986328125, -0.070068359375, 0.01263427734375, 0.028839111328125, 0.03668212890625, 0.0128173828125, -0.09490966796875, -0.029693603515625, -0.005367279052734375, -0.0308685302734375, 0.004512786865234375, -0.00994110107421875, 0.02008056640625, 0.036407470703125, 0.027740478515625, -0.045501708984375, -0.0177154541015625, -0.04229736328125, -0.037078857421875, 0.05328369140625, 0.03369140625, 0.006793975830078125, -0.01507568359375, -0.0445556640625, -0.037078857421875, -0.006145477294921875, 0.01177978515625, 0.0102691650390625, 0.00865936279296875, -0.0136566162109375, 0.034027099609375, -0.0240936279296875, 0.030059814453125, 0.006328582763671875, -0.0199432373046875, 0.03515625, -0.02301025390625, -0.0338134765625, -0.004970550537109375, 0.06561279296875, 0.02789306640625, -0.0009183883666992188, 0.00998687744140625, -0.05438232421875, -0.005268096923828125, 0.01543426513671875, -0.078125, -0.0255889892578125, 0.0318603515625, -0.0338134765625, -0.061798095703125, 0.0254974365234375, -0.051788330078125, -0.01209259033203125, -0.0011377334594726562, 0.072998046875, -0.0247802734375, -0.03680419921875, -0.006450653076171875, -0.0173492431640625, 0.0225830078125, 0.01422882080078125, -0.0653076171875, 0.0035343170166015625, 0.037811279296875, 0.066650390625, 0.01812744140625, -0.02703857421875, -0.022186279296875, 0.011932373046875, -0.004756927490234375, 0.0177459716796875, 0.0045166015625, -0.0240631103515625, 0.0198822021484375, 0.027252197265625, -0.0279388427734375, -0.033203125, 0.019989013671875, -0.038970947265625, 0.0242767333984375, 0.018402099609375, -0.031280517578125, -0.036773681640625, 0.017242431640625, -0.04583740234375, 0.0811767578125, 0.037109375, -0.039703369140625, 0.0210113525390625, -0.0606689453125, -0.035247802734375, -0.01100921630859375, 0.01039886474609375, -0.07476806640625, -0.0080413818359375, 0.0086212158203125, 0.033447265625, 0.007030487060546875, 0.0194854736328125, -0.02093505859375, -0.01099395751953125, 0.017425537109375, -0.0216217041015625, 0.07080078125, 0.0203094482421875, -0.04522705078125, 0.03759765625, -0.049102783203125, 0.001110076904296875, 0.01611328125, -0.01139068603515625, 0.0234375, -0.0302581787109375, 0.0254974365234375, 0.0213623046875, 0.02056884765625, -0.02001953125, 0.0096282958984375, -0.0271148681640625, 0.05279541015625, 0.046722412109375, 0.007373809814453125, 0.0269927978515625, -0.0323486328125, 0.035736083984375, 0.02301025390625, 0.024322509765625, -0.01800537109375, -0.01052093505859375, -0.0797119140625, -0.01202392578125, 0.01232147216796875, 0.040496826171875, -0.043792724609375, 0.050537109375, -0.0120697021484375, -0.044952392578125, 0.002979278564453125, 0.0018835067749023438, 0.0055389404296875, 0.03973388671875, 0.0212860107421875, -0.005096435546875, -0.0428466796875, -0.059783935546875, -0.0179901123046875, -0.0167388916015625, -0.01082611083984375, -0.00791168212890625, 0.08184814453125, -0.0205078125, 0.053863525390625, -0.037811279296875, -0.03912353515625, -0.024688720703125, 0.01885986328125, 0.0269012451171875, 0.056976318359375, 0.04864501953125, -0.033050537109375, -0.012115478515625, -0.01313018798828125, -0.046722412109375, 0.01189422607421875, -0.0186004638671875, -0.0178375244140625, 0.0191650390625, 0.0228424072265625, -0.050445556640625, 0.0718994140625, 0.02056884765625, -0.05328369140625, 0.043212890625, -0.0015087127685546875, -0.004917144775390625, -0.09429931640625, 0.022125244140625, -0.0079345703125, -0.036590576171875, -0.046051025390625, 0.011016845703125, 0.0182952880859375, -0.0191802978515625, -0.0369873046875, 0.038909912109375, -0.0097198486328125, 0.009185791015625, -0.01059722900390625, -0.02685546875, 0.0190887451171875, 0.035125732421875, 0.012603759765625, 0.046478271484375, 0.051300048828125, -0.061126708984375, 0.03564453125, 0.02685546875, -0.0263671875, 0.005672454833984375, -0.050689697265625, 0.00011157989501953125, 0.021026611328125, 0.0225830078125, -0.05853271484375, -0.034698486328125, 0.0185394287109375, -0.03912353515625, 0.0175933837890625, -0.019134521484375, -0.04022216796875, -0.0203857421875, -0.0032958984375, 0.031707763671875, 0.04510498046875, -0.047821044921875, 0.07940673828125, 0.005710601806640625, 0.02874755859375, -0.0267181396484375, -0.06805419921875, -0.0249176025390625, -0.032012939453125, -0.054229736328125, 0.0243682861328125, 0.03363037109375, 0.0059661865234375, -0.0156402587890625, -0.0167999267578125, -0.0173492431640625, 0.0166778564453125, 0.01145172119140625, 0.01873779296875, -0.022216796875, 0.0019550323486328125, -0.0182342529296875, -0.0228424072265625, 0.017333984375, -0.0301513671875, 0.04296875, -0.031463623046875, -0.005252838134765625, -0.0628662109375, -0.005889892578125, 0.04046630859375, -0.029998779296875, 0.071044921875, 0.06610107421875, -0.040771484375, -0.02191162109375, -0.032440185546875, -0.054443359375, -0.0389404296875, 0.045440673828125, -0.017822265625, -0.02667236328125, 0.05615234375, 0.01103973388671875, 0.01416778564453125, 0.0557861328125, 0.038970947265625, -0.010101318359375, 0.07177734375, 0.0298309326171875, 0.0152740478515625, 0.05706787109375, -0.0621337890625, 0.01473236083984375, -0.03057861328125, -0.0175933837890625, -0.019287109375, -0.007373809814453125, -0.0289764404296875, -0.007701873779296875, 0.040069580078125, 0.018218994140625, -0.0518798828125, 0.045928955078125, -0.032135009765625, 0.005374908447265625, 0.058685302734375, 0.0113067626953125, -0.005130767822265625, 0.007320404052734375, -0.01203155517578125, 0.0254974365234375, -0.057769775390625, -0.034332275390625, 0.06634521484375, 0.036773681640625, 0.05047607421875, 0.01325225830078125, 0.049774169921875, 0.0009522438049316406, 0.018402099609375, -0.049713134765625, 0.0259246826171875, 0.0197296142578125, -0.07147216796875, -0.0136260986328125, -0.0201416015625, -0.058685302734375, 0.0255889892578125, 0.005374908447265625, -0.060546875, 0.0106201171875, 0.0194549560546875, -0.0350341796875, 0.03851318359375, -0.054962158203125, 0.08624267578125, -0.0247039794921875, -0.0249786376953125, -0.0043487548828125, -0.04534912109375, 0.034759521484375, 0.01345062255859375, -0.01021575927734375, 0.00563812255859375, 0.00952911376953125, 0.066650390625, -0.06805419921875, 0.06866455078125, -0.0174713134765625, 0.0075531005859375, 0.0411376953125, 0.01369476318359375, 0.032501220703125, 0.0211334228515625, -0.00835418701171875, 0.0270843505859375, 0.03765869140625, -0.0157470703125, -0.04107666015625, 0.0288848876953125, -0.06524658203125, -0.030120849609375, -0.07177734375, -0.0292510986328125, 0.01885986328125, 0.034393310546875, 0.0220489501953125, 0.03436279296875, 0.0108795166015625, 0.00960540771484375, 0.0206451416015625, -0.0135498046875, 0.05126953125, 0.0305023193359375, -0.026031494140625, -0.05230712890625, 0.051177978515625, -0.00997161865234375, 0.0298614501953125, 0.011444091796875, 0.0220489501953125, -0.04229736328125, 0.0015764236450195312, -0.0277557373046875, 0.005359649658203125, -0.06939697265625, -0.039581298828125, -0.03173828125, -0.0313720703125, -0.050140380859375, 0.00594329833984375, -0.0236358642578125, -0.034698486328125, -0.037078857421875, -0.02508544921875, 0.047576904296875, 0.041748046875, -0.0200347900390625, 0.047393798828125, -0.05877685546875, 0.012542724609375, 0.0143890380859375, 0.029052734375, 0.0026149749755859375, -0.056243896484375, -0.0113067626953125, -0.0008640289306640625, -0.0367431640625, -0.0665283203125, 0.034912109375, 0.007381439208984375, 0.0313720703125, 0.047210693359375, 0.01358795166015625, 0.04071044921875, 0.009246826171875, 0.06158447265625, 0.01036834716796875, -0.0797119140625, 0.0411376953125, -0.02545166015625, 0.01507568359375, 0.038848876953125, -0.004608154296875, -0.034820556640625, -0.0015659332275390625, -0.058502197265625, -0.07940673828125, 0.07855224609375, 0.03857421875, -0.00313568115234375, 0.024200439453125, 0.01012420654296875, -0.01383209228515625, 0.01169586181640625, -0.06903076171875, -0.0175933837890625, -0.05035400390625, -0.024322509765625, 0.01369476318359375, -0.003124237060546875, -0.0233306884765625, -0.0267486572265625, 0.06744384765625, -0.004993438720703125, 0.031951904296875, 0.0205230712890625, -0.025177001953125, 0.00469207763671875, -0.00629425048828125, 0.0540771484375, 0.033233642578125, -0.02777099609375, 0.0093231201171875, 0.0271148681640625, -0.034942626953125, 0.003322601318359375, 0.00876617431640625, -0.0197296142578125, 0.0035877227783203125, 0.03466796875, 0.07440185546875, -0.01003265380859375, -0.0113067626953125, 0.0430908203125, -0.0057373046875, -0.039459228515625, -0.058563232421875, 0.0243988037109375, 0.00021183490753173828, 0.01522064208984375, 0.0223846435546875, 0.01372528076171875, -0.0167236328125, -0.0213165283203125, 0.020965576171875, 0.0132904052734375, -0.03057861328125, -0.0130767822265625, 0.07281494140625, 0.0130157470703125, -0.02301025390625, 0.058380126953125, 0.00318145751953125, -0.06353759765625, 0.07147216796875, 0.0280303955078125, 0.079345703125, -0.00379180908203125, -0.009368896484375, 0.060028076171875, 0.0258941650390625, -0.0014619827270507812, 0.01495361328125, -0.0001614093780517578, -0.04107666015625, 0.01294708251953125, -0.042083740234375, 0.003841400146484375, 0.0325927734375, -0.0487060546875, 0.03570556640625, -0.0426025390625, -0.0178375244140625, -0.0218048095703125, -0.00399017333984375, -0.06988525390625, 0.031646728515625, -0.0022430419921875, 0.060028076171875, -0.058563232421875, 0.05072021484375, 0.036285400390625, -0.06689453125, -0.0880126953125, -0.00421142578125, 0.003936767578125, -0.0494384765625, 0.04107666015625, 0.036956787109375, 0.0067138671875, 0.0032215118408203125, -0.02947998046875, -0.06591796875, 0.110595703125, -0.00333404541015625, -0.0235595703125, -0.0044403076171875, 0.026580810546875, 0.0281982421875, -0.01192474365234375, 0.066650390625, 0.0423583984375, 0.038055419921875, -0.0028476715087890625, -0.05841064453125, 0.0120697021484375, -0.01335906982421875, -0.005558013916015625, -0.00925445556640625, -0.0540771484375, 0.068359375, -0.01184844970703125, -0.018341064453125, 0.020660400390625, 0.060577392578125, 0.01192474365234375, 0.0155487060546875, 0.051422119140625, 0.0259246826171875, 0.034515380859375, -0.0225677490234375, 0.0579833984375, -0.042266845703125, 0.07403564453125, 0.061309814453125, -0.0016031265258789062, 0.036468505859375, 0.0224151611328125, -0.0057525634765625, 0.042022705078125, 0.050811767578125, -0.036376953125, 0.0226287841796875, -0.0003724098205566406, -0.0200958251953125, -0.016998291015625, 0.0202484130859375, -0.034820556640625, 0.028594970703125, 0.0227508544921875, -0.057342529296875, -0.022491455078125, -0.0200347900390625, -0.0142364501953125, -0.045928955078125, -0.00553131103515625, 0.042877197265625, -0.002414703369140625, -0.045501708984375, 0.052337646484375, 0.0177001953125, 0.042816162109375, -0.060028076171875, -0.00040984153747558594, -0.0098419189453125, 0.0460205078125, -0.0343017578125, -0.035888671875, 0.0131988525390625, -0.0156402587890625, -0.0008358955383300781, -0.00899505615234375, 0.050262451171875, -0.0281829833984375, -0.048004150390625, 0.0081024169921875, -0.0005884170532226562, 0.0016984939575195312, 0.00019633769989013672, -0.054229736328125, -0.019683837890625, 0.0015430450439453125, -0.033355712890625, 0.0015659332275390625, 0.005001068115234375, -0.003757476806640625, 0.0560302734375, 0.04302978515625, -0.0294647216796875, 0.0216522216796875, 0.0156402587890625, 0.04742431640625, -0.046783447265625, -0.053436279296875, -0.049560546875, 0.059661865234375, -0.005359649658203125, -0.042877197265625, 0.040863037109375, 0.057373046875, 0.06695556640625, -0.022003173828125, 0.06146240234375, -0.01140594482421875, 0.01641845703125, -0.0279083251953125, 0.08245849609375, -0.04168701171875, 0.0020847320556640625, -0.00567626953125, -0.040557861328125, -0.01303863525390625, 0.0806884765625, -0.00893402099609375, 0.0305633544921875, 0.0582275390625, 0.09222412109375, -0.0230255126953125, 0.01035308837890625, 0.01013946533203125, 0.017730712890625, 0.04583740234375, 0.030487060546875, 0.05712890625, -0.0697021484375, 0.054901123046875, -0.055328369140625, 0.0022144317626953125, -0.028472900390625, -0.05328369140625, -0.0772705078125, -0.035888671875, -0.037017822265625, -0.07171630859375, -0.032958984375, 0.06695556640625, 0.06268310546875, -0.08892822265625, -0.017913818359375, -0.0257720947265625, -0.002971649169921875, -0.0212249755859375, -0.025299072265625, 0.040496826171875, -0.024627685546875, -0.04705810546875, 0.0038051605224609375, -0.0242767333984375, 0.0254058837890625, -0.0240936279296875, -0.0018281936645507812, -0.01513671875, -0.0202178955078125, 0.005832672119140625, 0.01218414306640625, -0.05517578125, -0.02301025390625, -0.01549530029296875, -0.01406097412109375, 0.0027179718017578125, 0.044219970703125, -0.03546142578125, 0.0196380615234375, 0.06085205078125, 0.0187835693359375, 0.050628662109375, -0.0038089752197265625, 0.052276611328125, -0.039642333984375, 0.0185394287109375, 0.0186309814453125, 0.037261962890625, 0.0102996826171875, -0.0220184326171875, 0.037567138671875, 0.025115966796875, -0.0325927734375, -0.0687255859375, 0.002788543701171875, -0.048248291015625, -0.0013666152954101562, 0.08441162109375, -0.01525115966796875, -0.022613525390625, 0.0006971359252929688, -0.006328582763671875, 0.049560546875, -0.024261474609375, 0.043670654296875, 0.030548095703125, 0.002834320068359375, -0.007045745849609375, -0.0268096923828125, 0.044677734375, 0.038726806640625, -0.04095458984375, -0.01209259033203125, -0.002857208251953125, 0.05096435546875, 0.01258087158203125, 0.00250244140625, -0.006011962890625, 0.035430908203125, 0.012542724609375, 0.01445770263671875, -0.0236358642578125, 0.00278472900390625, -0.03558349609375, 0.00298309326171875, -0.021636962890625, -0.0295562744140625 ] ]
pszemraj/flan-t5-large-grammar-synthesis
2023-03-18T19:28:37.000Z
[ "transformers", "pytorch", "onnx", "safetensors", "t5", "text2text-generation", "grammar", "spelling", "punctuation", "error-correction", "grammar synthesis", "FLAN", "dataset:jfleg", "arxiv:2107.06751", "doi:10.57967/hf/0138", "license:cc-by-nc-sa-4.0", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
pszemraj
null
null
pszemraj/flan-t5-large-grammar-synthesis
67
103,342
transformers
2022-11-26T02:40:52
--- languages: - en license: - cc-by-nc-sa-4.0 - apache-2.0 tags: - grammar - spelling - punctuation - error-correction - grammar synthesis - FLAN datasets: - jfleg widget: - text: "There car broke down so their hitching a ride to they're class." example_title: "compound-1" - text: "i can has cheezburger" example_title: "cheezburger" - text: "so em if we have an now so with fito ringina know how to estimate the tren given the ereafte mylite trend we can also em an estimate is nod s i again tort watfettering an we have estimated the trend an called wot to be called sthat of exty right now we can and look at wy this should not hare a trend i becan we just remove the trend an and we can we now estimate tesees ona effect of them exty" example_title: "Transcribed Audio Example 2" - text: "My coworker said he used a financial planner to help choose his stocks so he wouldn't loose money." example_title: "incorrect word choice (context)" - text: "good so hve on an tadley i'm not able to make it to the exla session on monday this week e which is why i am e recording pre recording an this excelleision and so to day i want e to talk about two things and first of all em i wont em wene give a summary er about ta ohow to remove trents in these nalitives from time series" example_title: "lowercased audio transcription output" - text: "Frustrated, the chairs took me forever to set up." example_title: "dangling modifier" - text: "I would like a peice of pie." example_title: "miss-spelling" - text: "Which part of Zurich was you going to go hiking in when we were there for the first time together? ! ?" example_title: "chatbot on Zurich" - text: "Most of the course is about semantic or content of language but there are also interesting topics to be learned from the servicefeatures except statistics in characters in documents. At this point, Elvthos introduces himself as his native English speaker and goes on to say that if you continue to work on social scnce," example_title: "social science ASR summary output" - text: "they are somewhat nearby right yes please i'm not sure how the innish is tepen thut mayyouselect one that istatte lo variants in their property e ere interested and anyone basical e may be applyind reaching the browing approach were" - "medical course audio transcription" parameters: max_length: 128 min_length: 4 num_beams: 8 repetition_penalty: 1.21 length_penalty: 1 early_stopping: True --- # grammar-synthesis-large: FLAN-t5 <a href="https://colab.research.google.com/gist/pszemraj/5dc89199a631a9c6cfd7e386011452a0/demo-flan-t5-large-grammar-synthesis.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> A fine-tuned version of [google/flan-t5-large](https://huggingface.co/google/flan-t5-large) for grammar correction on an expanded version of the [JFLEG](https://paperswithcode.com/dataset/jfleg) dataset. [Demo](https://huggingface.co/spaces/pszemraj/FLAN-grammar-correction) on HF spaces. ## Example ![example](https://i.imgur.com/PIhrc7E.png) Compare vs. the original [grammar-synthesis-large](https://huggingface.co/pszemraj/grammar-synthesis-large). --- ## usage in Python > There's a colab notebook that already has this basic version implemented (_click on the Open in Colab button_) After `pip install transformers` run the following code: ```python from transformers import pipeline corrector = pipeline( 'text2text-generation', 'pszemraj/flan-t5-large-grammar-synthesis', ) raw_text = 'i can has cheezburger' results = corrector(raw_text) print(results) ``` **For Batch Inference:** see [this discussion thread](https://huggingface.co/pszemraj/flan-t5-large-grammar-synthesis/discussions/1) for details, but essentially the dataset consists of several sentences at a time, and so I'd recommend running inference **in the same fashion:** batches of 64-96 tokens ish (or, 2-3 sentences split with regex) - it is also helpful to **first** check whether or not a given sentence needs grammar correction before using the text2text model. You can do this with BERT-type models fine-tuned on CoLA like `textattack/roberta-base-CoLA` - I made a notebook demonstrating batch inference [here](https://colab.research.google.com/gist/pszemraj/6e961b08970f98479511bb1e17cdb4f0/batch-grammar-check-correct-demo.ipynb) --- ## Model description The intent is to create a text2text language model that successfully completes "single-shot grammar correction" on a potentially grammatically incorrect text **that could have a lot of mistakes** with the important qualifier of **it does not semantically change text/information that IS grammatically correct.** Compare some of the heavier-error examples on [other grammar correction models](https://huggingface.co/models?dataset=dataset:jfleg) to see the difference :) ### ONNX Checkpoint This model has been converted to ONNX and can be loaded/used with huggingface's `optimum` library. You first need to [install optimum](https://huggingface.co/docs/optimum/installation) ```bash pip install optimum[onnxruntime] # ^ if you want to use a different runtime read their docs ``` load with the optimum `pipeline` ```python from optimum.pipelines import pipeline corrector = pipeline( "text2text-generation", model=corrector_model_name, accelerator="ort" ) # use as normal ``` ### Other checkpoints If trading a slight decrease in grammatical correction quality for faster inference speed makes sense for your use case, check out the **[base](https://huggingface.co/pszemraj/grammar-synthesis-base)** and **[small](https://huggingface.co/pszemraj/grammar-synthesis-small)** checkpoints fine-tuned from the relevant t5 checkpoints. ## Limitations - dataset: `cc-by-nc-sa-4.0` - model: `apache-2.0` - this is **still a work-in-progress** and while probably useful for "single-shot grammar correction" in a lot of cases, **give the outputs a glance for correctness ok?** ## Use Cases Obviously, this section is quite general as there are many things one can use "general single-shot grammar correction" for. Some ideas or use cases: 1. Correcting highly error-prone LM outputs. Some examples would be audio transcription (ASR) (this is literally some of the examples) or something like handwriting OCR. - To be investigated further, depending on what model/system is used it _might_ be worth it to apply this after OCR on typed characters. 2. Correcting/infilling text generated by text generation models to be cohesive/remove obvious errors that break the conversation immersion. I use this on the outputs of [this OPT 2.7B chatbot-esque model of myself](https://huggingface.co/pszemraj/opt-peter-2.7B). > An example of this model running on CPU with beam search: ``` Original response: ive heard it attributed to a bunch of different philosophical schools, including stoicism, pragmatism, existentialism and even some forms of post-structuralism. i think one of the most interesting (and most difficult) philosophical problems is trying to let dogs (or other animals) out of cages. the reason why this is a difficult problem is because it seems to go against our grain (so to synthesizing took 306.12 seconds Final response in 1294.857 s: I've heard it attributed to a bunch of different philosophical schools, including solipsism, pragmatism, existentialism and even some forms of post-structuralism. i think one of the most interesting (and most difficult) philosophical problems is trying to let dogs (or other animals) out of cages. the reason why this is a difficult problem is because it seems to go against our grain (so to speak) ``` _Note: that I have some other logic that removes any periods at the end of the final sentence in this chatbot setting [to avoid coming off as passive aggressive](https://www.npr.org/2020/09/05/909969004/before-texting-your-kid-make-sure-to-double-check-your-punctuation)_ 3. Somewhat related to #2 above, fixing/correcting so-called [tortured-phrases](https://arxiv.org/abs/2107.06751) that are dead giveaways text was generated by a language model. _Note that _SOME_ of these are not fixed, especially as they venture into domain-specific terminology (i.e. irregular timberland instead of Random Forest)._ --- ## Citation info If you find this fine-tuned model useful in your work, please consider citing it :) ``` @misc {peter_szemraj_2022, author = { {Peter Szemraj} }, title = { flan-t5-large-grammar-synthesis (Revision d0b5ae2) }, year = 2022, url = { https://huggingface.co/pszemraj/flan-t5-large-grammar-synthesis }, doi = { 10.57967/hf/0138 }, publisher = { Hugging Face } } ```
8,775
[ [ -0.014312744140625, -0.0865478515625, 0.035614013671875, 0.033782958984375, 0.0003135204315185547, -0.0134429931640625, -0.029510498046875, -0.019927978515625, 0.0103759765625, 0.0225982666015625, -0.05010986328125, -0.0313720703125, -0.0290069580078125, 0.0276336669921875, -0.0274810791015625, 0.07159423828125, -0.008758544921875, -0.0003936290740966797, 0.003681182861328125, 0.00965118408203125, -0.029388427734375, -0.041473388671875, -0.0672607421875, -0.0101165771484375, 0.0293121337890625, 0.0254669189453125, 0.03594970703125, 0.048583984375, 0.042205810546875, 0.032745361328125, -0.027496337890625, 0.0160675048828125, -0.032958984375, 0.004161834716796875, -0.0211639404296875, -0.037139892578125, -0.04412841796875, 0.007396697998046875, 0.0506591796875, 0.04876708984375, 0.0009975433349609375, 0.00252532958984375, -0.013671875, 0.047332763671875, -0.03466796875, 0.01003265380859375, -0.032501220703125, 0.01323699951171875, -0.0178375244140625, 0.0137176513671875, -0.04193115234375, -0.02783203125, -0.00958251953125, -0.0533447265625, 0.01525115966796875, 0.007709503173828125, 0.09490966796875, 0.0292816162109375, -0.0203094482421875, -0.0172576904296875, -0.045623779296875, 0.05877685546875, -0.058563232421875, 0.009979248046875, 0.0246429443359375, 0.00916290283203125, -0.04339599609375, -0.083984375, -0.05126953125, -0.0003025531768798828, -0.0171661376953125, 0.024078369140625, -0.0264739990234375, 0.0166015625, 0.0374755859375, 0.037017822265625, -0.0308990478515625, -0.02191162109375, -0.0272216796875, -0.0246124267578125, 0.0335693359375, -0.01288604736328125, 0.035125732421875, -0.0163726806640625, -0.0158233642578125, -0.014312744140625, -0.035614013671875, 0.015777587890625, 0.01593017578125, 0.039276123046875, 0.0025920867919921875, 0.057403564453125, -0.0032215118408203125, 0.03790283203125, 0.0237274169921875, -0.0137481689453125, 0.017669677734375, -0.04278564453125, -0.031982421875, -0.0175933837890625, 0.061065673828125, 0.0262298583984375, 0.0305328369140625, 0.004543304443359375, -0.00945281982421875, 0.032135009765625, 0.00827789306640625, -0.0650634765625, -0.0171661376953125, 0.023101806640625, -0.0157928466796875, -0.0243377685546875, -0.0031490325927734375, -0.049407958984375, -0.00033593177795410156, -0.01392364501953125, 0.030975341796875, -0.047882080078125, -0.00472259521484375, 0.024017333984375, -0.0281982421875, -0.00917816162109375, 0.015777587890625, -0.058441162109375, 0.005275726318359375, 0.04132080078125, 0.061767578125, 0.044952392578125, -0.01849365234375, -0.052459716796875, -0.007190704345703125, -0.04278564453125, 0.0634765625, -0.035125732421875, -0.00450897216796875, -0.0008020401000976562, 0.01727294921875, -0.01702880859375, -0.039306640625, 0.057861328125, 0.0008978843688964844, 0.054962158203125, -0.0286102294921875, -0.04925537109375, -0.00714874267578125, 0.0038967132568359375, -0.0311126708984375, 0.09906005859375, 0.00588226318359375, -0.03936767578125, 0.0215606689453125, -0.050201416015625, -0.044219970703125, -0.02496337890625, 0.02783203125, -0.038818359375, 0.005794525146484375, 0.017730712890625, 0.0224151611328125, 0.0034122467041015625, 0.0270538330078125, -0.021270751953125, -0.01116943359375, 0.010009765625, 0.000576019287109375, 0.076171875, 0.0128021240234375, -0.035888671875, 0.0132904052734375, -0.060150146484375, 0.0127410888671875, 0.01123046875, -0.03314208984375, -0.023223876953125, -0.007488250732421875, -0.0035648345947265625, 0.02325439453125, 0.0014238357543945312, -0.04052734375, 0.02239990234375, -0.0494384765625, 0.054962158203125, 0.052459716796875, -0.00592803955078125, 0.03607177734375, -0.029449462890625, 0.018768310546875, -0.01788330078125, 0.0136871337890625, -0.0213470458984375, -0.0295257568359375, -0.0848388671875, -0.03375244140625, 0.0300445556640625, 0.054290771484375, -0.049591064453125, 0.0533447265625, 0.003993988037109375, -0.0306396484375, -0.034210205078125, 0.0096893310546875, 0.051605224609375, 0.0307464599609375, 0.032745361328125, -0.0164794921875, -0.06695556640625, -0.048980712890625, -0.0341796875, -0.03192138671875, 0.005672454833984375, -0.0095062255859375, 0.044036865234375, -0.00836944580078125, 0.0701904296875, -0.03253173828125, -0.026885986328125, -0.0169677734375, -0.005641937255859375, 0.00994110107421875, 0.03509521484375, 0.04974365234375, -0.03955078125, -0.032318115234375, -0.011322021484375, -0.06134033203125, -0.03289794921875, -0.0138092041015625, -0.040557861328125, 0.034881591796875, 0.0516357421875, -0.06549072265625, 0.020599365234375, 0.01800537109375, -0.043609619140625, 0.030670166015625, -0.0010118484497070312, 0.0006551742553710938, -0.08746337890625, 0.0198822021484375, 0.0014705657958984375, -0.03729248046875, -0.055511474609375, 0.0221099853515625, -0.00020599365234375, -0.00537109375, -0.0299072265625, 0.04742431640625, -0.026031494140625, 0.0207977294921875, 0.005008697509765625, 0.0181884765625, 0.007137298583984375, 0.036712646484375, -0.020050048828125, 0.0562744140625, 0.021575927734375, -0.036041259765625, 0.0283050537109375, 0.02496337890625, -0.00649261474609375, 0.0291748046875, -0.04815673828125, 0.001956939697265625, -0.01126861572265625, 0.038055419921875, -0.07470703125, -0.0350341796875, 0.0479736328125, -0.0301513671875, 0.018157958984375, 0.0036449432373046875, -0.04046630859375, -0.036224365234375, -0.0301361083984375, -0.0010671615600585938, 0.041351318359375, -0.037841796875, 0.03961181640625, 0.00946044921875, -0.0183868408203125, -0.0394287109375, -0.054473876953125, 0.008056640625, -0.0178070068359375, -0.06390380859375, 0.028106689453125, -0.015106201171875, -0.03753662109375, -0.0021152496337890625, -0.00893402099609375, 0.00676727294921875, -0.00567626953125, 0.0070648193359375, 0.01145172119140625, -0.017547607421875, 0.0193328857421875, 0.0017213821411132812, -0.01041412353515625, -0.0163421630859375, -0.0197296142578125, 0.039215087890625, -0.032684326171875, 0.00896453857421875, -0.033966064453125, 0.01629638671875, 0.040740966796875, -0.00848388671875, 0.04119873046875, 0.04534912109375, -0.032470703125, -0.01947021484375, -0.04412841796875, -0.0191497802734375, -0.0369873046875, 0.013092041015625, -0.0171661376953125, -0.0537109375, 0.03460693359375, 0.0002104043960571289, 0.0083465576171875, 0.04693603515625, 0.0362548828125, -0.0182647705078125, 0.060089111328125, 0.0335693359375, 0.0059051513671875, 0.05511474609375, -0.002498626708984375, 0.03863525390625, -0.0382080078125, -0.001682281494140625, -0.0399169921875, -0.01486968994140625, -0.04278564453125, -0.0200347900390625, 0.0193023681640625, 0.0218048095703125, -0.029937744140625, 0.039764404296875, -0.022430419921875, 0.032867431640625, 0.04315185546875, 0.00614166259765625, 0.01629638671875, 0.012451171875, -0.0104522705078125, -0.005741119384765625, -0.045501708984375, -0.040313720703125, 0.0614013671875, 0.0194091796875, 0.0382080078125, 0.009674072265625, 0.039276123046875, 0.004055023193359375, 0.0004925727844238281, -0.06329345703125, 0.053741455078125, -0.03594970703125, -0.05718994140625, -0.018768310546875, -0.0263214111328125, -0.05987548828125, 0.017547607421875, -0.022552490234375, -0.04510498046875, -0.006214141845703125, 0.0200347900390625, -0.029083251953125, 0.0149688720703125, -0.0810546875, 0.07745361328125, -0.0024127960205078125, -0.04180908203125, 0.004901885986328125, -0.0628662109375, 0.0264739990234375, 0.00522613525390625, 0.007549285888671875, -0.00441741943359375, 0.01374053955078125, 0.033416748046875, -0.06689453125, 0.0797119140625, -0.00510406494140625, -0.0015583038330078125, 0.0318603515625, -0.00970458984375, 0.032562255859375, -0.01374053955078125, -0.0013093948364257812, -0.0076751708984375, 0.003940582275390625, -0.0252532958984375, -0.040924072265625, 0.0290679931640625, -0.042633056640625, -0.044921875, -0.040771484375, -0.0307464599609375, 0.004241943359375, 0.0302581787109375, 0.0184173583984375, 0.032440185546875, -0.0128173828125, 0.0207366943359375, 0.0283966064453125, -0.019683837890625, 0.045806884765625, 0.0165863037109375, -0.037811279296875, -0.032257080078125, 0.041107177734375, 0.001708984375, 0.025848388671875, 0.033966064453125, 0.029937744140625, -0.0122528076171875, 0.003665924072265625, -0.04254150390625, 0.038543701171875, -0.05438232421875, -0.01373291015625, -0.044219970703125, -0.01483154296875, -0.0548095703125, -0.018798828125, -0.03594970703125, -0.056915283203125, -0.0224151611328125, -0.0015773773193359375, 0.02911376953125, 0.035491943359375, -0.005413055419921875, 0.0301513671875, -0.035675048828125, 0.02178955078125, 0.0073089599609375, 0.01491546630859375, -0.030975341796875, -0.036285400390625, 0.0093536376953125, 0.00661468505859375, -0.0292205810546875, -0.05047607421875, 0.0230560302734375, 0.03936767578125, 0.021575927734375, 0.0208892822265625, -0.006649017333984375, 0.07659912109375, -0.0390625, 0.067138671875, 0.002758026123046875, -0.0926513671875, 0.059173583984375, -0.021697998046875, 0.0183563232421875, 0.0372314453125, 0.00580596923828125, -0.061767578125, -0.0469970703125, -0.047607421875, -0.07611083984375, 0.052490234375, 0.03253173828125, 0.008148193359375, -0.0088958740234375, 0.045013427734375, -0.01245880126953125, 0.00862884521484375, -0.0645751953125, -0.00954437255859375, -0.01468658447265625, -0.021484375, 0.004123687744140625, -0.016632080078125, -0.00795745849609375, -0.0251922607421875, 0.0906982421875, -0.0015277862548828125, 0.0247039794921875, 0.0251007080078125, -0.007537841796875, -0.0162811279296875, 0.028472900390625, 0.05401611328125, 0.0360107421875, -0.0122222900390625, 0.0062103271484375, 0.00804901123046875, -0.0250091552734375, 0.0037555694580078125, 0.002285003662109375, -0.020477294921875, 0.01068878173828125, 0.0389404296875, 0.055938720703125, 0.004150390625, -0.05120849609375, 0.041351318359375, -0.0003018379211425781, -0.016876220703125, -0.037933349609375, 0.007076263427734375, 0.0005626678466796875, 0.0233612060546875, -0.00218963623046875, 0.026092529296875, -0.00940704345703125, -0.06829833984375, 0.006694793701171875, 0.0166473388671875, -0.0010280609130859375, -0.0240020751953125, 0.046112060546875, 0.00856781005859375, -0.023590087890625, 0.045074462890625, -0.0306854248046875, -0.05035400390625, 0.04119873046875, 0.055633544921875, 0.053985595703125, -0.0282440185546875, 0.01552581787109375, 0.051971435546875, 0.02392578125, -0.03070068359375, 0.0211334228515625, 0.0220184326171875, -0.050018310546875, -0.0308837890625, -0.0552978515625, -0.00945281982421875, 0.00525665283203125, -0.02740478515625, 0.05096435546875, -0.054168701171875, -0.0238037109375, 0.01250457763671875, -0.01548004150390625, -0.02349853515625, 0.021881103515625, -0.00852203369140625, 0.054443359375, -0.0693359375, 0.04022216796875, 0.06671142578125, -0.0506591796875, -0.0689697265625, 0.0020732879638671875, 0.01422119140625, -0.040313720703125, 0.028106689453125, 0.0112152099609375, -0.007328033447265625, -0.01995849609375, -0.04400634765625, -0.05487060546875, 0.061676025390625, 0.041046142578125, -0.0400390625, -0.0205078125, -0.0003719329833984375, 0.065185546875, -0.00890350341796875, 0.0207977294921875, 0.04534912109375, 0.048248291015625, 0.014312744140625, -0.07928466796875, 0.019500732421875, -0.03778076171875, 0.00920867919921875, 0.007442474365234375, -0.06756591796875, 0.0787353515625, -0.01026153564453125, -0.02862548828125, 0.0233612060546875, 0.03863525390625, 0.0012845993041992188, 0.014801025390625, 0.035552978515625, 0.04608154296875, 0.0711669921875, -0.0226287841796875, 0.0758056640625, -0.00930023193359375, 0.04510498046875, 0.07080078125, -0.0009703636169433594, 0.060089111328125, 0.0399169921875, -0.023040771484375, 0.04034423828125, 0.05072021484375, -0.0018053054809570312, 0.030975341796875, 0.005535125732421875, -0.017333984375, -0.01349639892578125, 0.0005488395690917969, -0.0294342041015625, 0.036773681640625, 0.0193634033203125, -0.0408935546875, -0.00676727294921875, -0.007015228271484375, 0.05206298828125, 0.007061004638671875, -0.004085540771484375, 0.057861328125, 0.0009202957153320312, -0.07275390625, 0.051971435546875, 0.025360107421875, 0.047607421875, -0.0384521484375, 0.00867462158203125, -0.0335693359375, 0.041595458984375, -0.021942138671875, -0.038848876953125, 0.035308837890625, 0.01007843017578125, -0.0099639892578125, -0.028533935546875, 0.066162109375, -0.049468994140625, -0.047332763671875, 0.0207366943359375, 0.0204620361328125, 0.01446533203125, -0.00548553466796875, -0.03485107421875, 0.004375457763671875, 0.00839996337890625, -0.0125274658203125, -0.00870513916015625, 0.033966064453125, -0.003917694091796875, 0.03533935546875, 0.05218505859375, 0.004852294921875, 0.006557464599609375, 0.006748199462890625, 0.056915283203125, -0.043731689453125, -0.02984619140625, -0.0718994140625, 0.050811767578125, -0.01479339599609375, -0.03436279296875, 0.0489501953125, 0.040924072265625, 0.0828857421875, -0.0174407958984375, 0.07122802734375, -0.034210205078125, 0.0242462158203125, -0.054046630859375, 0.0302886962890625, -0.055084228515625, 0.0029430389404296875, -0.036712646484375, -0.068115234375, -0.0516357421875, 0.0758056640625, -0.0247802734375, 0.000059545040130615234, 0.082763671875, 0.07843017578125, -0.00643157958984375, 0.004398345947265625, 0.00928497314453125, 0.03863525390625, 0.0270233154296875, 0.040130615234375, 0.04827880859375, -0.051361083984375, 0.047760009765625, -0.0223236083984375, -0.01313018798828125, -0.018280029296875, -0.06396484375, -0.0718994140625, -0.045196533203125, -0.034515380859375, -0.05316162109375, 0.0193328857421875, 0.093994140625, 0.033111572265625, -0.06329345703125, -0.0050811767578125, -0.01488494873046875, -0.002593994140625, -0.0263824462890625, -0.0196075439453125, 0.01461029052734375, -0.04376220703125, -0.0777587890625, 0.027374267578125, 0.0168609619140625, -0.003963470458984375, 0.018707275390625, 0.0174713134765625, -0.032684326171875, 0.009490966796875, 0.05859375, 0.0226898193359375, -0.07110595703125, -0.049285888671875, 0.006084442138671875, -0.0144805908203125, 0.00966644287109375, 0.055023193359375, -0.0244903564453125, 0.0400390625, 0.0474853515625, 0.0338134765625, 0.035614013671875, 0.0093536376953125, 0.033599853515625, -0.062225341796875, 0.01415252685546875, -0.00943756103515625, 0.03497314453125, 0.03460693359375, -0.0185699462890625, 0.05816650390625, 0.05206298828125, -0.036376953125, -0.057373046875, 0.007183074951171875, -0.0758056640625, -0.0277099609375, 0.1025390625, -0.01322174072265625, -0.033660888671875, 0.01263427734375, -0.04437255859375, 0.039215087890625, -0.034637451171875, 0.053680419921875, 0.07879638671875, -0.00305938720703125, -0.0010232925415039062, -0.02362060546875, 0.0390625, 0.052886962890625, -0.06719970703125, -0.003002166748046875, 0.05047607421875, 0.0379638671875, 0.019287109375, 0.046539306640625, -0.01137542724609375, 0.0099334716796875, 0.01139068603515625, 0.0163421630859375, 0.0015354156494140625, -0.0230865478515625, -0.01364898681640625, 0.00829315185546875, -0.0175018310546875, -0.00333404541015625 ] ]
guillaumekln/faster-whisper-small
2023-05-12T18:58:54.000Z
[ "ctranslate2", "audio", "automatic-speech-recognition", "en", "zh", "de", "es", "ru", "ko", "fr", "ja", "pt", "tr", "pl", "ca", "nl", "ar", "sv", "it", "id", "hi", "fi", "vi", "he", "uk", "el", "ms", "cs", "ro", "da", "hu", "ta", "no", "th", "ur", "hr", "bg", "lt", "la", "mi", "ml", "cy", "sk", "te", "fa", "lv", "bn", "sr", "az", "sl", "kn", "et", "mk", "br", "eu", "is", "hy", "ne", "mn", "bs", "kk", "sq", "sw", "gl", "mr", "pa", "si", "km", "sn", "yo", "so", "af", "oc", "ka", "be", "tg", "sd", "gu", "am", "yi", "lo", "uz", "fo", "ht", "ps", "tk", "nn", "mt", "sa", "lb", "my", "bo", "tl", "mg", "as", "tt", "haw", "ln", "ha", "ba", "jw", "su", "license:mit", "has_space", "region:us" ]
automatic-speech-recognition
guillaumekln
null
null
guillaumekln/faster-whisper-small
6
102,485
ctranslate2
2023-03-23T10:21:29
--- language: - en - zh - de - es - ru - ko - fr - ja - pt - tr - pl - ca - nl - ar - sv - it - id - hi - fi - vi - he - uk - el - ms - cs - ro - da - hu - ta - 'no' - th - ur - hr - bg - lt - la - mi - ml - cy - sk - te - fa - lv - bn - sr - az - sl - kn - et - mk - br - eu - is - hy - ne - mn - bs - kk - sq - sw - gl - mr - pa - si - km - sn - yo - so - af - oc - ka - be - tg - sd - gu - am - yi - lo - uz - fo - ht - ps - tk - nn - mt - sa - lb - my - bo - tl - mg - as - tt - haw - ln - ha - ba - jw - su tags: - audio - automatic-speech-recognition license: mit library_name: ctranslate2 --- # Whisper small model for CTranslate2 This repository contains the conversion of [openai/whisper-small](https://huggingface.co/openai/whisper-small) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format. This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper). ## Example ```python from faster_whisper import WhisperModel model = WhisperModel("small") segments, info = model.transcribe("audio.mp3") for segment in segments: print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text)) ``` ## Conversion details The original model was converted with the following command: ``` ct2-transformers-converter --model openai/whisper-small --output_dir faster-whisper-small \ --copy_files tokenizer.json --quantization float16 ``` Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html). ## More information **For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-small).**
2,003
[ [ 0.0039825439453125, -0.028839111328125, 0.0240631103515625, 0.025665283203125, -0.0343017578125, -0.0316162109375, -0.036407470703125, -0.0274658203125, 0.00685882568359375, 0.049896240234375, -0.036712646484375, -0.03741455078125, -0.038299560546875, -0.0232086181640625, -0.0247802734375, 0.06817626953125, -0.0108184814453125, 0.0215606689453125, 0.02734375, -0.00408935546875, -0.03076171875, -0.00937652587890625, -0.060577392578125, -0.028076171875, 0.0160980224609375, 0.0257720947265625, 0.045440673828125, 0.032867431640625, 0.034332275390625, 0.01898193359375, -0.025634765625, -0.006496429443359375, -0.0207061767578125, -0.01776123046875, 0.0174560546875, -0.05303955078125, -0.049072265625, 0.00804901123046875, 0.051666259765625, 0.01517486572265625, -0.0228424072265625, 0.040802001953125, -0.01483154296875, 0.0230255126953125, -0.03411865234375, 0.0196685791015625, -0.038360595703125, 0.0013570785522460938, -0.014739990234375, -0.01174163818359375, -0.041748046875, -0.0284423828125, 0.032196044921875, -0.061187744140625, 0.01238250732421875, 0.006557464599609375, 0.06011962890625, 0.0240325927734375, -0.0345458984375, -0.026092529296875, -0.06982421875, 0.0673828125, -0.05877685546875, 0.01026153564453125, 0.0184783935546875, 0.043121337890625, 0.01540374755859375, -0.08441162109375, -0.021240234375, -0.00911712646484375, 0.0036678314208984375, 0.016754150390625, -0.032562255859375, 0.0208740234375, 0.0136566162109375, 0.03662109375, -0.047332763671875, -0.003955841064453125, -0.05487060546875, -0.0321044921875, 0.034820556640625, 0.0132904052734375, 0.01617431640625, -0.0245208740234375, -0.018096923828125, -0.044403076171875, -0.042266845703125, 0.0018463134765625, 0.03515625, 0.02362060546875, -0.049468994140625, 0.0501708984375, 0.0033016204833984375, 0.0268402099609375, 0.006938934326171875, -0.0228729248046875, 0.0216064453125, -0.0231170654296875, -0.014739990234375, 0.03350830078125, 0.0518798828125, 0.031890869140625, 0.0014963150024414062, 0.027191162109375, -0.0228424072265625, -0.0060882568359375, 0.00560760498046875, -0.08843994140625, -0.041961669921875, 0.015777587890625, -0.0755615234375, -0.0217437744140625, -0.0007886886596679688, -0.024261474609375, 0.004306793212890625, -0.01111602783203125, 0.0499267578125, -0.034271240234375, -0.033416748046875, 0.0242462158203125, -0.0364990234375, 0.006023406982421875, 0.03662109375, -0.058074951171875, 0.03656005859375, 0.036895751953125, 0.08441162109375, 0.0089111328125, -0.0036258697509765625, -0.02520751953125, 0.0164337158203125, -0.0057525634765625, 0.045257568359375, 0.0029964447021484375, -0.038482666015625, -0.006778717041015625, -0.00445556640625, -0.007320404052734375, -0.045379638671875, 0.045684814453125, -0.01416015625, 0.0261383056640625, 0.022430419921875, -0.0152130126953125, -0.01294708251953125, -0.0034389495849609375, -0.05255126953125, 0.07159423828125, 0.030517578125, -0.056365966796875, -0.01348876953125, -0.0635986328125, -0.0115509033203125, -0.01529693603515625, 0.03167724609375, -0.0360107421875, 0.024261474609375, -0.006397247314453125, -0.0020389556884765625, -0.0362548828125, 0.0120086669921875, -0.01493072509765625, -0.0322265625, 0.0233612060546875, -0.0311737060546875, 0.06622314453125, 0.02691650390625, 0.007659912109375, 0.026336669921875, -0.04376220703125, 0.0030670166015625, 0.002506256103515625, -0.025360107421875, -0.0239105224609375, -0.0149078369140625, 0.03863525390625, -0.00019097328186035156, 0.021881103515625, -0.040863037109375, 0.0236053466796875, -0.04266357421875, 0.06353759765625, 0.03143310546875, 0.00001436471939086914, 0.03863525390625, -0.035736083984375, 0.0075531005859375, 0.0166015625, 0.0305633544921875, 0.00997161865234375, -0.042083740234375, -0.05743408203125, -0.0108184814453125, 0.038360595703125, 0.026611328125, -0.044158935546875, 0.01088714599609375, -0.0241851806640625, -0.06915283203125, -0.072021484375, -0.0292816162109375, 0.0158843994140625, 0.0170440673828125, 0.038818359375, -0.0038909912109375, -0.0562744140625, -0.0660400390625, -0.00519561767578125, -0.0298614501953125, -0.021820068359375, 0.0180511474609375, 0.047088623046875, -0.0135498046875, 0.04949951171875, -0.051971435546875, -0.042449951171875, -0.0169677734375, 0.032379150390625, 0.0183258056640625, 0.062255859375, 0.0487060546875, -0.05438232421875, -0.0180206298828125, -0.0151519775390625, -0.01467132568359375, 0.0016040802001953125, -0.00543212890625, -0.0003685951232910156, -0.0034275054931640625, 0.0044403076171875, -0.055206298828125, 0.0362548828125, 0.052703857421875, -0.0223846435546875, 0.038421630859375, -0.0019550323486328125, -0.004673004150390625, -0.08831787109375, 0.01049041748046875, -0.0034427642822265625, -0.0142974853515625, -0.041839599609375, -0.004085540771484375, 0.01383209228515625, 0.004192352294921875, -0.06036376953125, 0.05206298828125, -0.00832366943359375, -0.0063323974609375, -0.0130462646484375, -0.0114288330078125, -0.010467529296875, 0.020538330078125, 0.028076171875, 0.059173583984375, 0.02264404296875, -0.028656005859375, 0.01224517822265625, 0.045684814453125, -0.017913818359375, 0.007480621337890625, -0.07537841796875, 0.01282501220703125, 0.022674560546875, 0.03143310546875, -0.0421142578125, -0.0069732666015625, 0.0223236083984375, -0.047332763671875, 0.01206207275390625, -0.051971435546875, -0.042999267578125, -0.0205535888671875, -0.040130615234375, 0.04248046875, 0.05242919921875, -0.03253173828125, 0.049468994140625, 0.017578125, 0.004642486572265625, 0.007354736328125, -0.08135986328125, -0.0130157470703125, -0.0151519775390625, -0.0570068359375, 0.050323486328125, -0.01959228515625, -0.01241302490234375, -0.004772186279296875, -0.0011205673217773438, -0.022308349609375, -0.0146942138671875, 0.0306396484375, 0.0239105224609375, -0.03546142578125, -0.01560211181640625, 0.033294677734375, -0.02911376953125, 0.004611968994140625, -0.03997802734375, 0.04901123046875, -0.015106201171875, 0.003437042236328125, -0.05419921875, 0.005157470703125, 0.03656005859375, -0.01157379150390625, 0.0355224609375, 0.05828857421875, -0.0300140380859375, -0.01517486572265625, -0.02423095703125, -0.023468017578125, -0.035003662109375, 0.01174163818359375, -0.02783203125, -0.06097412109375, 0.03240966796875, 0.002044677734375, -0.00988006591796875, 0.061981201171875, 0.0447998046875, 0.0005054473876953125, 0.0828857421875, 0.03558349609375, 0.021484375, 0.03558349609375, -0.052703857421875, -0.01145172119140625, -0.08709716796875, -0.015716552734375, -0.05316162109375, -0.0164794921875, -0.028961181640625, -0.0243072509765625, 0.039459228515625, 0.016357421875, -0.030670166015625, 0.048065185546875, -0.054779052734375, -0.0011816024780273438, 0.039215087890625, 0.01336669921875, 0.0232696533203125, -0.0048980712890625, 0.01480865478515625, -0.0142669677734375, -0.03424072265625, -0.0318603515625, 0.07928466796875, 0.04400634765625, 0.058380126953125, 0.027984619140625, 0.051666259765625, 0.01318359375, 0.01324462890625, -0.0672607421875, 0.0172271728515625, -0.0218658447265625, -0.03839111328125, -0.005245208740234375, -0.024017333984375, -0.04803466796875, 0.0031890869140625, 0.0007200241088867188, -0.04986572265625, 0.007175445556640625, 0.007755279541015625, -0.0191497802734375, 0.026275634765625, -0.042999267578125, 0.06158447265625, 0.0126800537109375, 0.024505615234375, -0.0178985595703125, -0.030120849609375, 0.0423583984375, 0.00817108154296875, -0.020233154296875, 0.006008148193359375, -0.0066986083984375, 0.0823974609375, -0.056365966796875, 0.061767578125, -0.03155517578125, -0.008392333984375, 0.051544189453125, 0.01271820068359375, 0.0333251953125, 0.01529693603515625, -0.00923919677734375, 0.035003662109375, 0.039642333984375, -0.000030338764190673828, -0.0248565673828125, 0.039764404296875, -0.0919189453125, -0.00394439697265625, -0.018096923828125, -0.037628173828125, 0.0308837890625, 0.01277923583984375, 0.04022216796875, 0.04144287109375, -0.00354766845703125, 0.0165863037109375, 0.044769287109375, 0.010101318359375, 0.03240966796875, 0.046661376953125, -0.00881195068359375, -0.05316162109375, 0.05767822265625, 0.013946533203125, 0.0202178955078125, 0.03338623046875, 0.033538818359375, -0.034576416015625, -0.07025146484375, -0.0299224853515625, 0.00894927978515625, -0.043212890625, -0.0340576171875, -0.04132080078125, -0.036407470703125, -0.0372314453125, 0.0142974853515625, -0.052703857421875, -0.0594482421875, -0.04376220703125, 0.0294036865234375, 0.04742431640625, 0.038330078125, -0.0071563720703125, 0.057891845703125, -0.0780029296875, 0.0223846435546875, 0.00429534912109375, 0.0033092498779296875, 0.01142120361328125, -0.07391357421875, -0.01128387451171875, 0.01140594482421875, -0.0235595703125, -0.0518798828125, 0.0341796875, 0.006389617919921875, 0.002857208251953125, 0.014190673828125, 0.01288604736328125, 0.05560302734375, -0.016693115234375, 0.0721435546875, 0.0250091552734375, -0.0869140625, 0.05279541015625, -0.034423828125, 0.00916290283203125, 0.03863525390625, -0.0001264810562133789, -0.043182373046875, -0.0019741058349609375, -0.048187255859375, -0.0484619140625, 0.06927490234375, 0.033203125, -0.0118408203125, 0.0194091796875, 0.0132904052734375, 0.00812530517578125, 0.0166778564453125, -0.057525634765625, -0.02020263671875, -0.042205810546875, -0.034088134765625, 0.016326904296875, -0.0206756591796875, -0.0084991455078125, -0.018035888671875, 0.053619384765625, -0.0246124267578125, 0.039306640625, 0.032989501953125, -0.0135498046875, -0.01020050048828125, 0.0052337646484375, 0.05712890625, 0.0007429122924804688, -0.041412353515625, -0.0028839111328125, 0.01229095458984375, -0.058837890625, -0.00506591796875, 0.002773284912109375, -0.0300140380859375, 0.01105499267578125, 0.0310516357421875, 0.057403564453125, 0.028900146484375, -0.0197296142578125, 0.051788330078125, -0.01690673828125, -0.0287628173828125, -0.05792236328125, 0.007320404052734375, 0.01387786865234375, 0.0199127197265625, 0.0217437744140625, 0.0247802734375, 0.0190582275390625, -0.0276031494140625, -0.017852783203125, 0.00650787353515625, -0.03619384765625, -0.037628173828125, 0.059234619140625, 0.009490966796875, -0.0289459228515625, 0.04486083984375, -0.000293731689453125, -0.010467529296875, 0.0379638671875, 0.0543212890625, 0.0880126953125, -0.0011386871337890625, 0.005947113037109375, 0.040924072265625, 0.0264892578125, -0.017333984375, 0.043548583984375, -0.01300811767578125, -0.028594970703125, -0.0224609375, -0.05548095703125, -0.0277862548828125, 0.0052337646484375, -0.06927490234375, 0.0263519287109375, -0.04296875, -0.0164794921875, 0.00539398193359375, 0.01146697998046875, -0.044677734375, 0.00008618831634521484, 0.010589599609375, 0.107177734375, -0.046417236328125, 0.09442138671875, 0.039581298828125, -0.036895751953125, -0.07293701171875, -0.018310546875, 0.0019330978393554688, -0.048187255859375, 0.043670654296875, 0.008453369140625, -0.00036406517028808594, 0.0019330978393554688, -0.05462646484375, -0.064697265625, 0.10736083984375, 0.006008148193359375, -0.04364013671875, -0.0167999267578125, 0.00902557373046875, 0.03631591796875, -0.0465087890625, 0.045166015625, 0.032440185546875, 0.060821533203125, 0.0018062591552734375, -0.08880615234375, 0.0020961761474609375, -0.007843017578125, 0.0218353271484375, 0.0078582763671875, -0.06634521484375, 0.09088134765625, -0.01105499267578125, -0.00691986083984375, 0.0723876953125, 0.04986572265625, 0.0193939208984375, 0.0280914306640625, 0.03466796875, 0.0308837890625, 0.0296630859375, -0.026275634765625, 0.046875, -0.00936126708984375, 0.042205810546875, 0.07293701171875, -0.0132904052734375, 0.07696533203125, 0.032623291015625, -0.003238677978515625, 0.04962158203125, 0.035675048828125, -0.0209197998046875, 0.047332763671875, -0.002117156982421875, 0.004352569580078125, -0.0028324127197265625, -0.0031261444091796875, -0.0313720703125, 0.050933837890625, 0.0304107666015625, -0.0252532958984375, -0.005218505859375, -0.0029773712158203125, -0.0009107589721679688, -0.0163116455078125, -0.035247802734375, 0.050384521484375, 0.0002129077911376953, -0.0241241455078125, 0.04547119140625, 0.032257080078125, 0.06494140625, -0.06768798828125, -0.00881195068359375, 0.020782470703125, 0.0178985595703125, -0.0179595947265625, -0.049774169921875, 0.03271484375, -0.0172119140625, -0.0281219482421875, -0.0010976791381835938, 0.05322265625, -0.044830322265625, -0.02587890625, 0.0234527587890625, 0.0129547119140625, 0.0209808349609375, -0.019134521484375, -0.054107666015625, 0.030517578125, 0.0152740478515625, -0.0191802978515625, 0.0215301513671875, -0.00251007080078125, 0.0003490447998046875, 0.032623291015625, 0.06024169921875, 0.0158843994140625, -0.0013895034790039062, 0.00606536865234375, 0.05279541015625, -0.04949951171875, -0.0513916015625, -0.02532958984375, 0.039642333984375, -0.0016698837280273438, -0.051788330078125, 0.03790283203125, 0.059326171875, 0.0450439453125, -0.0265960693359375, 0.038299560546875, -0.0013418197631835938, 0.031280517578125, -0.0626220703125, 0.062042236328125, -0.033599853515625, -0.007045745849609375, -0.002857208251953125, -0.0557861328125, 0.00542449951171875, 0.02984619140625, -0.00550079345703125, -0.0133819580078125, 0.052978515625, 0.06365966796875, -0.013671875, 0.0222015380859375, 0.0011377334594726562, 0.033050537109375, 0.0213775634765625, 0.04351806640625, 0.048828125, -0.07470703125, 0.056610107421875, -0.0289154052734375, -0.000022172927856445312, -0.01218414306640625, -0.04632568359375, -0.0655517578125, -0.04180908203125, -0.032745361328125, -0.04248046875, -0.0135498046875, 0.066650390625, 0.071533203125, -0.049346923828125, -0.0255889892578125, 0.007720947265625, -0.005802154541015625, -0.0013818740844726562, -0.0213775634765625, 0.035675048828125, 0.0276336669921875, -0.055389404296875, 0.0379638671875, 0.006343841552734375, 0.036773681640625, -0.0196075439453125, -0.032684326171875, 0.0223388671875, 0.0006504058837890625, 0.027374267578125, 0.00788116455078125, -0.061614990234375, -0.0225372314453125, -0.0242919921875, -0.0031642913818359375, 0.0100250244140625, 0.060699462890625, -0.052734375, 0.0031757354736328125, 0.04052734375, -0.01873779296875, 0.059783935546875, -0.0298614501953125, 0.01291656494140625, -0.04608154296875, 0.038177490234375, 0.020751953125, 0.030548095703125, 0.00951385498046875, -0.00591278076171875, 0.03021240234375, 0.0156402587890625, -0.0265350341796875, -0.0703125, -0.0008029937744140625, -0.1004638671875, -0.004459381103515625, 0.07574462890625, 0.005489349365234375, -0.031280517578125, 0.0211181640625, -0.045684814453125, 0.0243072509765625, -0.0487060546875, 0.01480865478515625, 0.00502777099609375, 0.02301025390625, -0.006221771240234375, -0.029266357421875, 0.03564453125, -0.005710601806640625, -0.02996826171875, 0.0015535354614257812, 0.01044464111328125, 0.034088134765625, 0.035003662109375, 0.04010009765625, -0.029510498046875, 0.03436279296875, 0.0238494873046875, 0.024017333984375, -0.026397705078125, -0.0301055908203125, -0.0298614501953125, 0.0013713836669921875, 0.0018053054809570312, -0.0245513916015625 ] ]